Posted Fri Apr 23, 2010 at 11:00 AM PDT by Joshua Zyber
Editor's Note: Each Friday, High-Def Digest's own HD Advisor will answer a new round of questions from our readers. If you have home theater questions you need answered, send an email to [email protected]
Answers by Joshua Zyber
Is HD Audio Worth It?
Q: The difference between HD Audio and non-HD Audio was explained to me using the analogy of listening to an audio CD off of the store shelf versus converting it to MP3. MP3 compresses and loses some sound quality. To me, an MP3 audio track is just fine, and there is no major difference to the original CD. With that being said, I have been very hesitant to make the plunge to purchase a good HD Audio receiver. Then I went back to the store and spoke to someone else and they said the difference was HUGE! I don't know if they were just trying to sell me the product. I asked if they could demonstrate the difference between HD and non-HD Audio in their showroom. I am unable to find a place where I can hear the difference. I can tell a major difference between HD and SD Video. Would I be able to tell the difference between HD and regular audio? Is it a major difference? Are there more surround tracks that aren't normally heard with non-HD audio, or is just the quality different? I am sure there is a difference and an audio engineer could probably tell the difference. But for me, would it be worth dropping $500+ for HD Audio for the difference? I don't even know exactly what the difference is.
A: I've never pretended to be an audiophile. Like most viewers, I'm more attuned to differences in video quality than audio. I'm sure if you asked this question of a more golden-eared listener, you'd probably get a different answer. I try to approach decisions about home theater equipment as pragmatically as possible.
Yes, there is a difference in audio quality between the lossless HD audio formats (Dolby TrueHD and DTS-HD Master Audio) available on Blu-ray and the standard Dolby Digital and DTS formats carried over from DVD. The improvement does exist. However, the difference has perhaps been exaggerated by a lot of hype and placebo effect. Whether the difference will be noticeable enough to you to justify spending hundreds or thousands of dollars in new audio hardware will ultimately be a personal decision that only you can make. Some people are willing to spend top dollar for even the smallest incremental improvements, while others might find that frivolous.
The difference between a lossy audio format and a lossless format rarely has anything to do with the number of channels in the soundtrack. Although both TrueHD and DTS Master Audio support up to 7.1 discrete channels, the vast majority of movie soundtracks are mastered at only 5.1 channels. Further, processing a 5.1 soundtrack through Dolby ProLogic IIx decoding can simulate 7.1 almost indistinguishably from a track mixed discretely that way.
The real benefit of a lossless format is the clarity and fidelity of the audio. In order to compress a movie soundtrack, standard Dolby Digital and DTS discard frequencies in the audio. On the other hand, once decoded, a lossless track is a bit-for-bit identical copy of the original studio master, without any data or frequency loss. This is usually discernable in the subtle audio details, which will come through more clearly. Many listeners may not notice a dramatic improvement at first. But once you get used to lossless quality, listening to lossy formats again afterwards will sound flat and less compelling.
With that said, regular Dolby Digital and DTS are not bad by any means. They are also very good audio formats, which served the needs of DVD viewers perfectly well for more than a decade. While it's true that both throw out parts of the original audio, they do so through a process called perceptual encoding. This means that they first discard frequencies either beyond the range of human hearing, or frequencies that would be masked by other frequencies in the soundtrack anyway.
The more compression used (and thus lower bit rate), the more audible frequencies are lost. The less compression (and higher bit rate), the more of those frequencies are preserved. With that in mind, it's worth noting that when Blu-ray players downconvert both Dolby TrueHD and DTS-HD Master Audio soundtracks to standard Dolby Digital or DTS for output to older A/V receivers, they do so at the maximum allowable bit rates for those formats. The Dolby Digital and DTS you get from your Blu-ray player now are already higher quality than DVD, even though you're not using the lossless soundtracks to their fullest potential.
The truth of the matter is that upgrading your speakers will have a much more dramatic impact on the sound quality of all the sources you listen to than the difference between lossy and lossless audio codecs. Most small home theater speakers are incapable of resolving the full frequency and dynamic range of a lossless soundtrack in the first place. After speakers, the next most important components in your audio chain are the Digital-to-Analog converters and the amplifiers in your A/V receiver. If you have an older receiver, but one that was considered high-end in its day and has high quality DACs and amps, you'll still get better results with that than a newer but lower-end receiver, whether or not it includes HD audio decoding.
If you're unhappy with your current receiver and have considered upgrading it anyway, focus on finding a new model that's better all around in every area. I recommend one with HD audio support, or course. But that's not the only factor to look for. On the other hand, if you're happy with the audio quality you get now, you can probably push off a receiver upgrade to a later time. It sounds to me like video is a higher priority to you than audio.
Q: I realize consumers will need to have an HDMI 1.4 television and an HDMI 1.4 Blu-ray player to see 3-D. My main concern for my system is my A/V receiver. Will I also have to get a new receiver that will be HDMI 1.4 compatible to pass through the 1.4 between the TV and the Blu-ray player?
A: At the present time, some of these details are still sketchy. I've heard conflicting answers to this question. My assumption is that most existing HDMI 1.3 receivers will not pass through a 3-D Blu-ray signal. If you plan to use your receiver for video switching, you will probably need to upgrade to a model that supports HDMI 1.4.
Your other option is to bypass the receiver for video passthrough. The Panasonic DMP-BDT350 3-D Blu-ray player has two HDMI outputs, one for video and one dedicated to audio. This will allow you to connect the player directly to the 3-D TV for video, while routing audio to your current receiver separately. It's awkward and a bit of a nuisance, but should work.
120 Hz Frame Rate
Q: I have a 32" Samsung 1080p 120 Hz TV. When I upconvert DVDs, is it best to have the 120 Hz feature on or off? I like the feature for Blu-rays, but I wonder if it affects DVDs at all? I'm thinking that it could possibly make fast motions pixelate. Also, when you review Blu-rays, do you review with 120 Hz or 60 Hz?
A: As I've mentioned previously, a 120 Hz HDTV always runs at 120 Hz. Any content fed into the TV will be converted to that frame rate. There is no way to turn this feature off, per se.
The TV has two methods of converting video to 120 Hz. The first is to simply multiply the original frames in the source. The TV will apply 2:2 Pulldown to a 60 Hz input signal, or 5:5 Pulldown to a 24 fps signal (such as from Blu-ray). A simple multiplication of the original frames will look seamless to your eye, but will reduce the motion blur inherent to LCD displays.
The second method is to apply frame interpolation, which analyzes the original frames in the content and artificially creates new "in-between" frames. This will make motion smoother, but also tends to make film-based movies look like they were shot on a camcorder. Many viewers, myself included, find this very distasteful. I'm going to assume that when you ask about turning the "120 Hz feature on or off," you're referring specifically to frame interpolation. Some TVs call this MotionFlow, or Auto Motion Plus, or SuperSmooth Motion Reality Jazzamatazz, or other similar names.
Personally, I say turn all that junk off. But I realize that this will come down to personal preference. If you like giving all your movies that soap opera look, by all means turn it on.
As for why Blu-rays look different than DVDs with this feature, many HDTVs will only apply frame interpolation to 60 Hz sources. If your Blu-ray player is set for 1080p 24 fps output, this may trigger the TV to automatically disable frame interpolation.
My current D-ILA projector that I use for reviews is not a 120 Hz model. It displays 60 Hz content at 60 Hz, and multiplies 24 fps content to 96 Hz with 4:4 Pulldown. It has no frame interpolation features. Even if it did, I would never use that for review purposes. The point of our Blu-ray reviews on this site is to analyze the video quality of the content as it's encoded on the disc. Frame interpolation alters that content, and reflects the quality of the TV more than the quality of the disc.
The HD Advisor knows many things, but he doesn't know everything. Some questions are best answered with a consensus of opinions from our readers. If you can help to answer the following question, please post your response in our forum thread linked at the end of this article. Your advice and opinions matter too!
HD Advisor Column Titles & Pictures
JZ: I've put this call out there once before. It's time for me to solicit more reader suggestions for HD Advisor column titles that fit the numerical theme. Basically, I've got nothing for 56 and very little above that. Throw out those bad numerical puns, people! I'm also running out of ideas on pictures with the question or advice theme, preferably movie- or TV-related.
Check back soon for another round of answers. Keep those questions coming.
Joshua Zyber's opinions are his own and do not necessarily reflect those of this site, its owners or employees.
The latest news on all things 4K Ultra HD, blu-ray and Gear.