Editor's Note: Each Friday, High-Def Digest's own HD Advisor will answer a new round of questions from our readers. If you have home theater questions you need answered, send an email to [email protected]
Answers by Joshua Zyber
Correction re: HD Audio Downconversion
Q: Last week, I read something in your Is HD Audio Worth it? answer that sounded strange to me. I've learned from you that a Dolby TrueHD track is completely different than a regular Dolby Digital track. When a Blu-ray has a Dolby TrueHD track, it has to also contain a standard Dolby Digital track for compatibility reasons. This is different than DTS-HD Master Audio, which has a "core" conventional DTS track inside it. What sounded strange to me was these words: "When Blu-ray players downconvert both Dolby TrueHD and DTS-HD Master Audio soundtracks to standard Dolby Digital or DTS for output to older A/V receivers, they do so at the maximum allowable bit rates for those formats."
I don't see what you mean by a Blu-ray player downconverting Dolby TrueHD to a standard Dolby Digital. As we know, they are different. I wonder if what you meant was that, in the Blu-ray discs that have a standard Dolby Digital track, that track is recorded with a bit rate higher than the used in DVDs. Can you clarify?
A: You are absolutely correct that I should have been more clear in that response. I used the word "downconvert" as shorthand to explain that (in the scenario described) the player outputs lossy Dolby Digital or DTS when playing Blu-ray discs authored with lossless Dolby TrueHD or DTS-HD Master Audio tracks. However, the word "downconvert" implies that the player itself applies processing to the original lossless tracks to scale them down or compress them. That is not the case.
As you mentioned, a DTS-HD Master Audio contains an audio "core" in standard DTS format. If the Blu-ray player is connected to an older A/V receiver that isn't compatible with lossless audio, the player will extract the core and disregard the lossless extension. It then transmits that core to the receiver. This is, technically, different than "downconversion."
Likewise, a Blu-ray authored with Dolby TrueHD audio must also contain a backwards-compatible Dolby Digital track. In this case, the lossy track is a separate entity, not a "core." Sometimes, the Dolby Digital track is "hidden," in that you can't actively select it from the disc's menus. However, if the player is connected to a receiver incompatible with TrueHD, it will ignore the TrueHD track and default to the DD track instead. Again, this is technically different than downconversion.
I hope that clears it up. I apologize for the confusion this caused.
2k or 4k Video at Home?
Q: If you have already answered this in some way before, you may just let me know where to find the answer. Some people went crazy buying movies in the DVD era. Now that Blu-ray is here, DVD is the new VHS. After 1080p, is the next jump in home theater resolution, such as 2k, going to see obvious diminishing returns? Not that you can predict the future, but mathematically. If people were purchase a lot of Blu-rays, is there a good chance that the next generation format might not feel so far removed from Blu-ray quality?
A: You're right that a similar question was asked once before. But it's been a while, so the subject is worth revisiting.
Our home video HD standard of 1080p is not far removed from the 2k resolution used in a majority of digital cinemas. Technically, 1080p refers to a resolution of 1920x1080 pixels. "2k" digital cinema is 2048x1080. That's only about a 7% increase in pixel density. Now consider that Blu-ray's 1080p resolution is a 500% increase over DVD.
An upgrade from 1080p to 2k would not be the same sort of quantum leap improvement as we've seen going from standard definition to high definition. In terms of resolution, it would barely be noticeable at the screen sizes most home theater owners use. 2k digital cinema also has deeper color space that might help with the purity and accuracy of colors (and eliminate banding artifacts). Even still, all in all, this would very much be an incremental improvement.
Some premium digital cinemas use high-end 4k projectors with a pixel resolution of 4096x2160. That's a more significant increase of more than 325% from 1080p. That can be greatly beneficial on large theater screens. However, in the home environment, all those extra pixels would have to be packed into the same screen sizes we're watching now. The result is that the pixels would be so small that the human eye couldn't resolve all that extra detail. This is much akin to why there's little point in buying a 15" 1080p display. At sizes that small, your eye can't tell the difference between 480p and 1080p. The pixels are just too tiny.
Keep in mind also that the vast majority of modern feature films that are either shot digitally or use a Digital Intermediate stage during production are currently rendered and archived at 2k resolution. 4k productions are very rare. (Even 'Avatar' was shot with 2k digital cameras.) When a 2k movie is projected at a 4k theater, it's upconverted to the higher resolution, similar to how DVDs are upconverted for display on an HDTV.
What all of this comes down to is that I think 1080p is the "sweet spot" where home video will remain for the foreseeable future. Some manufacturers may experiment with higher resolutions, but 1080p will very likely remain the standard for quite some time.
'Avatar' vs. Blu-ray Players
Q: I just got an email from Best Buy because I ordered 'Avatar' from them. It is suggesting that I check my Blu-ray player for firmware updates. It states, "In order to provide the best possible picture and sound, this Blu-ray Disc uses advanced technology that may cause compatibility issues with some Blu-ray players." I can understand if it needs an update for newer copy protection or Java features, but what can newer firmware do with the main feature's picture and sound that couldn't be done before (especially considering this is a 2-D disc)? Is this maybe just a fib in order to excuse a disc that has compatibility issues with several players?
A: You've got it exactly right. There's nothing a firmware update for your player will do to improve the picture or sound of a Blu-ray disc that otherwise plays without issue. (Meaning, the disc actually loads and plays through without glitches.) The notice from Best Buy was really referring to the BD+ copy protection that 20th Century Fox used on the disc, which has already been reported to cause incompatibility problems with many standalone Blu-ray players. Unfortunately, Fox is the most paranoid of all the major Hollywood studios when it comes to concerns about video piracy, and constantly changes its encryption protocols, to the detriment of law-abiding consumers who've purchased their product and can't get it to actually work.
You'll notice the ambiguous wording of the Best Buy announcement. When it says, "In order to provide the best possible picture and sound," that doesn't mean that the firmware update will give you better picture or sound. It means that without the firmware update, your player may not provide you with picture or sound from this disc at all.
The HD Advisor knows many things, but he doesn't know everything. Some questions are best answered with a consensus of opinions from our readers. If you can help to answer the following question, please post your response in our forum thread linked at the end of this article. Your advice and opinions matter too!
Plasma Screen Cleaners
Q: What screen cleaner you would recommend for my Pioneer plasma TV? I love my baby (and so does the girlfriend) and I want to keep her screen looking pretty. Pioneer said to use a dry microfiber cloth, but I've heard there are chemicals you can spray on a microfiber cloth and then clean the screen. Are these a waste of time, or do they really work? Do you have any personal experience with them?
Check back soon for another round of answers. Keep those questions coming.
Joshua Zyber's opinions are his own and do not necessarily reflect those of this site, its owners or employees.