Editor's Note: Each Friday, High-Def Digest's own HD Advisor will answer a new round of questions from our readers. If you have home theater questions you need answered, send an email to [email protected].
If you've already sent a question and don't see it answered yet, please be patient as we work our way through them. To browse through previously answered questions, visit the main HD Advisor page.
Answers by Joshua Zyber
A/V Receivers and PAL Video
Q: I'm originally from the UK, but I'm moving to the U.S. with all my PAL DVDs. I'm going to get a completely new home theater set-up and decided to go for a multi-region/multi-system Blu-ray player and a multi-system TV. I'm guessing the quality will be slightly better without being first converted to NTSC. However, I am unsure about the A/V receiver. I know it deals with the video signal as well as the audio. So would the A/V receiver also have to be multi-system, or would the signal just pass though to the TV?
A: On most modern A/V receivers, if you route your video through the receiver, you have two options. You can set the receiver to simply pass through the video signal unprocessed. (Whatever you send in is what comes out the other end.) Or you can have the receiver deinterlace and/or scale the video. (For example, upconverting a standard-definition source to HD.) In most cases, I recommend simply passing through the video signal untouched. DVD upconversion is generally best performed at the disc player. However, it's been a recent trend for many receivers to include top-end scaling solutions such as Silicon Optix or Anchor Bay processing. So this may not necessarily be an inflexible rule.
I can't speak for every receiver brand or model on the market, but in my experience, most receivers will pass through both NTSC and PAL without issue. PAL compatibility is a more critical issue in both the disc player and in the TV, either of which might reject the PAL signal.
At the very least, all of the Denon receiver models that I've used in my own home theater have passed through native PAL successfully. If some of our other readers have experience with different receivers that aren't compatible with PAL video, I invite them to post those details in the forum thread linked at the bottom of this article.
Keep in mind that even if you buy a multi-system capable Blu-ray player, you will still encounter the issue of region coding. Blu-ray players sold in North America are locked to Region 1 for DVD and Region A for Blu-ray. If the player is PAL-compatible, it may play European discs that are authored as region-free, but it won't play any DVDs that have been authored as Region 2, or any Blu-rays authored as Region B.
To get around that problem, you'll have to specifically purchase a unit that's been modified to disable the region code restriction, or purchase a Do-It-Yourself modification kit. JVB Digital is a good source for either. I personally have an OPPO BDP-83 player that I modified with a DIY kit from JVB. The kit was relatively affordable, and very easy to install. The machine has successfully played any DVD or Blu-ray from anywhere in the world that I've thrown at it.
What Happened to Dolby TrueHD?
Q: Having upgraded my system to take full advantage of the lossless codecs from Dolby and DTS, it's been somewhat disheartening to see the number of titles available in Dolby TrueHD slow to little more than a trickle in 2010. Can we expect to see more Dolby TrueHD titles in the future or has DTS-HD Master Audio become the de facto standard for Blu-ray? Also, what is the likelihood of seeing both TrueHD and DTS-HD Master Audio as options on a single Blu-ray disc?
A: It's true that there does seem to be a trend in recent months for home video studios to move away from Dolby TrueHD and toward DTS-HD Master Audio. TrueHD isn't completely being phased out, but the momentum in the market is with DTS.
I suspect that this primarily has to do with the core+extension design of DTS-HD Master Audio, which incorporates a backwards compatible standard DTS soundtrack integrated as a "core" within the audio signal. Hardware not compatible with the full lossless sound format can simply ignore the MA extension and only read the core. This means that the studio only has to author one soundtrack on the disc for all purposes. Dolby TrueHD, on the other hand, requires that a separate backwards compatible DD 5.1 soundtrack must be added to the disc along with the lossless TrueHD track. (Sometimes it's "hidden," so you may not see it listed in the disc menus; but the player will fall back to it if your equipment doesn't support TrueHD.)
This isn't to say that one lossless format is better than the other. The end result of either is the same bit-for-bit identical copy of the studio master. However, for backwards compatibility purposes, DTS-HD Master Audio may be more convenient. The lossy DTS core also tends to offer a higher level of fidelity than DD 5.1 for those listeners who can't take advantage of the full lossless track.
As far as I'm aware, only two movie discs have been authored with both Dolby TrueHD and DTS-HD Master Audio options: 'Close Encounters of the Third Kind' and 'Top Gun'. [Correction: Reader Stephan has pointed out that 'The Final Countdown' does as well.]
In the case of 'Top Gun', the two tracks come from different sound mixes. The DTS version was remixed for 6.1, while the TrueHD version is an older 5.1 mix. However, other than isolated examples like this, there is little reason for any Blu-ray to be authored with both lossless formats. TrueHD and Master Audio produce identical results. Lossless is lossless. The inclusion of both is just a waste of disc space that could be better utilized for other purposes, such as including bonus features or increasing the video bit rate.
LFE Calibration and 'Digital Video Essentials'
Q: I have a question regarding the LFE +10 dB boost that happens in equipment. I have a Sony BDP-S550 BD Player with the newest firmware, that is connected to an Onkyo 707 receiver. (The Sony can't be forced to output LPCM on encoded signals like DD, DTS, TrueHD and DTS-MA). Now that I calibrated the set through Audyssey automatic setup, I manually checked the calibration with 'Digital Video Essentials' and an SPL meter. Everything was fine except for the subwoofer. That spiked up like mad. Then I lowered the LFE volume setting in the Onkyo for all encoded signals with -10 dB and pronto, the subwoofer signal was dead on now! This makes me think that the Sony BDP-S550 player already does the +10dB boost on encoded signals (while this should only be done in the receiver). It doesn't happen on LPCM tracks. Am I right with this, or is something else going on?
JZ: To get to the bottom of this issue, I sought assistance from my friend Chase, who knows a lot more about the nitty gritty of audio engineering than I do. In researching the problem, he made an interesting discovery.
Chase: My initial thought was that the problem was misdiagnosed. Assuming he is connected via HDMI using bitstream, the BD player is not adding the +10 dB boost. This would only happen with analog or possibly native/decoded PCM. Even that's pretty rare.
It's possible that Audyssey MultEQ missed the SPL for the subwoofer. It's also possible that he didn't account for all the variables involved in verifying SPL, such as the actual modulation level on the disc for LFE vs. mains, the DialNorm value, the frequency content of the LFE test noise, and the effect of the subwoofer low-pass filter(s) on that test noise.
The Onkyo 707 is THX certified, which makes it especially tricky to verify levels using a test disc like 'DVE' that is encoded at DialNorm value -31. (Tests must be conducted with master volume at a confusing -4 dB from reference.) 'DVE' doesn't have a PCM track, so I'm not sure how he would verify that the problem doesn't occur with a PCM input.
With that said, I ran some tests and it looks like your reader discovered something interesting. Let me start by saying that I verify channel levels using an excellent test disc distributed by Gold Line. I have never employed 'DVE' for that purpose. However, I got 'DVE' out today to see if anything interesting turned up, and it did.
In a THX controller, channel levels should always be set using the internally generated noise, as it's accurate. If verification is needed, the standard procedure is to set the master volume to reference (0dB) and measure 500 Hz to 2 kHz narrowband pink noise at -30 dBFS for main channels and 40-80 Hz narrowband pink noise at -40 dBFS for LFE (DialNorm -27 for DD-encoded signals). This should produce 75 dB C-weighted at the listening position for all channels.
(For a non-THX controller, use test noise encoded at DialNorm -31, or turn the master volume up 4dB.)
Posted Fri Jul 2, 2010 at 11:45 AM PDT by:
Joshua Zyber