Posted Fri May 28, 2010 at 09:55 AM PDT by Joshua Zyber
Editor's Note: Each Friday, High-Def Digest's own HD Advisor will answer a new round of questions from our readers. If you have home theater questions you need answered, send an email to [email protected]
Answers by Joshua Zyber
1080i Scaled to 1080p
Q: Everybody agrees that a movie displayed in 1080p resolution is better looking than 1080i. But I am still confused about one thing: a 1080p screen will uspcale a 1080i signal to 1080p anyway, should we bother with the player output resolution? My understanding is that the 1080p TV combined the even and odd lines of the 1080i signal to display a 1080p image, which means that any 1080p screen will not care about either input resolutions. For example, a Blu-ray player can be set to output its video signal in 1080i or 1080p, so if the screen is 1080p, no real difference will exist at either setting, right? Let's assume here that the upscaler in the Blu-ray player is of similar quality to the TV's. In the same line of thought, my Toshiba A3 HD DVD player outputs its signal in 1080i as reported by my LCD TV (a 1080p), but the picture is upscaled to 1080p by the TV. So am I right/wrong/lost?
A: You're correct that a 1080p HDTV will display all content at 1080p resolution. If you feed it an input signal other than 1080p, it will scale the video to its native resolution. In the case of a 1080i signal, that video will be deinterlaced to form a progressive scan image.
However, if your HDTV will accept a 1080p input signal, I would recommend setting your Blu-ray player for that resolution. Especially if the TV will accept a 1080p signal at the 24 fps frame rate. For more information on this, see my What's the Big Deal About 1080p24? article. You will not be able to take advantage of 24 fps playback with a 1080i signal.
Most Blu-ray movies are natively encoded at 1080p24 resolution on disc. It's generally best to output a "native" signal to a TV of the same resolution, with as few processing steps in between as necessary. If you set your Blu-ray player to 1080i output, the player will break each of the original 1920x1080 pixel frames apart into separate 1920x540 interlaced fields, then apply 3:2 Pulldown to modify the frame rate to 60 Hz before transmission. On the receiving end, the TV will have to deinterlace these fields to reconstruct the original frames. Although, theoretically, this shouldn't be too difficult a process, there's little point in adding extra and unnecessary processing steps to your signal chain. Doing so runs the risk of errors at some point along the way.
Downmixing Lossless Audio to Stereo
Q: My Samsung BD-P1600 is capable of decoding Dolby TrueHD and DTS-HD Master Audio. I do not have a multi-channel audio receiver (yet, and in no hurry to get one). However, can the player downmix to stereo these audio codecs and output them via stereo analog or HDMI? I am inclined to say yes, because I have compared the lossless tracks to the standard Dolby Digital track ,and the lossless tracks sound richer. Am I right? Or is this simply a placebo effect?
A: I can't speak for how the BD-P1600 specifically does things, but if a Blu-ray player has the ability to decode Dolby TrueHD and DTS-HD Master Audio internally, it should be able to downmix these tracks to stereo without losing the lossless nature of the signal (as far as data compression goes). The player will collapse the 5.1 original channels to 2, but should not apply additional lossy compression.
In a worst case scenario, a player might default to decoding only the DTS "core" or DD 5.1 backup track if you've set it to 2-channel output. That would be a very poor design, however. I'm of the impression that a player with the proper hardware will decode the full lossless track first before downmixing.
Resolution of CRT HDTVs
Q: My parents have one of the last 30" Toshiba CRT HDTVs manufactured with DVI. When my dad watches a Blu-ray film on the set, he claims it doesn't look much better than an upconverted DVD. At first I thought it was his eyesight, but after watching a few Blu-rays on his television and then watching them on my 1080p LCD set, I can see that he's right. I read somewhere that CRT HDTVs were incapable of outputting a true 720p/1080i picture and used a technique called bobbing and weaving to produce a faux 1080i image. When watching a Blu-ray, the max resolution would be 540p (or some odd number). Have you heard of this before and is it possible that if I upgrade my parents set to a 1080p set that they will see a marked improvement when viewing Blu-ray films?
A: Most CRT HDTVs were 720p or 1080i models, but most (especially tube sets in that size range) were incapable of resolving detail on screen that actually approached those numbers. This was a combination of poor video processing (the "bobbing and weaving" you refer to) and limitations in the raster as to how much detail could make it through onto the screen. Generally speaking, 540p is probably a close estimate for what you'd actually see on such a television. As you note, that's not much better than DVD quality.
A modern 1080p digital set is able to produce a much more detailed picture than those CRT models. Perhaps dramatically so. Whether your parents will notice or be able to tell the difference is another matter, of course. But technically, the new sets will be better in this regard.
This question ties in nicely with one from two weeks ago about how the black levels of digital HDTVs compare with old CRT sets. I'll refer you back to that article as well, because it's worth noting that digital sets do have some tradeoffs.
In any case, if your parents were to upgrade their TV now, they could buy a larger model that weighs much less, takes up less space, and is easier to install than their old set. I have a feeling that they'd find some benefit in that, if nothing else.
In last week's column, I solicited some expert advice to help explain why most movie soundtracks are mixed with bass in the main speaker channels rather than directed exclusively toward the LFE channel. To provide some additional background and clarification, reader Chase offers the following further details.
Large vs. Small Speakers Revisited
Feedback: In addition to the LF output aspect you covered, one of the primary reasons for using a subwoofer crossover system at home is to control the LF resonances that dominate bass response in small rooms. It is much easier to place one or (preferably) more subwoofers for smooth bass across the listening area as opposed to 5 or 7 large speakers. Some experts contend that electronically summing LF from all channels to mono like this has a negative impact on the spatial character of LF (i.e. bass is stereo), but I would characterize that as a minority view (not saying it's right or wrong). It can also have some unintended side effects, most notably too much bass. The acoustic summing of 5 or 7 large speakers plus the subwoofer in a room is not as perfect, so the effect is less. Also, if there happen to be phase discrepancies between channels (most common between mains and LFE), then electronically summing them can actually cancel bass out!
Overall, subwoofer crossovers are still preferred because of the huge benefits of resonance control and flexibility in the placement of smaller main speakers. It's not a perfect world, though.
On the production side, it's important to note that, while many of the main monitors and cinema speakers used to mix movie and music programs are physically large and powerful, they still do not reach to the lowest octaves with much punch. Look at the specs for big 2- and 3-way cinema speakers, and you'll find that LF cutoffs between 35-45 Hz are common. These boxes are also usually vented, which means the response dives quickly after the cutoff. (Even more interesting, you'll find a similar occurrence with cinema subwoofers!) In the studio, it's not uncommon for a surround mix to be performed on monitors with limited frequency response (like nearfields) and a subwoofer monitoring only LFE. As a result, mix engineers sometimes artificially boost the low end in their mixes to compensate for speaker roll-off. This works just fine in the cinema world where the speakers in movie houses work just like the dubbing stages, but when you take those mixes home to systems that will play 20-35 Hz, you get more bass than the mixer intended. This is exacerbated by home systems employing subwoofers crossovers, because they play lower louder. Fortunately, studios are getting wise to this and checking mixes on systems with subwoofer crossovers prior to minting the home release.
Another note on the production side…some cinemas do not have subwoofers, so it is inappropriate to route all LF content to LFE.
I think there is greater awareness of the whole bass management issue now vs. a few years ago. However, as far as I know, the production side of things still favors "large" speakers with no crossover and subwoofer for LFE only. Lucky dogs. They don't have to contend with resonances.
Oh...forgot to mention one thing on the why-don't-mixers-use-just-LFE topic. It's theoretically possible that mixing all the bass in a soundtrack to LFE would overmodulate (clip) the LFE channel. I don't remember the exact numbers -- it may be that all channels full scale would be 121dB, or 6dB higher than LFE can handle. In fact, the LFE channel was originally created to prevent a similar problem from occurring with the main channels: overmodulation from extreme LF demands. You can't just turn the mix level down to allow more LF to be mixed in, because the track has to play at reference.
P.S. Automatic speaker setup systems are often wrong about the small/large size of a speaker and the appropriate crossover frequency. To be used as a guideline only.
Check back soon for another round of answers. Keep those questions coming.
Joshua Zyber's opinions are his own and do not necessarily reflect those of this site, its owners or employees.
The latest news on all things 4K Ultra HD, blu-ray and Gear.