Posted Fri Feb 19, 2010 at 12:00 PM PST by Joshua Zyber
Editor's Note: Each Friday, High-Def Digest's own HD Advisor will answer a new round of questions from our readers. If you have home theater questions you need answered, send an email to [email protected] |
Answers by Joshua Zyber
Blu-ray vs. Theaters
Q: It has been stated several times in this column and on many sites that 35mm film holds higher resolution that even Blu-ray and the best of HDTVs cannot capture. However, having seen most films in theaters, and then on Blu-ray again later, am I crazy in thinking that the Blu-ray looks better than it did in theaters? Now I understand that 35mm film has a higher resolution and more detail. So why does it seem to me that my Blu-rays sometimes look 10 times better than the movie did in theaters? When I watch a film on Blu, my eyes tend to wander and are taken aback by the sheer beauty and vast amount of detail on the screen. This is not the case in theaters.
A: The first thing to keep in mind is that movie theater screens are much larger than your HDTV at home. Even if you have a big-screen TV or high-def projector, your home theater display benefits from having a smaller and brighter picture in comparison to that at a movie theater. If you were to project a movie from a Blu-ray source onto a 50-foot screen, it would suffer a lot of the same problems that you see in commercial theaters. Even small visual artifacts that are virtually invisible to you now would be glaringly obvious at that size. Depending on the projection equipment used, the picture would also likely appear dimmer and consequently softer.
With that said, the increased resolution available on 35mm film is often a theoretical advantage when it comes to theatrical projection. All that detail really resides on the original camera negative. Of course, that's not what's being projected at your local theater. The dupe prints distributed to theaters are several generations removed from the negative. Because film prints are optically duplicated, each subsequent generation loses some quality. As a result, a lot of that detail in the original photography never makes it to your theater screen. However, Blu-rays are typically mastered from Internegative or Interpositive film elements much closer to the source, and thus retain much more of that original detail.
This is the same reason why D-Cinema digital screenings usually appear more vibrant and detailed than 35mm screenings, even though the 2k projectors used in most digital theaters are barely higher in resolution than the 1080p used on Blu-ray.
Another contributing factor to your disappointment with theatrical screenings is that many commercial theaters underpower the lamps in their projectors in the misguided belief that the lamps will last longer that way. Some theaters also keep old lamps in service long after they have dimmed beyond the point of acceptability.
Update: One of our forum readers has added the following:
As a former projectionist, I can tell you that the biggest contributing factors to theater pictures looking like crap are:
1. Projectionist error (focus is often slightly off, framing almost always is to some degree).
2. Wear and tear on the print (if projected from film, it gets pretty beat up, especially if the projector has some problems and scratches or wears it more on the way through).
3. Overall condition of the projector. As I said above, it can further damage the film, but the bigger problem I see is jitter. When the picture is slightly shaky, just enough to almost seem like its the focus problem, it is usually due to a projector not exactly lining up each frame, or the projector not being properly secured to where its own motor is vibrating it.
Of course the reasons mentioned in the column play big parts too, but if we could conquer the 3 reasons I listed, you'd be seeing something MUCH better.
"Source Direct" Video Mode
Q: I recently borrowed a Sony BDP-S500 Blu-ray player. It has the Source Direct option when choosing the video format. In the case of playing a DVD, it outputs the disc at 480/60i. I was very surprised that the disc looked pretty good. What does Source Direct do for output of the video? And what are the benefits over upconversion?
A: The "Source Direct" option bypasses any internal processing that your Blu-ray player may otherwise apply to a video signal. It outputs the video at the resolution and frame rate that was encoded onto the disc. For an NTSC DVD, this means that it will send a 480i signal at 60 Hz, without upconversion. When a Blu-ray disc is played, it will output at 1080p resolution and 24 fps. (Unless it's a 1080i disc, of course.)
If you have a digital HDTV, your set will have one and only one native resolution. If it's a 1080p set, the native resolution is 1920x1080 pixels. Any video signal your TV receives will be displayed at that resolution. If the input signal is less than the native resolution, the set will upconvert it. It is not possible to watch a DVD on a digital HDTV without upconversion.
The question is where that upconversion should be performed. It can occur either in the DVD/Blu-ray player or in the TV. If the Blu-ray player has better deinterlacing and scaling quality, you'll want to do the upconversion there. In that case, set the player to output DVDs at 1080p. On the other hand, if the TV has a better processing chip, you're better off setting the Blu-ray player for Source Direct mode and letting the TV do the hard work.
I haven't used the Sony BDP-S500 personally. But I did used to own an S300, which is very similar. The S300 didn't have particularly good DVD upconversion. I'd expect the S500 to be the same. If you like the results you're getting with Source Direct, I'd recommend sticking with it.
HDTV Preset Picture Calibration Modes
Q: For years, TVs (and now HDTVs) have come equipped with pre-set picture modes such as Vivid, Standard, Cinema, etc. When we see those TVs at the store, they're set to Vivid mode because that's the best way to catch one's attention. But for a true home theater enthusiast, a professional calibration is needed, or at least some picture adjustments made by ourselves with the aid of a calibration disc. Why don't TV manufacturers include in their models a picture mode that is professionally calibrated with proper settings for color accuracy, black level, shadow detail, etc.? I know that some TVs include a "better" picture mode (THX, Pure, Cinema, etc.), but even these are usually not perfectly calibrated.
A: TVs or HD displays with proper calibration presets built-in are not completely unheard of. Years ago, video calibration guru Joe Kane worked with a company called Princeton Graphics to design a 32" CRT HD set that met his demanding standards for picture accuracy. (Unfortunately, the set's mechanical workings were notoriously unreliable.) More recently, he's collaborated with Samsung to design the SP-A900B DLP projector. Unrelated to Kane, the Planar PD8150 DLP projector is also said to have nearly perfect calibration out-of-the-box. I'm sure other examples exist as well.
But your point stands. Most TVs and HD displays on the market do not come with accurate picture calibration, no matter which of their preset modes you choose. You'd think this would be an important selling point. So why don't more manufacturers design their displays with a feature like this?
Well, frankly, because it would cost them more to do so. Precise calibration requires measurement, testing, and adjustment of every single unit sold. The smallest of unit-to-unit manufacturing variances can throw video calibration way off. This generally isn't something that can be automated. Even on those displays with THX calibration presets, the THX mode is usually a "ballpark" setting. A professional calibration of your specific set might be able to do better.
It's also worth pointing that the calibration settings for most TVs and projectors will drift over time as the light source ages. That's why it's recommended to recalibrate once a year or so. A unit that's properly calibrated when you first buy it may not remain quite so accurate over time.
Beyond all that, the truth of the matter is that those ridiculously inaccurate preset modes like Vivid, Sports, and whatnot are designed to prey on the ignorance of consumers who don't know any better. As you noted, the brightest picture (typically the Vivid mode, with Brightness and Contrast cranked to the maximums) is the most eye-catching on a retailer floor. Sadly, it works. Modes like that sell a lot more TVs than those that are properly calibrated. When it comes down to it, most manufacturers will choose the cheapest option that results in the most sales.
The HD Advisor knows many things, but he doesn't know everything. Some questions are best answered with a consensus of opinions from our readers. If you can help to answer the following question, please post your response in our forum thread linked at the end of this article. Your advice and opinions matter too!
Bitstream Audio Problems
Q: I have a Memorex MVBD2520 Blu-ray player that I've set to Bitstream. This connects via HDMI to my Yamaha HTR-6250 receiver. When I play a Dolby TrueHD track, I can see the logo on the display. But when I play a disc using the DTS-HD track, I either get PCM or the connection # on the display. My receiver supports both formats so I'm not sure what I'm doing wrong. Is my receiver decoding the DTS track correctly or is it downgrading the AQ?
JZ: Are any of our readers familiar with these particular models? Please post your set-up tips in the forum.
Check back soon for another round of answers. Keep those questions coming.
Joshua Zyber's opinions are his own and do not necessarily reflect those of this site, its owners or employees.
See what people are saying about this story in our forums area, or check out other recent discussions.
The latest news on all things 4K Ultra HD, blu-ray and Gear.