Posted Fri May 1, 2009 at 12:00 PM PDT by Joshua Zyber
Editor's Note: Each Friday, High-Def Digest's own HD Advisor will answer a new round of questions from our readers. If you have home theater questions you need answered, send an email to [email protected].
Answers by Joshua Zyber
Q: What is the difference between native contrast and dynamic contrast? I know enough that they are overblown and kind of a cheat. Also, since now that most companies just post the dynamic contrast ratio, is there any way to find out what a TV's true ratio is?
A: The contrast ratio is the difference between the brightest and darkest points in a video image. The higher the contrast ratio, the more range that the display has between those two points. Displays with a low contrast ratio will have milky black levels and poor definition of details in shadow areas. A high contrast ratio will not only address those aspects, but will also greatly improve the sense of depth in the picture.
Contrast is still one of the biggest obstacles faced by digital displays. Although modern HDTVs boast higher and higher contrast ratios every year, I have yet to see any digital display (LCD, DLP, LCoS, or even Plasma) that can achieve the inky, almost total darkness generated by an analog CRT display with a pure black frame on screen. Plasmas come the closest (especially the Pioneer KURO line), but you'll always see some light output. In the worst cases (many LCDs), blacks are really just dark gray.
Of course, digital displays have other advantages over CRT. A set with a high contrast ratio will look great in most movie scenes, even if it can't achieve absolute black in the darkest scenes.
As you noted, there's a difference between a digital display's native contrast ratio and dynamic contrast ratio. Native contrast is the measured difference between bright and dark achieved by the pixel panel, assuming that the light source is a constant intensity at all times. However, many newer HD displays attempt to cheat the contrast by raising and lowering the intensity of the light source depending on the content of the image. The most common method to achieve this is to place a dynamic iris in front of the lamp that clamps down to dim light output in dark scenes, and opens up to increase light output in bright scenes. Some dynamic contrast implementations may also modulate power to the lamp, electronically manipulate the gamma curve, or combinations of all these methods.
The effect of dynamic contrast is that darks are darker and brights are brighter. Sounds great, doesn't it? Unfortunately, when the light output drops during dark scenes, bright points in those scenes (like a small lamp in the background of a darkened room) are also dimmed. Likewise, dark points in bright scenes will be washed out. In either case, light output is adjusted across the board. This is especially problematic in dark scenes with changing light sources, which can cause a visible "pumping" of the contrast as the algorithm attempts to adjust between bright and dark.
My torture test for dynamic contrast is the opening scene in 'Star Wars: Episode IV – A New Hope'. After the prologue scroll, we see a dark star field. As the Imperial Destroyer flies over the camera, it occupies a larger and larger portion of the screen, thus raising the average brightness level. A dynamic contrast implementation will react to this by raising the light level. As it does, you can see the blackness of space in the background turn grayer and grayer.
Dynamic contrast can be very effective if implemented well. You may never notice it in the majority of movie scenes. But it is a compromise and does have trade-offs.
Additionally, you must take any contrast ratios published by the manufacturers (whether native or dynamic) with a grain of salt. The circumstances used to measure those ratios in the factory are usually not achievable at home, certainly not after the set has been properly calibrated for movie and TV viewing. Your best bet is to find a trustworthy hardware review publication that uses professional instruments to measure the contrast at calibrated settings.
DTS-HD Master Audio on 'Die Hard 2'
Q: I noticed that in your review of 'Die Hard 2', you never mentioned that there is no DTS-HD track. I noticed that the other two movies, 'Die Hard' and 'Die Hard with a Vengeance', do have DTS-HD tracks. 'Die Hard 2' only has a DTS 5.1 track. I spoke to Fox Home Video, they are aware of this issue but nothing has been done about it. How come you guys didn't notice it, and if Fox is aware of the problem why aren't they doing something about it?
A: While I didn't review this title myself, I do have a copy of the disc and have read up on the issues with it. From what I've gathered, all four of the movies in the 'Die Hard Collection' are indeed encoded with DTS-HD Master Audio 5.1 soundtracks. However, there's an authoring issue with the 'Die Hard 2' disc that cuts off the Master Audio extension when the soundtrack's native bitstream is transmitted from a Blu-ray player to a compatible receiver. Instead, only the standard DTS core is sent. On the other hand, some players that decode their audio internally are able to read the MA extension and decode it to PCM.
I tested the disc in two Blu-ray players today. My Panasonic DMP-BD50 is set to transmit audio in native bitstream form. When I play 'Die Hard 2', my receiver only reflects accepting a standard DTS 5.1 track. Meanwhile, when I try the disc in my Playstation 3, the console's Display screen clearly shows that it's decoding DTS-HD MA 5.1. Reviewers watching 'Die Hard 2' on a PS3 would have no indication that anything was wrong with the disc.
As for why Fox hasn't corrected and issued replacements for the disc yet, you'll have to take that up with the studio.
Q:I have an older LCD TV that only has one HDMI jack on the back. As I have several high-def devices (Blu-ray player, PS3, Blu-ray laptop), I'd like to be able to hook them up to the TV. I was told I could purchase an HDMI switcher (I believe that what it's called), but have seen many different price ranges, from $40 to over $300. Is there any difference in the quality of these? Is the price difference justified?
A: In my column from a few weeks ago, I wrote about the differences between HDMI switchers and HDMI splitters. To recap, a switcher allows you to connect multiple video sources to one screen, while a splitter will take the output of one source and amplify it to multiple displays. Because the HDMI signal only travels in one direction, the two devices cannot be substituted for one another. What you need is an HDMI switcher.
Due to the complexities of the HDMI and HDCP encryption protocols, HDMI repeater devices are often prone to handshaking errors. My experience with HDMI splitters was problematic. I had to try several products to find one that worked reliably; of course, it was one of the more expensive options. For whatever reason, switchers are generally less glitchy. Nonetheless, you take your chances when you buy a cheap HDMI switcher. If you're on a budget and want to try inexpensive switchers, make sure you do so at a retailer with a good return/exchange policy.
If you aren't in the mood to experiment, the Oppo HM-31 is a very good 3x1 switcher for a reasonable $99. There may be other, less expensive options that will also fit your needs, but this is one I'm fairly confident in.
Q: Can you explain the process of how studios go about creating a master from a film and then the steps involved to transfer that to Blu-ray? Additionally, when/why/how various tinkering goes on, both necessary and unnecessary, such as digitally cleanup, adding edge enhancement, contrast/color boosting, etc.? How often does this result in a positive result rather than just altering something because they can (i.e. would things appear closer to as intended if less post processing was done)?
A: In order to transfer a movie from film to video, the film elements must be digitally scanned in a device called a telecine. After the movie has been digitized, the transfer operators have various tools to adjust the picture attributes. Among these are color correction, contrast enhancement, dirt and scratch removal, Digital Noise Reduction, and Edge Enhancement. After they've completed these steps, the files are archived to a digital master, usually at either 2k or 4k resolution. That HD master will then be used as the basis for all subsequent video editions. The master will be scaled to 1080p for Blu-ray or downconverted to standard-def for DVD.
All those tools at the studio's disposal are a mixed blessing. In a worst case scenario, too much tinkering will eradicate the original film textures and leave the image looking unnaturally electronic and "digital." In the days before Blu-ray, it was common for studios to wipe away any trace of film grain with DNR, because grain is difficult to compress using MPEG-2 at DVD's compression ratios. When that left the picture soft, they'd try to "sharpen it back up" with Edge Enhancement. We picky DVD reviewers may have complained about the process at the time, but average viewers watching on 20" tube sets thought it looked terrific.
Now that we're in the age of Blu-ray and large HD displays, those digital artifacts are downright painful to see. Unfortunately, in far too many cases, studios have simply recycled old outdated masters plagued with DNR and Edge Enhancement for their catalog titles. When they do, we try to call them on it in our reviews here at High-Def Digest.
With that said, digital tweaking has its place as well. When used correctly and in moderation, these tools can help to clean up or correct imperfections in the source material. For example, Fox's stunning Blu-ray transfer for 'The Sand Pebbles' was based on an extensive restoration effort performed entirely in the digital realm. The results look perfectly natural and film-like. The tools themselves are neither inherently good nor inherently evil. The skill and intent of the person operating them makes all the difference.
Dust Blobs & Stuck Pixels
Q: I have a 3 year-old HD LCD projector. About two years ago, two green spots appeared on the screen. I am pretty certain that it has nothing to do with the lens. Do you have any idea what could cause this?
A: There are a couple possible causes for this. If dust particles have gotten into the light path of your projector between the lamp and the lens, they may be obscuring part of the image. Remove the lamp and try (carefully) cleaning the insides of the projector with compressed air.
Another possibility is that your projector's LCD panel may have some stuck pixels. When that happens, the pixels are unable to refresh with new picture information even though rest of the screen is actively changing from frame to frame. Unfortunately, I don't believe there's a fix for this on a projector. In that event, perhaps this is the excuse you need to upgrade to a newer model.
Some questions that the HD Advisor receives are best answered with a consensus of opinions from our readers. If you can help to answer the following question, please post your response in our forum thread linked at the end of this article. Your advice and opinions matter too!
Blu-ray Burner Recommendations
Q: Is there a BD burner for a PC that will burn 1080p from my Canon EOS 5D Mark II that will play on a Playstation 3?
Check back next week for another round of answers. Keep those questions coming.
The latest news on all things 4K Ultra HD, blu-ray and Gear.