Posted Fri Mar 11, 2011 at 11:00 AM PST by Joshua Zyber
Editor's Note: Each Friday, High-Def Digest's own HD Advisor will answer a new round of questions from our readers. If you have home theater questions you need answered, send an email to [email protected]
Answers by Joshua Zyber
Inconsistent Video Quality on Some Blu-rays
Q: I have a general question. What is the reason that even spectacular-looking Blu-rays will sometimes have a grainy shot here and there? Is it a slip-up in the transfer process or something else?
A: There are a number of possible reasons for something like this. It could be a video transfer or disc authoring issue. For example, movie scenes with a lot of complex picture information in motion typically require careful attention during digital compression (and generally a higher-than-average bit rate), or else the image might exhibit digital noise or pixelation artifacts. These aren't "grain," strictly speaking, but many viewers might describe the resulting noisy or blocky picture as "grainy." A studio (ahem, Warner Bros.) with a habit of simply running its movies through one automated encoding pass dialed down to the lowest-allowable bit rate without human supervision might let issues like this slip past quality control. It's sloppy, but it happens.
However, in most cases, inconsistencies in the amount of grain (real film grain) visible from scene-to-scene, or even shot-to-shot, are more likely a simple side effect of the way that movies are made. The average feature-length movie is comprised of thousands of individual shots filmed over a period of weeks to months. Because movies are filmed out of sequence, it's common for selected shots in a scene to be completed early in production, while other shots in the same scene aren't photographed until much later. A lot of conditions can change in the meantime. Perhaps part of the scene was shot in daytime and other parts at night? Perhaps it was sunny one day and rainy the next? Filmmakers do their best to even out these variances during the color timing or (nowadays) Digital Intermediate stage of post-production, but it's not always possible to maintain 100% consistency.
Back in the days before digital compositing, special effects shots, dissolves, and credit text overlays were achieved by optically compositing multiple layers of film on top of one another. Each composite caused a generation loss in quality, which commonly resulted in increased grain levels. That's why a lot of older movies have very soft or grainy opening credit sequences that snap into better clarity when the credits are done. Likewise for movies with grainy special effects scenes.
Even though modern film stocks (and digital photography) are usually fine grained by nature, all photographic capture processes are prone to grain or noise when shooting in low light. The less light available during filming, the wider the camera's aperture must be opened to expose an image. The slickest, glossiest and least grainy photography is achieved by pumping an enormous amount of light onto the scene and closing down the camera aperture. You'd be amazed at how brightly illuminated a "dark" nighttime scene really is if you walked onto the set. Even daytime scenes shot outdoors in bright sunlight are often supplemented by additional artificial light. What your eye sees might not bear much of any resemblance to what gets captured on film.
That much light of course requires complicated lighting rigs and an increased amount of both time and money to shoot the scene. Low budget productions, or filmmakers who prefer to shoot in a fast-and-loose style, might opt to film in natural light instead, even if it means a grainier image.
Finally, there are also times when grain can be used for deliberate artistic effect. Grain isn't always undesirable. The grain in a photographed image is much like the brush strokes in a painting. It provides texture. Many horror movies use grain to set an oppressive mood. Steven Spielberg famously used heavy grain in the battle sequences of 'Saving Private Ryan' to emulate old war documentaries and heighten the viewer's emotional reaction to the scenes.
2-Channel PCM vs. 5.1 Lossy Audio
Q: I was watching a standard-def DVD concert disc that had three soundtracks: DTS, Dolby Digital, and 2-channel PCM. Normally I tend to favor DTS lossy over Dolby lossy in that situation, as the bit rates are higher. However, I got to wondering as to whether or not the 2-channel PCM would be the better choice since it's not compressed. Dolby's steering logic would allow full 5.1 or 7.1 surround playback (albeit it not fully discrete channels). Do 2-channel DVD PCM tracks have the matrix audio cues built in to allow proper surround sound to be produced with Dolby ProLogic IIx (like the old stereo tracks on Laserdisc)? If so, you'd have pretty much the same thing as a lossless Blu-ray track, except of course for the lesser picture.
A: Unlike Blu-ray, which has much greater storage capacity, standard DVD discs are only capable of being authored with up to two channels of uncompressed PCM audio. (Blu-ray can hold a full 7.1 soundtrack in PCM, which you might find on a disc like '3:10 to Yuma'.) The DVD spec requires that multi-channel soundtracks must be authored in either the Dolby Digital or DTS lossy formats.
Aside from concert titles, 2-channel PCM is fairly rare on DVD, due to the amount of disc space it takes up. Whether a particular PCM soundtrack has the directional cues that can be picked up by Dolby ProLogic IIx processing will vary on a case-by-case basis, depending on how the content was mixed and how the disc was authored. Even in a best-case scenario, ProLogic IIx processing is only a simulation of what a true multi-channel mix would sound like. It might come close, but it will lack the refined directional steering of a true 5.1 or 7.1 soundtrack.
When forced to choose between a 5.1 lossy soundtrack or a 2-channel PCM soundtrack, each has its trade-offs. The PCM track will have greater fidelity, but the 5.1 track will have better directionality and immersiveness, as well as a dedicated LFE channel. Your decision will come down to your personal preferences and priorities, as well as the specific content you're watching.
For what it's worth, my personal feeling is that I'd choose to watch a typical movie in lossy 5.1, but I'd rather watch a music-intensive program (like a concert) in 2-channel PCM. Fidelity is of greater importance to the music program, while surround directionality is usually less critical there. Your average Hollywood movie soundtrack, on the other hand, will place a lot more emphasis on the surround mix and throbbing bass, and a lot less emphasis on musical clarity.
Naturally, Blu-ray offers the best of both worlds with its full multi-channel lossless or uncompressed audio formats. Sadly, a lot of desirable music and concert programs aren't available on Blu-ray yet.
The HD Advisor knows many things, but he doesn't know everything. Some questions are best answered with a consensus of opinions from our readers. If you can help to answer the following question, please post your response in our forum thread linked at the end of this article. Your advice and opinions matter too!
White Flashes on Screen
Q: I recently purchased an LG55LX6500 LED LCD and an ONKYO HT-S3300. I have video and audio running through an older overpriced HDMI (1.3, I think) cable from SONY out of my first generation 80GB PS3 into the ONKYO. Then I run the video from the ONKYO out to the LG using a newer, more modestly priced, Amazon Basics HDMI 1.4 cable that I purchased at the same time as the new TV and receiver. Intermittently, and I mean really randomly (sometimes a couple in a ten minute span, sometimes none for hours of BD and DVD viewing, gaming, or Netflix streaming), the entire screen of the TV will flash white for a fraction of a second. There is no loss of sound, no change in the synchronization of audio and video, and the flash won't repeat if I "rewind" and play the same spot, so I'm pretty sure it's not a content issue. I am not watching any 3D content. Does this scream "bad HDMI cable" to you? (I've been using the older cable for 2-3 years now with no problems.) Or should I be worried about the receiver or the TV? Do you have any troubleshooting suggestions, and is there a way to test an HDMI cable?
JZ: My first instinct is to suggest either a bad HDMI cable or that you're possibly running an HDMI cable too long for your signal chain. (HDMI doesn't carry a signal well over about 15 feet. But that's assuming that you're running a cable longer than that, which you didn't say.) The fact that you never had trouble with this same cable in the past with older equipment might undercut that theory, however. This could be an issue with the HDMI transmitters or receivers in the hardware, or something else entirely. At this point, I'll throw it out there to our other readers to see if anyone else has experienced this same problem.
Check back soon for another round of answers. Keep those questions coming.
Joshua Zyber's opinions are his own and do not necessarily reflect those of this site, its owners or employees.
The latest news on all things 4K Ultra HD, blu-ray and Gear.