Editor's Note: Each Friday, High-Def Digest's own HD Advisor will answer a new round of questions from our readers. If you have home theater questions you need answered, send an email to HDanswers@gmail.com.
Answers by Joshua Zyber
Blu-ray Volume Issues
Q: Both by parents and my sister have Sony Blu-ray players, but different models. My sister has the BDP-S360. I'm not sure about my parents. Both players are hooked directly to the TV with HDMI cables. The issue that both of them are having is when they switch over from watching TV to Blu-ray they have to crank the volume nearly all the way up to max on the TV just to hear the Blu-ray movie they are watching. When they finish, they have to crank the volume all the way back down before switching back over. Is there a setting they are missing in the Audio portion of the Blu-ray player that is causing this? I would think when you switch over from TV to Blu-ray and vice versa the volume would be consistent.
A: What you describe actually isn't unexpected. The lossless soundtracks on most Blu-ray discs have much greater dynamic range than TV shows or movies (even when watching the same movies) broadcast on television. This dynamic range is defined as the difference between the lowest low to the highest high. In other words, there should be a huge difference in volume between whispered dialogue and huge explosions in a movie soundtrack.
In order to accommodate this dynamic range, Blu-ray soundtracks are generally authored at a low default volume. (Normal speaking dialogue in the soundtrack is considered the median baseline.) This gives them some headroom so that the really loud sounds (like explosions) don't clip or distort. If the soundtrack were authored with too high a median volume, those loud sounds might hit the audio ceiling and cause distortion.
Programs broadcast on television do not have quite as wide a dynamic range. When movies are aired on TV, they're usually run through a Dynamic Range Compression filter, which will raise the low sounds and lower the highs, pushing everything closer together towards the middle. This often gives the perception that the entire soundtrack is louder, because the portion of the soundtrack with where your baseline for volume is set (the dialogue) is higher, and is much closer to the "loud" sounds in the mix.
As for what your family can do about this, I would suggest that they check the Audio section of the Blu-ray player setup menus to see if those models offer Dynamic Range Compression (sometimes called "Night Mode"). While viewers with full-blown surround sound systems that can handle lossless audio would not want to engage Dynamic Range Compression, it might be beneficial to someone listening through limited-range TV speakers.
"Sharpness" vs. Detail
Q: How much of a difference in sharpness is there when viewing a 1080p image on a 1080p 32" TV, as opposed to viewing the same 1080p image on a 1080p 58" TV? I would think the 32" set has the advantage because it is smaller and these 1080 lines of resolution are closer together. Is this a correct way of thinking?
A: When it comes to high definition video, "sharpness" is something of a misnomer, and is arguably an outdated standard to judge a picture by. What you should be looking for in a high-def picture is detail. When you see a close-up of an actor's face, how well resolved are the pores on his skin, or the individual hairs on his head or in his beard? Can you make out the thread pattern and texture of the fabric of the clothes he's wearing?
Depending on a number of factors such as the way the movie was photographed, perhaps you won't be able to see those specific details. It may even be that you're not supposed to see them. (For example, it's common for vain actors and actresses to be photographed with soft focus in order to hide things like skin pores and wrinkles.) Nonetheless, these are examples of the types of fine object detail that high definition is capable of resolving far better than standard definition.
Detail is not necessarily the same thing as "sharpness." From a technical standpoint, one 1080p HDTV has the exact same pixel resolution as any other 1080p HDTV, and is capable of displaying the same amount of detail. Regardless of screen size, a large screen and a small screen of the same 1080p resolution will render the exact same 1920x1080 pixels of content.
However, as you mention, the smaller screen may leave the impression of having a sharper picture than the larger screen, because the same amount of detail is crammed together more tightly into a smaller area. But this is really a false sense of sharpness. The actual detail is the same.
Further, beyond a certain point, the human eye may not be capable of seeing all of that 1080p detail on a small screen. It's generally suggested that there's little to no visible difference between 720p and 1080p at screen sizes around 30" or less. The larger the screen, the more discerning your eyes will be, and the more able you'll be to pick out all of that 1080p detail.
Yes, it's true that if you go with too large a screen, your image quality will seem to degrade. The pixels will become so spread apart that the picture may seem perceptibly "soft," and you may notice the pixel structure itself (which is not desirable). But it takes a really large picture to make that happen with Blu-ray. Most high-end home theater viewers report that 1080p Blu-ray picture quality holds up quite nicely when projected at 100". So, you should have no problems at all with a 58" HDTV.
The HD Advisor knows many things, but he doesn't know everything. Some questions are best answered with a consensus of opinions from our readers. If you can help to answer the following question, please post your response in our forum thread linked at the end of this article. Your advice and opinions matter too!
Blu-ray Laptop Recommendations
Q: I need a new laptop and I need to buy it soon. I want one with a Blu-ray drive, but I have no idea what specs are good these days for processing power and whatnot. What should I look for? What brands give the best value?
JZ: I see that we asked this question in the Homework a little over a year ago. But things move so quickly in the computer world that I have a feeling a lot of that information may already be outdated. I think it's worth posing the question to our expert readers again.
Check back soon for another round of answers. Keep those questions coming.
Joshua Zyber's opinions are his own and do not necessarily reflect those of this site, its owners or employees.