Posted Fri Feb 20, 2009 at 11:00 AM PST by Joshua Zyber
Editor's Note: It's Friday, which means it's time for another round of questions and answers with High-Def Digest's own HD Advisor!
If you have home theater questions you need answered, our HD Advisor will try to help you out.
Send an email to [email protected] to submit a question for consideration.
To browse through previously answered questions, visit the main HD Advisor page.
Answers by Joshua Zyber
Q: Does the advent of hi-def resolutions make the PAL/NTSC issue redundant? Do all regions now run in identical frame rates?
A: The PAL/NTSC frame rate issue is mostly redundant in the High-Def world, but not entirely. To explain, let me start with the basics.
The majority of movies are shot on film or digital video at a rate of 24 frames per second. Unfortunately, when the Standard-Def formats were established, neither NTSC nor PAL was built to accommodate that frame rate. NTSC runs at 60 interlaced fields per second (equivalent to 30 frames per second) and PAL runs at 50 interlaced fields per second (or 25 fps). In order to maintain a movie's original running speed, NTSC employs a 3:2 Pulldown pattern for film-based material. The first film frame is repeated 3 times, the second frame is repeated twice, then the next frame 3 times, then twice, etc. This keeps the overall speed and length of the movie the same, but tends to cause judder during panning movements.
PAL simply speeds up the movie to 25 fps. Movies transferred to PAL run 4% shorter, and have a corresponding increase in audio pitch.
For compatibility purposes, HDTVs have maintained the same 60 Hz (in NTSC territories) and 50 Hz (in PAL territories) frame rates, even though the overall High-Def resolution is much higher. As a result, movies are still subjected to either 3:2 Pulldown or 4% speedup. However, many modern HDTVs are also able to display a movie image at the original 24 fps without 3:2 Pulldown or speedup if you set your Blu-ray player for that frame rate.
Most feature films available on Blu-ray are natively encoded at 1080p24 resolution (the last two digits refer to the frame rate), regardless of where in the world they're released. If your HDTV is compatible with 24 fps playback, that's how it will be displayed. If not, either the Blu-ray player or your TV will frame rate convert the video to 60 Hz or 50 Hz as needed. Since the movie data is the same format, many Blu-ray discs from Europe work just fine in American Blu-ray players, and vice versa.
However, this doesn't necessarily take into account other content on the disc, including the menus and bonus features. Some European Blu-rays have menus encoded at 50 Hz or supplements in PAL SD resolution, which will not work in an American disc player. For example, for the recent UK release of 'Son of Rambow', the disc has no region coding and the movie portion is encoded at a worldwide-compatible 1080p24 resolution. But the disc's copyright warnings and menus are encoded at 1080p50. If you try to play that disc in an American Blu-ray player, most players won't be able to access the menus and will simply reject the disc.
Some feature content may also be natively encoded at 60 Hz, 50 Hz, or possibly 25 fps. This includes concerts and documentaries shot on video, or the occasional movie (European Blu-ray editions of the Spanish zombie thriller '[Rec]' are authored at 1080p25). These will also face compatibility issues.
You should also keep in mind that everything I've described above is separate from the issue of region coding, which is another matter entirely. Even if a disc is authored at a worldwide-compatible 1080p24 resolution for all its content, the studio releasing it may still choose to lock that disc to Region A, B, or C.
1080i vs. 1080p
Q: I have a Panasonic Viera FullHD plasma, and have difficulty determining a difference between 1080i and 1080P. I was under the impression that a progressive image made fast motion and swift pans “jerk-free,” yet it seems far from smooth on my system. Are you able to suggest some differences too look out for to help the layman like myself?
A: I think you're confusing the issues of progressive scan and frame rate. Your plasma HDTV has a native resolution of 1080p. Whatever input signal you feed it, the set will automatically scale it to 1080p. If you send it a 1080i signal, it will deinterlace the fields before display. No matter how you set your Blu-ray player, what you watch on screen is always 1080p.
"1080p" doesn't necessarily tell you what frame rate the content is being displayed at, though. As per my response above, most American HDTVs still run at 60 Hz, which will add judder to film-based movies. To reduce the judder, you'll need to verify whether your TV can accept a 1080p24 input signal from a Blu-ray player, and whether it will display that signal at the original 24 fps (or an even multiple such as 48 Hz, 72 Hz, 96 Hz, etc.). Many 1080p HDTVs simply add 3:2 Pulldown to a 24 fps signal to bring it back to 60 Hz, judder and all.
Let me direct you to my What's the Big Deal about 1080p24? article for more information about 24 fps and judder. As noted in that article, even a native 24 fps display will still have some amount of inherent jerkiness due to the slow photographic capture rate.
Just to add another wrinkle to this story, some newer 120 Hz HDTVs have a Frame Interpolation feature (sometimes called TruMotion, PureMotion, MotionFlow, Auto Motion Plus, etc.) designed to reduce judder and jerkiness. Unfortunately, Frame Interpolation has some very serious drawbacks, which I'll address in the next question.
Q: I see the question pop up quite often, especially some arguing the feature makes HD look better. I think a better explanation of this feature in new TV sets would be beneficial. The preference for 120Hz over 24Hz is really more subjective --- 120Hz seems more artificial than the film-like appearance of 24Hz.
A: The 120 Hz refresh rate on newer HDTVs was chosen because 120 is an even multiple of both 24 and 60. When displaying content, a 120 Hz set has two options for what to do with the video signal.
On the one hand, the TV may simply multiply the frames in the input signal to repeat them more often. If fed a 24 fps input signal, the set can apply 5:5 Pulldown to repeat each frame 5 times, at 5x the original speed. The net effect is indistinguishable to the human eye. Likewise, if fed a 60 Hz input signal, it will apply 2:2 Pulldown to repeat each frame twice. One again, this is visibly identical to the original. (If that 60 Hz signal was built by applying 3:2 Pulldown to a 24 fps source, those 3:2 judder artifacts will remain unchanged).
The other option is to apply Frame Interpolation, which will invent all new frames to insert between the original frames in the video signal. It does this by taking parts of the original frames and then estimating what the "in-between" frames should look like. When starting from a 24 fps movie source, each one of the original frames will be followed by 4 brand new frames, each gradually different than the last, until hitting the next original frame.
This is a much more invasive process than simple repetition. Frame Interpolation will visibly change the appearance of the movie. The intent is to smooth out judder and 24 fps jerkiness. It can also make the picture look "crisper," for lack of a better word. Some viewers find it appealing, claiming that it adds more of that elusive "3-D pop."
However, Frame Interpolation has a nasty side effect of making film-based content look like it was shot on a camcorder. Instead of watching a movie, it often feels like you're watching the behind-the-scenes footage from the making of that movie. While judder may be reduced, all those artificial frames can make the picture look smeary instead. Most movie fans find the tradeoff undesirable.
Some viewers will argue that this is a matter of personal preference. I suppose that it is, in as much as that it's also personal preference to crank a TV's color control all the way to the maximum to make people's skin tones glow orange. If that floats your boat, I can't stop you. Just know that there's a difference between personal preference and accuracy.
As far as I'm concerned, the purpose of the home theater hobby is to present movies as faithfully as possible to the way the original photography was shot and intended for theatrical projection. That means maintaining the Original Aspect Ratio, calibrating colors for accuracy, and not adding any additional processing that will visibly change the look of the movie. Frame Interpolation is a very serious alteration of a movie's photography, and I personally can't stand it.
Q: Why is it that on some Blu-ray discs the sound is off when I play it on 1080p? I have to resort to 1080i to have the picture and sound match. I have an LG Blu-Ray, HD DVD combo player running through a brand new Pioneer Elite receiver using HDMI cables. Do I need a more up to date Blu-Ray player to solve this?
A: There are a couple of possible causes to this problem. First things first, make sure that your Blu-ray player is up to date with the latest firmware. There may be a bug in your player that causes audio to fall out of sync at 1080p resolution. It's worth checking to see if the manufacturer is aware of this and whether they've already issued a fix.
But let's assume that doesn't solve the problem. You didn't mention whether your HDTV is a 1080p model. If not, the set will need to scale the incoming signal to its native resolution before display. It's very possible that the TV's internal hardware may take longer to do that with 1080p input signals than with 1080i. In that case, you're better off leaving the Blu-ray player set for 1080i output.
I would also recommend turning off any additional video processing such as Noise Reduction or Frame Interpolation. These can also cause delays and hence audio sync issues.
Player Load Times
Q: I've always wanted to know: What exactly is being "loaded" when a Blu-Ray disc first starts up? Java language? Blu-Ray Gnomes? What?
A: As I'm sure you've noticed, some Blu-ray discs load faster than others. If you see a "Loading" icon on screen before playback, that means the disc is Java-enabled and the player needs time to prep and sort through all the Java programming. Sometimes, this is worth the wait, if the disc has useful Bonus View or BD-Live features. Sometimes it isn't. Believe me, I know how frustrating it can be to sit there waiting for a disc to load, just for a simple Bookmark feature you'll never use or annoying animated menus.
Discs that aren't Java-enabled will load faster, but (on most players) are still noticeably slower than Standard-Def DVD. There are several things going on during this delay. For one, the player must authenticate the encryption keys on the disc. Blu-ray has more complex encryption than DVD, and some discs have several layers of it. Another thing the Blu-ray player does is scan the disc to buffer the video. This video buffer is used to ensure that there's no pause during layer changes, for example. Because 1080p video is much higher resolution than DVD's 480i video, it takes longer to buffer.
And then there's the simple fact that some Blu-ray players are much faster or slower at loading a disc than others. The PS3 is still the standard bearer for Blu-ray loading times. A few recent standalone player models have caught up with its speed, but most trail far behind.
That will close out the latest edition of the HD Advisor. Check back next week for another round of answers. Keep those questions coming.
The latest news on all things 4K Ultra HD, blu-ray and Gear.