Editor's Note: Each Friday, High-Def Digest's own HD Advisor will answer a new round of questions from our readers. If you have home theater questions you need answered, send an email to [email protected]
Answers by Joshua Zyber
"Resume Play" Feature on Blu-ray
Q: When the PS3 was first introduced as a Blu-ray player, a disc had to be restarted from the beginning every time the disc was stopped. Then, one of the first firmware upgrades to the PS3 brought the capability to automatically return to the spot where the disc had been stopped without re-starting. Unfortunately, not all discs utilize this feature. It seems like at least half of the BDs I watch restart by going back to the beginning. Why don't all discs work the same way?
A: Although it's been a while since I last addressed this question here in the Advisor column, I get asked about the "resume play" feature on Blu-ray quite frequently. In fact, browsing through my backlog of questions, this may even be one of the most common questions I'm asked. I suppose it's time that I actually brought it up again here.
Erratic support for "resume play" is one of those areas where Blu-ray has sadly taken a step back from the convenience and ease-of-use we've grown accustomed to since the DVD era. Most DVD players support this function for any DVD disc. If you stop a movie before you're done with it, you can simply hit Play to restart right where you left off. Many models will even store this information in persistent memory after you eject the disc, for up to the last 5-10 discs you watched. Unfortunately, things are more complicated on Blu-ray.
When the Blu-ray format debuted in 2006, there was no "resume play" option at all. Every disc had to be restarted from the beginning if you stopped playback. Later player models started to incorporate the ability, but only for some discs, not all. (Some older players, such as the PS3, were also upgraded by firmware to add the feature.) However, even today, not all discs can be resumed from a stop-point, not matter which player you use.
BD-Java programming is the culprit here. Discs without Java will allow the "resume play" option if the player offers it. Unfortunately, discs with Java by default do not allow resume play. You can generally tell whether a disc has Java or not the first time you try to load it. If one of the first things you see is a "Loading" icon or status bar on screen, the disc has Java. If not, it probably doesn't (though this isn't necessarily a 100% guarantee).
Recently, some studios have discovered how to add a form of resume play within the Java programming itself. However, instead of the disc simply starting right up where you left off, you'll typically get a message on screen asking whether you want to resume your position or not. This may not appear until after the studio logo, copyright notices, and other annoyances. So there still may be a minute or two delay before you can jump back into the movie. And this feature has to be specifically authored onto the disc by the studio ahead of time. At present, there's no way for any Blu-ray player to get around this if the disc has Java but wasn't authored with resume play.
The worst part of all this? Most discs that are authored with Java don't actually need it for anything useful. For example, take the gimmicky menu system on most Sony Blu-ray releases – the ones with the menu options that jump around on screen when you cycle through them. That's the sort of pointless function that Java is being wasted on, at the expense of basic convenience features like resume play.
That's not to say that Java is always worthless on Blu-ray. Some discs actually do need it. Bonus View and BD-Live features require Java, for instance. However, in many cases it doesn't offer much value, is more nuisance than benefit.
Film-to-Video Transfer Process
Q: How are Blu-ray catalog titles authored? Do the studios start with the film negative or is some kind of digital master used? It confuses me when I hear about a Blu-ray using the same master used for the DVD. That talk makes me nervous. Wouldn't the studio have to start from scratch with a film negative, and clean the negative, to get the best image possible?
A: For most of the history of cinema, the majority of movies were shot on 35mm film. These days, many movies (but still a minority) are shot digitally. Even a significant portion of those shot on film today go through a Digital Intermediate (or "DI") process in post production. After the photography is completed, the film elements will be scanned to produce a digital file, which is used for all editing, color grading, etc. When all of that is done, the filmmakers are left with a master file that can be output back onto film or used as the basis for D-Cinema distribution. The DI will also form the basis of the video master used for all home video releases.
This isn't quite what you asked me. I mention it because the process for mastering catalog titles has some commonalities.
Of course, Digital Intermediates are a fairly recent development in the history of filmmaking, having taken prominence only in the last decade. Before that time, all movie post production work was completed on film from start to finish. After editing and color timing was completed from dupe film elements, a movie's original camera negative (OCN) would be conformed to match the final edit by a person called a Negative Cutter. Intermediate film elements called interpositives and internegatives would be created from that, and the OCN would then be shipped off to archive. Theatrical release prints were struck from the interpositives or internegatives. To reduce wear and tear, the OCN itself would be used very infrequently, if at all.
In the digital home video era, when a studio decides to transfer a movie that was produced without a Digital Intermediate, typically either an IP or IN will be scanned to create a digital video master. (Release prints are usually only used in a worst case scenario.) That video master is similar to a Digital Intermediate. It will be used as the basis for all subsequent video releases. Copies of the master will be transcoded to the desired format, scaled to the necessary output resolution, and digitally compressed.
In the early days of video, those film elements were often only scanned at standard-definition resolutions, leaving the studio with standard-def video masters. Since about the mid-'90s, most studios switched to scanning their movies at HD resolutions, even though the only home video formats then available were standard-def. For DVD release, the HD masters would be downconverted to SD resolution before authoring. That's still the case. Today, the exact same HD master will be used for both DVD and Blu-ray. One copy will be scaled to 1080p and the other to 480i, but the source remains the same.
At the time, it was assumed that these HD masters would "futureproof" the studios' movie catalogs for decades to come. As we learned throughout the history of DVD and now Blu-ray, that isn't necessarily the case. The quality of video transfer technology has improved dramatically over the years. HD masters created originally for DVD may be riddled with Edge Enhancement, Digital Noise Reduction, and other problems that leave them looking quite dreadful when re-used for Blu-ray.
Depending on the title and the studio involved, some studios (ahem, Universal) may be content to re-use these old, inferior DVD masters anyway for catalog titles. However, high priority titles may merit being "remastered." In a best case scenario, this means going back to the film elements (often the IP or IN again) to rescan using modern equipment and techniques. If the quality of the IP or IN isn't satisfactory, the original camera negative may even be pulled out of storage. But studios try to avoid this if possible, because they don't want to put unnecessary wear on the negative.
Unfortunately, lately we've seen some studios (I'm going to have to pick on Universal again) that throw around the word "remastered" to mean instead that the old DVD master has simply been digitally tweaked with new sharpening or DNR tools in the hopes of making it look somewhat better – or at least somewhat less awful. This is a cheapskate option that will result in limited returns. For the best quality, re-scanning from the film elements is practically a necessity. Some studios may use the buzzword "restored" to mean that they went to this much trouble and expense. That word may also imply that more advanced techniques such as frame-by-frame scratch removal and damage repair were employed.
Now, keep in mind that not all older video masters are necessarily awful. That depends on the age of the master, the studio, the equipment used, and the people who did the work. Some may still produce satisfying, even very good results. Also, not every studio labels every title that has had a new film-to-video scan. You could be getting a fresh remaster and not even know it. Just because the movie you want to watch is an older title that doesn't boast the words "remastered" or "restored" on the case, that doesn't automatically mean that it's going to look like crap. These things really have to be evaluated on a case-by-case basis, and that's exactly how sites like ours treat them when reviewing the discs.
The HD Advisor knows many things, but he doesn't know everything. Some questions are best answered with a consensus of opinions from our readers. If you can help to answer the following question, please post your response in our forum thread linked at the end of this article. Your advice and opinions matter too!
A/V Receiver Upgrade Advice
Q: I'm considering the purchase of a Samsung PN58C7000 plasma and would like to tie it into some existing components previously used primarily for music. I'd also like to add other components to provide an enhanced video environment. This is in a 15'x21' family room with 18' ceiling. I'm using an old Denon AVC-3000 Surround Amplifier and pair of Vandersteen Model 2 front speakers. I plan to add a PSB Image C5 center speaker and pair of PSB Alpha LR1 rear surround speakers. Although my primary source for TV viewing is Verizon's FiOS System, I would still want to use VCR and disc sources. Will the Denon AVC-3000 properly handle both audio & video inputs/outputs of the new TV without degrading signals? Or should I consider buying a new amp/receiver? If a new one needed, do you have any recommendation in the $500-$1,000 range of sales price (not MSRP)?
JZ: From my Googling, it appears that the Denon AVC-3000 is a 1990 model Dolby ProLogic receiver. It lacks digital inputs of any sort (either audio or video). Forget about Blu-ray lossless audio; this model can't even do discrete 5.1 surround. The best you would get from this is to connect the stereo analog connections from your cable box into the receiver, to get a very basic form of matrixed surround sound. Personally, I'd suggest that it's time for you to upgrade. As for what you should buy, I'll leave those suggestions to our other helpful readers.
Check back soon for another round of answers. Keep those questions coming.
Joshua Zyber's opinions are his own and do not necessarily reflect those of this site, its owners or employees.