In Space, No One Can See Your Grain

For the moment, let’s classify this as speculation mixed with just a wee bit of paranoia. Until we can get some more clarification or confirmation, anyway. (For the record, the banner image above this post is exaggerated for effect.) Nonetheless, after an interview James Cameron gave this week, fans suddenly have a lot of reasons to worry about the video transfer quality for the upcoming Blu-ray release of his sci-fi classic ‘Aliens’.

Remember what a terrible hack-job Fox did with that atrocious ‘Predator: Ultimate Hunter Edition‘ Blu-ray recently? You know, the one smothered in so much hideous Digital Noise Reduction that any resemblance to motion picture film has been sapped away and the actors all look like plastic mannequins on screen? Yeah, that one. Naturally, that left fans (entirely reasonably) concerned about what the studio would do to the ‘Alien Anthology‘ box set scheduled for release this October. ‘Aliens,’ in particular, has always been a gritty, grainy movie. Would the studio really have the audacity to treat ‘Aliens’ as badly as it treated ‘Predator’?

Then came word that director James Cameron had been working with Fox on the release and had personally signed off on the new video transfer. Fans breathed a sigh of relief. Surely, a notorious perfectionist like Cameron would never let the studio mangle his work, would he?

Well, as it turns out, maybe. In a sit-down chat with ComingSoon.net to promote the theatrical re-release of ‘Avatar‘ (now even longer than ever!), the director slips in a few words about the ‘Aliens’ Blu-ray remaster at the end, where he more or less boasts that it’s been DNR’ed to hell and back, as if that were a good thing. Ah, crap.

Here’s what he actually says:

I just did a complete remaster of ‘Aliens’ personally, with the same colorist I worked on with ‘Avatar’. And it’s spectacular. We went in and we completely de-noised and de-grained it, up-rezzed it, color-corrected it end to end every frame. And it looks amazing. It looks better than it looked in theaters originally, because it was shot on a high-speed negative that was a new negative that didn’t pan out too well and got replaced the following year. So it was pretty grainy. We got rid of all the grain. It’s sharper and clearer and more beautiful than it’s ever looked. And we did that to the long version, to the “director’s cut” or the extended play. I call it the “FM Mix.” That’s dating myself!

My heart just sinks when I hear “completely de-noised and de-grained it” and “We got rid of all the grain.”

Cameron specifies that, “We did that to the long version, to the ‘director’s cut’ or the extended play.” Could that mean that only the extended cut will be DNR’ed but not the theatrical cut? (The Blu-ray has been announced to contain both versions.) Unfortunately, I find that doubtful. The Blu-ray will most assuredly use seamless branching to integrate the additional bits for the extended cut into the theatrical version. I can’t imagine that Fox will squeeze two completely separate video transfers onto the same disc. I think Cameron was just trying to emphasize that he cleaned up the whole movie, even the long version. Whatever happens with this video transfer will be an all-or-nothing decision that affects both.

Now, to be fair, it’s entirely possible that Cameron was just throwing out some buzzwords there, expecting that he was speaking to a non-techie audience. Perhaps he really meant that the film will not have any excessive grain or noise, beyond that originally photographed during the production, because previous releases of the movie had been transferred from poor-quality dupe film elements. In which case, I’d be perfectly fine with it.

But then I remember that Cameron also signed off on the ‘Terminator 2: Judgment Day – Skynet Edition‘ Blu-ray, which has a fair amount of obvious DNR, and looks inferior to the Blu-ray and HD DVD versions of that film released previously in Europe.

Digging way back into the archives, I also pulled up an article from a 1996 issue of ‘Widescreen Review’ magazine that explains the horror story behind how Cameron (not explicitly named, but the wink-wink-nudge-nudge implication is clear) screwed up the Laserdisc transfer of ‘Aliens’ back in the day. It’s a fun read.

Reprinted from Widescreen Review magazine, published by Gary Reber: Volume 5, Number 3, Issue 20. Dated August 1996. Article “Film-to-Tape Horror Stories: Inside Telecine” by Marc Wielage.

THE CASE OF THE KILLER GRAIN

One last story. There’s an infamous laserdisc out there – you might guess which one it is, but you won’t find out from me – which is known for being a particularly ugly transfer of a major science-fiction blockbuster.

Why do good-looking films wind up looking so bad sometimes? In this case, the director was too busy to supervise the film transfer, so he sent over his official representative, Mr. Y, to do the day-to-day work, then the director would come by once a week to view what had been done and then ask for changes where he felt they were necessary. So far so good.

Unfortunately, the facility chosen for this film transfer wasn’t exactly one of the top transfer houses in LA. (In fact, it’s since gone bankrupt.) Their Rank Flying-Spot Scanners were ten years old at the time of the transfer, and although well-maintained, there’s only so much that old equipment can do with difficult films. This particular Science Fiction epic had numerous opticals, explosions, and effects, which made it particularly difficult to transfer, with widely-varying exposures and density levels.

To make things worse, the telecine operator assigned to this job was a bit inexperienced. That, combined with the poor judgement of Mr. Y, made many of the film’s dark scenes come out somewhat grainy and “pushed”-looking. This is a typical problem when the gamma (mid-range) settings for the Rank are misadjusted, which tends to exaggerate grain while it brings out more detail.

When the director came in to view the finished result, he decided to bring in his own personal TV set to view the transfer. All the engineers and telecine operators at the facility were aghast, and tried mightily to explain to the director that there’s no way a $1,000 TV set can reproduce the subtleties of a $10,000 broadcast monitor, but the director wouldn’t hear of it. “Nonsense,” he retorted. “I’ve watched hundreds of films on this set. This is my own personal standard, and I just want to use it as an additional reference.” Numerous changes were made, just comparing the image on the expensive lab-grade monitor and the cheap consumer set, sometimes averaging a compromised setting between the two. This necessitated even more time and expense, since sometimes, the frustrated telecine colorist could make the image look good on one of the monitors, but not both at the same time.

The director was also unhappy with the grain in the problem scenes. As luck would have it, the transfer was recorded on the component digital D-1 video format*, which is now the standard for the telecine industry. Mr. Y suggested that they remove the grain by dubbing the master tape through a noise-reduction device. These noise-reducers (also known as grain reducers) essentially use a computer to make intelligent decisions on a pixel-by-pixel basis, analyzing which part of the picture is noise and which is actual detail, and then subtracting the noise pixels. Unfortunately, when overused, the noise-reducers tend to add a degree of “lag” and “smearing” to the image, as the overtaxed circuits can’t make their decisions fast enough. This adds artifacts and flaws, and also tends to make the picture soft.

When he looked at the new tape, the director felt that the image was better, but now it lacked the crispness of the original. Now, the second-generation D-1 tape was fed through an image enhancer, which sharpens images by delaying one line of information and adding a subtle black outline to sharp edges. The director viewed this tape, and he pronounced it better still, but now, too much of the grain was back!

Rather than do the transfer over again, Mr. Y made the decision to again feed the enhanced D-1 dub through the noise-reduction box, only this time, it would be done at a much lower setting. At last, the director viewed the fourth-generation D-1 tape, and approved the noise-reduced, enhanced, and noise-reduced image. But that doesn’t mean the picture looked good.

Any audiophile knows that the ideal amplifier is a straight wire with gain. The more processing you add to the circuit, the more distortion and noise gets added to the sound. The same is true of video. All this extra processing to this film image created a very strange, “mushy” kind of image. The grain patterns tended to “float” around the screen in odd, unnatural patterns – one of the artifacts of certain kinds of noise reducers – and the extra enhancement added a harsh edginess that exaggerated details and made them very unpleasant. In short, what you basically had was a terribly over-processed picture. If the original transfer had been made on decent equipment to start with, most likely, none of this would have been necessary.

A few weeks after the laserdisc came out, laserdisc buffs started complaining about how strange the transfer looked. This particular disc began attracting a reputation as one of the ugliest on the market. Someone managed to get ahold of Mr. Y, who concocted a story that the grain was an intentional part of the film, giving it a deliberate “texture” to those shots, and was a result of the high-speed stock used for the production. But even Mr. Y was at a loss to explain why sometimes, back-to-back shots had different degrees of grain and enhancement.

A few months later, after the studio had been deluged with complaints, they decided to try some tests to see if the film could look better with different equipment. A friend of mine was assigned the task of retransferring the first 10 minutes, just to compare it with the original version.

Mr. Y came in and was understandably perplexed and chagrined to find that the new transfer was sharp, crisp, full of detail, and yet had hardly a spec of grain. He was even more embarrassed to discover that the inexperienced operator who had done the first transfer had misframed the Rank and cut off quite a bit of the image on one side of the frame. The new transfer revealed at least 10 percent more picture, showing more detail and more of the sets and characters.

“I don’t care that it’s better,” he snapped. “We’re not going to go back to square one and re-do this picture from scratch. Besides, the director is much too busy to concern himself with this. The old transfer stays as-is.” And with that, he stormed out of the facility.

A few years later, a new transfer was quietly prepared and reissued to much fanfare. It was light-years better than the old one. True, the director again brought in his trusty TV set as his own “personal reference” and – despite the fact that the facility managed to drop it and wound up buying a new one – the new transfer was beautiful. Mr. Y subsequently left the director’s production company and went on to some success in a different part of the industry.

I don’t know if there’s a moral here, except that bad equipment and inexperienced people will invariably result in bad-looking laserdiscs. In fact, I’d argue that a great telecine operator can probably make better pictures on a mediocre Rank than a horrible telecine operator could on a great Rank. But the key is that excessive video processing isn’t the right way to fix a bad transfer.

———-

* The professional D-1 digital component format stores the luminance and color-difference signals separately, providing unusually high detail and very low noise compared to any other format. The D-1 tape signal is virtually identical to that coming out of the Rank itself, and subsequent dubs and tape generations should theoretically be identical to the original.

Yeah, that’s right, I remember this sort of thing and file it away for when I need it. What other blog are you going to read that will dredge up a 14-year-old magazine article about a Laserdisc transfer to find relevance with a current event? No other blog, that’s where. I can guarantee you that.

Anyway, that certainly was a long time ago. I’d expect that Cameron has learned a thing or two about video transfers in the meantime. However, this does seem to speak to his long-standing dissatisfaction with the movie’s photography, and his desire to “enhance” it digitally beyond what was originally photographed.

I have obviously not yet seen the ‘Aliens’ Blu-ray for myself. I will reserve judgment on its quality until I have the real thing in my hands. I’m still holding out hope that this new interview is misleading, and that the disc will in fact have an excellent transfer without any of that ugly, plastic-y DNR look that disgraces the ‘Predator’ Blu-ray. I guess I won’t know for sure until the disc is released in a couple months. Let’s all cross our fingers.

28 comments

      • Colin

        I’m one of the “nuts” who thought “Predator: Ultimate Hunter Edition” looked absolutely stellar.

        I’m no fan of grain, and always claimed that it existed on accident because of limitations in film technology. After all, film’s purpose was to create a good looking, realistic moving image.

        Black and white film and no soundtrack used to be the norm because of limited technology. Technology improved, however, and films gained soundtracks and primitive colors. Film and sound technology improved so colors became more accurate/interesting and sound became more detailed and immersive. Technology advanced, but it wasn’t perfect in the 1980’s.

        You still had grain on film which was a defect, not an aesthetic enhancement! The human eye doesn’t see grain in real life, it’s just a limitation in the way of trying to create a progressively more true-to-life, interesting moving image.

        Nowadays, we have digital video and films like Avatar which I (and most others) claim is THE best looking Blu Ray currently in existence, PERIOD.

        We also have technology that can erase some of the defects of lower technology older films with the dreaded “DNR”.

        Interestingly, many of the arguments I heard against DNR were originally about preserving the director’s vision, but now we see a director using many of my arguments! Grain didn’t preserve the director’s vision, it was simply a byproduct of limited technology that the director now wants to remove!

        Now the tables have turned a bit and people are saying “to heck with what the director wants, we want grain and noise!”. I have to do a bit of a facepalm, given my perspective.

        Anyway, to me it’s like this: shadow detail is great, but it’s only one aspect of creating a great, interesting, real-to-life image. Grain isn’t real-to-life and is something that gets in the way of a pristine picture.

        Undoubtedly, some detail is going to be lost in the process of removing film grain, but in the process you gain a lot of picture quality in one area at the same time as losing a bit in another and I think the tradeoff is worth it.

        Now you have two of the folks behind “Avatar”, which is THE BEST LOOKING Blu Ray out there talking about how great they made “Aliens” look and I’m not scared a bit. I’m looking forward to seeing it!

        • EM

          If you think “Avatar” looks realistic, you must be on some serious drugs or else may need to be prescribed some. Sorry, the real world is not populated by blue giants with tails who live under floating mountains.

          I don’t know enough about what fellows like Edison and the Lumières were thinking when they were inventing the cinema, for me to able to dispute your claim that the purpose of (motion-picture) film was to create a good-looking and realistic moving image. Indeed, that very well may have been their purpose. (Or at least one of them—particularly for Edison, financial gain comes to mind as another purpose.) But it’s clear that such is not always the purpose of persons who use film to make masterpieces. Good-looking though they may be, good films are rarely mistakable for reality or even records of reality. This is because they are works of art, and art—while it may imitate life and be imitated by life—by its very nature does not have a 1:1 relationship with reality.

          Does this mean that there are no films that are better without grain than with? No. Does this mean that there are no films that benefit from color? No. Does this mean that there are no films that work better with live action than with animated illustrations? No. But reality is not everything (if it is, then why bother with fiction?)—and reality, like beauty, is often in the eye of the beholder.

          By the way, if you dislike film grain, black-and-white photography, silent cinema, subtitles, editing, two-dimensional imagery, non-olfactory presentations, and/or other such unrealistic artifices often found in cinema, that is perfectly your prerogative. I make no claim that you are wrong to have your esthetic preferences. I do make the claim that you are utterly wrong to imply that others are wrong to have theirs.

          • Colin

            Ah, but I think you confuse the purpose of video with the purpose of storytelling.

            Stories should be fascinating, moving, awe-inspiring, fill you with dread or bring you to tears in an extraordinary way, but that doesn’t negate video’s purpose of making the experience real to the viewer: realness aids in suspension of disbelief and also the enjoyment of the film by extension.

            Make no mistake about it, if computer animation were at “Lawnmower Man” levels today, “Avatar” never would have been made and certainly wouldn’t be the highest grossing film of all time if it were. In fact, James Cameron wanted to make “Avatar” years ago, but lacked the technology to make the world of Pandora “real” enough to the viewer.

            As far as aesthetic and artistic preferences go, I can fully appreciate the artistic value of films like “A Scanner Darkly” or “Sin City”, which I think are visually impressive directly because of the warped sense of realism (fantasy?) the worlds in the movies create, and I can even understand the desire to leave the detail that accompanies grain.

            Where I think some so-called videophiles go awry is in the seemingly religious worship of “grain”; not detail, mind you, but grain, to the point where you get thousands of internet bloggers rambling on about the moral travesty of degraining a remastered Blu Ray that they haven’t even given themselves a chance to watch, let alone enjoy.

            The fact is that I liked “Predator: Ultimate Hunter Edition” and thought its image was mind-blowingly good, and I suspect if you grabbed a bunch of average people off the street to show them the degrained, recolored version of the movie, they’d enjoy it more than the original version until they were told not to by the grain cult who try to convince them “grain is good”.

            How about this, I’ll stop preaching about the superiority of picture quality after removing grain when the grain haters stop filling the internet with regurgitated lines about the horrors of DNR. If the haters are willing to respect my picture quality preferences so much, I’d like them to stop trying to influence (adversely, IMO) the movie industry by writing articles saying “My heart just sinks when I hear ‘completely de-noised and de-grained it'”.

        • Josh Zyber
          Author

          “The human eye doesn’t see grain in real life.” You know what else the human eye doesn’t see in real life? Aliens. Movies are not real life.

          • EM

            EM types and types
            Along comes Josh with Cliff’s Notes
            Either way, grain’s good

            (Oh, wait, this isn’t the contest…)

          • Colin

            Please see above.

            Detail is good because it adds a realistic, interesting image. Grain is bad because it detracts from the same.

            Sometimes you end up with a better picture when you sacrifice some detail to get rid of grain.

          • Josh Zyber
            Author

            Say you want to look at the Mona Lisa or Monet’s Water Lillies. You can either look at the original paintings and see the brush strokes that composed the image, or you can look at a smoothed-over Xerox copy that loses all of the textural detail.

            You’re arguing in favor of the Xerox version right now.

          • EM

            Josh, why would you want to look at those oily scraps? Their outmoded paint technology lacks real-life accuracy and movement (c’mon, a woman who never blinks?) and is therefore defective. And the lilies are all blurry!! They would need some serious digital scrubbing to insert much-needed detail. I don’t see how anyone can recognize that they’re looking at real-life aliens on other planets while viewing this stuff. And besides, I’m tired of other people saying they prefer the paintings “as is” while proceeding from premises different from yours; therefore, only my viewpoint is right! But I’ll make a deal: I promise I’ll keep quiet about the whole thing only once you all shut up and let the original versions be lost forever. That’s fair, right?

          • Colin

            EM, my response to Josh is “awaiting moderation”, I think because I included links to what I referred to, but to summarize my points.

            1. Renaissance paintings and remastered films can’t be directly compared for a number of reasons.
            2. The Mona Lisa has been “remastered”, if you will, several times during its lifespan, like most old paintings.
            3. Modern technology recently showed what the Mona Lisa may have originally looked like, and the colors certainly are more vivid and interesting than what you see in the Louvre. Google it if you’d like, I’ll withold the links this time.

            I’ll amend the response by adding the question of if you’d begrudge artists who want to improve the quality of subsequent copies of their work and also ask if you were against the restoration of the Sistine Chapel. After all, paint technology was such that works cracked and faded over time.

            For that matter, do you advocate erasing the brilliance of Michelangelo’s work and trying to get to the painting of the stars and sky that originally laid beneath? That was the ORIGINAL historic work.

  1. Tim

    I’m also a bit torn on this issue. Typically, guys like Josh hate it when companies, such as Alliance in Canada, mess with the aspect ratio of films because it differs from how the director wanted the film to appear.

    Well, in this case, Cameron is the one tweaking the film. Could it be that Cameron — assuming the Blu-ray version of Alien turns out the way some people are expecing — wanted Alien to be virtually grain-free when he originally shot it? Was he simply limited by a low budget, poorer technology back when he first created Aliens?

    Finally, if “director’s intent” is the be-all-and-end-all, shouldn’t we (read: Josh) embrace this version and not reject it simply because it won’t be like we remembered it. It reminds me of that ridiculous “The book was better than the movie” refrain…

    • Josh Zyber
      Author

      It’s a matter of revisionism. Cameron may want Aliens to be grain free now, but that’s not how he made the movie in 1986.

      Maybe there were factors beyond his control at that time, such as budget or available resources, that prevented him from making the movie a certain way. I would say that’s the case of at least 99% of all movies that have ever been made. At the end of the day, the movie is what it is. It’s a grainy movie. It’s been a grainy movie for the last 24 years. It seems pretty ridiculous that, rather than just accept Aliens for what it is, he’d decide that he needs to digitally re-mold it to be as sparkly and smooth as Avatar, a completely different picture.

      Take a look at The French Connection on Blu-ray. William Friedkin destroyed that movie. You can say, “Well, that’s what the director wants.” When do we draw the line and tell the director that he’s making a bad decision?

      Directors are not gods. They’re fallible human beings, just like the rest of us. Sometimes they do stupid things. And when they do, we should call them on it.

      With that said, we’re still basing this argument entirely on speculation. We won’t really know how much Cameron has screwed up this transfer (if at all) until the disc is released.

  2. it can be director aproved and people would still piss about it.see the star wars movies as an example. it’s always an odd situation. people love grain , people hate grain. i happen to think the t2 skynet edition is a great looking blu. some dont. my feeling is people look at these discs with a fine tooth comb then go ah ha!!! then snap a shot and goes see on the dvd it looked like this and now
    it looks like this.
    well it didnt look like this on HDNET when i saw it 13 million years ago but i remember seeing this 14 million years ago and it looked nothing like this in the theater.

    • EM

      The “Star Wars” movies are a mixed bag as an example. Richard Marquand, director of “Return of the Jedi”, died in 1987 and has been in no position to approve of the changes made for that film’s 1997 “special edition” or any subsequent release. I’m less clear on the situation with Irvin Kershner, director of “The Empire Strikes Back”. My impression is that he has condoned the changes to his “Star Wars” film (at least for the 1997 version), though I’m pretty sure he had no input into them, and I’m not sure his condoning, which might have been simply polite deference to George Lucas, extended to actual endorsement. Perhaps someone here knows more about Kershner‘s stance. In any case, as the “Jedi” situation makes clear, not all the changes to the “Star Wars” films were made with their directors’ approval. When discussing the issue, Lucas usually talks about “artists’ rights”—not “directors’ rights”—apparently wanting us to believe that in the case of all the “Star Wars“ films, he is the only artist involved.

  3. Turd Furgeson

    Well, it’s a good thing that I’m not a fan of Aliens, or else I’d be quite worried about this issue. However, what worries me even more is the fact that Alien (my only favorite of the bunch) has [with virtually all certainty] been given an director-unsupervised transfer. That’s really why Predator turned out so bad, because Fox just sent some fool to run the errand.

  4. pete

    that was a fascinating read. My fingers are crossed tight but I think Josh’s rant is totally on the money. I don’t want revisionism to ruin blu ray.

  5. jgslima

    Regardless video quality issues, I will not buy this box because it is too expensive (besides the fact that we are forced to buy the complete box. I like only Alien and Aliens).
    Compare that price with the box of Back to the Future. One is $100 and the other $50. Ok, Aliens Anthology has 4 instead of 3 movies, but this does not justify.

    • jgslima

      Actually in my point of view, the Alien Anthology has only 3 movies. Because Alien Ressurection… oh dear, this S-U-C-K-S. I remember leaving the theater with 30 minutes of projection.

  6. Patrick A Crone

    Call me crazy but I’ve always liked the grainy look Aliens has. Given the gritty nature of the movie it always seemed to be a nice fit. I recently watched Aliens on Netflix streaming in HD and thankfully it still had the same film grain. I’ll be really disappointed if the blu ray looks like a glossy waxy hack job

  7. Hi.
    I wanted to ask something since you lot seem like the people who really know something about quality and the things behind it. I’m a huge Alien fan (I think all four movies are exceptional and I actually would have loved to see a fifth one, the topic of my thesis is the Alien franchise, too) and I just recently bought the quadrilogy box off an auction site (I know it’s old news but I didn’t have the money way back then), it’s all official and original but I noticed that Alien Resurrection looks unbelievably bad grain-wise compared to even the very first movie. Why is that? Making screencaptures is pointless, theres not one sharp frame in the whole movie. I’d really appreciate it if you could help out a total layman here who’s very curious.

    • Josh Zyber
      Author

      You know, I have all the Alien films on DVD, but I don’t think I’ve bothered to watch Resurrection since I saw it in the theater. (No, that’s not true. I also had it on Laserdisc.)

      Keep in mind that all of the Alien films were made by different people and are each very stylistically different from one others. The cinematographer on Resurrection was Darius Khondji, who’s most famous for shooting Se7en. (He also worked on director Jeunet’s Delicatessen and City of Lost children). Khondji favors imagery that’s dark and grainy.

      As I recall, the Resurrection DVD from the Quadrilogy set was given pretty respectable reviews when it came out, but of course that was a good 7 years ago.

  8. Shayne Blakeley

    You know, I always liked Resurrection. It certainly wasn’t my favorite of the series or anything, but I thought it was enjoyable enough. Maybe it’s just because I’ve had a crush on Winona Ryder as long as I can remember, I don’t know, but I thought it was better than Alien3 and leaps and bounds better than either AvP.