When 4k Is Not 4k

For the second year in a row, 4k Ultra High Definition was all over the Consumer Electronics Show in Las Vegas. This year, the manufacturers promise not only more 4k TVs, but (with the arrival of Ultra HD Blu-ray) some actual 4k content to watch on them. There’s just one catch: Most of the movies you’ll watch in “4k” aren’t 4k at all.

Here’s the dirty secret about the industry’s move to 4k or higher displays: The majority of modern movies are either photographed digitally at 2k resolution or have a 2k Digital Intermediate. While it’s true that some movies are indeed starting to be photographed with 4k cameras (and movies shot on film may get scanned at 4k resolution), most of them still get downgraded to 2k for the post-production workflow. The higher pixel resolution of 4k requires a big increase in bandwidth resources that most post houses can’t handle. And, ultimately, most viewers can’t tell the difference between 2k and 4k anyway.

Think I’m exaggerating? Let’s look at some of the launch titles that have been announced for early release on the Ultra HD Blu-ray format this spring.

Here are the titles that Warner Home Video has announced:

‘The Lego Movie’ – Animated on a 2k DI
‘Mad Max: Fury Road’ – Shot in 2k, with a 2k DI
‘Man of Steel’ – Shot on 35mm, with a 2k DI
‘Pacific Rim’ – Shot in 5k, but only a 2k DI
‘Pan’ – Shot in 3k, with a 2k DI
‘San Andreas’ – Shot in 3k, DI is not listed but probably 2k

Yes, every single film that Warner plans to release on the 4k Ultra HD format is a 2k movie.

The 20th Century Fox release titles are only marginally better:

‘Exodus: Gods and Kings’ – Shot in 5k, with a 2k DI
‘Fantastic Four’ – Shot in 2k, with a 2k DI
‘Kingsman: The Secret Service’ – Shot mostly in 2k, with a 2k DI
‘Life of Pi’ – Shot in 2k, with a 2k DI
‘The Martian’ – Shot in 5k, with a 2k DI
‘The Maze Runner’ – Shot mostly in 2k mixed with some 5k, with a 4k DI
‘Wild’ – Shot in 2k, with a 2k DI
‘X-Men: Days of Future Past’ – Shot in 2k, with a 2k DI

That’s 13 launch titles from two major studios, and only a single movie was actually produced at 4k resolution (‘The Maze Runner’) – and even that one was mostly photographed in 2k. And these aren’t just old movies made before 4k was possible. Even major big-budget tentpole blockbusters from the past year were made in 2k. Many more will continue to be made in 2k this year and going forward too.

Only Sony appears to have a genuine commitment to making movies in 4k. Here are that studio’s Ultra HD Blu-ray launch titles:

‘The Amazing Spider-Man 2’ – Shot on 35mm, with a 4k DI
‘Chappie’ – Shot in 5k, with a 4k DI
‘Hancock’ – Shot on 35mm, with a 4k DI
‘Pineapple Express’ – Shot on 35mm, with a 2k DI
‘Salt’ – Shot on 35mm, with a 4k DI
‘The Smurfs 2’ – Shot in 4k, with a 4k DI

Forget About 3D with Ultra HD

In all the hype about Ultra HD, the manufacturers and home video studios have also been careful to downplay another issue that some viewers will find disappointing. If you happen to be a fan of 3D (and it seems that fewer and fewer people are these days), you’re completely out of luck. The Ultra HD format does not support 3D. I say again for emphasis: The Ultra HD format does not support 3D. At all. Period. End of discussion. It’s not in the spec. Nobody has any interest in adding it to the spec anytime soon. As far as Ultra HD is concerned, 3D is dead.

How can this be? Why would the new, super-advanced format drop a feature that’s already available on regular Blu-ray?

The first thing you need to understand is that there is no such thing as a 4k 3D movie at the present time. Not in theaters, not anywhere. All 3D movies are 2k. Yes, this includes that special overpriced screening of ‘Star Wars: The Force Awakens’ you just saw in super deluxe IMAX 3D Laser Projection from dual 4k projectors. Even that was upconverted from 2k. Nobody in Hollywood is making 3D movies at 4k. The resource requirements are too huge. Given that the public’s interest in 3D is waning, there’s been no big push in the industry to invest in 4k 3D. That being the case, the Ultra HD Alliance decided to dump it altogether.

If you enjoy 3D and want to continue watching movies in that format, you’re stuck with standard Blu-ray.

Ultra HD Is About More Than 4k

If most of the films getting released on 4k Ultra HD Blu-ray are really 2k movies, what’s the point of Ultra HD at all? Honestly, the increase in pixel resolution from 1920×1080 to 3840×2160 is the least interesting thing about Ultra HD. At the screen sizes available in almost all home theaters, 1080p already hits a sweet spot for delivering richly detailed images with no visible pixel structure. Our human eyes are not capable of resolving much of the additional detail 4k may offer, except on perhaps the largest of projection screens. That extra resolution is more beneficial on a huge 50-foot cinema screen, but for the needs of home theater, it’s basically irrelevant.

Fortunately, Ultra HD brings other new improvements over regular High Definition. The most notable of these are enhanced colors and High Dynamic Range.

You may have read about how Ultra HD will offer millions of new colors that HDTVs of the past were never capable of reproducing. While technically accurate, those claims are largely overblown. The 10-bit color depth and expanded color gamut will be subtle improvements. Ask yourself when was the last time you watched a Blu-ray and thought it wasn’t colorful enough? (Please spare me the inevitable snark about watching black-and-white movies.) Many of the new colors in the expanded gamut are beyond the range of human vision – and of those that are visible, most of today’s two-tone, digitally graded, teal-and-orange movies will never use them. However, the 10-bit color depth means the elimination of banding artifacts in color gradients, which are a genuine limitation of the 8-bit color that standard Blu-rays are encoded with. Artifacts like that are already pretty rare, but Ultra HD shouldn’t suffer them at all, which is a good thing.

High Dynamic Range is by far the most interesting development of Ultra HD. HDR movies have much darker darks and much brighter brights than those of the past, yielding a richer, more vibrant and lifelike image. HDR projection started rolling out to theaters over the past year, and the response from viewers has been overwhelmingly positive. Now that experience is coming to the home as well.

With that said, be aware that not every movie is HDR. A movie has to be specifically graded for the extended dynamic range in post-production. So far, only a handful of movies have undergone that treatment. The very first HDR movie was Disney’s ‘Tomorrowland’, which was released theatrically on May 22nd of last year. Other notable HDR titles include ‘Inside Out’, ‘Pixels’, ‘Mission: Impossible – Rogue Nation’, ‘The Martian’ and ‘Star Wars: The Force Awakens’.

Not every movie that gets released on Ultra HD Blu-ray will be encoded in a High Dynamic Range format. (The UHD spec contains three competing HDR standards.) However, it is possible to re-grade older movies into HDR, and of the supporting studios, Warner Bros. has announced that it plans to do so for all of its Ultra HD Blu-ray releases. I’m not entirely sure how I feel about this. Re-grading a movie for HDR is a form of revisionism that the filmmakers did not intend when they originally made the movies. If those filmmakers are still alive and approve the decision, I might be interested to see the results, but I have no more interest in ever watching ‘Lawrence of Arabia’ in HDR than I’d want to watch ‘Casablanca’ colorized.

That 4k TV You Just Bought Is Already Obsolete

Sadly, the Ultra HD rollout has been a confusing mess. The UHD Alliance only just recently settled on some of these critical features, and 4k TVs purchased in the past (even many still available in stores today) may not be compatible with either the enhanced colors or High Dynamic Range. To truly take advantage of everything that Ultra HD Blu-ray offers, you need to have a display labeled with the new “Ultra HD Premium” branding.

Even then, with three competing optional HDR standards, there’s no guarantee that the HDR decoder built into any given Ultra HD Premium set will be able to decode the HDR format on a specific Ultra HD Blu-ray disc. What a disaster!

164 comments

  1. OmarF

    “Only a white man would think that tearing a strip of cloth off the bottom of his blanket and sewing to the top, gives him a longer blanket.”
    –A native American’s statement about Daylight Savings.

    Richard,

    In your reply, you finish by saying:
    “A UHD 2160p TV has exactly 4 times the amount of pixels as an HD 1080p TV.”

    So this statement admits that the UHD TVs everyone is running around buying today, are indeed, 2160p—double the (vertical) resolution of 1080p TVs. They are 2K televisions, not 4K televisions, as they are being advertised to be. This was exactly my point to begin with. It’s a simple, self-evident truth. Our UHD TVs are 2K televisions, not 4K. You say that cinema resolution and television resolution are two different things. Fine, so grant that, as long as we are talking about the TVs people are shelling out their hard earned money for, we are moving from 1K (1080p) televisions to 2K (2160p) televisions, NOT to 4K televisions (which would actually be 4320p on the vertical).

    [Related side point: when I said that 2160 is double the resolution of 1080, I was speaking of it in terms of vertical resolution, for purposes of reference to model lineups and what people are familiar with buying, and not in terms of absolute pixel count. Of course, 2k is double 1k on each of two axises, which is quadruple the number of actual pixels. I didn’t bring that up in my first post because I didn’t want to obfuscate the point.]

    You express the difference between digital cinema being measured in horizontal resolution, and television measuring resolution vertically. And this ties directly to my point—-the ACTUAL resolution of home TV, vs. the consumer’s PERCEPTION of their home TV’s resolution, is what I’m talking about. My contention is that somewhere along the lines there’s been a marketing sleight of hand, and home sets are being referred to in terms of what you’re calling “cinema resolution”, rather than their native “home resolution”, without any explanation being made, and people are being left with the impression they’re getting a lot more than what they think. And that’s not fair to the consumer.

    Today’s UHD TVs are indeed 2K televisions, not 4K televisions. Just as you pointed out. Calling them 4K TVs is a marketing ploy to make people think they are making a much huger leap forward in technology than they really are. It’s so easy and more sellable to say, “Hey, you’re going from 1K to 4K overnight!”… (Without telling the buyer that one TV is counting it’s resolution up, but the other counts across!) …Rather than, “Hey, you’re going from 1K to 2K! And really, you may not think that’s a very big leap, but really you’re getting quadruple the number of pixels! Let’s park it over on top of the washing machines and do the math, and you’ll understand why you should fork out a quarter of your yearly income for a new TV and updated receiver, what to speak of those Atmos speakers…”

    So, along comes the ploy. Make them think they’re getting a lot more, so they’ll buy…again.

    Further thoughts regarding the difference between digital cinema resolution and home theater resolution.

    You said,
    “Digital Cinema 2K and now 4K and 8K has ALWAYS been based, unlike TV resolutions, on the HORIZONTAL resolution not the vertical resolution. And NO, the resolutions have NOT been reversed or “flipped” either.”

    …Well, yes, when I say flipped, that’s exactly what I mean. Using horizontal pixel count to determine the resolution of the image/display, vs. vertical pixel count. Reversing the way in which the pixel count is expressed, 1080 x 1920, vs. 1920 x 1080, IS flipping it. Both statements are mathematically equivalent. So WHY has the industry suddenly chosen to “always” refer to digital cinema, which is barely ten years old, by horizontal resolution, instead of vertical? And by the way, it has not “always” been so, either. Recall when Collateral was shot, it was said to have been “shot in 1080p” and there were concerns about its ability to survive longterm in the home theater venue, because indeed the resolution of the camera Michael Mann used was 1080 x 1920, and we knew things would eventually scale upwards. Nobody EVER spoke of it as having been filmed in “2k”, did they? No one said its “digital cinema resolution” was 1920 x 1080. When the early digital movies were filmed, we spoke of them as being “1080p”. They were still digital cinema back then, so why flip everything around now?

    When movies were shot and watched on film, we never spoke of resolution. On the consumer level, the whole business of counting pixels was introduced with the advent of LD and DVD. Before this, resolution was relegated to discussion of broadcast standards and most of us never heard about it. And most people were never aware of what the pixel count on their televisions was. When it was brought before the public, it was as a means of easily quantifying for them the improvement in the image they were receiving on their new televisions and DVD players, over VHS and standard broadcast, and it was always expressed in terms of vertical resolution, as was consistent with broadcast standards for some 50 years.

    Vertical pixel count was not just relevant to television, it became relevant to cinema when we started digitizing all video media, whether that be for computer rendering of special effects in television, or cinematic post production, or products on shiny discs. We were digitizing both TV and cinema, and we were talking about them in terms of vertical resolution by default. From ST:TNG, to O Brother, Where Art Thou?, there was never a discussion of “cinema resolution” vs. “television resolution”. Exactly when and where and by whom was it decided that this difference needed to occur? Why was it decided that “cinema resolution” should be expressed in terms of horizontal resolution and not vertical? Regardless of whether we’re counting kilobytes or scan lines, why after over 50 years of precedent, should we reverse the way in which we express resolution and suddenly start using a different variable (horizontal rather than vertical) as our means of expressing the value of our resolution? …Other than the fact that in widescreen presentation, horizontal resolution is almost double that of vertical, and a much more impressive sounding number when you’re trying to get someone to buy another new television, a couple of years after you just told him that 1080p was the broadcast standard for the next twenty years, and that that 1080p set would be all he needs for the foreseeable future…with the exception of that 3D TV you just suckered him into buying, too, and for which he has a meager dribble of overpriced discs to choose from, along with those ridiculous glasses.

    My point still stands. At the end of the day, we are watching 2K UHD television video, on the 2K UHD televisions we bought. The television broadcast material we are watching is not 4K, and the televisions are not 4K, regardless of how you want to account for resolution during the filming process.

    Omar

    • Richard

      Omar,

      2K, 4K, and 8K are not “true” home resolutions. The “monikers” are simply “CARRY OVERS” (if you will) from the Digital Cinema world. As I already explained above, Digital Cinema 4K is 1024 based and is 100% accurate. However, when the 4K moniker is carried over to TV resolutions, it is no longer 100% accurate DUE TO THE FACT THAT TV HAS A DIFFERENT “ASPECT RATIO” THAN DIGITAL CINEMA – IT HAS NOTHING TO DO WITH THE RESOLUTION NUMBERS BEING “REVERSED”.

      Therefore, since TV and Digital Cinema have different “ASPECT RATIOS”, when you transfer the movie from Digital Cinema to TV you have to either: A. crop out the sides of the movie, reducing the horizontal resolution, or B. letterbox the movie (place “black bars” at the top and bottom), reducing the vertical resolution.

      If you crop the movie, the movie now has a different horizontal resolution (3.8K) than the Cinema’s 4K resolution. THIS is the reason that so many people argue and disagree with using the 4K moniker for TV. IT HAS NOTHING TO DO WITH USING THE “HORIZONTAL” RESOLUTION VS. USING THE “VERTICAL” RESOLUTION. This is a completely separate issue.

      As to your comments about 1080p being 1K and 2160p being 2K. (Now read this slowly and carefully). 1080p HAS NEVER BEEN REFERRED TO AS 1K, NOR HAS 2160p EVER BEEN REFERRED TO AS 2K. That’s not how it works.

      1080p has sometimes (incorrectly) been referred to as 2K (based on it’s HORIZONTAL resolution of 1920 and NOT it’s vertical resolution of 1080). However, this, like with the 4K designation, caused a lot of arguments and disagreements BECAUSE OF THE DIFFERENCE BETWEEN DIGITAL CINEMA’s “TRUE” 2K (2048 pixels) HORIZONTAL RESOLUTION VS. TV’s HORIZONTAL RESOLUTION OF ONLY 1920 PIXELS. The disagreements had NOTHING to do with the use of the horizontal resolution (1920) vs. the use of the the vertical resolution (1080) – even though this did cause a lot of confusion (as you well know) since the vertical resolution is what has “traditionally” been used to describe a TV’s resolution.

      As I already stated above, the 2K and 4K monikers are simply “CARRY OVERS” from the Digital Cinema’s way of expressing a movie’s resolution. The 2K and 4K monikers are STILL based on the HORIZONTAL resolution and NOT on the vertical resolution (even for TV). This has ALWAYS been the case. That is why a TV’s resolution of 1920 x 1080 is said to be 2K (even though it’s not 100% accurate) and a 3840 x 2160 resolution is said to be 4K (again not 100% correct). However (I’ll say it one more time in case you were not listening the first 10 times), they are “carry-overs” from Digital Cinema’s HORIZONTAL resolution which is why they are not 100% accurate.

      Just one more thing. About resolution being either horizontal or vertical: A TV display (and a Movie screen for that matter) is TWO dimensional, NOT one dimensional. Therefore, to arrive at the ACTUAL resolution of the display, you NEED to multiply the horizontal resolution by the vertical resolution. This, of course, gives you the total amount of pixels on a given display (which is the panel’s resolution).

      Anyways, enough about this now.

      Richard

      • Richard

        I forgot to add.

        When we speak of 480 (DVD), 720 (HD), 1080 (Full HD), or 2160 (Ultra HD), we are talking about horizontal LINES of resolution not pixels. For instance, a 1080i (interlace) or 1080p (progressive) have 1,080 LINES with 1,920 pixels on each line for a total of 2,073,600 pixels.

        Richard

        • OmarF

          This is dead wrong. These designations apply to how many VERTICAL pixels there are in an image, regardless of how long the lines proceeding from them are on the horizontal plane. As an extreme example to make the point: One can stack 480 pixels vertically, and have no other pixels running horizontally, for a vertical resolution of 480, and it would be denoted, 480 x1. This would still be a 480 display. It is the number of pixels stacked vertically which determines those resolutions. I certainly hope you’re not working in the film industry.

          And please stop speaking to me in a condescending tone of voice; spend some time understanding a simple X/Y coordinate graph, first.

          Omar

          • Richard

            Omar,

            I apologize if my post sounded “condescending”.

            As to my comment on resolution, I was simply contrasting a display’s horizontal resolution and vertical resolution with the TOTAL number of pixels of the display.

            Yes, I know that the resolution is always spoken of as either the vertical or the horizontal number of pixels. I guess I should have been more clear in that post.

            From Wikipedia:

            Quote:

            “The display resolution or display modes of a digital television, computer monitor or display device is the number of distinct pixels in each dimension that can be displayed. It is usually quoted as width × height, with the units in pixels.

            One use of the term “display resolution” applies to fixed-pixel-array displays, and is simply the physical number of columns and rows of pixels creating the display (e.g. 1920 × 1080), which does not tell anything about the pixel density of the display on which the image is actually formed: broadcast television resolution properly refers to the pixel density, the number of pixels per unit distance or area.

            In digital measurement, the display resolution would be given in pixels per inch (PPI). In analog measurement, if the screen is 10 inches high, then the horizontal resolution is measured across a square 10 inches wide. This is typically stated as “lines horizontal resolution, per picture height.”

            End quote.

          • Richard

            Forgot to say,

            You said, quote: “This is dead wrong.” in regards to what I said about resolution.

            I sorry, but no, everything I said was 100% correct. You simply misread it or misunderstood what I meant.

            But again, I apologize for my “condescending tone”.

            Sorry about that.

            Richard

          • Richard

            Just to add,

            A 2K (2048 x 1080) Digital Camera is referred to as a 2.2 megapixel camera because it has roughly 2.2 million pixels (actually 2,211,840 pixels).

            A 4K (4096 x 2160) Digital Camera is referred to as a 8.8 megapixel camera because it has roughly 8.8 million pixels (actually 8,847,360 pixels).

      • OmarF

        I have no disagreement with your point that home TV and digital cinema have different aspect ratios, which invariably leads to differences in horizontal and vertical pixel counts between the two media, and thus disagreements on what should be called what. I get it, and those disagreements are not significant to me. Others may argue that home 1080×1920 is not “real” 2K (2048×1556) because it’s missing a couple hundred pixels per axis…that’s not my issue at all. My issue is why a piece of media or playback device should be called 2K because its horizontal resolution is near or around 2000 pixels. More on this in a moment…

        Here is the kernel of our contention. You write:
        “As I already stated above, the 2K and 4K monikers are simply “CARRY OVERS” from the Digital Cinema’s way of expressing a movie’s resolution. The 2K and 4K monikers are STILL based on the HORIZONTAL resolution and NOT on the vertical resolution (even for TV). *****This has ALWAYS been the case. That is why a TV’s resolution of 1920 x 1080 is said to be 2K***** (even though it’s not 100% accurate) ”

        [asterisks inserted for emphasis]
        WHOA THERE! Never, ever ever have I heard of a 1080p set being referred to as 2k –slightly inaccurately or not–until recently. They have ALWAYS been referred to as 1080p, denoting their vertical resolution, and in the terms, 1080 x 1920. [Just as 480 was 480 x 720, and 720 was 720 x 1280]. They were always considered to possess 1000(ish) lines of (vertical) resolution. And every multiplying of vertical resolution would determine the next level of resolution. Thus, 2160 vertical resolution would make for a 2K television and so on. I UNDERSTAND THAT THIS IS DIFFERENT FOR DIGITAL CINEMA PRODUCTION, but in the world of televisions and the home market, this was the understanding until recently.

        More to the point, why has one format (home TV) has always been “monikered” as you say, by its number of vertical pixels. While another format (digital cinema) is using horizontal pixel counts. And why are the new crop of TVs being defined by the latter rather than the former? A television should be defined by television standards. A digital master, by digital standards. We’d like both worlds to match up, but in the case that they don’t, there’d better be some explanation given, especially if you’re going to start using one set of standards to co opt another’s!

        If the world of digital cinema wants to turn on its head 60 years of broadcast convention and agree upon the “monikers” 2K, 4K, 8K, based on the horizontal resolution of their images they capture, so be it. But this “carry over” is an entirely new thing for televisions, and seems to have been advented only now with the introduction of “4K” televisions, with NO formal declaration to the consumers that this is being done. It would be like the auto industry measuring a car’s width by the length between left and right front tires for a hundred years, then suddenly changing to a front to back axle measurement, but not telling anyone that’s what they were doing. “Last year’s Prius was 4′ wide, but this year’s is 7′ wide!” Why the change, sir? “Well, on the production line the cars run though sideways and it’s more useful for us to discuss their width from front axle to back axle…but that’s the way it’s ALWAYS been, so we just decided to carry our little moniker over to the advertising and promotional materials without telling anyone…”

        Back to the point of digital cinema. I believe one reason for the use of horizontal resolution becomes apparent in the quote from the Editor’s Guild that I posted earlier:

        “Scanning film takes time, and time is money. The result is that filmmakers and vendors must make choices about how much data is scanned from each frame. This number, the scan resolution, influences the economics of the entire process. Scans are measured in thousands of pixels of horizontal resolution. One “K” means 1,024 pixels. A full-aperture “4K” scan has 4,096 pixels horizontally, and 3,112 pixels vertically. 4K is the current gold standard,”

        Money and cost may have driven it. Especially considering the limitations of storage space and processing power from ten years ago, I can see why they would choose to express “resolution monikers” in terms of horizontal resolution, over the traditional vertical. Because the horizontal number is significantly bigger, and it sounds like a “High(er) Definition” archive/master/DI, is being created. Do I want to tell the guys who produced and directed this 100 million dollar movie that I’ve archived their film in 1080P, or that I’ve done it in, 2K? Forget for a moment slight differences in aspect ratio and a couple hundred pixels this way or that… What makes better business sense? What sounds like better fire proofing against the future? Saying something is archived in 1080p, home television resolution (or thereabouts), or ostensibly double that, “2K”?

        Another possibility might be something as innocent as carrying over the means of measuring film width, 35mm for example, being 35mm wide, into digitization standards. In other words, when 35mm was first being digitized, it would have been logical to refer to the digital copy by its width in pixels, just as the film was referred to by its width in mm.

        Perhaps both factors had a play in the use of horizontal resolution as a measure for digital cinema.

        Regarding your point about “actual resolution”, yes I agree. To determine the full resolution is to count the total number of pixels. To discuss the resolution of a device or piece of media, we use references, such a vertical or horizontal resolution. I simply say everything should be above board and consistent.

        • Richard

          Quote: “Never, ever ever have I heard of a 1080p set being referred to as 2k –slightly inaccurately or not–until recently. They have ALWAYS been referred to as 1080p, denoting their vertical resolution, and in the terms, 1080 x 1920.”

          True. I agree.

          It wasn’t ’til the “4K” moniker started being used for 2160p TV resolution that the 2K moniker started being thrown around. “HD” was first used to denote the 720p resolution and later the 1080i/p resolution (which is sometimes called “Full HD” or “FHD”).

          This is one of the reasons that people argue that 2160p TVs (there is no 2160i since interlace is no longer used) should simply be referred to as “Ultra HD” or “UHD”. However, this is not 100% accurate either since the UHD specs include both the 3840 x 2160 AND the 7680 x 4320 (inaccurately referred to as 8K) resolutions.

          Just sayin’ 🙂

          Note: Again, as Josh, myself (and the Wikipedia article I quote above) have said, the horizontal resolution (larger number) is always listed first for both Digital Cinema and TV resolutions (e.g. it’s 1920 x 1080 not 1080 x 1920).

          Richard

        • Richard

          Omar,

          You said, quote: “Thus, 2160 vertical resolution would make for a 2K television and so on. I UNDERSTAND THAT THIS IS DIFFERENT FOR DIGITAL CINEMA PRODUCTION, but in the world of televisions and the home market, this was the understanding until recently.”

          Again, I’m afraid that is incorrect.

          A TV with a VERTICAL resolution of 2160 would have a HORIZONTAL resolution of 3840 (3840 x 2160), and therefore would be considered a 4K TV not a 2K TV (even though 3,840 pixels would in fact only be 3.75K instead of “true” 4K which is 4,096 pixels).

          Again, the “4K” designation ALWAYS refers to the horizontal resolution NEVER to the vertical resolution regardless of whether you are talking about Cinema resolutions or TV resolutions. This is separate from the 1080p or 2160p which, as you pointed out, are in fact based on a TV’s vertical resolution.

          I’m not sure how else to put this. The “4K” designation was simply “copied” or “borrowed” from digital cinematography and (even though not 100% accurate due to the TV’s 16:9 aspect ratio being different than a cinema’s aspect ratio) applied to TV’s 3840 horizontal resolution. But it NEVER has been used to refer to a TV’s vertical resolution of 2160.

          I hope this clarifies it.

          Richard

        • Richard

          Omar,

          You asked, “Why are the new crop of TVs being defined by their horizontal rather than their vertical resolutions?”

          Your right, it does get pretty confusing. I think this was purely a “MARKETING” decision. What all the reasons behind it were… well, your guess is as good as mine.

          Maybe because it’s easier to say 4K than 2160p and to say 8K than 4320p, maybe because the numbers were simply getting to large, to numerous, or to hard to remember. Some have suggested that since a 2160p TV had exactly 4 times the total amount of pixels than a 1080p TV, it would make it easier to market the new TVs (e.g. 4K means 4 times the resolution – but not really). Maybe all of the above. Who knows?

          You also have to remember that when 720p TVs and 1080i/p TVs came out, almost all movies were still being shot on film. Therefore, this whole “Digital Cinema resolution vs. TV resolution” debate did not exist at that time.

          Richard

          • OmarF

            Ah, at long last, we are friends 🙂

            You’re now touching on the issue I have with the nomenclature for TVs as they have changed.

            I’m sure you see my point, that had TVs proceeded down the same “naming path”, this crop of “4K” televisions would’ve logically next been called 2K. But rather than follow the ascending path of designation by vertical resolution, they took a sharp turn and started using as you say the naming sequence from digital cinema.

            To me, this was a way of giving people the idea that they’ve made a greater leap in TV tech than they really have. Perhaps it was also from the desire to standardize the nomenclature, so that people won’t be confused by talk of playing 2K movies on their 1080p TV, or the like.

            Anyway, thanks for the lively discussion. I have a better understanding of how digital and digitized cinema are being handled.

            Omar

    • Richard

      Quote:

      You said, “When movies were shot and watched on film, we never spoke of resolution.”

      That’s because “Film” doesn’t have “Pixels”. Period.

      Richard

  2. Trond Michelsen

    “So this statement admits that the UHD TVs everyone is running around buying today, are indeed, 2160p—double the (vertical) resolution of 1080p TVs.”

    I don’t think anyone has ever claimed otherwise.

    4k = UHD = 2160p = 3840×2160

    What, in your opinion, is the exact horizontal and vertical resolution of a 2K digital intermediate scan? The Editor’s Guild Magazine seems to think it is 2048×1556

    http://www.editorsguild.com/v2/magazine/Newsletter/MayJun02/digital_intermediate.html

    • OmarF

      Hey Trond,

      My point is that UHD is not 4K as everyone seems to think. It’s 2K. People have simply been lead to believe it is 4K by marketing strategy.

      1080p is 1K, or (approximately) 1000 lines of vertical resolution, and is expressed as 1080 (vertical) x 1920 (horizontal).
      2160 x 3840 is 2K, approximately 2000 lines of vertical resolution.
      True 4K would be, 4320 x 7680, or 4 times the vertical resolution.

      Television resolution has always been expressed as Vertical x Horizontal (like 2160 x 3840, or 1080 x 1920). Suddenly, now that “4K” TVs are being marketed, the expression of the resolution has been flipped to read, Horizontal x Vertical, so the much larger number comes first. Then that larger number is being used to define the resolution of the image. Just as you said, 3840 x 2160. But the relevant piece of information is that 2160 lines of vertical resolution. This is double the vertical resolution of 1080, and thus is actually, 2K. Not 4K. We do not have actual 4K televisions. We have 2K televisions, but people are being lead to believe they have something of much higher resolution, because of clever marketing.

      Omar

      • Josh Zyber
        Author

        The idea of televisions being measured in lines of resolution is a very outdated analog concept. I’ve never personally heard of “HD” resolution being referred to as 1080 x 1920. It has always been 1920 x 1080, just as the aspect ratio is expressed as 16:9, not 9:16.

        With that said, you have a very valid point that HD was always marketed as “1080p,” never as “1920p.” To suddenly switch to using the horizontal number is indeed misleading, and probably deliberately so to make it seem like a bigger jump than it actually is.

        • Richard

          True.

          Except that it does make easier to express and understand that 1080i (interlace) is 1,080 lines of resolution drawn/displayed in an “interlaced” manner where as 1080p (progressive) is 1,080 lines of resolution drawn/displayed in a “progressive” manner. Therefore, it would get kind of confusing to refer to it as 1920i and 1920p.

          If I’m not mistaken, feel free to correct me, the reason the vertical resolution was used rather than the horizontal resolution has to do with “scan lines” of resolution, that is: the way TV renders a picture on a TV display.

          Again, as I explained in my previous response to Omar above, “film” doesn’t have a set resolution given that it doesn’t use pixels. Therefore this did not apply to film at all.

          P.S. Be sure to read my previous responses to Omar above.

          Richard

          • OmarF

            Hey Richard, trying to figure out what you’re meaning here, exactly. TVs do not render pictures horizontally, they render vertically.

            Neither 1080i or p is horizontal resolution, they are both a vertical resolution of 1080. The “scanning” or drawing, occurs up and down, not across. So in a 1080 monitor or TV, there are 1080 lines of pixels running top to bottom (vertically) and 1920 running left to right (horizontally). When the TV scans, it draws the image from top to bottom, vertically. Hence probably why TV resolution was always spoken of in terms of vertical lines.

            To simplify, I’ll try and make a diagram.

            ———————————- (line 1)
            ———————————- (line 2)
            ^
            |
            | (1080 lines running top to bottom, vertical resolution)
            |
            v
            ———————————– (line 1079)
            ———————————– (line 1080)

            Those 1080 lines are scanned by the TV, top to bottom. Thus they are vertical resolution.

            Likewise, each one of those dashes I used to make the lines is in a column running left to right, and thus are horizontal resolution, like this:

            1 2 3 4 5 1919 1920
            – – – – – – – – – – – – – – – – –
            (1920 columns running across, left to right. Horizontal resolution)

            PS, I never said film had a vertical or horizontal resolution, nor did you mention it in your reply that I saw. We were discussing horizontal and vertical resolution of digital cinema and home television.

          • Richard

            Omar,

            Quote: “When the TV scans, it draws the image from top to bottom, vertically.”

            Yes, you are correct. Thank you for catching that.

            I meant to say “1,080 lines of VERTICAL resolution” in my previous post.

            Unfortunately, I could not go back and correct it.

            Richard

        • OmarF

          Thanks for your reply, Josh.

          For myself, until recently I’d always heard HD resolution expressed as 1080 x 1920, because the primacy was put on the vertical resolution, so it was listed first. Same went for 480 x 720, etc.. This is, as you point out, at odds with the way we express aspect ratio. But they are also two different topics.

          Be well,
          Omar

    • OmarF

      Regarding 2K digital intermediate scans, I really don’t know. I’ve not taken a close interest in following the current trends in digital cinema photography, and this really isn’t where my issue is. It’s with the marketing going on with the newer TVs. My attention first became drawn to my issues in the home theater market a few months ago, when I was investigating 4K TVs to purchase one. Just glancing at the info card next to the TV told me that somewhere in the past five years or so (I’ve been using a 1080p projector for quite a while and never followed up on the newer equipment until now), there’s been a dramatic marketing change. Honestly, I expected to see a true 4K vertical resolution on the set. It was a bit of a shock to see a 2K television being listed as 4K, and I pretty much got the idea right there of what was going on when I saw the way they’d listed it’s horizontal resolution first. Something which was not done previously.

      [I still bought the TV because it looks damn good. Great black levels, bright, very accurate grays and colors…and I don’t have to deal with pesky bulbs. By sitting a couple feet closer, I can get the same relative size as my projector screen.]

      As I mentioned in my past several posts, what shocks me the most is that even people in the know seem to be going along with this shell game. In the article you quote, they write:

      “Scanning film takes time, and time is money. The result is that filmmakers and vendors must make choices about how much data is scanned from each frame. This number, the scan resolution, influences the economics of the entire process. Scans are measured in thousands of pixels of horizontal resolution. One “K” means 1,024 pixels. A full-aperture “4K” scan has 4,096 pixels horizontally, and 3,112 pixels vertically. 4K is the current gold standard,”

      This is obviously the standard in digital post production now, how or when it got to be this way is a question that interests me. Is it only recently that horizontal resolution is used to determine the image resolution, or has it been that way from the beginning? Has the reference point been changed to horizontal resolution because it gives the impression that a more detailed intermediate or archival is being made? I know I would rather have “a gold standard 4K” master of my film if I were a director.

      One thing I know for certain is televisions were always rated by Vertical resolution, and that this has been coopted by terminology used for current digital post production, which even my detractor Richard said are not the same thing. So why are they being treated that way? Why are we being sold TVs with 2160 likes of 2K resolution, and being told they are 4K? I doubt it’s an error or happenstance. Marketing and advertising people get paid a LOT of money for their jobs, and what appeals to the market is their top concern.

      Anyway, have a good one,
      Omar

      • Richard

        Omar,

        Quote: “Why are we being sold TVs with 2160 lines of 2K resolution, and being told they are 4K?”

        Like I already explained in my previous posts, we are NOT, and have NEVER been told that “2160 lines of 2K resolution” are 4K (as you stated above). You are mistaken.

        This is simply your misunderstanding of where the 2K and 4K designations originated (as I explained in my previous posts above in great detail). 2,160 lines of [vertical] resolution DOES NOT equal 2K resolution. 2,048 pixels of horizontal resolution equals 2K (where one “K” means 1,024 pixels as you quoted above and I mentioned in a previous post) [Cinema] resolution (2 x K (1,024 pixels) = 2,048 pixels).

        Richard

  3. Chapz Kilud

    This has been discussed many times and the cnet article summed it up the best:

    http://www.cnet.com/news/why-ultra-hd-4k-tvs-are-still-stupid/

    But I wanted to point out that some people claimed they could see night and day difference between 4K and 1080p. They were probably right because they were situated between the red solid line and the purple solid line in the Carlton Bale scale. That’s the area where “Benefit of 4K starts to become noticeable”). But I’d argue that one could not tell the difference between 4K and Blu-ray upscaled to 4K at those distances. For the tiny details in 4K to be visible by the human eyes, you have to be at the solid purple line or closer. So for a 80″ you have to be sitting at 5′ or closer or else you cannot see the additional details offered by 4K.

    The exact analogy would be upscaling DVD. It looks better than DVD at any distance, and that’s where people claim to see big difference between 4K and 1080p. But upscaled DVD doesn’t offer extra detail provided by Blu-ray. If you compare Blu-ray and upscaled DVD, then you’ll see the extra detail from Blu-ray. For that to happen, you only need 40″ TV sitting at 5′ away. Most of us have TV bigger than that and sitting much closer than that distance. So we can fully reap in the benefit of 1080p, something you can’t get do with 4K because of steeper requirement.

    • OmarF

      You are very correct. Joe Kane did a fair amount of testing with this in his home by inviting people over to view test images with built in artifacts at various distances, and report what artifacts they saw, if any. I don’t recall the exact distances he reported, but he was using a 10′ wide projector screen, and he said that was the minimum necessary at some common sitting distance, perhaps >8′. (I personally sit about 6′ from my 65″ UHD to get the benefit of the extra resolution.) He gave a long interview that should still be available on AVS about all the flaws in the current implementation of “4K”. As I said previously, I find my 4K set to be an excellent doubler for my blu rays. With less space between pixels and double the pixel density, I don’t get any “more” information than what is on the blu ray, of course, but the presentation is denser and richer to the eye.

      Omar

      • Chapz Kilud

        Even 6′ from 65″ is too far to get full benefit of 4K. Without a doubt at that distance you can distinguish 4K from 1080p side by side. Between 4′ to 8′, a person with 20/20 vision should begin to see the difference. To see the extra details you have to be at 4′ or closer, which is uncomfortably too close for a 65″.

        I always tell people to use the Samsung’s 4K split-screen demo. With a 65″, you can see the difference even from 8′ away. There was one scene in the library with bookshelves and books. Up close you can see book titles, detail book spine textures,…etc. But when you move away to sit at the home theater recliners, you cannot see those details. The pictures do look better and sharper, but I’m willing to bet that’s what it will look like if you substitute 4K material with Blu-ray upscaled to 4K.

        This brings up a very interesting question. Can 4K Blu-ray survive? If you’re building up a 4K library that you cannot see the extra details offered, and the pictures are indistinguishable from the regular Blu-ray upscaled to 4K, then the money is better spent elsewhere. Given how relevant 3D is, 4K will face even bigger challenges.

          • Chapz Kilud

            Resolution is the dead last in contribution to picture quality. The most important factors are contrast and artifact. With other things being equal a TV with better contrast will always look sharper. So when people are comparing TV #1 with 4K and TV #2 with 1080p, there are too many factors in play. But at least with the same TV with split screen, everything is almost equal and it’s a better comparison. Regarding simulation, it’s still a very valid test for the eyes. 4K offers extra details/information. The only reason 1080p couldn’t have it is because the details are too small for its resolution. If you cannot see those detail at normal sitting distance, then you’re better off keeping your existing Blu-ray collection and using a upscaler on 4K TV.

        • Richard

          Chapz,

          You said: “This brings up a very interesting question. Can 4K Blu-ray survive? If you’re building up a 4K library that you cannot see the extra details offered, and the pictures are indistinguishable from the regular Blu-ray upscaled to 4K, then the money is better spent elsewhere.”

          Well if resolution alone was the only “upgrade” from regular 1080p Blu-rays, then I’d agree. However, a simple increase in resolution is thankfully NOT the only thing that the new Ultra HD Blu-ray format brings to the table.

          (See my next post below for more on the extra features of the UHD format.)

          Richard

      • Richard

        @ Omar and @ Chapz

        I also have a 65″ 4K/UHD TV (Samsung SUHD 65JS9000).

        Great TV! 🙂

        I agree with both of you that at more than say 8-10 feet it’s very hard to see a difference in resolution.

        Thankfully resolution is only 1 aspect of the new UHD spec (and probably the least important one at that).

        Things like HDR (High Dynamic Range), WCG (Wide Color Gamut) and Deep Color (10 bit color depth) are far more important and are a much bigger improvement than a simple increase in resolution.

        And, those things ARE noticeable from much greater distances than resolution.

        Richard

        • Chapz Kilud

          I believe Joe Kane also listed the factors contributing to picture quality in the order of importance. But here is the article I found since I couldn’t find his: http://www.cnet.com/news/three-tv-improvements-more-worthwhile-than-ultra-hd-4k/

          As you can see compression is second on the list. I don’t know the spec of 4K Blu-ray. I was told they have similar compression ratio. But I was also told 4K Blu-ray codec is more efficient (more compression?) I don’t know. As for color, you’re correct. It happened to be third on the list above resolution. However, your TV isn’t going to display all the colors because some are impossible using current TV infrastructure. So how much improvement in color will be realized remains a big question. Don’t forget unless you’re sitting at the distance in Carlton Bale scale for 4K, it’s impossible to see the extra detail and added benefit of 4K. Without a doubt you will notice better picture, but that’s mostly sharpening/smoothing effect.

          We’ve been thru this before with audio. Those higher bits and samplings are just gimmick. That’s why SACD and DVD-A both flopped. Even in the camera world, manufacturers have been pushing for more pixels. But packing so many pixels without increasing the size of sensor, the result is more noise. That’s why Fuji F-31FD is so highly sought after even though it’s a 6MP P&S, it has large sensor and produces DSLR-like picture. Most 12MP or 16MP P&S cameras don’t come close. But that’s what TV manufacturers like to market: more resolutions. It’s easier to sell, even though there are better (but more expensive) ways of improving picture quality. I know there is a limit for hearing. I also know there is a limit for vision. I think we were getting pretty close with what we have.

          • Richard

            Hi Chapz,

            You DO realize that the Cnet article you linked to is from Oct 2012 right?

            Just sayin’ 🙂

            I agree with him that Contrast is the single most important aspect of picture quality. Which is where HDR (High Dynamic Range) comes in.

            HDR provides a significant increase in contrast ratio, and so much more.

            HDR is probably the single most important and exciting advancement in picture quality in the last 10 to 15 years, and single most important part of the new UHD (Ultra High Definition) standard.

            In essence, HDR is about brighter whites and deeper blacks, and more details in each end. It is about brighter more detailed “specular highlights” and darker blacks with more “shadow detail”. And that IS very noticeable – even at a distance.

            Additionally, UHD, which of course includes HDR, is also about more vibrant, richer, and more saturated colors. It is about more shades of color (or Color Depth) and a wider color palette (or Color Gamut). It is about trying to reproduce the world around us as accurately, realistically, and detailed as possible on a display.

            Chapz, you said:
            “However, your TV isn’t going to display all the colors because some are impossible using current TV infrastructure. So how much improvement in color will be realized remains a big question.”

            I don’t know what TV you have, however, the Samsung JS9000 I own has a 10-bit panel. Unlike an 8-bit panel, which can display up to a maximum of “only” 16.7 million colors and shades, a 10-bit panel can display up to 1.07 BILLION colors and shades. “So how much improvement in color will be realized?”, you ask. Answer: A HUGE IMPROVEMENT!!!

            As to the first part of your quote above, “some [colors] are impossible using current TV infrastructure” – the JS9000 can display up to 93% of the DCI P3 (Digital Cinema Initiative) Color Space. In other words, it can display practically all the colors Digital Cinema can display. Some 2016 TVs will be able to display 100%+ of DCI P3. Television HAS already caught to Digital Cinema as far as their ability to display an extremely wide range of vibrant colors.

            You said:
            “Without a doubt you will notice better picture, but that’s mostly sharpening/smoothing effect.”

            Hum… NO. That statement is simply just plain WRONG.

            Now as to your concerns regarding the new Ultra HD Blu-ray player and “compression artifacts”, when have you ever watched a recent, quality, Blu-ray movie on a good quality TV and thought: “Man, this movie just looks absolutely awful. Look at all those “compression artifacts”. Do you honestly think that a (recent) UHD Blu-ray movie is going to look “bad” (or worse than the same movie on regular Blu-ray) or be full of “compression artifacts”? Really???

            Again, what the new UHD Blu-rays bring to the table is a lot more than an increase in resolution and more efficient compression algorithms, it includes, like modern high-end UHD TVs, HDR, WCG, and Deep Color support. And I believe it will be a VERY noticeable improvement on the current Blu-ray discs.

            I’ll be buying one. Can’t wait. 🙂

            Richard

          • Chapz Kilud

            I couldn’t find Joe Kane’s article so I had to find something similar. I read the review on your Samsung. If what you said was true then it should have easily shattered the record for darkest black set by Panasonic VT plasmas. But it didn’t. I thought the improvement was significant? It should have beat the plasma’s contrast, but it didn’t. Maybe when the 4K Blu-ray finally comes out then its UHD specs will shine. Having it on UHD standard doesn’t mean much. Do you think a $300 Seiki UHD could beat a Panasonic plasma? Not even close. Also we’re way past the point of diminishing returns. Human eyes cannot detect that tiny of variance in shade, just as human ears cannot tell the difference between 16-bit and 24-bit audio. Again you have to sit extremely close right up on the purple line in Carlton Bale scale to see everything you claim you’ll be seeing.

            The difference between marketing for 4K Blu-ray and 4K TV is that manufacturers will slowly move to 4K TV and perhaps 1080p will be hard to find just as 720p is today. I doubt anybody will stop making 1080p but let’s say it’s possible. But I’m sure studios will stop releasing standard Blu-ray. DVD hasn’t died yet. So there is a real danger 4K Blu-ray may become irrelevant, just as 3D Blu-ray is already suffering from shrinking market. I don’t think a lot of people are going to invest on a new library of movies. I own over 2000 Blu-rays. I know it makes no sense to start another library.

          • Richard

            Hi Chapz,

            You said:
            “I read the review on your Samsung. If what you said was true then it should have easily shattered the record for darkest black set by Panasonic VT plasmas. But it didn’t. I thought the improvement was significant? It should have beat the plasma’s contrast, but it didn’t.”

            I have no idea which review you read (there have been many), however, some of the reviews I read did say that the Samsung JS Series TV WAS superior to most Plasma TVs.

            Plasma TVs, as you well know, use a completely different technology than LCD TVs (they do not have a backlight). This of course enables them to achieve almost perfect (but not quite though) black levels. In other words, you are comparing apples and oranges here.

            That said, modern high-end TVs, such as Samsung’s JS Series, are capable of displaying very close to plasma blacks. Unless you have them side by side in a dark room, you can barely tell the difference in their black levels anymore.

            Now when you speak of “contrast ratio”, black levels are only one half of the story. You cannot talk about the contrast ratio without mentioning the peak brightness as well. Modern LED LCD can get extremely bright – far brighter than Plasma TVs – which is very important when displaying HDR content.

            Speaking of HDR, Plasma TVs don’t do HDR. They don’t display 4K/UHD resolutions either. Plasma TVs, as I’m sure you already know, are DEAD; they no longer make them. Therefore, it is kind of pointless to compare modern LCD TVs to Plasma TVs. You can always compare them to OLED TVs if you want though.

            Now as far as OVERALL picture quality is concerned, I know quite a few former Plasma TV owners who have said that the picture on their new high-end LED TV totally “blows their old Plasma TV out of the water”. I, like them, also believe the “overall” picture quality on the new “flagship” TVs to be superior to the picture on a Plasma TV.

            However, feel free to disagree.

            🙂

            Richard

          • Richard

            Chapz,
            You said:
            “Also we’re way past the point of diminishing returns. Human eyes cannot detect that tiny of variance in shade, just as human ears cannot tell the difference between 16-bit and 24-bit audio. Again you have to sit extremely close right up on the purple line in Carlton Bale scale to see everything you claim you’ll be seeing.”

            I CAN see a significant difference in picture quality, especially with UHD HDR WCG content. I CAN see much more details; more vibrant, vivid colors (as well as “new” colors and/or shades of color); and a significant increase in contrast ratio and dynamic range. The differences ARE very substantial.

            Obviously you haven’t had the chance yet to see HDR WCG content on one of these new high-end TVs if you’re claiming that you can’t see a difference or that you have to sit extremely close to the screen to see it (you don’t).

            Richard

          • Richard

            Chapz,

            You said:
            “…manufacturers will slowly move to 4K TV and perhaps 1080p will be hard to find just as 720p is today. I doubt anybody will stop making 1080p but let’s say it’s possible.”

            You can still purchase a 720p TV? I haven’t seen one of those in a long time. It must be only in the very small sizes (32″ or less maybe?).

            Manufacturers are NOT “slowly” moving to 4K/UHD TVs as you said above, they have ALREADY moved on to 4K/UHD TVs. I don’t think they showed a single HD/1080p model at CES this year (they did show a few 8K TVs however). By next year, the only 1080p TVs you will be able to find are going to be small budget TVs.

            You doubt they will stop making HD/1080p TVs? They will most definitely stop making 1080p TVs (and in the very near future at that).

            Richard

          • Richard

            Chapz,

            You said:
            “So there is a real danger 4K Blu-ray may become irrelevant, just as 3D Blu-ray is already suffering from shrinking market. I don’t think a lot of people are going to invest on a new library of movies. I own over 2000 Blu-rays. I know it makes no sense to start another library.”

            I agree that the new Ultra HD Blu-ray players and discs will cater mostly to a niche market, however, that said, physical media is not dead yet. It will still be around for quite some time.

            The biggest competitor to UHD Blu-ray right now is streaming services. However, the quality of streaming media, despite many recent advances, still has a long way to go. Now add to that other things such as internet/download speeds, data caps, etc. The bottom line is, people ARE still going to continue buying Discs.

            Now as to people’s “current” collections of DVDs and regular Blu-rays, The new UHD Blu-ray players will still be able to play them. And once people start upgrading their current TVs and Players and start seeing the quality improvement, at least some of them (how many remains to be seen) will also start purchasing the new UHD Blu-ray discs as well.

            Richard

          • Chapz Kilud

            There was a LED TV that produced darker black than Panasonic VT plasmas of all plasmas, that was the Sharp Elite from many years ago. It also have some of the best contrast ever measured on TV. But it was super expensive to make. 70″ retailed 8 grand. So I don’t doubt that LED can outperform plasma given unlimited budget. Plasma also suffers from deteriorating colors. So I don’t doubt plasma owners finding their LED TV to be better years later.

            Sony stopped making 1080p XBR couple years ago. But their mid-ranged TV still have plenty of 1080p. Same with Samsung. If you’re putting your money where your mouth is, you’ll lose pretty badly because there will be plenty of 1080p TV’s around next year. If you ask people working in TV manufacturing, they will tell you 4K TV is more expensive to make than 1080p, having 4 times the pixels. So 1080p will always be cheaper with everything else being equal. You’re assuming average Americans make as much as you do. Most families are living paycheck to paycheck. The last time I read the average TV screen size sold was 46″, it may have gone up this year but I doubt it’s more than 50″. Most people still buy smaller TV’s. They don’t have big house or living room to put a 70″ or 80″, and that’s never going to change. They’re going to save a few hundred bucks buying 1080p (and they’d be smart for doing so) when 4K doesn’t offer much improvement at those sizes and viewing distances. Any manufacturers that plan to drop 1080p will lose that significant market segment. Nobody will do it. So you can bet your house 1080p will be around next year and the year after. Don’t forget the best cable/airwave can do today is 1080i.

            I haven’t seen 4K Blu-ray. The only thing I saw was simulated split screen demo from Samsung. As I said many times, I can still see the difference from far away. But I’m not getting the fine details at normal viewing distance. It doesn’t offer very much. But to each his own. Some people may find the improvement to be justified by the added premium. All I can say to people is make sure you’re seeing what you think you’re seeing before dumping your precious money into a new format.

          • Richard

            Chapz,

            You are correct. The Sharp Elite was by far the best LCD TV of it’s time (and for a few years after). It is still considered a reference set. However, the picture quality on the new flagship TVs have now surpassed it. This has been reported by a number of different owners and reviewers.

            I never said that 1080p TVs would no longer be available/sold next year – nor did I say that people would no longer be buying them. What I said was that they would mostly be available only in smaller screen sizes and that they would be “budget” TVs.

            By “budget TV” I didn’t simply mean “less expensive”. I also meant that manufacturers are putting all their best and newest Tech in their UHD/4K TVs – and these are the TVs they are currently promoting and will continue to promote more and more in the next couple years.

            That said, YES, I agree with you that there will most likely be plenty of 1080p TVs still available next year, and maybe even in 2018, however, they will slowly be “phased out” just like 720p sets were.

            Oh and one more thing. With more and more companies jumping on board with 4K/UHD, the prices are starting to come down (and will continue to do so in the next couple of years). Companies such as Hisense and Vizio, for instance, are offering decent quality 4K TVs at much lower prices than the big name brands.

            Richard

          • Richard

            Here’s a quote from a 3/7/2014 article in ‘LightReading.com’ called “Is 4K Ultra HD in Cable’s Future?”

            “During the Consumer Electronics Show (CES), exhibitors displayed new television models with much clearer pictures. The sets created a buzz despite questions about consumer demand, high prices, transmission capabilities and a lack of content for them.

            But this wasn’t 2014 [or 2015]. The year was 1998, and the new marvel was high-definition TV. The first HDTV sets hit the market that year, bearing a hefty retail price of about $8,000 [US], according to The New York Times.

            History appears to be repeating itself. 4K Ultra High Definition TV has arrived and is producing similar buzz, questions, and pronouncements…” (Emphasis mine)

            Does this at all sound familiar?

            Back then (in 1998), just like today with UHD TVs, there were countless people who said that they did not need or want an HDTV and that they would never buy one.

            Today, most people own at least one HDTV, some own several of them, including those who had emphatically stated, just a few short years before, that they had zero interest in these new fancy HD TVs and would never buy one. “I’m perfectly happy and content with my current CRT TV and VHS VCR.”

            There will always be “naysayers” who question, resist, oppose, or are skeptical about advances in new technologies.

            And there will always be those who embrace it with open arms.

            There will always be “early adopters”; and those who “show up late to the party”.

            But the bottom line is:

            Regardless of how you may feel about 4K/UHD (for or against; now or later; neutral), UHD (which, by the way, actually includes both 4K/2160p resolution and 8K/4320p resolution, and so much more) has arrived and is the future and next evolution of television.

            🙂 🙂 🙂

            Richard

  4. Pedram

    I heard someone say that for shots where there is no VFX, the studios will be using the original raw footage shot in 4k+ (as opposed to the 2k DI), so those shots will be in “real” 4k when put on a UHD Blu-Ray. It seems kind of odd to me to use variable resolution original footage for making a UHD master, but stranger things have happened. Anyone know if this is true?

    • Josh Zyber
      Author

      The original raw footage will be unedited and have no color grading. Seems unlikely to me, but I can’t say with certainty what the studios are doing. If they’re re-grading for HDR, they may need to make a new DI anyway.

  5. ray

    Josh, id like to point out that this article is a little misleading.

    Nowhere in your article does it mention that some of the 2k movies that are being remade for UHD discs are also including the 4k assests that are available to use. So for the martian, there is lots of 5k footage that was shot with the red epic dragon/scarlet and also a go pro 4k camera. Fox spokesman has commented that 5k footage was used in the rebuild. He also stated that all fox movies will use this same method.

    Also, dolby has publicly said that they also are starting with raw files. So you have to believe that these processes for building a UHD movie will be used by all the companies as it seems thats the “common sense” way to build a UHD movie.

    Therefore, stating that every movie with 2k DI is 100% upscale is also 100% false.

    • Josh Zyber
      Author

      First, this article was published before the Fox spokesman made that statement.

      Regardless, the VFX for The Martian were rendered at 2k. Any shot with CGI in it will be from the 2k DI, not from the raw camera data. In that movie, this means that any shot where you see Mars (you know, the parts of the movie people might expect to be visually impressed by) are 2k. Close-ups of potatoes or Matt Damon’s face might be redone in true 4k, maybe.

      The majority of movies in the initial launch wave were shot with 2k cameras anyway, making this argument moot.

  6. Dan Stanley

    I just want to thank you for this article. I don’t want to go as far as to say it changed my life, but it certainly changed my Home Theater life.

  7. So is there anywhere that maintains a list like you made? Wikipedia has a list like this for 3d (what was really shot in 3D, what was made 3D in post). Is there a list like this for 4k?

    ‘The Amazing Spider-Man 2’ – Shot on 35mm, with a 4k DI
    ‘Chappie’ – Shot in 5k, with a 4k DI
    ‘Hancock’ – Shot on 35mm, with a 4k DI
    ‘Pineapple Express’ – Shot on 35mm, with a 2k DI
    ‘Salt’ – Shot on 35mm, with a 4k DI
    ‘The Smurfs 2’ – Shot in 4k, with a 4k DI

  8. kelly anthony

    Why do the 4k movies that are true 4k, do not look as good as the demo videos we see on You Tube or stores like Best Buy? Walk into any electronic store and the videos will blow you away. Buy a true 4k movie and it looks a shade better than a standard Blu ray. I have a Panasonic HDR ready TV and the 4k Samsung player, its not worth the price of a 4k movie. Best just stay with a normal Blu Ray.

  9. Peter II

    Hey Josh, thanks for the very helpful article. Even though it has been nearly two years since you wrote it, it still seems to be one of the more informative pieces on the subject. I wonder if you can answer one question though? Wouldn’t HDR actually better reflect the colors of movies originally shot on 35mm film?

Leave a Reply

Your email address will not be published. Required fields are marked *