Now that 1080p is pretty well entrenched as the current home theater standard for HDTV and display resolution, the consumer electronics industry is already making moves to push us to new 4k screens. At a resolution of 4096×2304, that’s four-and-a-half times the number of pixels than we’re using now. Are you willing to make this leap, or do you question the need for it?
Honestly, I have to question the need for it. The jump from standard definition to high definition (even 720p) was a revolutionary improvement in the quality of watching movies and television at home. The increase from 720p to 1080p may be less apparent on TVs smaller than 30″, but is still clearly beneficial on larger screens. Frankly, for the vast majority of viewers, 1080p is a sweet spot where video images are richly detailed without visible pixel structure, even on large screens. Beyond that, it becomes a case of diminishing returns. Aside from projection users with extremely large screens (at least 100″), the increase in resolution from 1080p to 4k will not be visible to the eye.
In fact, viewing the content currently available to us from Blu-ray, DVD, Netflix, broadcast TV, etc. on 4k screens will require that those video images be upconverted to the higher resolution, which will introduce the possibility of visible scaling artifacts. I would expect Netflix and standard-definition sources like DVD to fare particularly poorly in this regard. They’d likely look better on the same size screen with less scaling.
Right now, consumers have no native 4k video sources available to watch. While that may change in the future (reportedly, Sony is hard at work on a new 4k format), we’ll still run into a problem of limited content availability. Most movies today that are either shot digitally or post-produced with a Digital Intermediate are permanently locked to 2k resolution or less. (Even ‘Avatar’ was shot with only 1080p cameras!) Very few movies are natively shot at 4k yet. Even among movies originally photographed on 35mm film, the majority have been transferred to video at 2k. Studios would need to rescan most of their old movies to create new 4k masters. Considering the sad state of catalog titles on Blu-ray now, can you really imagine a studio like Universal (notorious for dusting off aging DVD masters for reuse on Blu-ray) going to that trouble and expense?
Don’t get me wrong, I definitely see the benefit of 4k in movie theaters, and as an archival medium as studios convert their old films to digital. However, I see much less need for it in the home.
On the other hand, a brand new video format may bring with it some fringe benefits missing on Blu-ray, such as improved color depth (the so-called “DeepColor”) that would reduce the presence of color banding artifacts. That might be a good thing. Too bad it’s too late to implement that on our current Blu-ray standard now.
I also see great potential in using a 4k screen for passive 3D. A 4k screen can display two full 1080p images simultaneously without the need for active shutter glasses and the problems associated with those, such as flicker and headaches, not to mention the uncomfortable shutter glasses themselves.
Sony has already released one 4k projector to market (at a cost of $25,000!), and JVC has a few more affordable models that use a pseudo-4k format called e-Shift (which has a resolution of 3840 x 2160). Early adopters have been pleased with the results, but these same early adopters are also likely to have exceptionally large screens. More manufacturers are preparing to unleash 4k TVs in the coming months.
Do you see this as a welcome evolution in video technology, or a shameless cash-grab from an industry desperate to get us to buy new televisions all over again? Vote in our poll.