Actually, there are many things funny about Hollywood. But there is no denying that the tech this industry uses is amazing. Today I want to focus on resolution. Walk into any Best Buy and three dozen young, energetic salesmen rush up to you to sell you a 4k TV. They start feeding you all this mush about the picture quality and how it “blows HD away!” But there are several things you need to know about the home 4k format, known as UHD (Ultra high Definition). And does it really blow HD away?
First let us define 4k and UHD. Let’s start with 4k. There are three primary resolutions used in the home format these days: 720HD, Full HD, and UHD. Others have existed like Standard Definition (NTSC, or the old square TVs), but let’s concentrate on the top two as of 2018: Full HD and UHD. As you probably know, Full HD is the name given to the digital video format consisting of the following resolution: 1080 pixels by 1,920 pixels. Picture this as a rectangle that is 16 units wide and 9 units tall. The length (X Axis) contains a row of 1,920 pixels. The Y Axis, or the vertical height of the rectangle, contains a column of 1,080 pixels. Multiply these together, and you get a whopping 2 million pixels (2 megapixels) per frame. Keep in mind a super majority of Hollywood films are shot in 24 frames per second. That’s a lot of visual information hitting your eyes each second.
Now, UHD quadruples the pixel count! Depending on the shape of the screen, the resolution numbers vary, but we are looking at 3,840 x 2160. That’s 3,840 pixels horizontally by 2,160 vertically. Same 16 x 9 shape (aspect ratio) as HD. There is a wider aspect ratio now available, but we will get into that at another time. So, these numbers come to a whopping 8,294,400 pixels per frame, or more simply put, 8.3 megapixels per frame. Interestingly, this is not ACTUALLY 4k, as it is really only 3.8k (3,840). To be truly 4k, it would need to be at least a 4,000-line image. Hollywood uses a true 4k format, which is slightly higher resolution than the UHD “4k” format.
Why didn’t they just call it 3.8k? Better yet, why didn’t they just make the UHD format the same resolution as the cinema 4k? It takes a greater (or slightly more unstable) mind to make sense of these things, and even then, only after six drinks at a local pub. Keep in mind, however, that most every theater in the country only projects at 2k, which is only slightly (almost insignificantly) higher resolution than HD, which is technically a 1.9k format (1,920) as opposed to 2k’s resolution of 2,048 x 1080. Again, why didn’t they simply make HD the same 2k standard as theaters project in??
Does your head explode yet? Well, there’s more to consider.
UHD offers an increased color range and more dynamic range than HD. They call this feature HDR (High Dynamic Range), which is quite stunning, visually more perceptible than the added pixel density of 4k over HD. This means the average viewer will appreciate the added color depth and HDR more so than all the added pixels.
Okay, I know I have covered a lot. But consider a few additional items. Why were these resolutions conceived in the first place? Well, HD was fashioned in 1080 x 1920 to closely match the resolution and quality experience of a standard theatrical release print. Back in the film days, theatrical release prints had (according to the experts) roughly 2k worth of resolution. This is because a 35mm theatrical release print was 3rd or 4th generation away from the original negative. Release prints were not fashioned from the OCN (original camera negatives). This is to protect the negative, because as we all know, running film through projectors can cause damage to the prints (tears, scratches, etc.). Theatrical release prints were at least 3rd or fourth generation removed from the OCN, because thousands of them had to be made. And each time you copy from a copy, picture quality is reduced. So, HD and 2k gave you basically the same viewing experience of watching a 35mm film projection.
Now the 4k resolution, on the other hand, was intended to be a good archival resolution, as a 4k scan resolves nearly all the resolution from an OCN that it contains. So, in effect, when I am watching Close Encounters of the 3rd Kind or Blade Runner on my UHD Blu-Ray, I am, in essence, seeing it at nearly equal to the resolution in which the film was shot. That’s darn cool! We are FINALLY seeing the films in the quality the creators always wanted us to see them in.
Now, for the original question, are the Best Buy salespeople correct? Does UHD “blow away” Full HD? Well, yes and no. Yes, meaning it does, but no, meaning you can’t tell UNLESS you are seeing the image on a HUGE screen, or are sitting pretty damn close to the screen on a moderate sized-monitor.
If you have a 54” display and you are sitting six feet away from it, I would probably say you cannot see much (if at all) the added detail. Why? Because the human eye can only see detail so small, and anything smaller than that number is basically invisible to us at normal viewing distances. Now, if you are sitting right in front of your screen or are projecting in excess of 100 inches, you might very much appreciate the added detail. And let me tell you from personal experience, the added detail is STUNNING! Seeing is believing. If you are right in front of the screen, of course. This is why computer monitors like the Apple iMac 5k (27-inch screen) looks so gorgeous, because you are sitting right in front of it!
In conclusion, it all depends on your perspective, I guess. But make no mistake, 4k is just the beginning. I am seeing monitors mentioned now for 8k resolution. Can we say overkill? I can’t wait to see how the Best Buy dudes spin this one!