or Connect
AVS › AVS Forum › News Forum › Latest Industry News › Why UltraHD is Not 4K
New Posts  All Forums:Forum Nav:

Why UltraHD is Not 4K

post #1 of 40
Thread Starter 

Source of picture

UltraHD and 4K have been the big talk at CES this year and it seems the terms has been getting some mix up.
Quote:
But for industry pros, all the talk about 4K being UltraHD, and vice-versa is heresy.

Forbes spoke with an industry expert in both pro and consumer video topics Patrick-Pierre Garcia, who states;
Quote:
The UltraHD format has a horizontal resolution of only 3,840 pixels whereas the 4K format has 4,096 pixels. There’s nothing secret about this. UHD and 4K are official formats and I don’t understand why the CE industry is confusing consumers by calling UltraHD products, 4K.

The confusion might even delay the adoption of UltraHD.

Here's some fun facts to help clarify some of the mix up:
Quote:
1. 4K is the format used by professionals (movie studios, TV channels…) to create content and has a resolution of 4096 × 2160. It replaces 2K and will be replaced by 8K
2. The UltraHD format was voted by the Consumer Electronics Association (CEA) last October and refers devices with a horizontal resolution of at least 3,840 pixels. Sharp also calls QuadHD
3. And what about 4K UltraHD or UltraHD 4K? Well, these are marketing gimmicks not based on any kind of reality says Garcia


So I guess consumers will be buying UHD Displays and watchig downscaled 4K Content. So even after spending a couple of thousands of dollars to watch what?....downscaled content?







Source
post #2 of 40
Why did they bother with such a small difference to begin with? All it does it confuse people. Talk about not being willing to go the extra mile and add 200 measly lines of pixels!

I thought I was going to be able to get the true cinema resolution 4K experience at home, so this is pretty lame, even if minor.
post #3 of 40
It is understandable if it cuts cost, I don't think one would be able to tell the difference between the two. So it's all good to me. I am still on the fence wether we need this or not but once it's affordable I will be buying a 4K or UltraHD Passive 3D set.

I have a 42" Hitachi plasma with the res of 1024 x 1024 purchased in the early days of HD and it still looks fabulous. Rivals my native 42" Sharp 1920 x 1080 LCD set but that's due to the deep blacks of a plasma and the size. I do own a 55" Vizio as well and all the files that once looked good on the 42" sets at 720p now look less impressive on the 55" Vizio LED.

Size makes you need the higher res but with a 200 pixel difference I don't think you will notice other than the obsession with having the set with the higher number pixels...
Edited by PlayNice - 1/11/13 at 4:09pm
post #4 of 40
The reason for the resolution(pixel count) difference is due to the aspect ratio. 2K, 4K and 8K are more closely based to aspect ratio of 35mm film so they can caputue all the information contained in a frame. The aspect ratio of 35mm is close to the standard flat aspect ratio of 1.85:1. The aspect ratio of HDTVs is slightly less at 1.78:1 or more commonly referred to 16:9. So that is the main difference. The CE industry had no intention of slight change of aspect ratio for consumer 4K, UHD, QuadHD or whatever they end up calling it.

This brings up a slight quandry when transferring titles from the theatrical masters due to the slight difference in the pixel resolution between 2K and 1920x1080 and it will be an issue again on 4K to UHD.

Option 1
Basically in the transfer process you could just take a subset of the pixels (1920x1080) from the center of the slightly larger 2K image. This is basically cropping. The downside is that you sacrafice a little of the original image. Aspect ratio purists tend to lose sleep over this, but in reality it is no way like throwing away a quarter to half of the image when our TVs had an aspect ratio of 1.33:1. This approach yields a very sharp image since there is no interpolation/scaling involved.

Option 2
The other approach is scale the original 2k image in its entirety. The benefit is that the original aspect ratio is saved, but the bad part is that the original aspect ratio and the one that it is being scaled to are so close and yet slightly different that the resulting quality is entirely dependent on the quality of the scaling algorithm. The sad fact is unless you are scaling in integer values in either direction that no amount of the original information remains in your resulting image.

This is exactly why UHD, QuadHD is exactly 2x(1920 by 1080) the number of pixels in each axis . It makes it much easier for the scaling algorithms to work on 1080 content in the new UHD sets. You basically say its line doubling. An old trick applied to a new format that requires minimal processing power.

The general concencous from professionals is that the higher quality image is from option #1 especially when dealing with flat (ie non scope films).

With scope films there will alway be scaling involved if we want to retain the original aspect ratio since our displays are going to be based on the HDTV aspect ratio for the forseeable future.
Edited by Toknowshita - 1/11/13 at 11:25am
post #5 of 40


SD vs HD vs UltraHD (QuadHD in the pic) vs 4K
Edited by PlayNice - 1/11/13 at 1:35pm
post #6 of 40
IT IS FOR 4K BUT IT JUST NOT NATIVE, IT IS FLAT CROPPED4K UHDTV , NATIVE WOULD BE 4096 × 2160

4K UHDTV (2160p) has a resolution of 3840 × 2160 (8.3 megapixels), 4 times the pixels of 1080p
post #7 of 40
Quote:
Originally Posted by trackmaster1 View Post

IT IS FOR 4K BUT IT JUST NOT NATIVE, IT IS FLAT CROPPED4K UHDTV , NATIVE WOULD BE 4096 × 2160

4K UHDTV (2160p) has a resolution of 3840 × 2160 (8.3 megapixels), 4 times the pixels of 1080p

The reason you are not getting native is because of the aspect ratio. They aren't going to change aspect ratio for content that isn't even available yet. 16:9 is going to be the standard aspect ratio for a long time.
post #8 of 40
Quote:
Originally Posted by Toknowshita View Post

The reason you are not getting native is because of the aspect ratio. They aren't going to change aspect ratio for content that isn't even available yet. 16:9 is going to be the standard aspect ratio for a long time.

A few sets today mostly monitors vs TV's use the 16:10 ratio i think it will be the same with 4K only a few will adopt this and go 16:10 either way it’s not that noticeable and shouldn’t be an issue when buying a new set.
post #9 of 40
Quote:
Originally Posted by PlayNice View Post

A few sets today mostly monitors vs TV's use the 16:10 ratio i think it will be the same with 4K only a few will adopt this and go 16:10 either way it’s not that noticeable and shouldn’t be an issue when buying a new set.

Except 16:10 is going in the wrong direction if we want to keep a native wider aspect ratio.

2K, 4K, 8K have ARs that are roughly 1.9:1. HDTV/UHD is 16:9 or 1.78:1. The bastardized computer 16:10 ratio is 1.6:1.
post #10 of 40
Quote:
Originally Posted by PlayNice View Post

A few sets today mostly monitors vs TV's use the 16:10 ratio i think it will be the same with 4K only a few will adopt this and go 16:10 either way it’s not that noticeable and shouldn’t be an issue when buying a new set.

Quote:
Originally Posted by Toknowshita View Post

Except 16:10 is going in the wrong direction if we want to keep a native wider aspect ratio.

2K, 4K, 8K have ARs that are roughly 1.9:1. HDTV/UHD is 16:9 or 1.78:1. The bastardized computer 16:10 ratio is 1.6:1.

Ok I see what you are getting at but I don’t think you see what I am getting at… I am saying the aspect ratio and resolution is so minuscule it won’t matter.

4096x2160 is 16:8.4375 and 3840x2160 is 16:9 when you have a TV set or Projector up on the wall no one is going to notice the half of a decimal missing or gained. We will be in awe of the outstanding clarity though.
post #11 of 40
Quote:
Originally Posted by Toknowshita View Post

The reason for the resolution(pixel count) difference is due to the aspect ratio. 2K, 4K and 8K are more closely based to aspect ratio of 35mm film so they can caputue all the information contained in a frame. The aspect ratio of 35mm is close to the standard flat aspect ratio of 1.85:1. The aspect ratio of HDTVs is slightly less at 1.78:1 or more commonly referred to 16:9. So that is the main difference. The CE industry had no intention of slight change of aspect ratio for consumer 4K, UHD, QuadHD or whatever they end up calling it.

This brings up a slight quandry when transferring titles from the theatrical masters due to the slight difference in the pixel resolution between 2K and 1920x1080 and it will be an issue again on 4K to UHD.

Option 1
Basically in the transfer process you could just take a subset of the pixels (1920x1080) from the center of the slightly larger 2K image. This is basically cropping. The downside is that you sacrafice a little of the original image. Aspect ratio purists tend to lose sleep over this, but in reality it is no way like throwing away a quarter to half of the image when our TVs had an aspect ratio of 1.33:1. This approach yields a very sharp image since there is no interpolation/scaling involved.

Option 2
The other approach is scale the original 2k image in its entirety. The benefit is that the original aspect ratio is saved, but the bad part is that the original aspect ratio and the one that it is being scaled to are so close and yet slightly different that the resulting quality is entirely dependent on the quality of the scaling algorithm. The sad fact is unless you are scaling in integer values in either direction that no amount of the original information remains in your resulting image.

This is exactly why UHD, QuadHD is exactly 2x(1920 by 1080) the number of pixels in each axis . It makes it much easier for the scaling algorithms to work on 1080 content in the new UHD sets. You basically say its line doubling. An old trick applied to a new format that requires minimal processing power.

The general concencous from professionals is that the higher quality image is from option #1 especially when dealing with flat (ie non scope films).

With scope films there will alway be scaling involved if we want to retain the original aspect ratio since our displays are going to be based on the HDTV aspect ratio for the forseeable future.

Best post so far.

Good (or bad?) news is that many 1.85:1 films are "cropped" to 1.78:1 (16x9) already to make a simpler transition to HD. So that trend is already there and should carry over to UltraHD convesion from true 4K preserving pixel integrity.

I personally hope that this approach is adopted as I'd rather lose a minicule amount of left/right info that would have been matted out in the theater anyway than risk the compromise of recalculating the entire pixel domain.

As far as 2.35:1 films are concerned, I keep dreaming of a "native 2.35" or "20x9" or something encoding format in the future so our software could be "native" 20x9 and allow for downscaling on-the-fly in consumer hardware like we did with 16x9 DVDs on 4x3 TVs. Then for high-end home-theaters able to run native 20x9, they'd have significanly more detail than standard "letterboxed" Ultra-HD images. Imagine Ben-Hur in native 20x9 "4K" (naturally that encoding would extend the horizontal resolution beyond the UltraHD spec to widen from 16x9 to 20x9). It could be called "UltraHDW" or something. smile.gif
post #12 of 40
Heres my 2 cents on the matter I think the timing is wrong after the failure of 3D and the fact that many people did not want to upgrade their TV with a failure to understand the 3D new technology, introducing a new technology which is not even developed is pointless and will further add to the confusion for consumers wanting to upgrade their TV sets today. What i believe they should have done is introduce a glassless free 3D TV as it seemed to a popular amongst the consumer rather than the facade of glasses, and in the meantime further developed upon a Ultra HD. I just hope that when the 4k tvs hit mainstream they either get rid of the gimmick of 3D (active and passive) or make it glassless free.
post #13 of 40
Quote:
Originally Posted by mr.marts View Post

Heres my 2 cents on the matter I think the timing is wrong after the failure of 3D and the fact that many people did not want to upgrade their TV with a failure to understand the 3D new technology, introducing a new technology which is not even developed is pointless and will further add to the confusion for consumers wanting to upgrade their TV sets today. What i believe they should have done is introduce a glassless free 3D TV as it seemed to a popular amongst the consumer rather than the facade of glasses, and in the meantime further developed upon a Ultra HD. I just hope that when the 4k tvs hit mainstream they either get rid of the gimmick of 3D (active and passive) or make it glassless free.

Technology can't evolve into mature deliverable products overnight just because we wish it so... autoscopic 3D has some difficult challenges that are still being worked out and won't really be "ready" for a while (though hopefully some compromised, yet "good enough" for non-videophile solutions will be availale this year). But guess what... many autoscopic technologies benefit greatly from added screen resolution, so autoscopic-3D and 4K aren't mutually exclusive R&D investments. Passive-Polarized 3D and 4K resolution are also good friends (one way of getting "full 1080p HD 3D" from a passive set). win-win.
post #14 of 40
Really? 200 some od pixels is the big hang up? It's close enough to 4k, jeez people.
post #15 of 40
Quote:
Originally Posted by Toknowshita View Post

Except 16:10 is going in the wrong direction if we want to keep a native wider aspect ratio.

2K, 4K, 8K have ARs that are roughly 1.9:1. HDTV/UHD is 16:9 or 1.78:1. The bastardized computer 16:10 ratio is 1.6:1.
For televisions going taller is of course the wrong direction to take things, since most movies are shot in a wider aspect ratio. But as far as computer monitors, having a vertical resolution that is taller is wanted by many including myself, just due to the vertical real estate needed for reading documents, photo editing, having room for complex menu's and many other things. I deliberately purchased computer monitors that had 1920x1200 instead of 1920x1080 just because it makes such a big difference without having to turn a monitor vertical.
post #16 of 40
Why can't they give me a 100 inch screen of my choice, either 16:9 or 2:35:1 and have it be Kuro PQ??? I BE HAPPY

They can keep their 8K or 24K. Who cares..
post #17 of 40
If they gave me a set that was a cinema scope native 4k resolution that conformed to the D-cinema standard out of the box with some kind of masking system id start saving right now.

With a built in hard drive that received content right from the same network that the D-Cinema equipped movie theaters get there digital content the studios could have a viable option to replace theaters without sacrificing a thing. this is the future in my opinion.
post #18 of 40
What about the differences in color space...

I know a few professionals were explaining to me how UltraHD doesn't use the same color space as 4k DCI (p3 i think?)
post #19 of 40
Quote:
Originally Posted by mr.marts View Post

Heres my 2 cents on the matter I think the timing is wrong after the failure of 3D and the fact that many people did not want to upgrade their TV with a failure to understand the 3D new technology, introducing a new technology which is not even developed is pointless and will further add to the confusion for consumers wanting to upgrade their TV sets today. What i believe they should have done is introduce a glassless free 3D TV as it seemed to a popular amongst the consumer rather than the facade of glasses, and in the meantime further developed upon a Ultra HD. I just hope that when the 4k tvs hit mainstream they either get rid of the gimmick of 3D (active and passive) or make it glassless free.

I don't know what I think of glass-less 3D; there's not much content of value out there (I own one of the best disks, Avatar) -- and there seems to be very little broadcast 3D content.

That seems to be the biggest problem with "4K": Content. Will the existing over-the-air, cable and satellite bandwidths support the much-higher bandwidth of 4K? Or perhaps there is some technical magic I've not yet heard about.

Phil
post #20 of 40
Basically, all of this confusion sounds like the chatter surrounding HDTV about 10 years ago....

What it all means is anyones guess, but I won't be an early adopter this time. I went HD in 2005, and although I am glad that I did, it may still have been early.

I like the progression of tech, and I look forward to 4K and 8K in the home one day, as I may be one of the lucky ones with a Man Cave, so the wife doesn't care about the size of the TV.

However, the confusion, cost, and mystery surrounding content delivery will make the adoption of 4K very slow I think.

The prices will drop soon, but the bandwidth issue is a long way to being solved. I personally don't like compression, even though I am forced to deal with it, (DirecTV MPEG 4, iTunes, etc). It seems that most others don't seem to care and don't notice the difference.

UltraHD, or 4K? Neither will matter much in the consumer world for at least the next 5-8 years. For those of you who move forward within the next 2 years, good luck, as I'm rooting for you.
post #21 of 40
Quote:
Originally Posted by mr.marts View Post

Heres my 2 cents on the matter I think the timing is wrong after the failure of 3D ...[snip]
What? Did I miss a board meeting? 3D failed? I suppose DTS 7.1 and Dolby 7.1 failed as well, since you didnt buy extra speakers...
post #22 of 40
This entire mess is really nauseating to me. Hard to imagine them devising a way to defuse public interest in the whole thing any better than they have so far.

Forget it- for me, anyway. I'm simply going to buy one of these absurd-performing $1000 projectors and "suffer" with it for a few years until this is all sorted.


James
post #23 of 40
Quote:
Originally Posted by BAMABLUHD View Post

The prices will drop soon, but the bandwidth issue is a long way to being solved. I personally don't like compression, even though I am forced to deal with it, (DirecTV MPEG 4, iTunes, etc). It seems that most others don't seem to care and don't notice the difference.

Already solved. h.265 can cut the required bandwidth in half. So that means that a 2 hour 4K/UHD movie can fit on on a 3 or 4 layer BD. Broadcaster dial the compression up even higher to fit it into a 1080p slot with minimal distortion.
post #24 of 40
Quote:
Originally Posted by Augerhandle View Post

What? Did I miss a board meeting? 3D failed? I suppose DTS 7.1 and Dolby 7.1 failed as well, since you didnt buy extra speakers...

StereoVision was trotted out after 2 decades of slumber purely to make more money then mono films. It failed by a lot to do so, it's rare to see a breakdown but from what I can tell the mono version of the same film still makes more money and the films not in StereoVision rake in more cash overall.

The often claimed 'all films will be in StereoVision' (and TV) statement is something that clearly will not happen.
post #25 of 40
Quote:
Originally Posted by wuther View Post

Quote:
Originally Posted by Augerhandle View Post

What? Did I miss a board meeting? 3D failed? I suppose DTS 7.1 and Dolby 7.1 failed as well, since you didnt buy extra speakers...

StereoVision was trotted out after 2 decades of slumber purely to make more money then mono films. It failed by a lot to do so, it's rare to see a breakdown but from what I can tell the mono version of the same film still makes more money and the films not in StereoVision rake in more cash overall.

The often claimed 'all films will be in StereoVision' (and TV) statement is something that clearly will not happen.

Anyone can look up box office receipts and see the breakdown. 3D is alive and well, and some movies will do better than others, 3D or not. According to Boxofficemojo.com, R-rated movies were down 13.5% in 2011. Does this mean R-rated films have failed? rolleyes.gif Not every film is in 3D, nor does every film necessarily do better because it is in 3D. This doesn't denote failure of 3D, just as choosing to watch a film in Dolby 5.1 (or in mono on your cellphone) doesn't denote a failure of 7.1.

I don't know what you're talking about when you state 'all films will be in StereoVision', as I've never seen that claim before. 3D is just another choice in the marketplace, just as 5.1 or 7.1 surround, or even extra butter on your popcorn. The fact that it isn't chosen by EVERYONE doesn't make it a failure.

Anyway, we are way off-topic. There are plenty of threads for debating the viability of 3D. without polluting this one further. Let's move on.
post #26 of 40
When the HD "standard" was decided upon in the U.S., people pointed out that 16:9 doesn't match ANYTHING. No theatrical format used this aspect ratio, which set us up for the continued compromises and incompatibility of our acquisition formats and TV presentation.

But then, what do you expect when we still have 1930s interlacing and 1950s non-integer frame rates in our digital TV system?
post #27 of 40
16:9 was selected as the fairest compromise between all of the aspect ratios, because all ratios, from 4:3 to 2.35:1 fit inside its rectangle.
Edited by Augerhandle - 1/17/13 at 4:39pm
post #28 of 40
As someone already said this is all so reminiscent of the early 2000s. Back then everyone was so concerned over a few pixels and 720p vs 1080i vs 1080p, but what really matters is how good the picture looks to the viewer. If you can't see it yourself don't worry about it. Whether it's 4k or UltraHD, if all else is equal can you see the difference?
post #29 of 40
Thread Starter 
I think it comes down to some people won't stop until they can get anything in a native format, whether it be audio or video.
post #30 of 40
Great. I knew I shouldn't have spent so much money on BluRay discs! Blu Rays on 4k tv's are going to look like dvds on your 1080p tv. I can't imagine what a 720x480 res movie would look like on a 4k tv. Is the answer not to adopt any type of tangible media and just go to subscription? I feel like BR's are just becoming mainstream now. I would imagine that we are probably looking at 2-3 years before 4k is reasonably priced, and maybe another 2-3 years before there is a considerable amount of 4k content? Starting to feel a little better.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Latest Industry News
AVS › AVS Forum › News Forum › Latest Industry News › Why UltraHD is Not 4K