AVS Forum banner

1 - 20 of 40 Posts

·
Registered
Joined
·
604 Posts
Discussion Starter #1

Source of picture


UltraHD and 4K have been the big talk at CES this year and it seems the terms has been getting some mix up.
Quote:
But for industry pros, all the talk about 4K being UltraHD, and vice-versa is heresy.

Forbes spoke with an industry expert in both pro and consumer video topics Patrick-Pierre Garcia, who states;
Quote:
The UltraHD format has a horizontal resolution of only 3,840 pixels whereas the 4K format has 4,096 pixels. There’s nothing secret about this. UHD and 4K are official formats and I don’t understand why the CE industry is confusing consumers by calling UltraHD products, 4K.


The confusion might even delay the adoption of UltraHD.

Here's some fun facts to help clarify some of the mix up:
Quote:
1. 4K is the format used by professionals (movie studios, TV channels…) to create content and has a resolution of 4096 × 2160. It replaces 2K and will be replaced by 8K

2. The UltraHD format was voted by the Consumer Electronics Association (CEA) last October and refers devices with a horizontal resolution of at least 3,840 pixels. Sharp also calls QuadHD

3. And what about 4K UltraHD or UltraHD 4K? Well, these are marketing gimmicks not based on any kind of reality says Garcia


So I guess consumers will be buying UHD Displays and watchig downscaled 4K Content. So even after spending a couple of thousands of dollars to watch what?....downscaled content?






Source
 

·
Registered
Joined
·
918 Posts
Why did they bother with such a small difference to begin with? All it does it confuse people. Talk about not being willing to go the extra mile and add 200 measly lines of pixels!


I thought I was going to be able to get the true cinema resolution 4K experience at home, so this is pretty lame, even if minor.
 

·
Registered
Joined
·
783 Posts
It is understandable if it cuts cost, I don't think one would be able to tell the difference between the two. So it's all good to me. I am still on the fence wether we need this or not but once it's affordable I will be buying a 4K or UltraHD Passive 3D set.


I have a 42" Hitachi plasma with the res of 1024 x 1024 purchased in the early days of HD and it still looks fabulous. Rivals my native 42" Sharp 1920 x 1080 LCD set but that's due to the deep blacks of a plasma and the size. I do own a 55" Vizio as well and all the files that once looked good on the 42" sets at 720p now look less impressive on the 55" Vizio LED.


Size makes you need the higher res but with a 200 pixel difference I don't think you will notice other than the obsession with having the set with the higher number pixels...
 

·
Registered
Joined
·
1,438 Posts
The reason for the resolution(pixel count) difference is due to the aspect ratio. 2K, 4K and 8K are more closely based to aspect ratio of 35mm film so they can caputue all the information contained in a frame. The aspect ratio of 35mm is close to the standard flat aspect ratio of 1.85:1. The aspect ratio of HDTVs is slightly less at 1.78:1 or more commonly referred to 16:9. So that is the main difference. The CE industry had no intention of slight change of aspect ratio for consumer 4K, UHD, QuadHD or whatever they end up calling it.


This brings up a slight quandry when transferring titles from the theatrical masters due to the slight difference in the pixel resolution between 2K and 1920x1080 and it will be an issue again on 4K to UHD.


Option 1

Basically in the transfer process you could just take a subset of the pixels (1920x1080) from the center of the slightly larger 2K image. This is basically cropping. The downside is that you sacrafice a little of the original image. Aspect ratio purists tend to lose sleep over this, but in reality it is no way like throwing away a quarter to half of the image when our TVs had an aspect ratio of 1.33:1. This approach yields a very sharp image since there is no interpolation/scaling involved.


Option 2

The other approach is scale the original 2k image in its entirety. The benefit is that the original aspect ratio is saved, but the bad part is that the original aspect ratio and the one that it is being scaled to are so close and yet slightly different that the resulting quality is entirely dependent on the quality of the scaling algorithm. The sad fact is unless you are scaling in integer values in either direction that no amount of the original information remains in your resulting image.


This is exactly why UHD, QuadHD is exactly 2x(1920 by 1080) the number of pixels in each axis . It makes it much easier for the scaling algorithms to work on 1080 content in the new UHD sets. You basically say its line doubling. An old trick applied to a new format that requires minimal processing power.


The general concencous from professionals is that the higher quality image is from option #1 especially when dealing with flat (ie non scope films).


With scope films there will alway be scaling involved if we want to retain the original aspect ratio since our displays are going to be based on the HDTV aspect ratio for the forseeable future.
 

·
Registered
Joined
·
783 Posts



SD vs HD vs UltraHD (QuadHD in the pic) vs 4K
 

·
Registered
Joined
·
48 Posts
IT IS FOR 4K BUT IT JUST NOT NATIVE, IT IS FLAT CROPPED4K UHDTV , NATIVE WOULD BE 4096 × 2160


4K UHDTV (2160p) has a resolution of 3840 × 2160 (8.3 megapixels), 4 times the pixels of 1080p
 

·
Registered
Joined
·
1,438 Posts

Quote:
Originally Posted by trackmaster1  /t/1451171/why-ultrahd-is-not-4k#post_22816935


IT IS FOR 4K BUT IT JUST NOT NATIVE, IT IS FLAT CROPPED4K UHDTV , NATIVE WOULD BE 4096 × 2160


4K UHDTV (2160p) has a resolution of 3840 × 2160 (8.3 megapixels), 4 times the pixels of 1080p

The reason you are not getting native is because of the aspect ratio. They aren't going to change aspect ratio for content that isn't even available yet. 16:9 is going to be the standard aspect ratio for a long time.
 

·
Registered
Joined
·
783 Posts

Quote:
Originally Posted by Toknowshita  /t/1451171/why-ultrahd-is-not-4k#post_22816983


The reason you are not getting native is because of the aspect ratio. They aren't going to change aspect ratio for content that isn't even available yet. 16:9 is going to be the standard aspect ratio for a long time.

A few sets today mostly monitors vs TV's use the 16:10 ratio i think it will be the same with 4K only a few will adopt this and go 16:10 either way it’s not that noticeable and shouldn’t be an issue when buying a new set.
 

·
Registered
Joined
·
1,438 Posts

Quote:
Originally Posted by PlayNice  /t/1451171/why-ultrahd-is-not-4k#post_22817015


A few sets today mostly monitors vs TV's use the 16:10 ratio i think it will be the same with 4K only a few will adopt this and go 16:10 either way it’s not that noticeable and shouldn’t be an issue when buying a new set.

Except 16:10 is going in the wrong direction if we want to keep a native wider aspect ratio.


2K, 4K, 8K have ARs that are roughly 1.9:1. HDTV/UHD is 16:9 or 1.78:1. The bastardized computer 16:10 ratio is 1.6:1.
 

·
Registered
Joined
·
783 Posts

Quote:
Originally Posted by PlayNice  /t/1451171/why-ultrahd-is-not-4k#post_22817015


A few sets today mostly monitors vs TV's use the 16:10 ratio i think it will be the same with 4K only a few will adopt this and go 16:10 either way it’s not that noticeable and shouldn’t be an issue when buying a new set.
Quote:
Originally Posted by Toknowshita  /t/1451171/why-ultrahd-is-not-4k#post_22817178


Except 16:10 is going in the wrong direction if we want to keep a native wider aspect ratio.


2K, 4K, 8K have ARs that are roughly 1.9:1. HDTV/UHD is 16:9 or 1.78:1. The bastardized computer 16:10 ratio is 1.6:1.

Ok I see what you are getting at but I don’t think you see what I am getting at… I am saying the aspect ratio and resolution is so minuscule it won’t matter.


4096x2160 is 16:8.4375 and 3840x2160 is 16:9 when you have a TV set or Projector up on the wall no one is going to notice the half of a decimal missing or gained. We will be in awe of the outstanding clarity though.
 

·
Registered
Joined
·
6,463 Posts

Quote:
Originally Posted by Toknowshita  /t/1451171/why-ultrahd-is-not-4k#post_22816160


The reason for the resolution(pixel count) difference is due to the aspect ratio. 2K, 4K and 8K are more closely based to aspect ratio of 35mm film so they can caputue all the information contained in a frame. The aspect ratio of 35mm is close to the standard flat aspect ratio of 1.85:1. The aspect ratio of HDTVs is slightly less at 1.78:1 or more commonly referred to 16:9. So that is the main difference. The CE industry had no intention of slight change of aspect ratio for consumer 4K, UHD, QuadHD or whatever they end up calling it.


This brings up a slight quandry when transferring titles from the theatrical masters due to the slight difference in the pixel resolution between 2K and 1920x1080 and it will be an issue again on 4K to UHD.


Option 1

Basically in the transfer process you could just take a subset of the pixels (1920x1080) from the center of the slightly larger 2K image. This is basically cropping. The downside is that you sacrafice a little of the original image. Aspect ratio purists tend to lose sleep over this, but in reality it is no way like throwing away a quarter to half of the image when our TVs had an aspect ratio of 1.33:1. This approach yields a very sharp image since there is no interpolation/scaling involved.


Option 2

The other approach is scale the original 2k image in its entirety. The benefit is that the original aspect ratio is saved, but the bad part is that the original aspect ratio and the one that it is being scaled to are so close and yet slightly different that the resulting quality is entirely dependent on the quality of the scaling algorithm. The sad fact is unless you are scaling in integer values in either direction that no amount of the original information remains in your resulting image.


This is exactly why UHD, QuadHD is exactly 2x(1920 by 1080) the number of pixels in each axis . It makes it much easier for the scaling algorithms to work on 1080 content in the new UHD sets. You basically say its line doubling. An old trick applied to a new format that requires minimal processing power.


The general concencous from professionals is that the higher quality image is from option #1 especially when dealing with flat (ie non scope films).


With scope films there will alway be scaling involved if we want to retain the original aspect ratio since our displays are going to be based on the HDTV aspect ratio for the forseeable future.

Best post so far.


Good (or bad?) news is that many 1.85:1 films are "cropped" to 1.78:1 (16x9) already to make a simpler transition to HD. So that trend is already there and should carry over to UltraHD convesion from true 4K preserving pixel integrity.


I personally hope that this approach is adopted as I'd rather lose a minicule amount of left/right info that would have been matted out in the theater anyway than risk the compromise of recalculating the entire pixel domain.


As far as 2.35:1 films are concerned, I keep dreaming of a "native 2.35" or "20x9" or something encoding format in the future so our software could be "native" 20x9 and allow for downscaling on-the-fly in consumer hardware like we did with 16x9 DVDs on 4x3 TVs. Then for high-end home-theaters able to run native 20x9, they'd have significanly more detail than standard "letterboxed" Ultra-HD images. Imagine Ben-Hur in native 20x9 "4K" (naturally that encoding would extend the horizontal resolution beyond the UltraHD spec to widen from 16x9 to 20x9). It could be called "UltraHDW" or something.
 

·
Registered
Joined
·
34 Posts
Heres my 2 cents on the matter I think the timing is wrong after the failure of 3D and the fact that many people did not want to upgrade their TV with a failure to understand the 3D new technology, introducing a new technology which is not even developed is pointless and will further add to the confusion for consumers wanting to upgrade their TV sets today. What i believe they should have done is introduce a glassless free 3D TV as it seemed to a popular amongst the consumer rather than the facade of glasses, and in the meantime further developed upon a Ultra HD. I just hope that when the 4k tvs hit mainstream they either get rid of the gimmick of 3D (active and passive) or make it glassless free.
 

·
Registered
Joined
·
6,463 Posts

Quote:
Originally Posted by mr.marts  /t/1451171/why-ultrahd-is-not-4k#post_22817803


Heres my 2 cents on the matter I think the timing is wrong after the failure of 3D and the fact that many people did not want to upgrade their TV with a failure to understand the 3D new technology, introducing a new technology which is not even developed is pointless and will further add to the confusion for consumers wanting to upgrade their TV sets today. What i believe they should have done is introduce a glassless free 3D TV as it seemed to a popular amongst the consumer rather than the facade of glasses, and in the meantime further developed upon a Ultra HD. I just hope that when the 4k tvs hit mainstream they either get rid of the gimmick of 3D (active and passive) or make it glassless free.

Technology can't evolve into mature deliverable products overnight just because we wish it so... autoscopic 3D has some difficult challenges that are still being worked out and won't really be "ready" for a while (though hopefully some compromised, yet "good enough" for non-videophile solutions will be availale this year). But guess what... many autoscopic technologies benefit greatly from added screen resolution, so autoscopic-3D and 4K aren't mutually exclusive R&D investments. Passive-Polarized 3D and 4K resolution are also good friends (one way of getting "full 1080p HD 3D" from a passive set). win-win.
 

·
Registered
Joined
·
2,317 Posts
Really? 200 some od pixels is the big hang up? It's close enough to 4k, jeez people.
 

·
Registered
Joined
·
210 Posts

Quote:
Originally Posted by Toknowshita  /t/1451171/why-ultrahd-is-not-4k#post_22817178


Except 16:10 is going in the wrong direction if we want to keep a native wider aspect ratio.


2K, 4K, 8K have ARs that are roughly 1.9:1. HDTV/UHD is 16:9 or 1.78:1. The bastardized computer 16:10 ratio is 1.6:1.
For televisions going taller is of course the wrong direction to take things, since most movies are shot in a wider aspect ratio. But as far as computer monitors, having a vertical resolution that is taller is wanted by many including myself, just due to the vertical real estate needed for reading documents, photo editing, having room for complex menu's and many other things. I deliberately purchased computer monitors that had 1920x1200 instead of 1920x1080 just because it makes such a big difference without having to turn a monitor vertical.
 

·
Registered
Joined
·
600 Posts
Why can't they give me a 100 inch screen of my choice, either 16:9 or 2:35:1 and have it be Kuro PQ??? I BE HAPPY


They can keep their 8K or 24K. Who cares..
 

·
Registered
Joined
·
681 Posts
If they gave me a set that was a cinema scope native 4k resolution that conformed to the D-cinema standard out of the box with some kind of masking system id start saving right now.


With a built in hard drive that received content right from the same network that the D-Cinema equipped movie theaters get there digital content the studios could have a viable option to replace theaters without sacrificing a thing. this is the future in my opinion.
 

·
Registered
Joined
·
5,049 Posts
What about the differences in color space...


I know a few professionals were explaining to me how UltraHD doesn't use the same color space as 4k DCI (p3 i think?)
 

·
Registered
Joined
·
748 Posts

Quote:
Originally Posted by mr.marts  /t/1451171/why-ultrahd-is-not-4k#post_22817803


Heres my 2 cents on the matter I think the timing is wrong after the failure of 3D and the fact that many people did not want to upgrade their TV with a failure to understand the 3D new technology, introducing a new technology which is not even developed is pointless and will further add to the confusion for consumers wanting to upgrade their TV sets today. What i believe they should have done is introduce a glassless free 3D TV as it seemed to a popular amongst the consumer rather than the facade of glasses, and in the meantime further developed upon a Ultra HD. I just hope that when the 4k tvs hit mainstream they either get rid of the gimmick of 3D (active and passive) or make it glassless free.

I don't know what I think of glass-less 3D; there's not much content of value out there (I own one of the best disks, Avatar) -- and there seems to be very little broadcast 3D content.


That seems to be the biggest problem with "4K": Content. Will the existing over-the-air, cable and satellite bandwidths support the much-higher bandwidth of 4K? Or perhaps there is some technical magic I've not yet heard about.


Phil
 

·
Registered
Joined
·
458 Posts
Basically, all of this confusion sounds like the chatter surrounding HDTV about 10 years ago....


What it all means is anyones guess, but I won't be an early adopter this time. I went HD in 2005, and although I am glad that I did, it may still have been early.


I like the progression of tech, and I look forward to 4K and 8K in the home one day, as I may be one of the lucky ones with a Man Cave, so the wife doesn't care about the size of the TV.


However, the confusion, cost, and mystery surrounding content delivery will make the adoption of 4K very slow I think.


The prices will drop soon, but the bandwidth issue is a long way to being solved. I personally don't like compression, even though I am forced to deal with it, (DirecTV MPEG 4, iTunes, etc). It seems that most others don't seem to care and don't notice the difference.


UltraHD, or 4K? Neither will matter much in the consumer world for at least the next 5-8 years. For those of you who move forward within the next 2 years, good luck, as I'm rooting for you.
 
1 - 20 of 40 Posts
Top