AVS Forum banner
1 - 20 of 65 Posts

· Banned
Joined
·
10,830 Posts
Discussion Starter · #1 ·




There is a growing problem within the CE industry today. The problem is a collaborative one that was created out of ambiguity and laziness between enthusiasts and within the CE industry itself. I am, of course, talking about the widely used "4K" moniker when talking about UHD. The two formats are obviously not the same, otherwise we'd only be calling it one or the other. Part of the problem has been a lengthy wait time, wild speculation on the formats' specifics, and a problem with manufacturers themselves making the two names synonymous with each other.


Like 2K, 4K is a professional format used on the commercial side of video production most often seen by everyday consumers at commercial movie theaters equipped with the latest digital projectors. Unlike UHD, 4K has a different native aspect ratio. A true 4K image (4096x2160) has an aspect ratio of 1.9:1, while a true UHD image (3840x2160) is 1.78:1. We can see here that a 4K panel is actually wider by 256 pixels. This is a trivial number and doesn't do much in terms of overall resolution or clarity of the image. I'm fairly certain that this minimal difference in resolution is what's fueling many of us to call UHD "4K."


This 256 horizontal pixel difference causes at least one major issue when dealing with consumer content. The problem is that almost all television content is presented in a 1.78:1 aspect ratio. If we were to view this content on a true 4K display, we would see black bars on the left and right side of the display to keep that original aspect ratio intact. While enthusiasts understand the reasoning behind this, most everyday viewers would find their TV content annoyingly masked with black bars, very similar to how they find black bars on their 1080p televisions annoying while viewing 'scope films. This is one of the main reasons for choosing 3840x2160 as the next-gen consumer resolution. It makes sense to keep that 1.78:1 aspect ratio as most content made for broadcast TV is presented this way.


True 4K is the resolution specified by the DCI (Digital Cinema Initiative) commercial standard. This is another area where UHD and 4K differ. Much like Blu-ray is the 1080p standard for encoding and presentation, 4K has its own set of standards that the DCI dictates. These standards are high end, resulting in exemplary image quality. While it isn't totally clear yet what kind of video encoding standards the new UHD video format will use, all rumors point to sub-par encoding. DCI 4K uses JPEG2000 video compression, up to 250Mbps video bitrate, 12-bit 4:4:4 video, and a much wider color gamut. HDMI 2.0 will most likely dictate the standards for UHD Blu-ray (or whatever they decide to call it). Unfortunately, HDMI has very little left to give as an interconnect standard. As a result, there is no way to transport the amount of information needed to exceed or even match the 4K DCI standard. Those in the know are under NDA (non-disclosure agreements), which means we won't know the specifics for at least another month or two. Rumors point to 10-bit 4:2:2 video for UHD video content at 24 frames per second and a doubling of the throughput to support higher bitrates.


As we can see, the term "4K" encompasses more than just resolution. I’m going to give everyone who's called UHD "4K" the benefit of the doubt and assume that everyone knows the differences. Heck, I'll admit it, I was one of you. So, if this is the case, why does everyone still call UHD "4K"?


The issue stems from a time where no one knew what the new format was going to be. We've only known for a fairly short time that the new resolution was going to be 3840x2160. This resolution was not the 4096x2160 that enthusiasts wanted. The A/V enthusiast community, including us here at AVSForum, are partly to blame for this naming error. We had way too much time on our hands to gossip, speculate, and more importantly, assume we were going to get a format that would derive from the DCI 4K standard regarding not only resolution, but video compression, bitrates, bit depth, chroma subsampling, color space, etc. This infatuation with wanting the best of the best turned anything next-generation digital video into "4K," even if that isn't what we got. What we now have is an entire industry afraid to let go of that 4K moniker because of how much the name has stuck, even though what we're getting isn't 4K in both resolution and video-encoding quality.


I've noticed many UHD products being described as "4K-UHD." Even the Wikipedia page for UHD now refers to UHD as "4K-UHD." If you go to Newegg or Amazon, many HDTVs and computer monitors have both 4K and UHD in the title. To those who don't know the difference, it can be confusing. If I was shopping for a 1080p HDTV and saw 2K and 1080p in the title, I'd be very confused. If I was shopping for a computer monitor and saw it listed as both 1920x1200 and 1920x1080, I'd be even more confused. Which one is it?!? I’m sorry folks, but UHD and 4K are not the same. It boggles my mind that at least one major manufacturer hasn't called BS on this.


I blame the CE industry for letting this issue continue. UHD is still one of those esoteric topics where all the CE manufacturers need to do is simply change their branding scheme to fix the issue. If you were to stop 100 regular folks on the street to see if they knew what 4K or UHD is, I'd wager that less than 10 percent could give you a correct answer. Sony has flat out said it will not drop the "4K" naming scheme even if its products aren't really 4K. It seems 4K is a much more marketable name than UHD to early adopters.

In Scott Wilkinson's recent interview with video guru Joe Kane , he speaks about the same issue. Kane seems just as upset as I am. But he offers a solution, and it's a fairly simple one. He thinks it could be as easy as getting people to start referring to UHD as "2160p." Kane's reasoning has to do with why we call our Full HD displays "1080p." Kane explains that we have always referenced consumer displays by their vertical resolution and commercial displays by their horizontal resolution. So 2160p could be a great alternative to UHD, just like 1080p, 720p, and 480p were before it. I, for one, agree. It seems like the logical solution here even if it doesn't roll off the tongue as easily as 4K does.


To get the change that’s needed, someone big needs to take a stand and completely drop the 4K naming scheme for home-theater products. I find this issue particularly troublesome because even enthusiasts seem completely content with making the mistake. I guess a good way to sum it up would be to say that the term "4K" already has a meaning—it refers to a resolution and a strict set of rules for presentation. We aren't getting the same resolution or the same set of rules with UHD, so why call it something it isn't? I think we here at AVSForum have an obligation to fix this. We're a large enough community to make a difference. This site alone gets over 2 million unique visitors per month. With enough word of mouth, or more specifically, forum posting, we can turn this thing around. Who's with me?
 

· Registered
Joined
·
854 Posts
I am. Remember when there was EDTV and HDTV? HDTV won. Fox was one of the first stations to adopt EDTV widescreen when HDTV was around, calling it "Fox Widescreen" for EDTV and "Fox High Resolution Widescreen" for HD presentations. 24 was one of the landmark shows that were shown this way, as well as the 2001-2 super bowl and world series.
 

· Registered
Joined
·
1,438 Posts
I like the name 4K a lot better than UHD. So it's not true 4K, who cares?
 

· Registered
Joined
·
10,851 Posts
I call it Pseudo 4K and Real 4K currently.
When they finally get the specifications worked out that a proper 4K can presented to the Consumers - then I'll call it Actual 4K - but will this be in time for January's CES? or are they still fighting?
 
  • Like
Reactions: Marc Wielage

· Registered
Joined
·
116 Posts
Call it 4K-PRO and 4K-UHD.

4K-PRO will be for professional equipment and it will be much easier for them to
add that name to a small field of products. Companies will stick with this because
they are selling to a sophisticated market.

4K-UHD or just "4K" will refer to the consumer version because they have grabbed
the 4K word and will not let go.

Steve
 

· Banned
Joined
·
10,830 Posts
Discussion Starter · #7 ·
What you guys need to remember is that this isn't a format war. 4K is 4K and UHD is UHD. They're different and the UHD standards being created at this time are not going to meet the high standards 4K has. HDMI 2.0 WILL NOT allow for this. There simply isn't enough bandwidth to transmit the information needed to meet 4K standards. We are going to be stuck with lower quality source material, which is why we shouldn't be be calling UHD "4K." We aren't getting the same resolution and we won't be getting the same set of standards 4K already has.

Regarding EDTV vs HDTV. This is not the same because we don't have two sets of resolutions being sought after. There is only UHD (3840 x 2160). There is no "winner" and no "proper" 4K coming only the UHD we already have.
 

· Registered
Joined
·
222 Posts
JVC brought out the 4k label for their projectors a couple of years back and its sort of stuck. In reality, 4k means 4000 so both 4096 and 3840 are technically incorrect. JVC talk 4k and the professionals talk 4k but both are talking about something else.
I kind of prefer 4k for 3840 but I get your point - they need to be differentiated as they mean different things. I think the consumer likes 4k so maybe the pro's should adopt a new terminology as after all, its a smaller pool of people to ask to convert rather than the general public.
 

· Registered
Joined
·
51 Posts
I know it's probably sacrilege to say this, but why not dump HDMI and begin switching to DisplayPort / Thunderbolt? HDMI could still be included, but would not give the best pictures quality similar to when they phased over from component video.
 

· Registered
Joined
·
623 Posts
I think HDMI can do it, we'd just need to team them. Lots of high res computer monitors used to require two simultaneous dvi connections from the pc video card, I believe essentially each dvi connection carried either the top or bottom half of the display resolution. I think HDMI 2.0 could probably do True 4k if you split the signal over two hdmi cables.

And I will have to agree with a previous poster. Back in the HD format war, everyone thought the consumer would be fine with 720/1080i. But before we knew it, manufacturers started marketing 1080p as FullHD or TrueHD, and Joe average caught on to this and soon enough you couldn't even sell a 720p or 1080i set. I can absolutely see this happening again (come on, home a/v has been going for 50 years, history keeps repeating itself every few years from what I've seen). So, I see 4K-UHD starting off, then some manufacturers will realize they can sell more by tossing in the extra pixels (and hopefully the ability to accept actual 4K video) and call it True4K.

I understand the OP's frustration, but it is what it is. It's just a name, and so long as the enthusiasts know the difference that's all that matters. Well, okay what really matters is if they will actually change course and give us real 4k instead of this UHD "4k-lite". Aw man. I can just see how bitstarved even UHD is going to be over broadcast, cable and satellite.
 

· Registered
Joined
·
1,176 Posts
i dont really think the term "2160p" will ever catch on. its a good idea but its not as much of as "buzz word" as 4K or ultra-HD and we all know how much people(big box shoppers) like buzz words
 

· Banned
Joined
·
10,830 Posts
Discussion Starter · #13 ·
Lazarus Dark -

I'm not really upset about people referencing it as the wrong technical name instead of it's nickname. This would be ridiculous. I'm more upset that people are calling it 4K because 4K is already something and has been for a number of years. And it's something that isn't anything close to what we're going to get at home. There's no way the standard will meet what we get in commercial theaters. For this to work, instead of discs we'd be buying mechanical hard drives instead. The size of the 4K files are massive that not even the rumored 300GB blu-ray discs would be large enough to hold them.

The interconnect for 4K works as you propose. There are two dual link HD-SDI's used in tandem to make the data exchange from media server to projector. HDMI and single link DVI utilize the same 19 pin configuration and are, for all intents and purposes, the same connection. They are revising this interconnect AGAIN after multiple times to squeeze as much from it as possible. With the information already leaked on the finalized decision for HDMI 2.0, we know it cannot handle at UHD or 4K resolution at the same bitrate with 12bit, 4:4:4 chroma subsampling at various frame rates. With HDMI bringing a single "all in one" interconnect to the table some years ago, basically revolutionizing things, I highly doubt they will be taking a step back and go with a dual interconnect solution.
 

· Coyote Waits
Joined
·
27,308 Posts
I've tried in various display threads to point out the simple distinction between image retention and burn in. Good luck with this one.
 

· Registered
Joined
·
10,401 Posts
Technically it can't be called UHD either because NHK have used Ultra High Vision (High Vision is what they call High Definition) reserved for 8K resolution.

Besides, JVC and Sony already call their displays and projectors as 4K before the term UHD was even coined.
 

· Registered
Joined
·
605 Posts
Nice write up Seegs, I have to agree with you that AVS is a great start to possibly help change things around.

On another hand (not to be negative or anything), manufacturers are always trying to find ways to differentiate themselves from other in the market. Rewind back when HD was surfacing. TVs has HD Ready, which didn't mean much and wasn't entirely true. I believe manufacturers will use whichever term they feel will gain more attraction. Like David mentioned above, 4K is grabbing everyones attention, Sony and JVC are using that.
 

· Registered
Joined
·
105 Posts
I am a little more upset by Sony printing "mastered in 4K" on their blu rays and displaying them next to their new "4K" tv's without making it clear that they are not 4K blu rays, they have the same 1080p resolution as other blu rays, except they took the 1080p off the description on the back to mislead their customers. Shameful behavior to disguise the fact that there will never be a consumer 4K movie format so the debate is academic. The reason for the difference between the home and theater standards is again because they know there is no planned 4k native format so the only thing they have to concern themselves with is easy 1080p upscaling.

Yet again, another standard that is full of compromise. Most movies with still only use 2/3 of the screen to black bars. I saw 4 different sized "4K" tv's in the Sony store. The image looked very clear. I asked the assistant to switch 120hz mode off (sometimes called "game-show mode") and the cable tv image looked not better than an equivalent 1080P tv. In the same way that a 1080P tv with 120hz mode looks clearer than a 1080p tv without. The 65 inch, 55 inch and 50 inch models were no better than the 1080p equivalents. The 85 inch $25K might have had a small difference on the native content but they wouldn't turn 120hz mode off for me to see for sure. Not that it matters unless you are ok watching the same 10 pieces of content it ships with forever.

History shows every time that the format with the content is successful. With 2 formats, the one with access to a greater number of lower priced titles is the winner, even if it is technically inferior. This is another mini disc right now, or a betamax video, or an HD DVD. Sony have shown themselves capable of selling early adopters expensive kit and then abandoning them. They are too big to support a small niche market. Before anyone mentions the benefits of looking at their pictures on a 4k tv, that is an example of a small niche market.

I heard that they are already testing 8K broadcasts in Japan and are about to start testing here in the US. Perhaps the next consumer format will be 8K and skip over 4K all together? If the technology already exists, in a world of fixed pixel displays, why invest in the one that has already been superseded. I would advise anyone thinking of being an early adopter to take a trip to the sony store and ask to see the tv without 120hz mode and if possible, also look at a 120hz 1080p set to make sure you can see enough difference to justify that $5K - $25K tv purchase instead of the bargain 1080p sets that will be given away. The guy in the Sony store said the only 4K device he would consider is the 4K projector (also $25K) which is real 4K and might make a visible difference.

 

· Registered
Joined
·
7,007 Posts
it'd probably be easier to rename the commercial format and leave 4k for the marketing folks that love how it sounds.

let's be honest, 4k just sounds so much cooler.
 

· Registered
Joined
·
116 Posts
Just because a One Terabyte hard drive formats down to 930 Megabytes
does not mean that it stops being a Terabyte hard drive.

ALSO
=====

4K-PRO adds only a single syllable to the 4K name.

Which is similar to appending "CAM" to BETA.

BETACAM was the PRO format.
BETAMAX or BETA was the consumer format.

No problem.
 
1 - 20 of 65 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top