or Connect
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 8K by 4K or Octo HD - the real SUHDTV technology
New Posts  All Forums:Forum Nav:

8K by 4K or Octo HD - the real SUHDTV technology - Page 5

post #121 of 670
Octo, outside all the other arguments, is an ugly name. I reject it on that principle alone.

Now a really good name would be EightcHD wink.gif
post #122 of 670
Quote:
Originally Posted by irkuck View Post

Now one can see your confusion comes due to math knowledge limited to basic arithmetics.. It is obvious Octo K does not make sense since there is no reference to HD, it is the limitation to basic arithmetics to think it should be 8 x HD. The Octo HD simply refers both to HD AND the number 8 - 'AND' means logical conjunction here and not multiplication. It is in the same style as Quad HD with the exception that Quad can also refer to the area of 4xHD. So try to go beyond basic math and include logical AND operation.

This is laughable.

Its pretty clear how they are naming the display resolutions:

Ultra High Definition is either 4k or 8k.

UHDTV 4k is also know as 2160p
UHDTV 8k is also know as 4320p

Now that's the standards and I hope your not disputing that!

OK - 2160p has been refereed to as Quad HD only because it contains 4 times the amount of pixels as HD. Think of it as a nickname referring to the standard.
So - you can call it Sexe HD or Hexa HD since it is 16 times the number of pixels of HD but Octa HD is WRONG.
post #123 of 670
Quote:
Originally Posted by reconlabtech View Post

Octo, outside all the other arguments, is an ugly name. I reject it on that principle alone.
Now a really good name would be EightcHD wink.gif

How about . . .

Super HD
Boss HD
A-1 HD

And my favorite . . . .

Apex HD

tongue.gif
post #124 of 670
Quote:
Originally Posted by Lee Stewart View Post

How about . . .
Super HD
Boss HD
A-1 HD
And my favorite . . . .
Apex HD
tongue.gif

While you are correct that Octa HD is crazy - I disagree it needs a new name.

HD is defined as more than 1 definition.

720p, 1080i and 1080p are all under the umbrella of HD. Thats why they are referred to by 720p HD, 1080p HD.

So both are under the umberlla of UHD. One is 2160p and the other is 4320p.

So we should just leave it as it is defined.

UHD 2160p or UHD 4320p.

Everything else is unnecessary nicknames.
post #125 of 670
Quote:
Originally Posted by ebernazz View Post

While you are correct that Octa HD is crazy - I disagree it needs a new name.
HD is defined as more than 1 definition.
720p, 1080i and 1080p are all under the umbrella of HD. Thats why they are referred to by 720p HD, 1080p HD.
So both are under the umberlla of UHD. One is 2160p and the other is 4320p.
So we should just leave it as it is defined.
UHD 2160p or UHD 4320p.
Everything else is unnecessary nicknames.

What! You don't like my Apex HD? biggrin.giftongue.gifeek.gifcool.gif

LOL - I do agree with you. UHD 2160P and UHD 4320P are perfect monikers.
post #126 of 670
The next standard must be complete and with zero need to ever expand it as display technology improves. Which means it should not be defined by what is currently possible with current displays as far as bit depth and color space but rather an open-ended resolution and bit depth and the full color space the typical human eye can see. Then create the appropriate standards to convert from this space down to whatever the current display is capable with possibly a reasonable set of fixed standards. For example, resolutions limited to 2K, 4K, and 8K, fixed refresh rates from 24p up to at least 144p (3D x 3*24p) and color spaces Rec. 709, some expanded color space, and the complete color space with support for up to 16-bit depth. I think the key to the new standard is that it meet or exceed what is physically practically possible for home viewing even if the current displays are nowhere near that capability.

Ideally they would also completely separate the display from the broadcast/encoding standards such that I could hook up any new system to my display and it would just work. We had this in the old analog days where many projectors could handle HDTV long before the standard even came out because the protocol (analog RGB) for display was independent of the HDTV standard. Then they broke it by saying that if your device can't talk HDMI with the specific standards, your display was not allowed. So we run into the problem we have right now where any time a new standard comes out, the display itself becomes obsolete, like Blu-ray 3D. If all you have is a 3D disc, you must have a new 3D capable player and a 3D display or you cannot use the disc. It would have been a far superior to have planned ahead to have it work in 2D players, and at the very least in 3D players connected to 2D displays as the "conversion" is as simple as displaying only the left eye's image. Same problem with Deep Color. Don't require my display to talk the latest transmission protocol or it cannot be used!
post #127 of 670

Seems to me that the simplest descriptor would the way cameras do it:  # megapixels = MP (i.e., 1080p is then 2 MP, etc.)

post #128 of 670
Quote:
Originally Posted by Lee Stewart View Post

What! You don't like my Apex HD? biggrin.giftongue.gifeek.gifcool.gif
LOL - I do agree with you. UHD 2160P and UHD 4320P are perfect monikers.
Except they are missing the frame rate
post #129 of 670
Quote:
Originally Posted by Chronoptimist View Post

I think the point is that 8K should look good at any size or viewing distance. At larger screen sizes, 4K is not going to look any better than current sub-60″ 1080p displays. Having more resolution than you need is never a bad thing, and I am definitely looking forward to 8K displays. I can’t imagine them being released any time in the near future though.

I think it depends on what you want to use it for. If TV only then you have a point. If I'd like to use it as a desktop for my computer then 4K is absolutely beneficial as anything over 32" at 1080p, IMO, starts to get hard on the eyes in terms of reading and text. Would love to be able to work from a zero gravity chair with a wall mounted 4K+ computer screen looking down at me. :-)
post #130 of 670
Quote:
Originally Posted by Joe Bloggs View Post

Except they are missing the frame rate

There will be multiple frame rates. Everyting from 24 up to 120. But that doesn't change the description. We all recognize HD 1080P even though it can mean 1080x24P, 1080x30P or 1080x60P.

Even John Q. Public recognizes 1080P . . . Full HD
Edited by Lee Stewart - 9/19/12 at 1:42pm
post #131 of 670
Quote:
Originally Posted by Chuck Anstey View Post

The next standard must be complete and with zero need to ever expand it as display technology improves. Which means it should not be defined by what is currently possible with current displays as far as bit depth and color space but rather an open-ended resolution and bit depth and the full color space the typical human eye can see. Then create the appropriate standards to convert from this space down to whatever the current display is capable with possibly a reasonable set of fixed standards. For example, resolutions limited to 2K, 4K, and 8K, fixed refresh rates from 24p up to at least 144p (3D x 3*24p) and color spaces Rec. 709, some expanded color space, and the complete color space with support for up to 16-bit depth. I think the key to the new standard is that it meet or exceed what is physically practically possible for home viewing even if the current displays are nowhere near that capability.

The standards are written for both professional and consumer needs. Just like today for HD.
Quote:
Ideally they would also completely separate the display from the broadcast/encoding standards such that I could hook up any new system to my display and it would just work. We had this in the old analog days where many projectors could handle HDTV long before the standard even came out because the protocol (analog RGB) for display was independent of the HDTV standard. Then they broke it by saying that if your device can't talk HDMI with the specific standards, your display was not allowed. So we run into the problem we have right now where any time a new standard comes out, the display itself becomes obsolete, like Blu-ray 3D. If all you have is a 3D disc, you must have a new 3D capable player and a 3D display or you cannot use the disc. It would have been a far superior to have planned ahead to have it work in 2D players, and at the very least in 3D players connected to 2D displays as the "conversion" is as simple as displaying only the left eye's image. Same problem with Deep Color. Don't require my display to talk the latest transmission protocol or it cannot be used!

HDMI was used to strengthen DRM. It won't be any different for UHDTV.

CEMs don't want you to buy a television that will last you 10 or 15 years. They want to buy a new TV every 5 years. That's why they keep adding bells, lights and whistles to them every year or so.
post #132 of 670
Quote:
Originally Posted by Lee Stewart View Post

There will be multiple frame rates. Everyting from 24 up to 120. But that doesn't change the description. We all recognize HD 1080P even though it can mean 1080x24P, 1080x30P or 1080x60P.
Even John Q. Public recognizes 1080P . . . Full HD
That's why we need to specify the frame rate(s) when talking about a particular video format. There's a big difference between 1080p50 and 1080p24. Just saying "1080p" or something like "when will broadcasters broadcast in 1080p" or "this is capable of 1080p" or "2160p" doesn't tell enough about the format. You need the frame rate(s) added. If you look at the broadcast documents (eg. by the EBU), they usually add the frame rate after the p, eg. 1080p/50, 2160p/50.
Edited by Joe Bloggs - 9/19/12 at 2:16pm
post #133 of 670
The Far East hates the number 4.

Maybe that's why they want to skip it already.
post #134 of 670
Quote:
Originally Posted by Joe Bloggs View Post

That's why we need to specify the frame rate(s) when talking about a particular video format. There's a big difference between 1080p50 and 1080p24. Just saying "1080p" or something like "when will broadcasters broadcast in 1080p" or "this is capable of 1080p" or "2160p" doesn't tell enough about the format. You need the frame rate(s) added. If you look at the broadcast documents (eg. by the EBU), they usually add the frame rate after the p, eg. 1080p/50, 2160p/50.

All you are doing is complicating a simple moniker. Listing the vertical resolution as the general name of the format is SOP. Has been for decades. Ever since we went from TV's 480i to PC's 480P. I am sure you have seen those distance to display seating charts right? All they list is the vertical resolution.

Frame rates are not as important as they used to be because TVs can manulipate frames rates so easily.

If the world was nothing but AV Geeks, I would agree with you. But in most cases the KISS principal works very well. And THIS is one of those cases.
post #135 of 670
Quote:
Originally Posted by defdog99 View Post

The Far East hates the number 4.
Maybe that's why they want to skip it already.

Though that is true, they want to avoid what the USA is famous for . . . baby step improvements. They know that for the time and money it will take to put together a 4k infrastructure, they can wait just a few years and spend all their budgets on getting 8K into the field. Make a giant leap . . . not a simple step up.
post #136 of 670
Quote:
Originally Posted by Chuck Anstey View Post

The next standard must be complete and with zero need to ever expand it as display technology improves. Which means it should not be defined by what is currently possible with current displays as far as bit depth and color space but rather an open-ended resolution and bit depth and the full color space the typical human eye can see. Then create the appropriate standards to convert from this space down to whatever the current display is capable with possibly a reasonable set of fixed standards. For example, resolutions limited to 2K, 4K, and 8K, fixed refresh rates from 24p up to at least 144p (3D x 3*24p) and color spaces Rec. 709, some expanded color space, and the complete color space with support for up to 16-bit depth. I think the key to the new standard is that it meet or exceed what is physically practically possible for home viewing even if the current displays are nowhere near that capability.
Ideally they would also completely separate the display from the broadcast/encoding standards such that I could hook up any new system to my display and it would just work. We had this in the old analog days where many projectors could handle HDTV long before the standard even came out because the protocol (analog RGB) for display was independent of the HDTV standard. Then they broke it by saying that if your device can't talk HDMI with the specific standards, your display was not allowed. So we run into the problem we have right now where any time a new standard comes out, the display itself becomes obsolete, like Blu-ray 3D. If all you have is a 3D disc, you must have a new 3D capable player and a 3D display or you cannot use the disc. It would have been a far superior to have planned ahead to have it work in 2D players, and at the very least in 3D players connected to 2D displays as the "conversion" is as simple as displaying only the left eye's image. Same problem with Deep Color. Don't require my display to talk the latest transmission protocol or it cannot be used!

I agree with you're desire, but this is impossible not just because technology evolves, but because technology evolves in unpredictable ways: the questions that we're trying to answer and the problems that we're trying to solve with current video paradigmns won't necessarily be the same questions/problems to address in 10, 20, or 30 years.

For instance, right now we take for granted that video-media captures and reproduces flat images (3D being defined as two flat stereo images in parallel) in the shape of a rectangle (16x9, 21x9, 1.33:1 etc.... there are few variations but they're all flat quandrant-shaped boxes). Given those assumptions, we then have more assumptions such as how images are recorded in a matrix of square-shaped-pixels measured from sampling/quantizing light along a neat array of grid-like sample points.

From there, we argue about what resolution density and color-depth is necessary for those grid-arrayed pixels to make the images appear life-like from a given viewing distance, and about how many such frame captures we need per second to represent natural looking motion.

What if 20 years from now we abandon this "linear sampling" methodology in favor of fractal algorithms that represent the original image and can be rendered to a theoretical infinite output resolution? What if we abandon the whole idea of "flat retangle image capture" and go with holographic capture that measures depth axis in addition to other image parameters, so we can recreate real holographic playback viewed from any angle in true 3D? Or what if we develop cameras that can record in 360 degrees horizontal and vertical so we have "dome shaped images" for the ultimate immersion experience? Imaging the video game or nature documentary in dome-video. :-)

Whatever the next generation of cutting edge displays can deliver, it won't be long before ultra-realism breaks out of the flat 16x9 box.
post #137 of 670
Quote:
Originally Posted by Chuck Anstey View Post

The next standard must be complete and with zero need to ever expand it as display technology improves. Which means it should not be defined by what is currently possible with current displays as far as bit depth and color space but rather an open-ended resolution and bit depth and the full color space the typical human eye can see. Then create the appropriate standards to convert from this space down to whatever the current display is capable with possibly a reasonable set of fixed standards. For example, resolutions limited to 2K, 4K, and 8K, fixed refresh rates from 24p up to at least 144p (3D x 3*24p) and color spaces Rec. 709, some expanded color space, and the complete color space with support for up to 16-bit depth. I think the key to the new standard is that it meet or exceed what is physically practically possible for home viewing even if the current displays are nowhere near that capability.
Ideally they would also completely separate the display from the broadcast/encoding standards such that I could hook up any new system to my display and it would just work. We had this in the old analog days where many projectors could handle HDTV long before the standard even came out because the protocol (analog RGB) for display was independent of the HDTV standard. Then they broke it by saying that if your device can't talk HDMI with the specific standards, your display was not allowed. So we run into the problem we have right now where any time a new standard comes out, the display itself becomes obsolete, like Blu-ray 3D. If all you have is a 3D disc, you must have a new 3D capable player and a 3D display or you cannot use the disc. It would have been a far superior to have planned ahead to have it work in 2D players, and at the very least in 3D players connected to 2D displays as the "conversion" is as simple as displaying only the left eye's image. Same problem with Deep Color. Don't require my display to talk the latest transmission protocol or it cannot be used!

I agree with you in principle but the industry will not for various reasons and excuses. The main problem is the TV / display market is falling in love with making ****** standards that need upgrading because that encourages people to buy new displays. What you said is true of SDTV, VGA and DVI in computers. Look at the higher resolution higher refresh rate displays on computers almost everyone is driving them with dual link DVI, which is 2 standards behind the curve? The newest standard, HDMI was garbage and IMO it was purposely made like that to push people to buy new stuff. Used to be you could buy an AMP and sit on it for decades, now recievers need updated every couple years just to support the connections like HDMI, 3D etc.. And the industry loves this. They will keep doing this to milk us dry for the next 25 years, they are thinking ahead and the last thing they want is for us to be able to buy something and be satisfied for 10+ years.

Look at how many people are repurchasing movies etc for blu ray to upgrade their collection. You get to resell people the same content over and over with constant standard upgrades.
post #138 of 670
Good news for me. I was about to plunk down some change for the JVC RS65 4K projector.....NOT!! Not till there are 4k HD movie discs on the market. "BlurayPLUS". smile.gif

8k displays?? What's the point if the human eye can only recognize up to 4k allegedly. Unless for commercial theaters or film mastering studios.
post #139 of 670
Quote:
Originally Posted by DaViD Boulet View Post

Quote:
Originally Posted by Chuck Anstey View Post

The next standard must be complete and with zero need to ever expand it as display technology improves. Which means it should not be defined by what is currently possible with current displays as far as bit depth and color space but rather an open-ended resolution and bit depth and the full color space the typical human eye can see. Then create the appropriate standards to convert from this space down to whatever the current display is capable with possibly a reasonable set of fixed standards. For example, resolutions limited to 2K, 4K, and 8K, fixed refresh rates from 24p up to at least 144p (3D x 3*24p) and color spaces Rec. 709, some expanded color space, and the complete color space with support for up to 16-bit depth. I think the key to the new standard is that it meet or exceed what is physically practically possible for home viewing even if the current displays are nowhere near that capability.
Ideally they would also completely separate the display from the broadcast/encoding standards such that I could hook up any new system to my display and it would just work. We had this in the old analog days where many projectors could handle HDTV long before the standard even came out because the protocol (analog RGB) for display was independent of the HDTV standard. Then they broke it by saying that if your device can't talk HDMI with the specific standards, your display was not allowed. So we run into the problem we have right now where any time a new standard comes out, the display itself becomes obsolete, like Blu-ray 3D. If all you have is a 3D disc, you must have a new 3D capable player and a 3D display or you cannot use the disc. It would have been a far superior to have planned ahead to have it work in 2D players, and at the very least in 3D players connected to 2D displays as the "conversion" is as simple as displaying only the left eye's image. Same problem with Deep Color. Don't require my display to talk the latest transmission protocol or it cannot be used!

I agree with you're desire, but this is impossible not just because technology evolves, but because technology evolves in unpredictable ways: the questions that we're trying to answer and the problems that we're trying to solve with current video paradigmns won't necessarily be the same questions/problems to address in 10, 20, or 30 years.

For instance, right now we take for granted that video-media captures and reproduces flat images (3D being defined as two flat stereo images in parallel) in the shape of a rectangle (16x9, 21x9, 1.33:1 etc.... there are few variations but they're all flat quandrant-shaped boxes). Given those assumptions, we then have more assumptions such as how images are recorded in a matrix of square-shaped-pixels measured from sampling/quantizing light along a neat array of grid-like sample points.

From there, we argue about what resolution density and color-depth is necessary for those grid-arrayed pixels to make the images appear life-like from a given viewing distance, and about how many such frame captures we need per second to represent natural looking motion.

What if 20 years from now we abandon this "linear sampling" methodology in favor of fractal algorithms that represent the original image and can be rendered to a theoretical infinite output resolution? What if we abandon the whole idea of "flat retangle image capture" and go with holographic capture that measures depth axis in addition to other image parameters, so we can recreate real holographic playback viewed from any angle in true 3D? Or what if we develop cameras that can record in 360 degrees horizontal and vertical so we have "dome shaped images" for the ultimate immersion experience? Imaging the video game or nature documentary in dome-video. :-)

Whatever the next generation of cutting edge displays can deliver, it won't be long before ultra-realism breaks out of the flat 16x9 box.

We have had 2D images by way of TV for over 60 years and I fully expect it to continue for a very long time. People have been talking about holographic TVs and flying cars for a very long time yet here we are in our 2D and simulated 3D viewing world with cars that drive on roads. Besides, the standards are for the notion of a "flat" display. When a completely new type of display is created, it will needs its own standard and a whole new way of recording material. We would also still have over 100 years of 2D and simulated 3D material that won't benefit from a new type of holographic / domed display.

Most of your statements are why I said they have to break the connection between the encoding and the display. So what if there is a new compression technique that can theoretically be infinite in resolution? That is why the most important standard is the one to convert from the open-ended standard down to some very capable display standard. Secondly we still cannot record at infinite resolution nor can we see in infinite resolution. We could theoretically produce artificial images of any resolution but not record them.
post #140 of 670
Quote:
Originally Posted by BDD888 View Post

8k displays?? What's the point if the human eye can only recognize up to 4k allegedly. Unless for commercial theaters or film mastering studios.

If you put two 84" displays next to each other, one showing 4K and the other showing 8k and sat 6 feet away from them, you would have to be literally blind not to see the difference in the images.

The real point of increased resolution is so the viewer can sit closer to the display. Have the images fill more of their view without seeing any of the image structure. If you are sitting 14 feet away from a 60" HDTV and change it to a 60" 4K TV - no - you probably will not see any difference in the image quality. And this presents a problem for 4K TV getting a foothold in the mass market. Are regular consumers going to change their living rooms to maximize the installation of a 4K TV? Or will they simply put it where their HDTV resides now - the wall opposite the couch - and if it's 14 feet away - that's where it will be. And then they will proclaim as they have in the past about HD . . . "it doesn't look any different."
post #141 of 670
People sit "close enough" already in rooms typically not optimized for general TV watching or HT enjoyment. Most people just don't have the space. The average home owner barely has room for 5.1 speakers taking into account many times the WAF. Let alone having the luxury of a dedicated HT room (sound treated, right dimensions...etc.). So, while you say the point of increased res is so we can sit closer...I think the real benefit for most people who can afford a 4k display would be to see more detail. Maybe in a movie disc format that will also have more dynamic range for more color variety and shadow detail.

I'm already happy with 1080p on my Pioneer Elite plasma (not ISF calibrated...yet) and what the Bluray medium is delivering for the most part. That and suppose we do get 4k Blurays in the near future and 4k Bluray players...the quality of the discs will always come down to the mastering ability of the film studio. That's what it's always come down to. No matter the resolution or movie format.

Having said that. I'd still welcome the 4k format (movie discs/players/displays). Love to see what it will deliver to my "human eyes". How much better the picture will be vs what we're getting with 1080p Blurays.
Edited by BDD888 - 9/19/12 at 5:31pm
post #142 of 670
Quote:
Originally Posted by BDD888 View Post

People sit "close enough" already in rooms typically not optimized for general TV watching or HT enjoyment. Most people just don't have the space. The average home owner barely has room for 5.1 speakers taking into account many times the WAF. Let alone having the luxury of a dedicated HT room (sound treated, right dimensions...etc.). So, while you say the point of increased res is so we can sit closer...I think the real benefit for most people who can afford a 4k display would be to see more detail. Maybe in a movie disc format that will also have more dynamic range for more color variety and shadow detail.
I'm already happy with 1080p on my Pioneer Elite plasma (not ISF calibrated...yet) and what the Bluray medium is delivering for the most part. That and suppose we do get 4k Blurays in the near future and 4k Bluray players...the quality of the discs will always come down to the mastering ability of the film studio. That's what it's always come down to. No matter the resolution or movie format.
Having said that. I'd still welcome the 4k format (movie discs/players/displays). Love to see what it will deliver to my "human eyes". How much better the picture will be vs what we're getting with 1080p Blurays.

With a larger display, your viewing angle is increased even if your sofa doesn't move.

Somewhere between moving furniture and getting a larger sized TV, 4K, and even 8K, will benefit many videophile viewers.

Just think... with a 4K display you could watch 4 1080p HD channels *smultaneously* on the same screen... one image in each quadrant... and all images would be full 1080p HD resolution allowing the viewer to pick the image that supplies the audio.

Why would you want to do that? Well... let's imagine you're watching a live sporting event that's multi-cast on more than one HD station, each with their own camera and unique shooting angles. This way you could watch up to four at once and watch the game "multi-angle". Or imagine being able to watch one show with one set of headphones while your spouse watches another with another set of headphones (similar to what can be done with active-shutter glasses to show two different programs on the same screen). Or maybe you want to watch one show while wating to switch over to another... it's a full 1080p PIP feature.

Also, imagine being able to easily read fine text from your PC or notebook desktop sent to your 4K display and using your living room display as your PC/tablet monitor for an image to share with others. Finer resolution allows for finer edge-contouring which means smoother text and textures even if upscaling is used (at even interval upscaling, it's very clean math).

Passive 3D now has no loss of resolution from the 1080 x 1920 source.

Lots of viable reasons why 4K and up can meet a need for consumers who aren't necessarily thinking along the lines of a dedicated HT room.
post #143 of 670
Dave. You raise some good points for the use of a 4k display even before we get 4k media. It would be nice to be able to view 2 1080p programs simultaneously if that's your thing. But, would some one willing to foot the bill for a 4k display (e.g. me, HT enthusiast) be content? Would we not have a dedicated HT room? Or would the average joe who won a lottery and bought a Sony 84" 4k display be content with the ability to watch 2 1080p medias? I have PiP but almost never use it. I'd say I use it once or twice a year. For me it would have to be about the media (e.g. properly mastered 4k movies or concerts with reference quality audio).

So I'll continue to wait for the 4k format to continue to grow. Bring on the 4k movies!!

8k? Maybe we'll be able to watch 2 1080p movies, CNN in psudo-1080i and Sports Center 1080i. smile.gif
post #144 of 670
Quote:
Originally Posted by Joe Bloggs View Post

Except they are missing the frame rate

I can swallow that. Each element further clarifies the UHDTV details.

At 24, 25, 50, 60, and 120 fps & 2160 and 4320 p there would be 10 possibilities all under UHDTV.

Doesn't mean the 2 I listed are wrong just less informative.
post #145 of 670
post #146 of 670
Why aren't more mass market consumers buying 50+" HDTVs today? Why is that segment of the market only 16%? Why isn't it 25%? Or 30%?
post #147 of 670
Quote:
Originally Posted by Lee Stewart View Post

Why aren't more mass market consumers buying 50+" HDTVs today? Why is that segment of the market only 16%? Why isn't it 25%? Or 30%?

What percentage of the population actually buys Quality, versus those buying price: if it has colour, and it's cheap - throw it away when it breaks - I believe these are the people the manufacturers are catering to - push the volumes!. If they can buy a 32" for $298, but have to pay $498 for 42" - they'll buy the 32" - it's cheaper to throw away, and by then you can buy the 42" for the $298!
Besides this, when was the last time you actually saw a BAD TV? In my day, it wasn't all that uncommon to go to a friends, or relatives house, and got sit and watch a Green or Purple picture?
post #148 of 670
Thread Starter 
Quote:
Originally Posted by ebernazz View Post

While you are correct that Octa HD is crazy - I disagree it needs a new name.
HD is defined as more than 1 definition.
720p, 1080i and 1080p are all under the umbrella of HD. Thats why they are referred to by 720p HD, 1080p HD.
So both are under the umberlla of UHD. One is 2160p and the other is 4320p.
So we should just leave it as it is defined.
UHD 2160p or UHD 4320p.
Everything else is unnecessary nicknames.

This is absurd as nobody uses and will ever use UHD 2160p or 4320p, even because they are obviously p.
The use of vertical resolution description comes from the two HD standards 720, 1080.

Everybody switched to the horizontal pixel count of 4K and 8K since it describes the essence in briefest of ways.

These descriptions have one deficiency though: they have no reference to HD which is universally accepted term for
better TV. What is thus needed is an acronym which is referring to the numbers and AND to the HD in a short form.

This is why Quad HD is now widely accepted and Octo HD is becoming popular.
post #149 of 670
Quote:
Originally Posted by irkuck View Post

This is absurd as nobody uses and will ever use UHD 2160p or 4320p, even because they are obviously p.
The use of vertical resolution description comes from the two HD standards 720, 1080.

LOL - so they have used the HD terms 720 and 1080 for years but they won't use 2160 and 4320? rolleyes.gif
Quote:
Everybody switched to the horizontal pixel count of 4K and 8K since it describes the essence in briefest of ways.

They are switching back. Didn't you know? Too much confusion. Has to do with the fact that neither 4K nor 8k is the actual hortizontial resolution for their respective formats. 4K is really 3840 while 8K is really 7680. Lawyers are very concerned this falls under false advertising.
Quote:
These descriptions have one deficiency though: they have no reference to HD which is universally accepted term for
better TV. What is thus needed is an acronym which is referring to the numbers and AND to the HD in a short form.
This is why Quad HD is now widely accepted and Octo HD is becoming popular.

ROTF LMAO!

The term "Octo HD" is becoming popular?

You are truly a legend in your own mind.

Well - based on your thoughts, I can say with confidence . . . no question . . . .that you believed that HD DVD was a much better moniker than Blu-ray. Absolutely hands down right?
post #150 of 670
Quote:
Originally Posted by p5browne View Post

What percentage of the population actually buys Quality, versus those buying price: if it has colour, and it's cheap - throw it away when it breaks - I believe these are the people the manufacturers are catering to - push the volumes!. If they can buy a 32" for $298, but have to pay $498 for 42" - they'll buy the 32" - it's cheaper to throw away, and by then you can buy the 42" for the $298!

So how are you going to sell a 4K TV or 4K BD to these people?
Quote:
Besides this, when was the last time you actually saw a BAD TV? In my day, it wasn't all that uncommon to go to a friends, or relatives house, and got sit and watch a Green or Purple picture?

I ALWAYS bring a pair of sunglasses (neutral density lenses of course) when I go to somone elses home to watch TV. All too often, they have left the TV in "Torch Mode" where all the controls are set at max level.
New Posts  All Forums:Forum Nav:
  Return Home
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 8K by 4K or Octo HD - the real SUHDTV technology