or Connect
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 8K by 4K or Octo HD - the real SUHDTV technology
New Posts  All Forums:Forum Nav:

8K by 4K or Octo HD - the real SUHDTV technology - Page 20

post #571 of 670
Thread Starter 
Quote:
Originally Posted by millerwill View Post

Looks like we are again talking about different things, you about general purpose tv and me more about an HT setting. For the latter, 2 PH, or even less, is not uncommon at all.

There is ton of confusion about viewing scenarios here so it is always good to mention what is the scenario considered.. For HT the viewing distance can be adjusted for highest impact and/or the most comfortable one. Even 0.75PH for 100 deg 8K might be fine in the HT conditions. But this has nothing to do with the normal living room.
post #572 of 670
Thread Starter 
Japan official: Ultra HD public broadcast in July 2014, Octo HD public broadcast for Olympic Games 2020. Trial run of Octo HD in 2016-
post #573 of 670

OMG that country drives me nuts. Knock it off Japan.
post #574 of 670
What! They didn't use the term "Octo HD" once in the article. Shame on them. tongue.gif
You must be so disappointed? wink.gif

.
post #575 of 670
Most of this was ''news'' back in januari. They added 8K broadcast plans in 2020.
http://www.zdnet.com/japan-to-launch-4k-tv-broadcast-in-july-2014-7000010386/

So the only news here is that Japan plans 8K broadcast in 2020. That's seven years from now, timetable/aims rolleyes.gif

Eventhough they planned 8K tests in 2016/8K broadcasts in 2020, the ministry aims to spur for new televisions that offer four times the pixels of the current ''full high vision'' standard(Japanese for 1080p HDTV) confused.gif
Edited by 8mile13 - 6/6/13 at 5:23am
post #576 of 670
Thread Starter 
^But it means about 3 ys from now first OHD panels might hit the streets, the trend accelerated by the fact dense LCDs are the trend. UHD panels will thus have extremely short life span on a bleeding edge of development biggrin.gif.
post #577 of 670
great. I can’t imagine them being released any time in the near future though.thanks lJNb
post #578 of 670
Thread Starter 
^Try liberating your imagination by analogy: couple of ys ago you could not imagine the coming release of UHD TVs.
post #579 of 670
Thread Starter 
post #580 of 670

 

Huh?  What?  Where did all the naysaying "but you can't see resolutions that high" 4K people go?

 

If I'm really realllllly quiet..................I hear crickets.

 

Are they hiding?

post #581 of 670
There's no point in repeating that invisibility mantra. Everybody who cares knows it but the manufacturers don't care. They just want new revenue streams.
post #582 of 670
Quote:
Originally Posted by tgm1024 View Post

Huh?  What?  Where did all the naysaying "but you can't see resolutions that high" 4K people go?
If I'm really realllllly quiet..................I hear crickets.
Are they hiding?
Maybe they've actually been out and seen some of the 4K displays now.

It says right there that "8K-resolution will offer 16 times the pixels of current HD televisions"
post #583 of 670
Quote:
Originally Posted by Chronoptimist View Post
 
Quote:
Originally Posted by tgm1024 View Post

Huh?  What?  Where did all the naysaying "but you can't see resolutions that high" 4K people go?
If I'm really realllllly quiet..................I hear crickets.
Are they hiding?
Maybe they've actually been out and seen some of the 4K displays now.

 

Well, in case there's any argument whatsoever, please refer to this guide the next time someone needs clarification on what resolution is too high for what distance.

 


Edited by tgm1024 - 9/18/13 at 2:31pm
post #584 of 670
heh, I did a doubletake when you posted that, but then I looked a little closer and saw what you did dere.

Don't shoot me, but I am still highly skeptical of the perceived benefits of 4K on a 65" panel at 10 feet out. Further, I would not accept a tradeoff for more resolution versus an inferior contrast ratio (particularly on the lower end).
post #585 of 670
Quote:
Originally Posted by tgm1024 View Post

Well, in case there's any argument whatsoever, please refer to this guide the next time someone needs clarification on what resolution is too high for what distance.
http://www.avsforum.com/content/type/61/id/278616/flags/LL
That's great. biggrin.gif I almost skipped over it too.
Quote:
Originally Posted by vinnie97 View Post

Don't shoot me, but I am still highly skeptical of the perceived benefits of 4K on a 65" panel at 10 feet out. Further, I would not accept a tradeoff for more resolution versus an inferior contrast ratio (particularly on the lower end).
Well hopefully the 4K upgrade will also include OLED by the time most of us get around to it. I certainly won't be moving from local dimming LED to an edge-lit 4K LCD, and I doubt most plasma owners would be making that change either.
Edited by Chronoptimist - 9/18/13 at 5:18pm
post #586 of 670
Quote:
Originally Posted by vinnie97 View Post

heh, I did a doubletake when you posted that, but then I looked a little closer and saw what you did dere.

Don't shoot me, but I am still highly skeptical of the perceived benefits of 4K on a 65" panel at 10 feet out. Further, I would not accept a tradeoff for more resolution versus an inferior contrast ratio (particularly on the lower end).

I think you're absolutely correct. Resolution-wise, one needs to sit no further than ~1.0 screen widths (SW) to have full benefit of 4K. So at 10 ft away, this means a projector and a 10 ft wide screen. At 2 SW I think one would see no benefit of 4K.
post #587 of 670
Quote:
Originally Posted by tgm1024 View Post

Well, in case there's any argument whatsoever, please refer to this guide the next time someone needs clarification on what resolution is too high for what distance.


Here's a fun exercise that gets many people thinking, when I show it to them:

Double or triple these distances, when doing video games instead of video. Since computer graphics are far clearer, and you run into the aliasing issues, easily seen indirectly.

I can see aliasing artifacts on a 50 inch SEIKI 4K HDTV from more than 10 feet away, using this test pattern.
Proof: TestUFO: Aliasing Visibility Web Test

This test allows you to determine how far away you can get from your display before you can no longer see the benefits of the resolution of your display. This is useful in determining the distance from a 4K display, for a virtually worst-case computer graphics animation, to determine how far away the point is, where you no longer see the benefits of 4K. Computer graphics (lots of sharp lines) is far clearer than video, pushing the limits of human vision acuity, including via indirect effects such as shimmering caused by aliasing, even when individual pixels are too small to be resolved individually by the human eye.

Don't believe me? Connect a computer to a 4K HDTV and then click TestUFO: Aliasing Visibility Web Test.

If you are a videogame player who plays "Borderlands 2" or other games involving very thin, fine, lines similiar to this (e.g. rotoscoped videogame graphics) and you're not wanting to turn on anti-aliasing due to framerate issues (most graphics cards cannot do 3D graphics with antialiasing quickly at 4K, so you have to turn off antialiasing), then this _definitely_ becomes visible in real-world video game material. Obviously, it doesn't happen if you antialias everything properly, or you're watching movies, but it goes to show that the problems of finite-resolution displays creates macroscopic artifacts far, far, far, far, far beyond the human eye's resolution-resolving abilities. The chart you quoted is a good guideline, but quite narrow in scope and does not accomodate for other visible artifacts (e.g. aliasing, moire) that is still caused by finite-resolution displays.

Ability to detect these artifacts can also be very gamma dependant. A mis-adjustment of the display's gamma will make aliasing artifacts more visible, and can show up as beads of brightness modulations along a 1-pixel-thick antialiased line. It just goes to show that the human vision interacts with so many display variables, that artifacts are still being found (that detracts from Holodeck perfection) even far beyond what one naturally expects.

Vision research is tons of fun, eh?
Edited by Mark Rejhon - 9/20/13 at 2:19pm
post #588 of 670
Quote:
Originally Posted by Mark Rejhon View Post
 
Quote:
Originally Posted by tgm1024 View Post

Well, in case there's any argument whatsoever, please refer to this guide the next time someone needs clarification on what resolution is too high for what distance.

Here's a fun exercise that gets many people thinking, when I show it to them:

Double or triple these distances, when doing video games instead of video. Since computer graphics are far clearer, and you run into the aliasing issues, easily seen indirectly.

I can see aliasing artifacts on a 50 inch SEIKI 4K HDTV from more than 10 feet away, using this test pattern.
Proof: TestUFO: Aliasing Visibility Web Test

This test allows you to determine how far away you can get from your display before you can no longer see the benefits of the resolution of your display. This is useful in determining the distance from a 4K display, for a virtually worst-case computer graphics animation, to determine how far away the point is, where you no longer see the benefits of 4K. Computer graphics (lots of sharp lines) is far clearer than video, pushing the limits of human vision acuity, including via indirect effects such as shimmering caused by aliasing, even when individual pixels are too small to be resolved individually by the human eye.

Don't believe me? Connect a computer to a 4K HDTV and then click TestUFO: Aliasing Visibility Web Test.

If you are a videogame player who plays "Borderlands 2" or other games involving very thin, fine, lines similiar to this (e.g. rotoscoped videogame graphics) and you're not wanting to turn on anti-aliasing due to framerate issues (most graphics cards cannot do 3D graphics with antialiasing quickly at 4K, so you have to turn off antialiasing), then this _definitely_ becomes visible in real-world video game material. Obviously, it doesn't happen if you antialias everything properly, or you're watching movies, but it goes to show that the problems of finite-resolution displays creates macroscopic artifacts far, far, far, far, far beyond the human eye's resolution-resolving abilities. The chart you quoted is a good guideline, but quite narrow in scope and does not accomodate for other visible artifacts (e.g. aliasing, moire) that is still caused by finite-resolution displays.

Ability to detect these artifacts can also be very gamma dependant. A mis-adjustment of the display's gamma will make aliasing artifacts more visible, and can show up as beads of brightness modulations along a 1-pixel-thick antialiased line. It just goes to show that the human vision interacts with so many display variables, that artifacts are still being found (that detracts from Holodeck perfection) even far beyond what one naturally expects.

Vision research is tons of fun, eh?

 

I haven't checked out your test yet (and will!) but what I was hoping to do was to supply two images side by side, (probably 540 lines high to make the later math simple) one at a simulated (NN) half resolution of the other.  The common metric would be PH (easily applied at home) and can easily be applied to my notebook screen or any TV.

 

The idea would not be to look for aliasing, but to see how far back you had to be to not see a discernable difference between the two.  Pretty simple really, but you stole my thunder.

post #589 of 670
Quote:
Originally Posted by tgm1024 View Post

I haven't checked out your test yet (and will!) but what I was hoping to do was to supply two images side by side, (probably 540 lines high to make the later math simple) one at a simulated (NN) half resolution of the other.  The common metric would be PH (easily applied at home) and can easily be applied to my notebook screen or any TV.
For photographic tests, it becomes harder to see the difference.
-- If you create a photographic test, make sure you use source material at least twice the resolution of your display (e.g. for 540 lines, use at least 1080p downconverted to 540p, and the second photograph a re-enlarged 270p downconvert, basically 1080p->270p->540p).
-- That ensures you are already using very sharp source material that maxes out the sharpness at each respective resolution. (For the same reason 4K downconverts to 1080p often look sharper than native 1080p video material).
-- Also, aliasing in video is easier to see than aliasing in photos, because when aliasing artifacts are faint, it's easier to detect while in motion (e.g. shifty moire artifacts, shifty aliasing artifacts).
Quote:
The idea would not be to look for aliasing, but to see how far back you had to be to not see a discernable difference between the two.  Pretty simple really, but you stole my thunder.
Oh, but you can still do it --
My test is just a moving white line on a black background. (It's not photographic based.)
The TestUFO one is more of a test case for computer graphics and computer games, not videos/photos.
Edited by Mark Rejhon - 9/20/13 at 4:34pm
post #590 of 670
Quote:
Originally Posted by tgm1024 View Post

I haven't checked out your test yet (and will!) but what I was hoping to do was to supply two images side by side, (probably 540 lines high to make the later math simple) one at a simulated (NN) half resolution of the other.  The common metric would be PH (easily applied at home) and can easily be applied to my notebook screen or any TV.
The idea would not be to look for aliasing, but to see how far back you had to be to not see a discernable difference between the two.  Pretty simple really, but you stole my thunder.
I'm sure that I posted this test somewhere on here before. You can get back way farther than the Carlton Bale charts suggest, before there stops being a difference between them. And that's a static image - aliasing is far more noticeable in motion, as Mark's test illustrates.
post #591 of 670
way farther back? Sure, if you have eagle eyes (I guess "way" has to be defined here.) smile.gif
post #592 of 670
Thread Starter 
post #593 of 670
Quote:
Originally Posted by irkuck View Post

It's brewing!

 

I'm a little concerned though when they say things like "possibly with light compression".....it's a little too open ended a get-out-of-jail-free card.

 

Also, I'm not sure why they say that "HDMI can do color channel compression".  (?)  How does this matter at all?  HDMI sends bits.  How the data is interpreted is hardly a monumental feat in itself.

post #594 of 670
Quote:
Originally Posted by tgm1024 View Post

I'm a little concerned though when they say things like "possibly with light compression".....it's a little too open ended a get-out-of-jail-free card.

Also, I'm not sure why they say that "HDMI can do color channel compression".  (?)  How does this matter at all?
DisplayPort 1.3 will be capable of 8K at 30 fps but it will only be capable of that using either 8-bit 4:4:4 or 12-bit 4:2:2. DisplayPort supports chroma subsampling but I have never heard of anything using it since chroma subsampling is apparent on a computer monitor. I think that adding support for 8K was mainly done for the free publicity but the higher bandwidth might be useful for 4K computer monitors.
post #595 of 670
Thread Starter 
Quote:
Originally Posted by tgm1024 View Post

I'm a little concerned though when they say things like "possibly with light compression".....it's a little too open ended a get-out-of-jail-free card.

Also, I'm not sure why they say that "HDMI can do color channel compression".  (?)  How does this matter at all?  HDMI sends bits.  How the data is interpreted is hardly a monumental feat in itself.

In the case of DP1.3, they are most likely talking about on the fly lossless compression. For the HDMI this is more cryptic, it could mean that one can reduce the color bit depth on the fly in the HDMI data stream.
post #596 of 670
Quote:
Originally Posted by irkuck View Post
 
Quote:
Originally Posted by tgm1024 View Post

I'm a little concerned though when they say things like "possibly with light compression".....it's a little too open ended a get-out-of-jail-free card.

Also, I'm not sure why they say that "HDMI can do color channel compression".  (?)  How does this matter at all?  HDMI sends bits.  How the data is interpreted is hardly a monumental feat in itself.

In the case of DP1.3, they are most likely talking about on the fly lossless compression. For the HDMI this is more cryptic, it could mean that one can reduce the color bit depth on the fly in the HDMI data stream.

 

Speaking as a software engineer/architect though, I gotta say that talking about a network connection intrinsically supporting anything regarding compression is ......... just odd.  Where the rubber meets the road here is the bit rate.  Period, end of chapter, end of story.  Having display port "support mumblemumble subsampling and hoozitz compression" doesn't make much intrinsic sense to me.  That's up to the TV and source, and should have nothing to do with the connection spec.  Perhaps I'm a bit of a purist, but saying that it supports compression schemes is a little like saying that my internet connection supports games.

 

One is a transport spec.  The other is a usage spec.  The former mitigates the effectiveness of the latter, but they're entirely different.  In the analog days, this concept made sense, because there were all kinds of line discipline specific encoding going on in analog.  But now?

post #597 of 670
I assumed the compression comment was just a typical marketing mumbo-jumbo mis-statement, implying that the highest resolutions and framerates might still exceed the planned maximum datarate.
post #598 of 670
Thread Starter 
Quote:
Originally Posted by tgm1024 View Post

Speaking as a software engineer/architect though, I gotta say that talking about a network connection intrinsically supporting anything regarding compression is ......... just odd.  Where the rubber meets the road here is the bit rate.  Period, end of chapter, end of story.  Having display port "support mumblemumble subsampling and hoozitz compression" doesn't make much intrinsic sense to me.  That's up to the TV and source, and should have nothing to do with the connection spec.  Perhaps I'm a bit of a purist, but saying that it supports compression schemes is a little like saying that my internet connection supports games.

One is a transport spec.  The other is a usage spec.  The former mitigates the effectiveness of the latter, but they're entirely different.  In the analog days, this concept made sense, because there were all kinds of line discipline specific encoding going on in analog.  But now?

You are wrong since you are not signal processing/video engineer. There is sense and necessity for the DP 1.3 to support compression. It is simply the enormous bitrate (70 Gb/s) needed to transfer 8K which makes it next to impossible to make cheap connectors and cables. But this problem can be solved when one puts simple lossless compression codecs on both ends - these are really simple and save bandwidth 2-4 times with no loss using signal redundancy. One thus gets close to the bit rate needed for the DP 1.2 and the problem is solved. BTW, DP is really a point-to-point cabling, it is bit too much calling it network connection.
post #599 of 670
Quote:
Originally Posted by irkuck View Post

You are wrong since you are not signal processing/video engineer. There is sense and necessity for the DP 1.3 to support compression. It is simply the enormous bitrate (70 Gb/s) needed to transfer 8K which makes it next to impossible to make cheap connectors and cables. But this problem can be solved when one puts simple lossless compression codecs on both ends - these are really simple and save bandwidth 2-4 times with no loss using signal redundancy. One thus gets close to the bit rate needed for the DP 1.2 and the problem is solved. BTW, DP is really a point-to-point cabling, it is bit too much calling it network connection.
Instead of having lossless compression codecs on both ends (isn't that going to be too costly/take to much processing power to easily be able to do that in realtime with 8K at high fps?) won't they be more likely to send something like 4:2:0 instead of something higher (like 4:2:2 4:4:4) - if it's for 8K TV?
Edited by Joe Bloggs - 12/9/13 at 1:09am
post #600 of 670
Quote:
Originally Posted by irkuck View Post

There is sense and necessity for the DP 1.3 to support compression.
For 8K at 120 fps that would be very expensive and there is the possibily that the author of the article got confused about chroma subsampling.

Quote:
Originally Posted by irkuck View Post

It is simply the enormous bitrate (70 Gb/s) needed to transfer 8K which makes it next to impossible to make cheap connectors and cables. But this problem can be solved when one puts simple lossless compression codecs on both ends - these are really simple and save bandwidth 2-4 times with no loss using signal redundancy.
While VESA is planning to make a standard for visually lossless compression they have yet to release a standard for it. The only information I can find was a vague press release from CES (that didn't mention resolution, frame rate, bit depth, or how the video would be compressed) and it looks like nothing has been released since than.
New Posts  All Forums:Forum Nav:
  Return Home
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 8K by 4K or Octo HD - the real SUHDTV technology