or Connect
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 8K by 4K or Octo HD - the real SUHDTV technology
New Posts  All Forums:Forum Nav:

8K by 4K or Octo HD - the real SUHDTV technology - Page 17

post #481 of 670
Thread Starter 
Quote:
Originally Posted by vtms View Post

8K has 16x the resolution of 2K, not 4x.

No. Resolution is measured in single direction, horizontal being most used.
Quote:
Originally Posted by vtms View Post

Anyway, from what I read, 25Mbps for 4K HEVC will be enough; so 100Mbps for 8K.

Such simple recalculations are only very rough measures.
25 Mb/s for 4K will be enough to get the current 2K broadcast PQ which questions the sense of the 4K. Remember
the H.264 2K broadcast is about 10 Mb/s realtime compression. But Blu-ray uses more than 25Mb/s and with multipass.
4K should be aiming for Blu-ray PQ to make sense but that will eat bandwidth
post #482 of 670
Quote:
Originally Posted by irkuck 
...4K should be aiming for Blu-ray PQ to make sense but that will eat bandwidth
so that ain't gonna happen...
post #483 of 670
Quote:
Originally Posted by irkuck View Post

No. Resolution is measured in single direction, horizontal being most used.

No. Resolution is always measured in total numbers of pixels.
post #484 of 670
Quote:
Originally Posted by irkuck View Post

No. Resolution is measured in single direction, horizontal being most used.

We've been through this WOT battle before. It's a historical collision of usages.
post #485 of 670
Quote:
Originally Posted by vtms View Post

8K has 16x the resolution of 2K, not 4x. Anyway, from what I read, 25Mbps for 4K HEVC will be enough; so 100Mbps for 8K.

VTMS, it should read "4K" in my post, not 8K....

Clearly a typo, which I'm going to correct.
post #486 of 670
One of the most common fallacies concerning 8k and 4k is that the greater the screen size the greater the detail that is visible at a given viewing distance but experience shows us that is simply not so.
We all know that we do not see more detail watching a DVD on a 120" HT projector screen than we do on a 50" TV screen, given similar brightness, contrast settings etc. In fact it is often the case that less detail is visible as sharpness declines. And the same applies to blu-ray content when we see 1080P content at the movies, including IMAX movies. Bigger screens add to the sense of immersion, they do not by themselves make more details visible.
What both 8k and 4k can do is enable us to expand that image to a larger screen, like in a movie theatre, without losing definition or detail. We are all familiar with taking a 640x480 video from the net and expanding it to full screen size on our monitors and seeing it start to pixellate and fall to bits before our eyes. Higher definition content helps overcome that, as we already know from watching 1080P content from the net and expanding it to full screen size.
Whilst brightness, contrast, black levels, motion handling, refresh rates, lens sharpness (both for cameras and projectors) all affect how much detail is visible, as does the resolution of the content and viewing distances, bigger screens are just as likely to reduce the level of detail we see, not increase it.
This is especially so for screens above a certain size for content of a given resolution. Just go into any store selling tv's and watch the same content on a 50" screen compared with the 70" or 80" screen nearby. Usually the detail is less, not more.
post #487 of 670
Quote:
Originally Posted by 8mile13 View Post


so that ain't gonna happen...

It will almost definitely happen, the question is when.
post #488 of 670
Quote:
Originally Posted by Ken Ross 
It will almost definitely happen, the question is when.
blu-ray pq 'that will eat bandwidth' on cable and satellite?
post #489 of 670
Quote:
Originally Posted by 8mile13 View Post

Quote:
Originally Posted by Ken Ross 
It will almost definitely happen, the question is when.
blu-ray pq 'that will eat bandwidth' on cable and satellite?

Put yourself back in time and look at the broadcast & internet setup just before HD was adopted. Remove the expectations that it's definitely possible, and try to think in terms of all the things in the way back then. Just a guess, but you might see many myopic arguments about things "never happening".
post #490 of 670
Quote:
Originally Posted by 8mile13 View Post

blu-ray pq 'that will eat bandwidth' on cable and satellite?

With newer compression schemes, yes. How long? I have no idea.
post #491 of 670
Quote:
Originally Posted by tgm1024 
Put yourself back in time and look at the broadcast & internet setup just before HD was adopted. Remove the expectations that it's definitely possible, and try to think in terms of all the things in the way back then. just a guess, but you might see many myopic arguments about things ''never hapening''
I have satellite HD channels from several country's in Europe on my decent sat-receiver. Quality is not good enough for movies so i am forced to buy blu-ray movies (i am not happy with that quality either but its the best we got). I also owned a -> decent<- HD cable receiver for one year, SD-HD quality came across as fake-ish. I sold this receiver and will never watch cable stuff again.

Cable and satellite are only capable of delivering slightly beyond 'fast food' quality because profits have to be made ( yep, newer compression schemes comes in handy). 'great HD quality' on cable and satellite has never happened AFAIK ''never happened''

There will definitely be 4K cable/satellite TV but the quality we expect it to be wil never happen MO. 'great UHD quality' on cable/satellite will never happen.
post #492 of 670
Quote:
Originally Posted by catonic View Post

One of the most common fallacies concerning 8k and 4k is that the greater the screen size the greater the detail that is visible at a given viewing distance but experience shows us that is simply not so.

Actually it is. It's physics. And human biology at work.

Quote:
Originally Posted by catonic View Post

We all know that we do not see more detail watching a DVD on a 120" HT projector screen than we do on a 50" TV screen, given similar brightness, contrast settings etc.

I disagree. One of the immediate results I saw when I set up a projector in my room, watching several very familiar DVDs, was how much more detail I became aware of in the image. It wasn't just the size, but the number of details that were once too small to notice that I became aware of, so it was like seeing most of these disks "for the first time" again. Yes, much of it I could also see in a smaller image, but a much larger image makes the detail easier to apprehend.

Quote:
Originally Posted by catonic View Post

In fact it is often the case that less detail is visible as sharpness declines. And the same applies to blu-ray content when we see 1080P content at the movies, including IMAX movies. Bigger screens add to the sense of immersion, they do not by themselves make more details visible.

Yes they certainly do, insofar as they increase the viewing angle and make the image bigger relative to the viewer. For instance, small writing, distant signs etc that are unreadable from an average TV viewing angle become legible at larger sizes - which goes for such smaller details in general. This should hardly be surprising - it's what happens when you effectively get closer to something with details you want to see.

Quote:
Originally Posted by catonic View Post

This is especially so for screens above a certain size for content of a given resolution. Just go into any store selling tv's and watch the same content on a 50" screen compared with the 70" or 80" screen nearby. Usually the detail is less, not more.

Just the opposite: doing comparisons similar to this shows just how important image size is to apprehending detail.

I've viewed the Sony 4K display numerous times. One thing that I've noticed is that from a distance that gives a viewing angle similar to the smaller 1080p screens nearby, the Sony's 4K resolution doesn't stand out as better or more detailed. Because you just can't make out the added 4K detail from too far away. It takes standing much closer to the 4K display - e.g. expanding it's size re viewing angle - to percieve the extra details it conveys over the nearby 1080p screens.
So screen size/viewing distance is indeed quite critical to fully apprehending the new level of resolution of 4K.

But then, this has been stated countless times by tech-writers and consumers who have seen the new 4K displays.
post #493 of 670
Rich, I agree with your first three replies, but only up to a point and it is certainly my fault for not making my post clearer on this issue.
Yes, a bigger screen will enable more detail to be seen if the starting size is "too" small, at a given resolution. But beyond a certain size the still bigger screens will result in less detail as the content becomes too stretched to remain sharp and clear, hence my reference to how a 640x480 video falls apart when expanded "too" much.
One of the many benefits of going to 4k and then 8k is that for even large home theatre projector screen sizes (that very few of us have) such as 160"+ there will be no decline in detail, as would happen if we tried to show 640x480 etc content on such big screens.
On the extreme other hand, there has been mention on this forum of there eventually being 16k content, for commercial movie theatres, presumably IMAX size etc.
One reason for that is because someone has worked out or believes that even 8k content will not be enough (in terms of resolution, detail, sharpness etc) for such large screens.
Of course it remains to be seen if 16k content is ever actually created and projected.

In relation to your second point, my experience is just the opposite. I have a JVC HD550 projector with an average quality anamorphic lens and a 120" anamorphic screen. I also own a 50" Panasonic mid level plasma and watching a blu-ray of Serenity (a movie I am very familiar with) there is more detail visible on the smaller plasma than the larger projector screen. Again, these issues are influenced by what content we are watching and the relative size of the screens being used, as well as viewing distances, contrast levels etc.

As to your last point, there are at least two issues that I would like to comment on to try and explain things better.
Yes, for 4k content, at the screen size you saw, there was little difference with 1080P content because we have to go very large for 1080P to start to deteriorate. That is why I chose 640x480 because just about every user of the net is familiar with what happens to that resolution video when we try and play it on screens that are "too" big, as all modern desktop computer monitors are, when played at full screen size.
Secondly, of course, as you say:

"So screen size/viewing distance is indeed quite critical to fully apprehending the new level of resolution of 4K ...." but we don't have to go to a large screen size and close viewing distance to "fully" apprehend 4k to make it (or 8k for that matter) worthwhile. All we have to do is to see that there is a worthwhile improvement, for us as an individual, over 1080P at the screen size we choose at the viewing distance we choose. Of course, this is going to be our choice and no general chart or table of figures based on someone else's experience can really help us here, especially when we have so many different tv technologies (LCD, LED, plasma, perhaps OLED) and projector types (LCD, LCOS, DLP, LED, perhaps laser) available which will affect our decision.

To put it very crudely, there is an important difference here between "fully" and "worthwhile"

I hope this clarifies my original post, at least a bit anyway. smile.gif
Edited by catonic - 2/19/13 at 3:11am
post #494 of 670
DigInfo - Docomo demos H.265

Edited by Randomoneh - 2/21/13 at 1:13am
post #495 of 670
65, you mean.
post #496 of 670
Interesting that their "world first" 60fps 4K video is being demoed at 10mbps or less.
post #497 of 670
Quote:
Originally Posted by Chronoptimist View Post

Interesting that their "world first" 60fps 4K video is being demoed at 10mbps or less.

And it's that aspect (4K @60fps) that scares me with the current and imminent 4K displays. They can't display 4K at that frame rate and it seems to me they'll be obsolete as soon as the first 4K 60fps material becomes available.

I don't think we even know what would happen if 4K 60p were presented to one of the current 4K displays. Would their HDMI 1.4 connection renegotiate down to 2K @60p or would we simply see an 'out of range' indication on the display?

That's about the only thing keeping me from jumping in to 4K on my next display without any hesitation. I certainly hope for current 4K owners that they'd at least be able to watch the higher frame rate albeit at a lower, 2K resolution. But who really knows?
post #498 of 670
Quote:
Originally Posted by Ken Ross View Post

And it's that aspect (4K @60fps) that scares me with the current and imminent 4K displays. They can't display 4K at that frame rate and it seems to me they'll be obsolete as soon as the first 4K 60fps material becomes available.

I don't think we even know what would happen if 4K 60p were presented to one of the current 4K displays. Would their HDMI 1.4 connection renegotiate down to 2K @60p or would we simply see an 'out of range' indication on the display?

That's about the only thing keeping me from jumping in to 4K on my next display without any hesitation. I certainly hope for current 4K owners that they'd at least be able to watch the higher frame rate albeit at a lower, 2K resolution. But who really knows?
Do none of the current displays support 4K at 60Hz yet? Video cards have been equipped with 3GHz HDMI for over a year now.
post #499 of 670
Off topic, I'm sorry, but I have to respond. "CD quality" is hardly an audio goal worth aiming for. Minimum 24 bits. Better yet, 32bit/64khz stereo, or 5.1 channel losslessly compressed would be 100% transparent. Just my opinion. I can hear the diffenence between "CD quality" 16bit/44.1khz, and a good quality analogue record. I can even hear the limitations imposed by 24bit, compared to the best analogue recordings. A Cat Stevens record sounds much better in as an analogue record compared to the equvalent CD, for example. There will always be engineers who will claim that I am not right, and they will site their mathimatical reasons. All I know is that I trust my ears and my eyes.
BTW, I would be happy with a 4k, 10bit color standard for cinema. That would still be better than what we have today.
post #500 of 670
Will we ever see deep color encoded on Blu-ray with Blu-ray enabled displays?
post #501 of 670
Quote:
Originally Posted by Chronoptimist View Post

Do none of the current displays support 4K at 60Hz yet? Video cards have been equipped with 3GHz HDMI for over a year now.

Not a one. But it's not surprising since they're equipped with HDMI 1.4 which isn't capable of 4K @60. You'll have to wait for next gen 4K displays with HDMI 2.0 which will have that capability.
post #502 of 670
Quote:
Originally Posted by Tazishere View Post

Off topic, I'm sorry, but I have to respond. "CD quality" is hardly an audio goal worth aiming for. Minimum 24 bits. Better yet, 32bit/64khz stereo, or 5.1 channel losslessly compressed would be 100% transparent. Just my opinion. I can hear the diffenence between "CD quality" 16bit/44.1khz, and a good quality analogue record. I can even hear the limitations imposed by 24bit, compared to the best analogue recordings. A Cat Stevens record sounds much better in as an analogue record compared to the equvalent CD, for example. There will always be engineers who will claim that I am not right, and they will site their mathimatical reasons. All I know is that I trust my ears and my eyes.
16/44 is already beyond the limits of human hearing. 16-bits gives you about 96dB of dynamic range, and beyond 120dB if you use noise shaping. Most music probably has less than a third of that dynamic range. The only reason to go beyond 16-bit is if you are using a digital volume control to avoid the channel imbalance that analog pots have. That's something you do inside your DAC/audio player though, it doesn't make any difference whether the file is 16/24/32/64-bit. All you're doing is padding the file with zeros.

And records (vinyl) only has a dynamic range of about 12-bits, assuming you have a very high end player.

44.1kHz can reproduce signals up to 22kHz perfectly, and the upper threshold of human hearing is 20kHz.
Going beyond 44.1kHz might have made sense back when we were using crappy analog anti-aliasing filters, but we use digital filters now that don't have any problem with a very steep cutoff above 21kHz.

If you think 16/44 digital audio sounds bad, the problem is your DAC. Buy a receiver or DAC using one of the latest ESS Sabre DAC chips. They work in 32-bit internally, so you don't have any noise problems if you use a digital volume control (actually outperforms any analogue control to date, except when you have it at very low levels) they completely eliminate jitter, and they avoid the variable background noise problem of typical sigma-delta DAC chips.
Quote:
Originally Posted by Tazishere View Post

BTW, I would be happy with a 4k, 10bit color standard for cinema. That would still be better than what we have today.
I assume you mean as a home format, because theatres are already using more than 10-bit.

It would be nice, but I expect we will be stuck with 8-bits for home formats for a while yet. I would love to be proven wrong though.
Quote:
Originally Posted by Ken Ross View Post

Not a one. But it's not surprising since they're equipped with HDMI 1.4 which isn't capable of 4K @60. You'll have to wait for next gen 4K displays with HDMI 2.0 which will have that capability.
The video cards with 3GHz HDMI are technically only HDMI 1.4 as well, but they support 4K at 60Hz. "HDMI 1.4" doesn't guarantee the display won't support 4K60.
post #503 of 670
Quote:
Originally Posted by Chronoptimist View Post


The video cards with 3GHz HDMI are technically only HDMI 1.4 as well, but they support 4K at 60Hz. "HDMI 1.4" doesn't guarantee the display won't support 4K60.

Everything I've read, and I mean 'everything', has said the current crop will simply not support 4K 60p. I think it's important that we understand the limitations of the current crop of 4K displays and not give the impression they can accept all manner of 4K inputs. In fact, I'm quite sure that all references I've seen to HDMI 1.4 have said we'll need HDMI 2.0 before it will pass 4K 60p. If you've seen something different as it relates to TV displays, I'd love to see it.

This seems to be affirmed by the fact that no current 4K display is claiming it can accept this type of signal. The Sonys for example are very specific in their only accepting up to 4K @30p.

I'm not saying that this is deal break for everyone, but it sure is important that people realize this limitation before buying.
post #504 of 670
Ken Ross: I don't see how 4K will ever be able to be broadcasted. Do you think they could broadcast it without compressing its brains out?

Would COMPRESSED 4K look like what real UNCOMPRESSED 1080p is supposed to look like?

Realistically in the next ten years what's the best we can hope for from broadcast? Maybe 1080p/60? Seems to me like 1080p/72 would be best for movies.

Wouldn't 72 frames per second eliminate flicker for most people?
post #505 of 670
Artwood, we will find out when we get there. Until then, nobody knows.
post #506 of 670
Quote:
Originally Posted by Tazishere View Post

Off topic, I'm sorry, but I have to respond. "CD quality" is hardly an audio goal worth aiming for. Minimum 24 bits. Better yet, 32bit/64khz stereo, or 5.1 channel losslessly compressed would be 100% transparent. Just my opinion. I can hear the diffenence between "CD quality" 16bit/44.1khz, and a good quality analogue record. I can even hear the limitations imposed by 24bit, compared to the best analogue recordings. A Cat Stevens record sounds much better in as an analogue record compared to the equvalent CD, for example. There will always be engineers who will claim that I am not right, and they will site their mathimatical reasons. All I know is that I trust my ears and my eyes.
BTW, I would be happy with a 4k, 10bit color standard for cinema. That would still be better than what we have today.
That is so filled with cart-before-the-horse reasoning, now I have to respond. Actually, no I don't.
post #507 of 670
Sorry you feel that way.
post #508 of 670
If you have a spare 20 minutes, I highly recommend looking over this video if you have an interest in digital audio. (and why we don't need more than 16-bit, 44.1kHz)

https://www.xiph.org/video/vid2.shtml
post #509 of 670
Good watching at xiph.org, but he lost me when he said that dither didn't matter above 14 bits. I wish I could find the link that demonstrates sounds at different bit depths, because that is what convinced me that bit depth is very important to the quality of the audio. The demonstration stopped at 16 bits, but with each increase in bit depth up to that upper limit, an audible impovement resulted. Dithering mitigates the lack of bit depth to some extent, but some of us can hear truncating of the lowest bit. I know I can.
post #510 of 670
Thread Starter 
New Posts  All Forums:Forum Nav:
  Return Home
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 8K by 4K or Octo HD - the real SUHDTV technology