8K by 4K or Octo HD - the real SUHDTV technology - Page 27 - AVS Forum | Home Theater Discussions And Reviews
Baselworld is only a few weeks away. Getting the latest news is easy, Click Here for info on how to join the Watchuseek.com newsletter list. Follow our team for updates featuring event coverage, new product unveilings, watch industry news & more!


Forum Jump: 
 42Likes
Reply
 
Thread Tools
post #781 of 806 Old 06-15-2015, 12:02 AM
AVS Special Member
 
wco81's Avatar
 
Join Date: May 2001
Posts: 4,832
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 363 Post(s)
Liked: 128
Sony introduced a couple of cameras this past week with 4k capabilities, a compact with a 1 inch sensor and a full frame dslr.

But I believe they are at most 30fps and not sure they have HDMI 2. Nor probably some support for the HDR that display manufacturers are moving towards, so these may not be quite a source for the 4k HDR TVs which are coming.

Too bad ...
wco81 is online now  
Sponsored Links
Advertisement
 
post #782 of 806 Old 06-15-2015, 12:19 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,868
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 314 Post(s)
Liked: 92
Quote:
Originally Posted by wco81 View Post
Sony introduced a couple of cameras this past week with 4k capabilities, a compact with a 1 inch sensor and a full frame dslr.

But I believe they are at most 30fps and not sure they have HDMI 2. Nor probably some support for the HDR that display manufacturers are moving towards, so these may not be quite a source for the 4k HDR TVs which are coming.

Too bad ...
Though this is the "8K by 4K" thread (7680x4320 - 4 times more pixels than a "4K" (3840x2160) TV). So the cameras won't be the best for 8K (7.68K) TVs/content. Though they could still upscale or do something similar to what the other person did (ie. shoot in portrait mode and composite to make it higher res - though there could still be issues - eg. if things are moving).
Joe Bloggs is offline  
post #783 of 806 Old 06-15-2015, 06:38 AM
AVS Addicted Member
 
Ken Ross's Avatar
 
Join Date: Nov 2000
Location: N.Y.
Posts: 27,444
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 3280 Post(s)
Liked: 3130
Quote:
Originally Posted by wco81 View Post
Sony introduced a couple of cameras this past week with 4k capabilities, a compact with a 1 inch sensor and a full frame dslr.

But I believe they are at most 30fps and not sure they have HDMI 2. Nor probably some support for the HDR that display manufacturers are moving towards, so these may not be quite a source for the 4k HDR TVs which are coming.

Too bad ...
I've already owned several 4K cameras and although they don't do HDR, they are superb sources for 4K and can produce output that rivals the demos you see on these UHD TVs. They are fully compatible with any of the 2015 UHD TVs and yes, they all shoot 30fps (or 24fps if that's your choice).
Ken Ross is offline  
post #784 of 806 Old 06-15-2015, 06:42 AM
AVS Special Member
 
NuSoardGraphite's Avatar
 
Join Date: Jul 2007
Location: Tucson AZ
Posts: 1,627
Mentioned: 0 Post(s)
Tagged: 1 Thread(s)
Quoted: 138 Post(s)
Liked: 158
8k isn't at all necessary for the home. you need a HUGE display to see any benefit from the resolution increase. I think 4k is as high as home cinema needs to go. (maybe 8k for projectors, but not flat panel displays 85" or smaller)


other technological advances (expanded color, HDR, quantum dots, OLED) will be necessary to enhance PQ on a flat panel from here on out.

Stand tall and shake the heavens...
NuSoardGraphite is offline  
post #785 of 806 Old 06-15-2015, 07:22 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,785
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 350
Quote:
Originally Posted by NuSoardGraphite View Post
8k isn't at all necessary for the home. you need a HUGE display to see any benefit from the resolution increase.
Depends on your intended use. I desperately want a 44″ 8K panel.
That's 200 pixels per inch, and an ideal "retina display" to use as a PC monitor.

4K means you have to choose between a large workspace or retina-quality text and images. 8K gets you both.
dsinger likes this.
Chronoptimist is offline  
post #786 of 806 Old 06-15-2015, 07:32 AM
AVS Special Member
 
NuSoardGraphite's Avatar
 
Join Date: Jul 2007
Location: Tucson AZ
Posts: 1,627
Mentioned: 0 Post(s)
Tagged: 1 Thread(s)
Quoted: 138 Post(s)
Liked: 158
Quote:
Originally Posted by Chronoptimist View Post
Depends on your intended use. I desperately want a 44″ 8K panel.
That's 200 pixels per inch, and an ideal "retina display" to use as a PC monitor.

4K means you have to choose between a large workspace or retina-quality text and images. 8K gets you both.

I suppose so. Though I am hardpressed to identify individual pixels on a 65" 2160p screen. I can imagine that 4k on a 44" screen would be pretty good.


Why do you need such a tight PPI count?

Stand tall and shake the heavens...
NuSoardGraphite is offline  
post #787 of 806 Old 06-15-2015, 07:35 AM
AVS Addicted Member
 
Ken Ross's Avatar
 
Join Date: Nov 2000
Location: N.Y.
Posts: 27,444
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 3280 Post(s)
Liked: 3130
Although only 28", seeing pixels on my 4K monitor is tough to impossible.

I certainly don't see the advantages of 8K for home displays, but we all know it will come...eventually. They need to sell us something.

I can actually see more benefit in 8K camera recording. That would allow me to crop and pan an 8K image for a final 4K project. That would be very nice. Right now you can do that with a 4K recording, but you wind up with HD.
Ken Ross is offline  
post #788 of 806 Old 06-15-2015, 08:21 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,785
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 350
Quote:
Originally Posted by Ken Ross View Post
Although only 28", seeing pixels on my 4K monitor is tough to impossible.
I have no difficulty seeing pixels on a 160 PPI monitor.
That's closer to a non-retina notebook/iPad (130-140 PPI) than a retina display.

All of Apple's retina displays are >200 PPI:
  • iMac: 218
  • MacBook Pro: 220/227
  • MacBook: 226
  • iPad: 264

A 28″ 4K monitor gives you a non-retina 3840x2160 workspace, or a retina 1920x1080 workspace.
It's really too small to use as a non-retina display (should be 44″ for 100 PPI)
And too big to use as a retina display (should be 22″ for 200 PPI)

I was never a fan of when they tried changing 1080p monitors from being 22″ to something more like 27/28″ as a low-cost alternative to 2560x1440 screens.
100 PPI or thereabouts had been standard for PC displays as far back as I can remember.


On Windows with a 28″ 4K screen you can use 1.5× scaling, but non-integer scaling is far from ideal. At least you have the option though, unlike OS X.
I personally wouldn't buy anything other than ~100 PPI or ~200 PPI to use at 1× or 2× scale.
Non-integer scaling is too much of a compromise.

The new 5K monitors are nice though. 5120x2880 at 27″ is 218 PPI which is almost perfect. Just a little bit smaller than I'd like. (ideally 29″)
That gives you a 2560x1440 workspace when used as a retina display.
The problem is that they are very expensive right now, are non-optimal for displaying videos (though great for editing) and are currently all tiled displays which are limited to 60Hz.
If I'm buying a monitor like that I'd be waiting for DisplayPort 1.3 to have a single non-tiled image, using a single cable to connect it to the PC, and hopefully with a refresh rate above 60Hz.

But what I'd really like is a massive 3840x2160 workspace and retina-quality rendering.
A 5K monitor is nice, but it's the same workspace as the 2560x1440 displays we've had for 5+ years at this point. (well 2560x1600 before some idiot decided 16:9 was better than 16:10 in a monitor)
That requires an 8K panel, and 44″ is the ideal size - which is significantly smaller than the “100″ or larger” displays that many people claim are required for 8K to be worthwhile.

Seiki's 40" monitor is an example of what that would be like, only it is a "non-retina" 110 PPI.

Quote:
Originally Posted by Ken Ross View Post
I certainly don't see the advantages of 8K for home displays, but we all know it will come...eventually. They need to sell us something.
For televisions to sell to the mass market, you could be right.
I'm not convinced yet though. I've mentioned it on the forums before but a few people I know have updated their displays from smaller <55″ 1080p screens to larger 65″+ 4K ones and were not that impressed by the resolution increase.
The increase in size was enough that pixels are still quite visible to them. Not quite as bad as their older 1080p sets, but they didn't disappear either.
Chronoptimist is offline  
post #789 of 806 Old 06-15-2015, 08:33 AM
Member
 
Join Date: Jul 2014
Posts: 131
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 104 Post(s)
Liked: 18
What difference will 8k make? The industry will probably be stupid enough to try to fit it onto multi layer bluray discs - same as with 4k. Along with that, online streaming will say that they are streaming in high quality but in fact it is too compressed to fully enjoy the resolution and audio. Heck, bring back the 12" laser disc with today's modern technology and have it hold 1TB per side!!
STIGUY2014 is offline  
post #790 of 806 Old 06-15-2015, 08:45 AM
AVS Addicted Member
 
Ken Ross's Avatar
 
Join Date: Nov 2000
Location: N.Y.
Posts: 27,444
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 3280 Post(s)
Liked: 3130
Quote:
Originally Posted by Chronoptimist View Post
I have no difficulty seeing pixels on a 160 PPI monitor.
That's closer to a non-retina notebook/iPad (130-140 PPI) than a retina display.
Well your eyes are better than mine then.
Quote:
Originally Posted by Chronoptimist View Post
All of Apple's retina displays are >200 PPI:
  • iMac: 218
  • MacBook Pro: 220/227
  • MacBook: 226
  • iPad: 264

A 28″ 4K monitor gives you a non-retina 3840x2160 workspace, or a retina 1920x1080 workspace.
It's really too small to use as a non-retina display (should be 44″ for 100 PPI)
And too big to use as a retina display (should be 22″ for 200 PPI)

I was never a fan of when they tried changing 1080p monitors from being 22″ to something more like 27/28″ as a low-cost alternative to 2560x1440 screens.
100 PPI or thereabouts had been standard for PC displays as far back as I can remember.
<snip>
On Windows with a 28″ 4K screen you can use 1.5× scaling, but non-integer scaling is far from ideal. At least you have the option though, unlike OS X.
I personally wouldn't buy anything other than ~100 PPI or ~200 PPI to use at 1× or 2× scale.
Non-integer scaling is too much of a compromise.
I actually find the windows scaling to be quite good for most apps that adhere to it. I use the monitor for 4K editing and it works well for me. I don't use Apple because they don't support my editing program and I won't buy an Apple computer to use it in dual-boot mode. More trouble than it's worth. Plus, Apples have issues with new codecs. They are not receptive to H265 whereas I have a program that ingests these 4K H265 clips natively.

Quote:
Originally Posted by Chronoptimist View Post
For televisions to sell to the mass market, you could be right.
I'm not convinced yet though. I've mentioned it on the forums before but a few people I know have updated their displays from smaller <55″ 1080p screens to larger 65″+ 4K ones and were not that impressed by the resolution increase.
The increase in size was enough that pixels are still quite visible to them. Not quite as bad as their older 1080p sets, but they didn't disappear either.
Of course their ability to see pixels on that UHD display all hinges on their visual acuity and viewing distance. To me, at the distance I'd be at from a 65" UHD display, it's a total non-issue. Frankly I've seen very very few people complain of this.
Ken Ross is offline  
post #791 of 806 Old 06-15-2015, 09:25 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,785
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 350
Quote:
Originally Posted by STIGUY2014 View Post
What difference will 8k make? The industry will probably be stupid enough to try to fit it onto multi layer bluray discs - same as with 4k. Along with that, online streaming will say that they are streaming in high quality but in fact it is too compressed to fully enjoy the resolution and audio. Heck, bring back the 12" laser disc with today's modern technology and have it hold 1TB per side!!
You don't need an 8K source to benefit from an 8K display. Same thing applies to 4K TVs.
Chronoptimist is offline  
post #792 of 806 Old 06-15-2015, 09:59 AM
Advanced Member
 
Luke M's Avatar
 
Join Date: Apr 2006
Posts: 561
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 120 Post(s)
Liked: 41
Quote:
Originally Posted by Chronoptimist View Post
On Windows with a 28″ 4K screen you can use 1.5× scaling, but non-integer scaling is far from ideal. At least you have the option though, unlike OS X.
I personally wouldn't buy anything other than ~100 PPI or ~200 PPI to use at 1× or 2× scale. Non-integer scaling is too much of a compromise.
What needs to be scaled? Icons? Bah, who cares.

To me it's all about whether small text looks smooth or not.
Luke M is offline  
post #793 of 806 Old 06-15-2015, 10:04 AM
AVS Special Member
 
jogiba's Avatar
 
Join Date: Sep 2006
Posts: 2,330
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 190 Post(s)
Liked: 99
8K 4320p videos on youtube today !






Marques Brownlee has been shooting 4K videos for his youtube Channel BKBHD with his 4K RED camera and has the 8K version on order.

jogiba is offline  
post #794 of 806 Old 06-19-2015, 07:33 AM - Thread Starter
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,630
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 121 Post(s)
Liked: 95
Quote:
Originally Posted by Joe Bloggs View Post
Though this is the "8K by 4K" thread (7680x4320 - 4 times more pixels than a "4K" (3840x2160) TV). So the cameras won't be the best for 8K (7.68K) TVs/content. Though they could still upscale or do something similar to what the other person did (ie. shoot in portrait mode and composite to make it higher res - though there could still be issues - eg. if things are moving).
Sony A7RII has resolution of 7,952 x 4,472 pixels which exceeds the 8K format, one could make 8K still photos (or video at 5 frames/sec) by cutting excessive rows and columns. Moreover the camera does 4K video recording in full frame mode by taking 8K frames and downscaling them to 4K. This means 8K recording would be a piece of cake for this camera, most likely it is not implemented just from marketing reasons since there is no way of watching 8K (and camera other technical parameters are bordering on scary anyway). Thus 8K video is technically no problem anymore.
irkuck is offline  
post #795 of 806 Old 07-05-2015, 07:46 AM
Member
 
Join Date: Nov 2014
Posts: 50
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 30 Post(s)
Liked: 20
Upcoming NY Yankees game to be shot in 8K.

http://4k.com/news/japanese-broadcas...me-in-8k-7926/
AB4OLED is offline  
post #796 of 806 Old 07-22-2015, 03:28 AM - Thread Starter
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,630
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 121 Post(s)
Liked: 95
irkuck is offline  
post #797 of 806 Old Today, 02:44 AM
KOF
Advanced Member
 
KOF's Avatar
 
Join Date: Apr 2006
Posts: 982
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 146 Post(s)
Liked: 146
I've always believed the point in going higher resolution is to take advantage of higher bitrate. 2160p demos in Best Buy showrooms look unimpressive to me not because of not enough resolution jump, but because the bitrate increase cannot keep up with increased resolution. I've tried the very same demos Samsung/LG/Sony were using on my 1080p Panasonic plasma and my plasma did rather admirable job keeping up with downscaling. However, when I tried ChangHong demo that has excess of 100 Mbps, that was when I realized my plasma can no longer keep up with 2160p TVs. If I've purchased something like the Sony F65 and watched RAW 4:4:4 on any 2160p TVs, I would never, ever, say I can't distinguish between 1080p TVs in 10~14 feets of viewing distance.

With that said, that's also precisely the reason current UHDTVs are severely limited. I can't be shooting with the Sony F65 forever. Reality will kick in when I'm greeted with ATSC 3.0 with pitiful 27 Mbps HEVC bitrate that will be downgraded even further when multicasted. Netflix UHD is a joke currently and UHD Bluray will be lucky if they get 15% of the Bluray market let alone the entire physical media market. NHK's 8K will not be impressive either with 85 Mbps of bitrate, but more pressing question would be, can any American broadcast company can get bitrate anywhere near that in 10 years?
KOF is offline  
post #798 of 806 Old Today, 04:27 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,868
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 314 Post(s)
Liked: 92
Quote:
Originally Posted by KOF View Post
I've always believed the point in going higher resolution is to take advantage of higher bitrate. 2160p demos in Best Buy showrooms look unimpressive to me not because of not enough resolution jump, but because the bitrate increase cannot keep up with increased resolution. I've tried the very same demos Samsung/LG/Sony were using on my 1080p Panasonic plasma and my plasma did rather admirable job keeping up with downscaling. However, when I tried ChangHong demo that has excess of 100 Mbps, that was when I realized my plasma can no longer keep up with 2160p TVs. If I've purchased something like the Sony F65 and watched RAW 4:4:4 on any 2160p TVs, I would never, ever, say I can't distinguish between 1080p TVs in 10~14 feets of viewing distance.
Though isn't that demo H264 rather than HEVC/H265? So, depending on the source, HEVC may be up to 50% or so more efficient.
Quote:
With that said, that's also precisely the reason current UHDTVs are severely limited. I can't be shooting with the Sony F65 forever. Reality will kick in when I'm greeted with ATSC 3.0 with pitiful 27 Mbps HEVC bitrate that will be downgraded even further when multicasted. Netflix UHD is a joke currently and UHD Bluray will be lucky if they get 15% of the Bluray market let alone the entire physical media market. NHK's 8K will not be impressive either with 85 Mbps of bitrate, but more pressing question would be, can any American broadcast company can get bitrate anywhere near that in 10 years?
85 Mbps with HEVC was one of their recent tests. Are you sure their broadcasts will be that? Also (not factoring differences in resolution, etc.), that 85 Mbps, depending on the source, could have needed up to around twice that (so around 170 Mbps - a lot more than the ChangHong demo you mention) if H264 was used - though it probably also depends on the encoder (eg. realtime may not be as good PQ as non-realtime). Also NHK's "8K" (7.68K) broadcasts should be 120 (119.88?) fps, whereas UHD BD will have a max of 60 (59.94) - so that could also improve the PQ of "8K" broadcasts in comparison (though really they'd be better with more than 120)
Joe Bloggs is offline  
post #799 of 806 Old Today, 05:26 AM
KOF
Advanced Member
 
KOF's Avatar
 
Join Date: Apr 2006
Posts: 982
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 146 Post(s)
Liked: 146
Quote:
Originally Posted by Joe Bloggs View Post
Though isn't that demo H264 rather than HEVC/H265? So, depending on the source, HEVC may be up to 50% or so more efficient.
Yes, I do have several Samsung demos that are encoded in H265/HEVC with average 50 Mbps bitrate and I still prefer 100 Mbps H264 ChangHong demo and that Happy New Year Countdown Taiwanese clip. Maybe the reason being HEVC doesn't always bring 50% of bitrate improvement. Also, all of Samsung HEVC clips are also in 10bit, so that eats up about 25% of bitrate, while the Taiwanese demos are only encoded in 8bit. My plasma can obviously only accept 8bit, so the improved dynamic range and banding reduction of 10bit demos will not be felt. (Though Samsung's Dubai demo is impressive on my plasma. It completely kicks HD Blurays into oblivion. This means even HD Bluray is nowhere near saturating 1080p)

Quote:
Originally Posted by Joe Bloggs View Post
85 Mbps with HEVC was one of their recent tests. Are you sure their broadcasts will be that? Also (not factoring differences in resolution, etc.), that 85 Mbps, depending on the source, could have needed up to around twice that (so around 170 Mbps - a lot more than the ChangHong demo you mention) if H264 was used - though it probably also depends on the encoder (eg. realtime may not be as good PQ as non-realtime). Also NHK's "8K" (7.68K) broadcasts should be 120 (119.88?) fps, whereas UHD BD will have a max of 60 (59.94) - so that could also improve the PQ of "8K" broadcasts in comparison (though really they'd be better with more than 120)
What I mean is, even NHK's 8K broadcast has less bitrate than the '4K' UHD Blurays. So, even if NHK's 85 Mbps is much more than ChangHong, remember that ChangHong is designed to be played on 2160p while NHK one will be designed for 4320p, 4 times the resolution of 2160p. Of course NHK's 8K broadcast will be spectacular if played on 2160p TVs. I'm also interested in how much bitrate is required when pushing 120 fps. Good call about the tested bitrate against actual broadcast bitrate, I was forgetting that.

Last edited by KOF; Today at 05:32 AM.
KOF is offline  
post #800 of 806 Old Today, 09:16 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,868
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 314 Post(s)
Liked: 92
Quote:
Originally Posted by KOF View Post
Yes, I do have several Samsung demos that are encoded in H265/HEVC with average 50 Mbps bitrate and I still prefer 100 Mbps H264 ChangHong demo and that Happy New Year Countdown Taiwanese clip. Maybe the reason being HEVC doesn't always bring 50% of bitrate improvement. Also, all of Samsung HEVC clips are also in 10bit, so that eats up about 25% of bitrate, while the Taiwanese demos are only encoded in 8bit. My plasma can obviously only accept 8bit, so the improved dynamic range and banding reduction of 10bit demos will not be felt. (Though Samsung's Dubai demo is impressive on my plasma. It completely kicks HD Blurays into oblivion. This means even HD Bluray is nowhere near saturating 1080p)
Encoding in 10 bit may actually be more efficient according to some than encoding in 8 bit, if the original source was 10 bit. If you encode 10 bit source content at 8 bit and you dither (as they probably do on most Blu-ray releases) you lose compression efficiency due to the dither. Encoding 10 bit content as 10 bit without dither should gain efficiency (I don't know whether that includes HDR content). For an 8 bit display they could dither down from 10 bit after decompressing it - and so gain on efficiency.

Last edited by Joe Bloggs; Today at 09:20 AM.
Joe Bloggs is offline  
post #801 of 806 Old Today, 09:28 AM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 8,177
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 1432 Post(s)
Liked: 1400
Quote:
Originally Posted by Joe Bloggs View Post
Encoding in 10 bit may actually be more efficient according to some than encoding in 8 bit, if the original source was 10 bit. If you encode 10 bit source content at 8 bit and you dither (as they probably do on most Blu-ray releases)
Is that true though? Why do you think they convert from 10 bits native to 8 bit+dither on blu-ray releases (in the source)? I've never heard that at all. The only dithering algorithms I've heard of employed have been in the display itself.

What do you call a Harley that doesn't leak oil?
Out of oil.
tgm1024 is online now  
post #802 of 806 Old Today, 09:35 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,868
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 314 Post(s)
Liked: 92
Quote:
Originally Posted by tgm1024 View Post
Is that true though? Why do you think they convert from 10 bits native to 8 bit+dither on blu-ray releases (in the source)? I've never heard that at all. The only dithering algorithms I've heard of employed have been in the display itself.
I don't know the exact amount of Blu-rays that have used dithering. But when they dither they do it to reduce banding etc. (when converting from a 10 bit source to 8 bit).

This is what Stacey Spears said recently when talking about his dithering algorithm:
Quote:
Originally Posted by sspears
A lot of Blu-rays have been encoded using this algorithm. I provided it to Deluxe, GDMX, Technicolor, and many others back in 2007 while working on the VC-1 encoder for HD DVD and Blu-ray
SpectraCal on HDR

Here's a link which talks about 8 bit vs 10 bit encoding and 10 bit being more efficient - though it doesn't mention dithering. http://x264.nl/x264/10bit_02-ateme-w..._bandwidth.pdf

Last edited by Joe Bloggs; Today at 10:16 AM.
Joe Bloggs is offline  
post #803 of 806 Old Today, 10:34 AM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 8,177
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 1432 Post(s)
Liked: 1400
Quote:
Originally Posted by Joe Bloggs View Post
This is what Stacey Spears said recently when talking about his dithering algorithm:
Quote:
Originally Posted by sspears
A lot of Blu-rays have been encoded using this algorithm. I provided it to Deluxe, GDMX, Technicolor, and many others back in 2007 while working on the VC-1 encoder for HD DVD and Blu-ray

SpectraCal on HDR
Except he's referring to Floyd-Steinberg error diffusion in that quote. I haven't implemented that algorithm in EONS, but I remember some parts about it. And specifically, he's talking about the R250 algorithm, which is simply a random number generator----sounds like he's using it to defeat a repeat pattern from emerging from FS error accumulation. But I'm having a hard time believing that error diffusion is used in blu-ray. He's not one to doubt though, so I'll take it.

Here's what he fully said:

Spears: "The final image is using noise shaping / error diffusion (Floyd-Steinberg) to go from 10-bit down to ~2-bit. Because noise shaping can result in some fixed pattern, we use the same R250 algorithm randomize the error diffusion. A lot of Blu-rays have been encoded using this algorithm. I provided it to Deluxe, GDMX, Technicolor, and many others back in 2007 while working on the VC-1 encoder for HD DVD and Blu-ray."
Certainly interesting. And certainly someone to pay attention to.

What do you call a Harley that doesn't leak oil?
Out of oil.
tgm1024 is online now  
post #804 of 806 Old Today, 11:36 AM
AVS Special Member
 
sspears's Avatar
 
Join Date: Feb 1999
Location: Sammamish, WA, USA
Posts: 5,295
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 57 Post(s)
Liked: 96
Quote:
Originally Posted by tgm1024
Except he's referring to Floyd-Steinberg error diffusion in that quote. I haven't implemented that algorithm in EONS, but I remember some parts about it. And specifically, he's talking about the R250 algorithm, which is simply a random number generator----sounds like he's using it to defeat a repeat pattern from emerging from FS error accumulation. But I'm having a hard time believing that error diffusion is used in Blu-ray.
Deluxe has been using it for Blu-ray encoding since 2007/2008. We gave them a DMO that we called xScaler with the algorithm(s) in it along with linear light scaling and they built an app around it called Parascaler. Its been used on many Blu-ray discs. We gave the DMO to several companies back then. Not sure how many are using it today.

Pixar provided their own dither algorithm to go from 16-bit to 8-bit that is used on their titles.

Yes, the R250 is used as you describe with the FS.

Last edited by sspears; Today at 11:48 AM.
sspears is online now  
post #805 of 806 Old Today, 11:57 AM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 8,177
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 1432 Post(s)
Liked: 1400
Quote:
Originally Posted by sspears View Post
Deluxe has been using it for Blu-ray encoding since 2007/2008. We gave them a DMO that we called xScaler with the algorithm(s) in it along with linear light scaling and they built an app around it called Parascaler. Its been used on many Blu-ray discs. We gave the DMO to several companies back then. Not sure how many are using it today.

Pixar provided their own dither algorithm to go from 16-bit to 8-bit that is used on their titles.

Yes, the R250 is used as you describe with the FS.
Thanks for the reply. Indeed interesting.

What do you call a Harley that doesn't leak oil?
Out of oil.
tgm1024 is online now  
post #806 of 806 Old Today, 07:16 PM
AVS Special Member
 
sspears's Avatar
 
Join Date: Feb 1999
Location: Sammamish, WA, USA
Posts: 5,295
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 57 Post(s)
Liked: 96
Quote:
Originally Posted by tmg1024
Thanks for the reply. Indeed interesting.
Not sure if anyone is interested, but here is some backstory on the how it came to be.

I was unhappy with the quality of the WMV HD titles that had been done. There was some pretty ugly banding in the content and I was not sure if it was the pre-processing (10-bit to 8-bit) or the compression. This was 2003. As a side project, I wanted to fix this. I was working on Media Foundation at the time and I hated it. Bad design, bad leadership, etc... So I re-focused some resources onto the problem and got into a lot of trouble for it. We had just started looking at image scaling, which we were testing linear light vs. perceptual space. Also looking at better (than what was implemented in the encoder) RGB / 4:4:4 to 4:2:0 conversion. And then something better than rounding or truncation. We started off with pure random dither. R250 was chosen at the time because it was fast and did not repeat often. While we were happy with the dither, I wanted to look at error diffusion. It turned out a lot better than we expected, except for the fixed patterns. We were going to try serpentine, but then we thought about randomizing the error first, which worked well and was faster than serpentine as I recall and so that is what we stuck with. Management thought this was a waste of resources at the time and it had a negative impact on my annual performance review. The CODEC team really liked what I had done, so they offered me a job and I moved over and spent 2005-2007 working on the VC-1 encoder for HD DVD. I convinced my bosses that we should also support Blu-ray, which the HD DVD team HATED, but we did it, which turned out to be a wise decision because when the format war ended, the HD DVD team was cut and we focused our efforts on Blu-ray 100%. Eventually that was cut and I ended up working on smooth streaming for the 1080p instant on video that was announced for Xbox 360.

The algorithm has not really changed since we first did it back in 2003. Its all done using double precision floating point. We have added more ways to go to / from linear light. (different EOTF functions) I use it to process all of the red footage I shoot. Recently I have been looking at deconvolution, which I posted a sample in the other thread on some red footage. I started shooting test footage so I can work on a new down scaling algorithm optimized for red footage. What I mean by optimized is that since red uses such a strong OLPF in front of the sensor, I will tune the moiré reduction for it and not a still image, which is much sharper by default. The scaling is anti-ringing and a 2D EWA based algorithm. Its on hold while I work on the dcon stuff, which shows a lot of promise. The first project that will use both will either be the HDR montage on the UHD / HDR S&M disc (2017) or the documentary I start shooting next year, which I am shooting at 7680x4320 on the Red Weapon, as soon as I get the sensor upgrade. I plan to finish the doc at 7680x4320, but deliver at 3840x2160. I want to color grade and do VFX at the higher resolution just because I can.

I will say that dither before encoding has always been tricky because you don't want the encoder to quantize out the dither. I assume this is why you thought no one would be doing it for Blu-ray. It does make sense to also dither on playback, because color conversion has rounding errors of its own. We used dither on one version of our SMPTE bars pattern to hide the rounding error. Color conversion of SMPTE 709 color bars result in magenta rounding one direction and cyan the other, which makes it look like there is a tint error. (not set correctly) For our default color pattern, we chose values that would round or truncate to the same value for all colors. 2020 has the same rounding problem.

xScaler stood for extreme scaling and we considered it a cost no object scaler, where cost was time. That was our mantra. Also generated four patents, which was cool at the time. I believe MS donated some of the patents so that SMPTE could use them for something. (linear light related)

Last edited by sspears; Today at 07:25 PM.
sspears is online now  
Sponsored Links
Advertisement
 
Reply OLED Technology and Flat Panels General

Tags
Lcd Hdtv , Displays , Plasma Hdtv , Panasonic

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off