4k by 2k or Quad HD...lots of rumors? thoughts? - Page 25 - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #721 of 3692 Old 02-03-2012, 01:45 AM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,502
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 35 Post(s)
Liked: 63
Quote:
Originally Posted by Chronoptimist View Post

The D4 is Nikon's high speed, high sensitivity "sports" camera, and is a full 1/3 higher resolution than it's predecessor, the 12MP D3. Unlike shooting movies, which is almost never done in available light, DSLRs need to balance resolution, sensitivity and speed. (higher resolution means more data to be read, and less images that can be buffered)

Their "flagship" camera is still the D3x which has less sensitivity at higher ISO and uses a 24.5MP, 35mm sensor.

Nikon has always lagged behind Canon when it comes to resolution—they had a 16.7MP full frame camera in 2004, the 1Ds Mark II.

Nikon uses Sony sensors in their cameras. Sony's latest camera/sensor is the NEX-7, with a 24MP APS-C sensor. This has the photosite density equivalent to a 56.4MP full-frame sensor.And when resolution really matters, such as for fashion and product photography, medium format cameras are used, which have much larger sensors, 80+ MP resolution, with no antialiasing filter over the sensor at all, and images are shot under studio lighting.

This only shows how shallow is your understanding of the issues. You focus narrowmindlessly on the res only while I am showing that there are much more aspects to the overall PQ, with the viewing scenario being primary thing. Thus, the 18 megs Nikon is not lagging but is optimally positioned for its scenario, there is no aliasing issue with it. In the same way, 80 meg medium formats are good for fashion since light is aplenty and their targets include foldouts in glossy magazines. From your reasoning it would result everybody should use the 80 megs. This would be unproductive and detrimental when one takes the usage scenarios.

Quote:
Originally Posted by Chronoptimist View Post

None of this really has anything to do with my point that shooting for a target of 1080p with a camera that only has a 2K sensor is going to result in drastically lower detail than shooting with at least a 4K camera, and a far greater chance of exhibiting aliasing/moiré. All of these cameras are already drastically higher than the 2MP you deem to be sufficient.
10-bit panels have made a huge improvement to gradation on LCDs.
I think I may have to just start ignoring your posts from this point onwards. It's now clear that you don't have any background in digital imaging, and are simply trying to find or manipulate facts that suit your agenda, rather than considering the available information objectively.

Heh, ignorance is evident from the brilliance of your arguments : Yes the 4K has more res and more detail than the 4K. The problem is those details are invisible. Equally well one can advice everybody to attach telescopes and microscopes to the eyes since there is so much detail around.

What I am saying is that res is useful and needed whenever usage scenarios justifies it. I am using exclusively 2K monitors from the times they costed thousands, I am strongly backing here 4K computer monitors and hope manufs will offer them sooner than later, I am saying that the 3K displays for IPad are fully justified. But 4K for TV in the standard TV scenario is just like the "gradation improvement" in 10-bit panels. It is a pure PR improvement which nobody ever noticed .

irkuck
irkuck is offline  
Sponsored Links
Advertisement
 
post #722 of 3692 Old 02-03-2012, 05:37 AM
AVS Special Member
 
dsinger's Avatar
 
Join Date: Apr 2003
Posts: 1,727
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 58 Post(s)
Liked: 90
Quote:
Originally Posted by specuvestor View Post

When my son say he is getting taller, I don't expect him to hit 10'. We all agree there must be a limit. We don't agree it is 2k. No one yet has produced a report that says 2k is the acuity limit which is why the HD industry chose 2k. That's not the history I know.

The anecdodal evidence is simple:
1) huge size 70-90" TV is possible and happening
2) Difference between 4k and 2k is PERCEIVABLE (not as much as VCD to DVD but still perceivable... that is the proof of the pudding, not some lab reports because there are, as discussed, many factors to perceived resolutions) and can push improvements from panel bit to gamut to color depth etc.
3) Blu ray fans on the blu ray forum has been saying 4k scan is more superior than 2k scan, ON AN EXISTING BLU RAY FORMAT. I'm assuming most of them are right.

Regarding # 3, Joe Kane's DVE HD Basics calibration disc has several takes on the restaurants demonstration material including a 1080p scan shown at 1080p and a 4K scan shown at 1080p. The woman in the scene ages by 3-5 years IMO due to the increased visibility of fine lines etc. in the 4k scan. She looks in her mid 40s in 4k. 40 at 1080p and about 35 in DVD.
dsinger is online now  
post #723 of 3692 Old 02-04-2012, 07:11 AM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,502
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 35 Post(s)
Liked: 63
Quote:
Originally Posted by specuvestor View Post

When my son say he is getting taller, I don't expect him to hit 10'. We all agree there must be a limit. We don't agree it is 2k. No one yet has produced a report that says 2k is the acuity limit which is why the HD industry chose 2k. That's not the history I know.

It is absolutely so, that was done by the ATSC committee based on research at the MIT, thus was als preceded byt the Japanese Hi-Vision analog HD which had 1150 lines. 2K was not taken out of hat but after very detailed studies.

Quote:
Originally Posted by specuvestor View Post

The anecdodal evidence is simple:
1) huge size 70-90" TV is possible and happening
2) Difference between 4k and 2k is PERCEIVABLE (not as much as VCD to DVD but still perceivable... that is the proof of the pudding, not some lab reports because there are, as discussed, many factors to perceived resolutions) and can push improvements from panel bit to gamut to color depth etc.
3) Blu ray fans on the blu ray forum has been saying 4k scan is more superior than 2k scan, ON AN EXISTING BLU RAY FORMAT. I'm assuming most of them are right.

This does not hold water against what Cnet rightly said: 4K is stupid. Take 1): Huge is possible. But instead saying that 4K is needed for it, it would be much more productive to talk about increasing the bit rate of compressed sources and eliminate prefiltering plus increase the frame rate - meaning providing 2K with balls and not vasectomized 2K. Take 2) and 3): nobody proved this in objective testing. Quite opposite, research shows 4K is better at <2,5PH and that with extremely high Q content.

Quote:
Originally Posted by dsinger View Post

Regarding # 3, Joe Kane's DVE HD Basics calibration disc has several takes on the restaurants demonstration material including a 1080p scan shown at 1080p and a 4K scan shown at 1080p. The woman in the scene ages by 3-5 years IMO due to the increased visibility of fine lines etc. in the 4k scan. She looks in her mid 40s in 4k. 40 at 1080p and about 35 in DVD.

Yeah, 4K is rejuvenating - this is brilliant marketing argument for females . For guys, it reduces Viagra demand by 2K ,

Seriously, one would have to know precise details how those scans were taken. What you say indicates that either 1080p scan was not taken at full res or the 4K scan was not properly bandlimited.

irkuck
irkuck is offline  
post #724 of 3692 Old 02-04-2012, 01:11 PM
AVS Special Member
 
dsinger's Avatar
 
Join Date: Apr 2003
Posts: 1,727
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 58 Post(s)
Liked: 90
^April 15, 1912; "That's not an iceberg Captain, it's low res fog".
dsinger is online now  
post #725 of 3692 Old 02-05-2012, 06:47 AM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,502
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 35 Post(s)
Liked: 63
BTW, when talking about res and PQ one can consult the Digital Cinema standard. They define 2K and 4K formats. Most important though is that the digital cinema content is full 12-bit RGB and compression is intraframe JPEG-2000. In addition there is specified 2K profile @48fps. These guys really knew where are the weak points in the present digital video. In a similar way, instead of talking about 4K for the home it would be logical to talk first about addressing the very same aspects - e.g. by new Blu-Ray @ 10-bit RGB, intraframe, @48fps. That would then solved any PQ problems and blow out any need for mediocre 4K.

irkuck
irkuck is offline  
post #726 of 3692 Old 02-05-2012, 09:39 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,584
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 227
Quote:
Originally Posted by irkuck View Post

BTW, when talking about res and PQ one can consult the Digital Cinema standard. They define 2K and 4K formats. Most important though is that the digital cinema content is full 12-bit RGB and compression is intraframe JPEG-2000. In addition there is specified 2K profile @48fps. These guys really knew where are the weak points in the present digital video. In a similar way, instead of talking about 4K for the home it would be logical to talk first about addressing the very same aspects - e.g. by new Blu-Ray @ 10-bit RGB, intraframe, @48fps. That would then solved any PQ problems and blow out any need for mediocre 4K.

Haha. Just enough knowledge to be dangerous!

2K DCI footage is in the region of 200-300GB for a film at 24p, so you're looking at 400-600GB at 48fps. (and probably 3D too?)


A big part of this is, as you say, because each frame is stored as its own JPEG 2000 image. This means that it's almost approaching lossless compression, and you can take any still frame from a film, and it will be fully detailed with almost no compression artefacts. Do that in a fast-moving scene on Blu-ray and you will see compression there.

Except the compression formats used on Blu-ray take motion into consideration. Neither the displays we have, or the human vision system requires each frame to be perfect, especially with a lot of motion.

As for 10 or 12-bit data, there are virtually no consumer displays available today that are even transparent to 8-bit. There can be benefits to sending greater than 8-bit data from the player, and with 10-bit panels, but with the exception of animated content, there's likely going to be very little, if any, visual benefit to using 10-bit at the source-something which inflates the data 4x. (8-bit = 256 levels per channel, 10-bit = 1024)

For animated content, using high bitrates on Blu-ray seems to be sufficient as it is. Increasing the bit-depth allows higher compression to be used with animated content before macroblocking starts to become visible. As far as I know, this is only done by people that want to rip a disc they have, and compress it further to save on storage space.

DCI colour is also much wider gamut than the BT.709 standard for HDTV, and data is stored in an X'Y'Z' format, which also plays a big part in why they are using 12-bit data.



Assuming that both formats have the same amount of space to work with, there will be far more benefit to 4K Blu-ray at 8-bit, than 1080p Blu-ray with 10-bit colour.

You keep saying that there are image quality problems with Blu-ray that using less compression and moving to 10-bit colour would fix (if anything compression may need to be higher, but lets ignore that for now) can you give some examples of this?


Increasing the bit-depth, increasing the colour gamut (this would be the first thing on my list after 4K) moving to 4:4:4 chroma (or an X'Y'Z' format) and reducing compression are all nice things to have, but you have to be realistic, and as far as image quality is concerned, the thing which will have the most benefit right now is moving to 4K. (by which I mean 3840x2160 Blu-ray, not the cinema format)
Chronoptimist is offline  
post #727 of 3692 Old 02-05-2012, 11:43 AM
Advanced Member
 
David_B's Avatar
 
Join Date: Jan 2001
Location: delete me
Posts: 984
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 26
There's zero doubt that 4k is coming.

There's zero doubt that wall size screens are coming.

Saying otherwise is to believe technology is going to stop improving.


buytme
David_B is online now  
post #728 of 3692 Old 02-05-2012, 12:04 PM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,502
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 35 Post(s)
Liked: 63
Quote:
Originally Posted by Chronoptimist View Post

You keep saying that there are image quality problems with Blu-ray that using less compression and moving to 10-bit colour would fix (if anything compression may need to be higher, but lets ignore that for now) can you give some examples of this?

....
Increasing the bit-depth, increasing the colour gamut (this would be the first thing on my list after 4K) moving to 4:4:4 chroma (or an X'Y'Z' format) and reducing compression are all nice things to have, but you have to be realistic, and as far as image quality is concerned, the thing which will have the most benefit right now is moving to 4K. (by which I mean 3840x2160 Blu-ray, not the cinema format)

This is precisely showing errors in your thinking. IF 4K would be absolute must, like you imply, THEN digital cinema would be its indispensable priority No.1. But it is not so - the 4K profile is proposed with the biggest screens in mind, for the rest the 2K is enough if other problems are solved. They focus on the real issues to provide highest PQ and not on PR by 4K.

What I am saying is the same: if one takes Blu-ray and looks for still improving of its otherwise decent PQ the right way is NOT 4K but those other aspects. Going to 4K with the bag of the issues due to compression, color space and frame rate is patently nonsense.

Hopefully and practically the 4K will not make due to the consumer indifference. Consumers nowadays do not long for the PQ but for good enough handy tech. This is why the lossless audio CD lost to compressed IPod formats and why the edge lit LCD tech beat out local dimming. With the current state of the things we may witness completely brain-damaged offerings in the form of 4K edge-lit LCD sets but it is unlikely consumers will take them. For you the 4K edge-lit will be fine but technically it will be surreal grotesque.

irkuck
irkuck is offline  
post #729 of 3692 Old 02-05-2012, 12:19 PM
AVS Special Member
 
8mile13's Avatar
 
Join Date: Nov 2009
Posts: 3,923
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 257 Post(s)
Liked: 218
Quote:
Originally Posted by David_B; View Post


Haha.
8mile13 is offline  
post #730 of 3692 Old 02-05-2012, 12:22 PM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,556
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 130 Post(s)
Liked: 56
Quote:
Originally Posted by Chronoptimist View Post

As for 10 or 12-bit data, there are virtually no consumer displays available today that are even transparent to 8-bit. There can be benefits to sending greater than 8-bit data from the player, and with 10-bit panels, but with the exception of animated content, there's likely going to be very little, if any, visual benefit to using 10-bit at the source–something which inflates the data 4x. (8-bit = 256 levels per channel, 10-bit = 1024)

It doesn't necessarily increase the data for compressed video (eg. the mpeg formats), it could even use less data to store at 10 bit instead of 8 for some content. When converting from 10 or higher bits to 8 bit they use dithering which is hard to compress with mpeg formats (eg. H264/AVC). Having an option of 10 bit (and/or 16 bit pcc) on Blu-ray should mean they don't need to dither, so more efficient compression, as well as allowing more accurate colour.

Also, while 10 bit pcc colour would allow 4x the colours per channel as 8 bit, it wouldn't inflate the data storage requirements 4x even in uncompressed. Storing uncompressed 16 bits per channel (trillions of colours) would only be double the data storage requirements of uncompressed 8 bits per channel (about 16.7 million colours).
Joe Bloggs is offline  
post #731 of 3692 Old 02-05-2012, 12:28 PM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,556
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 130 Post(s)
Liked: 56
Quote:
Originally Posted by irkuck View Post

BTW, when talking about res and PQ one can consult the Digital Cinema standard. They define 2K and 4K formats. Most important though is that the digital cinema content is full 12-bit RGB and compression is intraframe JPEG-2000. In addition there is specified 2K profile @48fps. These guys really knew where are the weak points in the present digital video. In a similar way, instead of talking about 4K for the home it would be logical to talk first about addressing the very same aspects - e.g. by new Blu-Ray @ 10-bit RGB, intraframe, @48fps. That would then solved any PQ problems and blow out any need for mediocre 4K.

I don't think they did. It took a lot of persuasion/work for them to add higher rates than 48 fps into the standards for DCI. And 48 fps is a bit of an incompatible standard (apart from being double 24 fps) - it's not very compatible with the TV/video world and doesn't give the most accurate motion quality (there's already 60 fps in the video world and also in 3D), and their standard could originally only do 48 fps in 2D (maybe it was more just because they wanted 3D at 24 fps that 48 fps in 2D was also possible). Also, their standard of 48 fps only applies to 2K - but 4K is also part of the standard, but 4K needs 48 fps (and higher) just as much, if not more.
Joe Bloggs is offline  
post #732 of 3692 Old 02-05-2012, 08:02 PM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by irkuck View Post

It is absolutely so, that was done by the ATSC committee based on research at the MIT, thus was als preceded byt the Japanese Hi-Vision analog HD which had 1150 lines. 2K was not taken out of hat but after very detailed studies.

This does not hold water against what Cnet rightly said: 4K is stupid. Take 1): Huge is possible. But instead saying that 4K is needed for it, it would be much more productive to talk about increasing the bit rate of compressed sources and eliminate prefiltering plus increase the frame rate - meaning providing 2K with balls and not vasectomized 2K. Take 2) and 3): nobody proved this in objective testing. Quite opposite, research shows 4K is better at <2,5PH and that with extremely high Q content.

Yeah, 4K is rejuvenating - this is brilliant marketing argument for females . For guys, it reduces Viagra demand by 2K ,

Seriously, one would have to know precise details how those scans were taken. What you say indicates that either 1080p scan was not taken at full res or the 4K scan was not properly bandlimited.

please quote research. As discussed way back AFAIK the Japanese HD spec was between 900-1150, depending on which Japanese. So they were confused about acuity limits as well?

So you are saying that these people essentially say screw the existing format, we will choose 1080 16:9 because that's visual acuity irrespective of existing format. You seriously belong to academia, and a government funded one cause no private enterprise would support something idealistic and impractical.

Or more plausible they compromised on resolution spec with regards to PAL/NTSC SD and AR and transmission constraints that is existing so that technology can progress rather than leap with a single bound and fall to death? As for 4K, whoever even suggested that 30 years ago should just shut up and continue playing MUD game on the Internet, which was pretty much what the Internet was useful for back then, if you weren't around yet.

Skipping the copy and paste reiterations.

Suggest you ask in the Blu Ray forum why 4k scan is better since 2K is absolutely better. They are probably more intelligent, less misinformed and patient than us to explain it in detail to you. Studio scanning is well understood by the veterans there.
specuvestor is offline  
post #733 of 3692 Old 02-05-2012, 08:04 PM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by Joe Bloggs View Post

It doesn't necessarily increase the data for compressed video (eg. the mpeg formats), it could even use less data to store at 10 bit instead of 8 for some content. When converting from 10 or higher bits to 8 bit they use dithering which is hard to compress with mpeg formats (eg. H264/AVC). Having an option of 10 bit (and/or 16 bit pcc) on Blu-ray should mean they don't need to dither, so more efficient compression, as well as allowing more accurate colour.

Also, while 10 bit pcc colour would allow 4x the colours per channel as 8 bit, it wouldn't inflate the data storage requirements 4x even in uncompressed. Storing uncompressed 16 bits per channel (trillions of colours) would only be double the data storage requirements of uncompressed 8 bits per channel (about 16.7 million colours).

This is interesting comment. Isn't dithering an algo implementation rather than storage? Can you provide links?
specuvestor is offline  
post #734 of 3692 Old 02-05-2012, 08:26 PM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,556
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 130 Post(s)
Liked: 56
Quote:
Originally Posted by specuvestor View Post

This is interesting comment. Isn't dithering an algo implementation rather than storage? Can you provide links?

Here's a link that shows some examples:
http://www.avsforum.com/avs-vb/showt...7#post13460907

Yes, dithering is an algorithm but they use dithering when they reduce colour from >8 bit per channel to 8 bit per channel, otherwise they can get banding. Since dithering can look 'random' to the mpeg encoder, it is more difficult to compress than sources without dithering. eg. a bit like if you add random noise/grain to video it would be harder to compress.

This is what Amir said:
Quote:
10 bits would be desirable because without it, conversion to 8 bits involves dither and that makes the encoders job more difficult. In some sense then, we may be able to encode 10 bits for the same bit budget we have for 8-bit converted from 10 bits!

http://www.avsforum.com/avs-vb/showt...0#post20494450
Joe Bloggs is offline  
post #735 of 3692 Old 02-05-2012, 11:47 PM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,502
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 35 Post(s)
Liked: 63
Quote:
Originally Posted by Joe Bloggs View Post

I don't think they did. It took a lot of persuasion/work for them to add higher rates than 48 fps into the standards for DCI. And 48 fps is a bit of an incompatible standard (apart from being double 24 fps) - it's not very compatible with the TV/video world and doesn't give the most accurate motion quality (there's already 60 fps in the video world and also in 3D), and their standard could originally only do 48 fps in 2D (maybe it was more just because they wanted 3D at 24 fps that 48 fps in 2D was also possible). Also, their standard of 48 fps only applies to 2K - but 4K is also part of the standard, but 4K needs 48 fps (and higher) just as much, if not more.

Nice illustration of logic upside down. These we cinema guys where 24fps is sacrosanct (film still used and admired by many in this area) and distinction from TV/video. Saying they could use another frame rate is like spitting in their faces. What they did as the major work was 2K at highest PQ. The 4K was added as another format for imax-like cinemas. Fact they have not added 48 fps
to the 4K means they have not seen any need for this.


Quote:
Originally Posted by specuvestor View Post

please quote research. As discussed way back AFAIK the Japanese HD spec was between 900-1150, depending on which Japanese. So they were confused about acuity limits as well?

Research was done in MIT under auspices of Ad Com on ATSC. Note that Japanese had analo and digital systems, all of them around 1000 lines of res,
it is not so critical if it is 1080 or 1125. In fact, there is 720p nowadays. The 1080 was selected with visual acuity in mind, the 720 with progressive and motion rendering.

Quote:
Originally Posted by Joe Bloggs View Post

So you are saying that these people essentially say screw the existing format, we will choose 1080 16:9 because that's visual acuity irrespective of existing format. You seriously belong to academia, and a government funded one cause no private enterprise would support something idealistic and impractical.

It was not like this: there was a drive to develop next gen TV system which started in Japan and later in the US. Japanese experimented with several formats until they settled on 16:9, the precise number 1080 came from conisderation of it to be 'compatible' with other video numbers like 360.

Quote:
Originally Posted by specuvestor View Post

As for 4K, whoever even suggested that 30 years ago should just shut up and continue playing MUD game on the Internet, which was pretty much what the Internet was useful for back then, if you weren't around yet.

Indeed 4K was sci-fi then, but the research on res was independent on it.
They just used color slides and projector to establish the required res, 2K was also fantasy then.

Quote:
Originally Posted by specuvestor View Post

Suggest you ask in the Blu Ray forum why 4k scan is better since 2K is absolutely better. They are probably more intelligent, less misinformed and patient than us to explain it in detail to you. Studio scanning is well understood by the veterans there.

There is a lot of 4K propaganda around and you seem to be a victim of it too. I am not denying the 4K scan is better than 2K. What I am saying is it looks improbable that the downscaled 4K scan is significantly better than the original 2K and that this will have visual impact for the TV viewing scenario.

To figure out what is going on that people make claims 4K vs. 2K one would have to make extremely detailed analysis of the processing chains. For example, it is typical for all 2K systems that there is prefiltering to 1440 res and it may happen in the scan, in the camera or in digital part. As professionals say, this reduces PQ very little. But if you compare it to the 4K scan (which may also be prefiltered to e.g. 3K) and watch from close you'll see a difference.

What I am saying here is that before moving to 4K let's make real, full 2K without compromises. Unfortunately, in the current state of the things manufs will be trying to push 4K into ignorant consumers and we will see edge-lit 42 inchers with proud "Full-4K" badges .

irkuck
irkuck is offline  
post #736 of 3692 Old 02-06-2012, 12:03 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,556
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 130 Post(s)
Liked: 56
I think you've got the quotes mixed up a bit, I didn't write the second think you quote me as saying.

Quote:
These we cinema guys where 24fps is sacrosanct

And the person who did a lot of work trying to change (and got an award from the British Cinematographers (BSC) for it) it was also a "cinema guy", working for the European Federation of Cinematographers http://www.imago.org/

Quote:
Fact they have not added 48 fps to the 4K means they have not seen any need for this.

Well that shows they're not very good at seeing what needs to be done to improve quality since if 2K needs it, 4K needs it even more if the image is larger/more detailed. I'm sure a big reason is bandwidth required using their formats at the time. James Cameron has said "the 4K/24 image will judder miserably during a panning shot" - which shows 4K at 24 fps is insufficient for quality motion.
Joe Bloggs is offline  
post #737 of 3692 Old 02-06-2012, 01:30 AM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by irkuck View Post

Research was done in MIT under auspices of Ad Com on ATSC. Note that Japanese had analo and digital systems, all of them around 1000 lines of res,
it is not so critical if it is 1080 or 1125. In fact, there is 720p nowadays. The 1080 was selected with visual acuity in mind, the 720 with progressive and motion rendering.

It was not like this: there was a drive to develop next gen TV system which started in Japan and later in the US. Japanese experimented with several formats until they settled on 16:9, the precise number 1080 came from conisderation of it to be 'compatible' with other video numbers like 360.

all the more you draw skepticism to your credibility on the subject. The Japanese were pioneers of HD but they did not set the format for the industry. For very simple and obvious reasons: content.
specuvestor is offline  
post #738 of 3692 Old 02-06-2012, 02:11 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by specuvestor View Post

This is interesting comment. Isn't dithering an algo implementation rather than storage? Can you provide links?

dithering data isn't stored on BD. No need

It's done at the hardware level. In the case of BD, either by source or display


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.



To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
Nielo TM is offline  
post #739 of 3692 Old 02-06-2012, 02:16 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by Joe Bloggs View Post

It doesn't necessarily increase the data for compressed video (eg. the mpeg formats), it could even use less data to store at 10 bit instead of 8 for some content. When converting from 10 or higher bits to 8 bit they use dithering which is hard to compress with mpeg formats (eg. H264/AVC). Having an option of 10 bit (and/or 16 bit pcc) on Blu-ray should mean they don't need to dither, so more efficient compression, as well as allowing more accurate colour.

That doesn't make any sense.

Increasing color depth increases the bit-rate. You can cut size by compressing the video with less data but that causes image artifacts. But it is better to keep the video at the highest bit possible. Even a drop below 8-bit can lead to visual artifacts.



Quote:
Originally Posted by Joe Bloggs View Post

Also, while 10 bit pcc colour would allow 4x the colours per channel as 8 bit, it wouldn't inflate the data storage requirements 4x even in uncompressed. Storing uncompressed 16 bits per channel (trillions of colours) would only be double the data storage requirements of uncompressed 8 bits per channel (about 16.7 million colours).

No sure I follow

Could you elaborate?


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.



To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
Nielo TM is offline  
post #740 of 3692 Old 02-06-2012, 06:57 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,556
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 130 Post(s)
Liked: 56
Quote:
Originally Posted by Nielo TM View Post

That doesn't make any sense.

Increasing color depth increases the bit-rate. You can cut size by compressing the video with less data but that causes image artifacts. But it is better to keep the video at the highest bit possible. Even a drop below 8-bit can lead to visual artifacts.

It doesn't necessarily increase the bitrate if you now don't need to dither when converting content from 10 bit to 8 bit. See what people like Amirm said. Penton-Man has also said it wouldn't require dithering on his forum. Dithered content is much harder to compress (ie. can mean higher bitrate needed to have the same visual quality as non-dithered content, because the mpeg encoder has trouble encoding it, especially if in motion).

Quote:
Originally Posted by Amirm View Post

10 bits would be desirable because without it, conversion to 8 bits involves dither and that makes the encoders job more difficult. In some sense then, we may be able to encode 10 bits for the same bit budget we have for 8-bit converted from 10 bits!

Quote:
Originally Posted by Joe Bloggs View Post

Also, while 10 bit pcc colour would allow 4x the colours per channel as 8 bit, it wouldn't inflate the data storage requirements 4x even in uncompressed. Storing uncompressed 16 bits per channel (trillions of colours) would only be double the data storage requirements of uncompressed 8 bits per channel (about 16.7 million colours).

Quote:
No sure I follow

Could you elaborate?

It was a reply to a post that said 10 bit would increase "the data by 4x" (assuming he meant the data rate/amount of storage used). I'm saying even increasing from 8 bit colour per channel to 16 bit colour per channel (uncompressed) would only double it. It's the number of possible colours you can have per channel that increases by 4x going (256 colours per channel to 1024 colours per channel) from 8 bit pcc to 10 bit pcc, not the amount of data storage required.

Example comparing 8 bit pcc to 16 bit pcc:
* 320x240 tiff, 8 bit colour per channel (16.7 million colours), uncompressed=225 KB
* 320x240 tiff, 16 bit colour per channel (trillions of colours), uncompressed=450 KB (twice the data rate - because twice as many bits per colour channel).
Joe Bloggs is offline  
post #741 of 3692 Old 02-06-2012, 07:48 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,584
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 227
Quote:
Originally Posted by Joe Bloggs View Post

It was a reply to a post that said 10 bit would increase "the data by 4x" (assuming he meant the data rate/amount of storage used). I'm saying even increasing from 8 bit colour per channel to 16 bit colour per channel (uncompressed) would only double it. It's the number of possible colours you can have per channel that increases by 4x going (256 colours per channel to 1024 colours per channel) from 8 bit pcc to 10 bit pcc, not the amount of data storage required.

Sorry for not being more clear. I didn't mean to imply that it would require 4x the storage capacity, just that there could be as much as 4x more variation within the picture. I mean, if that were the case, DCI's 12-bit format would be more like 800GB than 200-300GB, and wouldn't look any better than Blu-ray, as 12-bit is 16x more information. (4096 vs 256 shades)

Your point about dithering is interesting, though I'm sure you will still need higher bitrates for 10-bit information, especially as film grain/dither is often added intentionally when grading the image, especially if it's a composited shot. I think you may end up reducing dithering for technical reasons, but there will still be film grain/dither in the final shot.

Even if bitrates are only inflated 25% or so by using 10-bit, I doubt there will be little if any benefit on today's displays which are not even transparent to 8-bit. (i.e. they will show banding with an 8-bit signal, even when there is none in the source)
Chronoptimist is offline  
post #742 of 3692 Old 02-06-2012, 08:47 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,556
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 130 Post(s)
Liked: 56
Quote:
Originally Posted by Chronoptimist View Post

Your point about dithering is interesting, though I'm sure you will still need higher bitrates for 10-bit information, especially as film grain/dither is often added intentionally when grading the image, especially if it's a composited shot. I think you may end up reducing dithering for technical reasons, but there will still be film grain/dither in the final shot.

Maybe 'films' that are CGI animation could be the ones most likely to be able to use more than 8 bit colour without requiring higher bitrates when compressed (maybe it could take less bitrate for better quality), and other more real (no effects) titles, and with film being replaced by digital they might use less 'film look'/grain effects (unless because of compositing effects shots it needs it to look convincing) in future.
Quote:


I doubt there will be little if any benefit on today's displays which are not even transparent to 8-bit. (i.e. they will show banding with an 8-bit signal, even when there is none in the source)

Since today's TVs do a lot of picture processing, if they started off with better than 8 bit source, it should give a better end result - ie. less precision being lost through all the picture processing (though I wonder if, if the TV wasn't transparent to 8-bit (if no processing was required), whether the TV itself would need to add dither? I think that's sort of what Plasmas do/did?). Also, picture processing in the TV might be easier (eg. motion interpolation, etc.) if it isn't processing dithered content.
Joe Bloggs is offline  
post #743 of 3692 Old 02-06-2012, 09:50 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by Chronoptimist View Post


Even if bitrates are only inflated 25% or so by using 10-bit, I doubt there will be little if any benefit on today's displays which are not even transparent to 8-bit. (i.e. they will show banding with an 8-bit signal, even when there is none in the source)

Displays with 10-bit processing or higher do display smooth transition. I've tested few LCDs that yielded band free image.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.



To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
Nielo TM is offline  
post #744 of 3692 Old 02-06-2012, 10:29 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,584
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 227
Quote:
Originally Posted by Nielo TM View Post

Displays with 10-bit processing or higher do display smooth transition. I've tested few LCDs that yielded band free image.

Could you give any examples? My HX900 is very close, but still has some banding in the image that shouldn't be there. (and going back to active vs passive 3D, the bit-depth drops considerably when you go to 3D) I've yet to see anything that can reproduce a gradient as smoothly as a CRT does.

And I personally don't consider a gradient to be enough for testing. Where it's really tough, is down below maybe 20% grey, avoiding any kind of discolouration/posterisation in the shadows. Skintone in low light seems very tough to reproduce accurately.

We're at a point now where I find the best displays tolerable in that regard, but I've yet to find anything that I'd consider transparent to 8-bits, and even if something was, I'd be surprised to see much if any benefit from going to 10-bit at the source.
Chronoptimist is offline  
post #745 of 3692 Old 02-06-2012, 10:36 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Sony use 3rd part LSI and Panel. So they have very little control

The following models yielded band free image (even during motion).

Toshiba 40RV753 (http://www.hdtvtest.co.uk/news/toshi...0110118924.htm)
LG 47LW550T (http://www.hdtvtest.co.uk/news/lg-47...1107081247.htm)
Sharp LC46LE700E (http://www.hdtvtest.co.uk/news/sharp...0091117160.htm)


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.



To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
Nielo TM is offline  
post #746 of 3692 Old 02-06-2012, 11:17 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,584
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 227
Quote:
Originally Posted by Nielo TM View Post

Sony use 3rd part LSI and Panel. So they have very little control

Right, but if I'm reading things correctly, the EMMA3TL2 uses 15-bit processing, and the Sharp UV2A panel used (LK460D3LB1S) is 10-bit. (I'm fairly certain all UV2A panels are)

Quote:
Originally Posted by http://am.renesas.com/media/products/soc/assp/av/visual/enc_dec/tv_decode/emma3tl2/E3TL2_PB_V10_20100428.pdf View Post

15-bit signal processing picture quality adjustment controller (Picture Quality Controller) supporting xvYCC
LTI/CTI, 2D aperture, sharpness, APL adaptive black extension/white peak correction, contrast, hue correction,
Liner - , MATRIX conversion, V-T correction, LCD overdrive, FRC, etc.
Dual Link LVDS output supporting Full HD (for main video, 1920H x 1080 V x 60 Hz, 30-bit GBR)

Chronoptimist is offline  
post #747 of 3692 Old 02-06-2012, 11:32 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
I wonder if Sony implemented the firmware correctly or limited the performance for unknown reasons. Or maybe the LSI just isn't good enough.

As for the UV2A not sure if it's true 10-bit panel. I'm not aware of any true 10-bit consumer panel as it has negative impact on pixel response


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.



To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
Nielo TM is offline  
post #748 of 3692 Old 02-06-2012, 11:38 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,584
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 227
Quote:
Originally Posted by Nielo TM View Post

I wonder if Sony implemented the firmware correctly or limited the performance for unknown reasons. Or maybe the LSI just isn't good enough.

As for the UV2A not sure if it's true 10-bit panel. I'm not aware of any true 10-bit consumer panel as it has negative impact on pixel response

I think most people would look at the set and say that gradients looked smooth, but I personally would not call it transparent to 8-bit. (nor would I say that of any consumer LCD I've seen to date)

Gradations do look smooth without obvious banding at a distance, which you can see on many other screens, but it's just not quite perfect when you get up close, and as I said in my previous post, I don't think that a greyscale ramp is a particularly taxing test for gradation. It's the first step of looking for problems.

When you consider the gradation performance of the majority of screens out there, I would say there's only a handful of TVs that actually look pretty good, and none that I'd consider to be transparent to 8-bit. I think moving to 10-bit at the source is the least of our problems there, when displays with 10-bit panels are failing to produce 8-bit transparency.
Chronoptimist is offline  
post #749 of 3692 Old 02-06-2012, 03:32 PM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
I do test them at close distance (I look for every flaw, within reason). I use CGI generated ramp along with another pattern (see attached).
LL


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.



To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
Nielo TM is offline  
post #750 of 3692 Old 02-06-2012, 05:14 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,584
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 227
Quote:
Originally Posted by Nielo TM View Post

I do test them at close distance (I look for every flaw, within reason). I use CGI generated ramp along with another pattern (see attached).

Thanks. I do see banding on my display, but it appears to be part of the image. There doesn't seem to be any dither used at all (and for what it's worth, Photoshop's dither is fairly useless in this regard) so there appear to be bands roughly 7px wide that are the same colourwith 8-bit you must dither gradients.

Analysis:


For comparison, here's CalMAN's generated gradient. (a gradient which should exhibit banding on any display)
Source:


Analysis:



And madVR's 16-bit (internal) dithered gradientabout as smooth as you can hope to get with 8-bit.
Source:


Analysis:



Using Photoshop's spatter tool, which is the best way I know to "dither" a pre-existing gradient you can almost eliminate any banding from your pattern.

Source:


Edit:


Analysis:


This image is almost smooth on my display.
Chronoptimist is offline  
Reply OLED Technology and Flat Panels General

Tags
Samsung Bd D7000 3d Blu Ray Disc Player Silver , Samsung Pn51d8000 51 Inch 1080p 3d Ready Plasma Hdtv 2011 Model , Displays , Pioneer Pdp 5080 Hd
Gear in this thread

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off