or Connect
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 4k by 2k or Quad HD...lots of rumors? thoughts?
New Posts  All Forums:Forum Nav:

4k by 2k or Quad HD...lots of rumors? thoughts? - Page 107

post #3181 of 3670
Quote:
Originally Posted by Chronoptimist View Post

I think you are ignoring or unaware of the fact that most TVs offer a color management system these days.

So when a source asks for "100% red" it displays 100% red in the BT.709 colorspace when the CMS is active.
If the CMS is disabled or set to "wide mode" then 100% red will be as saturated as the display allows for, which makes everything look very unnatural.

With 4K, what's likely to happen is that the source will tell the display whether it is sending BT.2020 content, or BT.709 content or perform the transformation so that it's only outputting BT.2020 and BT.709 content still looks correct.

And this is a compelling point. And one I've heard before in a number of ways.

This isn't the first time I've gone head-to-head with color guys about what the mapping of a data model (which is what it is) to results is. In fact, of the legion of color guys arguing with me about this very concept, I've only convinced 2 that I was in any way right. One guy even said that when he "got" what I was saying, I made him "very depressed." smile.gif But again, I'm against the storm of wisdom here, so I'll have to lose.

I'll state simply the variants of all counters to similar arguments:
  • An amplifier receiving a lower line level can only amp it so far. An broader line level can be amped higher. True. But amplifiers are already able to be too loud, and the limit is at the amp. And yep: our reds are already "red enough".
  • An amplifier with markings going to 11 also does nothing.
  • I can measure my driveway just as acurately with a 1000 foot ruler, as I can a 2000 foot ruler.
  • In fact, I can measure my driveway with a distorted and warped ruler so long as I know the rules of it first.


Further:
  • Almost no one is going to care about a better color space. Everything can get too everything already and will need calibration.
  • Almost no one here will believe anything I said above is in any way valid in this discussion.


Eh. smile.gif Be well. Let the waterboarding begin....in 5....4....3....2.....
Edited by tgm1024 - 6/11/13 at 9:08am
post #3182 of 3670
Thunderbolt or HDbas-eT connector technology doesn't seem to have any disadvantages compared to HDMI. They can transmit enough data to be future-proof. They can also transmit up to 100 watts of power. Truly a single wire from source to monitor. Now that would be something special.

http://forums.audioholics.com/forums/televisions-displays/85613-hdmi-2-0-4k-uhd-2160p-resolutions-new-hdtvs.html

http://www.hdbaset.org/
Edited by Tazishere - 6/11/13 at 9:34am
post #3183 of 3670
Quote:
Originally Posted by tgm1024 View Post

Almost no one is going to care about a better color space. Everything can get too everything already and will need calibration.
I'm not sure that you understand why color management exists though.

To keep these examples simple, let's say that BT.2020 covers 100% of the colors we can see, BT.709 covers only 50% of the colors we can see, and your current HDTV's gamut can cover 80% of the colors that we can see.

And to make things even simpler, let's assume that this is a "standard" HDTV where the color management engine is simple, and you have a "Color Space: Normal" or "Color Space: Wide" option in the advanced picture settings.
"Color Space: Normal" is BT.709, "Color Space: Wide" is unrestricted. (the 80% gamut)


When the TV is in the "Color Space: Normal" mode, when you give it an input of 100% red, it treats that as if it were a 62.5% red signal.
This is because the manufacturer knows that "62.5%" of the display's wide gamut is the same saturation as 100% in the BT.709 colorspace.
The same applies to every single input it gets when in the "Normal" mode - it's scaled down to 62.5% of its original value so that it is displayed as it is supposed to look, when viewed on this oversaturated display.


If you were to send this TV a BT.709 signal in the "Color Space: Wide" mode, what happens is that everything you try to show on this display is now 25% more saturated than it should be. So when you are displaying BT.709 content, the fact that the TV has a wider gamut than that is pointless.
It is worse than pointless actually - because you have to scale all inputs to 62.5% of their original value, you have effectively lost 37.5% of its gradation capability. This is part of the reason why even a 10-bit LCD may struggle to display gradients as smoothly as a CRT being sent an 8-bit signal.


So why not create content that is designed to use all of this display's saturation capabilities? After all, that's 25% more color!
Well this display might have 25% more saturation, but another might only have 15% more, or something else might be 50% more saturated than BT.709

So you can't tailor the content to the display, you need to tailor the content to a standard - and that standard is BT.2020.
As long as content is mastered to the BT.2020 standard, you will be able to use the full range of saturation that your display is capable of, up to 100% of the BT.2020 standard, and the display CMS will handle it so that color will still look correct.

But this requires displays - or a separate video processor in between your player and the display - that know what BT.2020 is, and can handle the gamut mapping properly.
Simply sending a BT.2020 input to a display which has a Normal/Wide option will not look correct with BT.2020 content, even in the Wide mode.


An exception to this is if you are using the display as a PC monitor.
Digital cameras can shoot images in the Adobe RGB colorspace, which is wider than BT.709, or even capture colors beyond that if you are shooting raw images. (the "ProPhoto" or "ROMM RGB" colorspace is often used there)
If you use a hardware & software package that can create an ICC profile for the display, color managed software such as Photoshop is capable of using the full range of saturation that your display offers, while still displaying accurate color. The issue is that when you are viewing applications that are not color managed like Photoshop is, such as most web browsers (Firefox being the exception) everything in them will appear oversaturated because they are designed for the sRGB colorspace. (which is the same as BT.709)
post #3184 of 3670
Quote:
Originally Posted by tgm1024 View Post

I do mostly understand this point, but I suspect we're getting a little too close to the trees to discern what really matters.
Larger color spaces are used for digital movies and professional photography. Here is a link to an article on Super Hi-Vision which has a chart comparing various color spaces. Also here is a link to a document about the development of Rec. 2020 and section 3.2.6 is about the color space.

Quote:
Originally Posted by tgm1024 View Post

I'll state simply the variants of all counters to similar arguments:
  • An amplifier receiving a lower line level can only amp it so far. An broader line level can be amped higher. True. But amplifiers are already able to be too loud, and the limit is at the amp. And yep: our reds are already "red enough".
  • An amplifier with markings going to 11 also does nothing.
  • I can measure my driveway just as acurately with a 1000 foot ruler, as I can a 2000 foot ruler.
  • In fact, I can measure my driveway with a distorted and warped ruler so long as I know the rules of it first.
Reds aren't red enough with the Rec. 709 color space. Color scientists know the color space that can be seen by the human eye and the Rec. 709 color space only covers 35.9% of that.

Also there is no way to accurately predict color outside of a color space. That would be like if a stranger measured the height of all the trees in a forest, sent you a note listing the height of each tree up to 15' (but no higher since that was as far as the measuring tape could reach), the note mentions that the trees were up to 30' high, and that based only on the information given in the note you are told to accurately predict the height of all the trees in that forest. Could you do that? No, and neither could a computer since there isn't enough information. If the note said that a tree was 15' high you would have no idea if the tree was 15' high, 17' high, or 30' high.
post #3185 of 3670
post #3186 of 3670
Quote:

Yeah, 4K may fare batter after huge advertising campaigns under the heading: 4K is better than 3D since no glasses are needed biggrin.gif.
post #3187 of 3670
Quote:
Originally Posted by Chronoptimist View Post

Quote:
Originally Posted by tgm1024 View Post

Almost no one is going to care about a better color space. Everything can get too everything already and will need calibration.
I'm not sure that you understand why color management exists though.
 

 

You're correct about the enlarged color spaces, I'm sorry.  I'm confusing not only two arguments, but two categories of arguments.  Models vs. spaces.

 

The position I was making is akin to "if you stretch the red, and the faces are too red as a result, then the faces were sent too red to begin with", which is a red-herring to this issue.

post #3188 of 3670
post #3189 of 3670
Quote:
Originally Posted by irkuck 
Testing 4K streaming to STB
NTT West (Nippon Telegraph and Telephone) is a former government owned corporation. The Japanese government still owns roughly 1/3 of NTT's shares. So there are the NHK (100% owned by the government) 4K tests and now NTT West (33% government owned) does some 4K streaming tests. NO private companies 4K tests yet in Japan smile.gif



https://en.wikipedia.org/wiki/Nippon_Telegraph_and_Telephone
post #3190 of 3670
^Indeed, 4K/8K in Japan is being developed by government organizations. This has very positive aspects in avoiding chaos and cart-before-the-horse riding in the market-driven development elsewhere where there are displays but no decent connectors, no content format, no distribution and no production equipment and no content. This makes consumers suspicious about the whole idea.
post #3191 of 3670
If they change the color space to BT .2020 will the Alabama Crimson Tide look better than how they look on the current BT .709?

I think they look great no matter what the colorspace is.
post #3192 of 3670
Just for the record; Redray, the worlds first 4K media player is shipping. cool.gif
post #3193 of 3670
Quote:
Originally Posted by coolscan View Post

Just for the record; Redray, the worlds first 4K media player is shipping. cool.gif

 

Are they able to handle Rec. 2020 if content shows up with it?

post #3194 of 3670
Quote:
Originally Posted by tgm1024 View Post

Are they able to handle Rec. 2020 if content shows up with it?
The player is, but will always be hampered with whatever HDMI can do.
This is based on that the integrated player in the eventual future RedRay Laser projector doesn't have such limits, claimed by Red staff when this was discussed a year ago.
I would guess that when rec.2020 or other cinema standard becomes something that is in regular use in the future, we will be on one of the future next generations of the Redray players.
post #3195 of 3670
post #3196 of 3670

 

When I saw the first theoretical upper limits of the new CF cards, the first comment I saw was "WHOA!  That's like 700 years of porn!"

 

LOL.....

post #3197 of 3670
post #3198 of 3670
Currently, the obsession is for ever higher pixel counts, an approach that disregards how we actually see moving images. If broadcasters have their way, we could be on course for some ridiculous format decisions.

Yes guys, let's stop cheating ourselves with brilliance of highly compressed 4K@30Hz mad.gif.
Edited by irkuck - 6/26/13 at 12:14am
post #3199 of 3670
Quote:
Originally Posted by irkuck View Post

Currently, the obsession is for ever higher pixel counts, an approach that disregards how we actually see moving images. If broadcasters have their way, we could be on course for some ridiculous format decisions.

Yes guys, let's stop cheating ourselves with brilliance of highly compressed 4K@30Hz mad.gif.

 

I'll take a native 1080p120 content over 2160p30 any day and they have equivalent data demands.

 

This is so ungodly depressing.

 

Increasing native frame rate is by far the most important thing we could accomplish in movie/video, and sometimes I just don't think we ever will.  We've discussed this ad infinitum, and everyone knows the problem, and it feels to me like we're locked in the trunk of Thelma & Louise's high resolution car as those two high resolution idiots decide to hold hands and drive off the @#$%ing high resolution cliff as if it were somehow inspiring.

 

In a jittery way of course.

 

From the content side of things, thank goodness for folks like Cameron who are actually in the position of saying "x FPS or else no movie" and have people listen.  BUT it seems that him breathing word of that is relegated to mere rumor off and on.  So now I'd love to know what his current stance on releasing Avatar 2 really is:

 

What's he saying now about Avatar 2's possible releases formats, does anyone know?:

  • 60 FPS only
  • 60 and 48 only
  • 60 and 48 and 24
  • 48 only
  • 48 and 24
  • Or....24 all by itself is okidoki?

 

.......and any hope of the first 5 options affecting video formats?


Edited by tgm1024 - 6/26/13 at 2:37pm
post #3200 of 3670
48fps doesn't get much traction in the movie industry as technical excellence and development isn't something movie maker or the studios are much interested in anymore, like they where when they developed "Cinema Scope" and 70m film to compete with TV back in the day. Same reason 4K material is long and far between in features.

Only Peter Jackson has shot movies in HFR which pressed the cinema owner to a $10000 server upgrade.
I doubt that they agree to another such upgrade for projecting in 60fps if no one else produce HFR movie except Jackson and Cameron.
But much can happen between now and 2016 when the second Avatar movie supposedly is going to be released.
Before that Cameron has to stop di(ck)ving around and actually start shooting the movies.
post #3201 of 3670
Quote:
Originally Posted by coolscan View Post

48fps doesn't get much traction in the movie industry as technical excellence and development isn't something movie maker or the studios are much interested in anymore, like they where when they developed "Cinema Scope" and 70m film to compete with TV back in the day. Same reason 4K material is long and far between in features.

Only Peter Jackson has shot movies in HFR which pressed the cinema owner to a $10000 server upgrade.

 

Did that projector upgrade bring them to 48 FPS only, or are many of them able to go higher?

post #3202 of 3670
Quote:
Originally Posted by tgm1024 View Post

Did that projector upgrade bring them to 48 FPS only, or are many of them able to go higher?
I'm not shure. The information hasn't exactly been clear. Often been mixed in with 4K ability.

This is Christie IMB;
Other IMBs might have other specs.
For 3D it should really be 4K HFR for each eye to really make an impact, but do they dare to go for such high data rates in post production?
Quote:
These are the frame rates supported by the Christie IMB:
  • Without the HFR upgrade ‐ 2K cinema content up to 60 frames per second in 2D, or up to 30 frames per second per eye in 3D
  • With HFR upgrade ‐ 3D support up to 60 frames per second per eye for 2K content
  • With the 4K upgrade ‐ 4K 2D up to 30 frames per second
post #3203 of 3670
It really seems like a stupid decision to move to 48fps. It's going to require new hardware anyway (at least for film production) so why not move to 50fps or 60fps - both of which are current standards that many televisions already support. (though in NTSC regions, 50Hz support is lacking)

Or if you still want to separate film from other mediums, go beyond 60fps and move to 120fps or higher.
I seem to recall that a move to ~600fps is required for things to actually be perceived as looking real rather than filmed.
post #3204 of 3670
Quote:
Originally Posted by Chronoptimist View Post

I seem to recall that a move to ~600fps is required for things to actually be perceived as looking real rather than filmed.
Depends on motion speed.
I can still see motion blur at 1/700sec strobes (LightBoost=10%) -- 700Hz equivalence.

For motion going 2000 pixels per second, there is 2 pixels of sample-and-hold motion blur on a theoretical flicker-free 1000fps@1000Hz display. If you're sitting at close computer monitor distances doing fast-flick 180 degree turns in first-person shooter video games (so the whole screen pans past you at 2000 pixels per second), and running an ultra GPU that can do 1000fps (VSYNC ON for perfect fps=Hz motion, so that any micro motion blur becomes noticeable), then the motion blur on a 1000fps@1000Hz can even become barely noticeable to the human eye -- 2 pixels of motion blur -- meaning even 1000fps@1000Hz (real time) isn't even yet the ultimate frontier. Alas! Yes, tiny, tiny. Yes, 600fps@600Hz probably is "good enough" to be the Ultimate Frontier since even most sports motion isn't moving at 2000 pixels per second while you're tracking your eyes across it at very close (computer monitor) distances.

There are certainly major points of diminishing returns beyond a certain point.
Also, as always, you need framerate=Hz for the point of minimized motion blur.
And assuming, no source-based motion blur (e.g. video game graphics, or video taken with high-speed shutter on each frame).

For non-flicker displays (sample-and-hold, when pixel transition speed is a non-issue):
60fps@60Hz = baseline (16 pixels of motion blur at 960 pixels/sec)
120fps@120Hz = 50% less motion blur (8 pixels of motion blur at 960 pixels/sec)
240fps@240Hz = 75% less motion blur (4 pixels of motion blur at 960 pixels/sec)
480fps@480Hz = 87.5% less motion blur (2 pixels of motion blur at 960 pixels/sec)
960fps@960Hz = 93.75% less motion blur (1 pixels of motion blur at 960 pixels/sec)

For flicker displays (any refresh rate, as long as framerate=Hz above flicker fusion threshold):
1/60th sec strobes = baseline (16 pixels of motion blur at 960 pixels/sec)
1/120th sec strobes = 50% less motion blur (8 pixels of motion blur at 960 pixels/sec)
1/240th sec strobes = 75% less motion blur (4 pixels of motion blur at 960 pixels/sec)
1/480th sec strobes = 87.5% less motion blur (2 pixels of motion blur at 960 pixels/sec)
1/960th sec strobes = 93.75% less motion blur (1 pixels of motion blur at 960 pixels/sec)

Strobes = on-to-off cycle (LightBoost flashes, brightest cluster of plasma subfield flickers, CRT phosphor illuminate-and-decay, etc). The cleaner the on-to-off cycle, the more accurate the motion blur math becomes. The jump from 60Hz->120Hz is dramatic, but steps beyond is fairly incremental. That said, 120Hz->960Hz is almost as big as the step from 60Hz->120Hz in terms of how much motion blur length is removed (in terms of motion blur trail length moved). This all assumes very clean strobes (e.g. LightBoost or CRT phosphor). Mathematically it all fudges around when you start factoring in pixel transitions (LCD), phosphor decay asymmetries (plasma / CRT), repeat refreshes (e.g. plasma subfields), etc. Most displays aren't as efficient as their "attempted Hz equivalence" ratings.

That said, certain displays are highly efficient at reducing or eliminating motion blur where the strobe length very, very, very mathematically accurately translates to motion blur trail length (e.g. Sony Trimaster OLED flicker mode, Sony interpolation-free "Motionflow Impulse" mode, LightBoost strobe backlight displays, and other clean-flicker displays, and even CRT's accurately translate, assuming you measure to roughly the 90% phosphor decay cutoff point)
Edited by Mark Rejhon - 6/26/13 at 9:14pm
post #3205 of 3670
Quote:
Originally Posted by Mark Rejhon View Post

Quote:
Originally Posted by Chronoptimist View Post

I seem to recall that a move to ~600fps is required for things to actually be perceived as looking real rather than filmed.
Depends on motion speed.
I can still see motion blur at 1/700sec strobes (LightBoost=10%) -- 700Hz equivalence.

For motion going 2000 pixels per second, there is 2 pixels of sample-and-hold motion blur on a theoretical flicker-free 1000fps@1000Hz display. If you're sitting at close computer monitor distances doing fast-flick 180 degree turns in first-person shooter video games (so the whole screen pans past you at 2000 pixels per second), and running an ultra GPU that can do 1000fps (VSYNC ON for perfect fps=Hz motion, so that any micro motion blur becomes noticeable), then the motion blur on a 1000fps@1000Hz can even become barely noticeable to the human eye -- 2 pixels of motion blur -- meaning even 1000fps@1000Hz (real time) isn't even yet the ultimate frontier. Alas! Yes, tiny, tiny. Yes, 600fps@600Hz probably is "good enough" to be the Ultimate Frontier since even most sports motion isn't moving at 2000 pixels per second while you're tracking your eyes across it at very close (computer monitor) distances.

 

In screen space, a traversal of 2000 pixels in a second is the width of your screen.  Turn your TV off, hold your hand in front of you and move it left to right the angular width of the screen in a precise second.  Even with strict following of your eyes, I'm not convinced you're tracking within the effective angular view of what 2 pixels would be if your hand was rendered back at your TV.  That's pretty freaking small as you alluded to.

 

We're not given the luxury of a lightboost'd world either.  So at 2 pixels blur on a high screen-speed object?  Perhaps we're gold-plating the tire irons and exceeding real life.

 

But in fairness and in argument with myself, there's another issue in the mix here.  When you move your hand, the thing receiving the information about it (the brain) is precisely the same thing invoking its movement (the brain)....it's not engaged in a visual pursuit.  It's counter-intuitive, but things don't move very predictably on the screen.  Even when a ball is thrown from left to right in a scene, even when you know this ahead of time, part of that is followed by the camera (not predictable by you) and part of it is forced to traverse screen coordinates.  Movement in screen space is difficult for the eye.  So perhaps we need better than real life on screen?

post #3206 of 3670
I've done tests -- as did dozens of others via my Blog -- it is surprising that it's very clearly detectable, under ideal conditions. 8-bit pixel art (high-contrast perfect boundaries between pixels), moving at motions at 1920 pixels per second -- the blur difference of 1/400sec strobe (4.8 pixels of blur) versus 1/700sec strobes (2.7 pixels of blur) becomes clearly noticeable for people who are sitting in front of a computer monitor at 1:1 viewing distance to screen width.

In general video situations (source based static blur & source based motion blur & compression based blur), this will be impossible to notice.

My point is, 600fps@600Hz isn't even the ultimate frontier.
Yes, most people do not care.
Yes, it won't usually be noticed in video material (except perfect pans with ultra-fast shutter speeds, and when you're sitting very close)
But, the point is, under ideal conditions, running motion tests (such as PixPerAn, or the upcoming Blur Busters Motion Tests) under _ideal_ conditions (1:1 view distance, 8-bit pixel art, framerate=Hz, perfect motion moving at exact pixel steps per frame) 2 millisecond of motion blur differences are _definitely_ noticeable by >50% of human population, especially if you turn your head while tracking the moving object (to improve eye tracking accuracy further), it becomes even easier to see the blur differences. But I don't even need to turn my head to notice the difference -- eye tracking alone is good enough to detect 1ms blur differences under ideal conditions.

Useful tracking accuracy test: Hold up a magazine steadily with both of your hands outstretched. Stand up. Now spin while reading the magazine. Spin at ~30 degrees per second while reading the magazine -- that's spinning one revolution every ~10 seconds -- the rough tracking speed needed for 2000 pixels/sec at 1:1 view distance from a 27" monitor. More than half of human population is still able to accurately read the magazine text while spinning at this speed. Maybe your reading speed slows down, but you won't completely be unable to read the magazine. These humans will see motion blur at 1ms differences under ideal conditions (1:1 view distance from 27" 1080p monitor, 8-bit pixel art in existing motion test software). Once you're trained to see the differences (like learning to detect judder, or learning to detect DLP rainbows), it's quite easy to see. You do get spoiled by motion clarity once you're used to it (fast panning motions as perfectly clear as stationary motion).

I have users of my Blog who told me who they said they wouldn't return to LightBoost=100% (400fps@400Hz motion blur equivalence) after becoming used to LightBoost=10% (700fps@700Hz motion blur equivalence). And that's in regular video games (not ideal scenario of motion tests). A small picky segment of population, but not a non-existent segment of population. I can personally vouch for seeing the motion blur differences of 1/400sec strobes versus 1/700sec strobes in actual video games when playing VSYNC ON at framerate=Hz, while strafing sideways (via arrow keys) at high speeds in front of high-detail wall textures (e.g. posters on virtual walls). The motion blur differences is less noticeable (or almost nonnoticeable) with VSYNC OFF at fluctuating framerates (judder) but immediately reveals itself when the motion becomes more perfectly matched with frame rate (e.g. playing old Source-engine games at 120fps on Geforce 600-series or 700-series cards). Becomes even easier to see when you use ultra-high-contrast boundaries.

Good example video game test case: Borderlands 2 video game (released September 18th 2012). Cartoony rotoscoped style graphics with lots of thin pixel-thick black lines and super-sharp contrasts. My trained eye can instantly see 0.5 millisecond blur differences (while turning or strafing) in THAT game during solo play when under these conditions: VSYNC=ON, framerate=Hz, on a powerful GPU, using a controller/mouse that introduces no microstutters (and in game options, slightly reduce view render distance until you get consistent framerate=Hz with no framedrops), playing using a gaming-caliber laser mouse (precise enough to eliminate visible microstutters). LightBoost=10% versus 50% versus 100% is very clearly distinguishable to my eyes (1.4ms vs 1.9ms vs 2.4ms, and yes, I measured using my oscilloscope -- which I did for my high speed video during LightBoost=100% at 2.4ms). The differences are not night and day (like 60Hz vs 120Hz), but easy to identify instantly during a smooth in-game turn or strafe.

Bottom line: During motion under ideal conditions, 600fps@600Hz (sample-and-hold method), or the use of 1/600sec strobes (flicker method), is certainly NOT the final frontier in motion blur for five-sigma population. (e.g. Finding a Hz or short strobe length that 99.999% population, when _fully_ trained-and-pointed-out-to-do-so, wouldn't be able to detect any display-enforced motion blur in 'perfect' blur free source material). Even 1/700sec refresh samples doesn't exceed a single sigma when pre-training the humans, and when using _ideal_ test case (motion test app). Sure, most population won't notice just like most population won't see 3:2 pulldown judder (until they are trained to see it), but once you are trained to see the blur (and use VSYNC=ON, framerate=Hz, for the 'perfect synchronized motion effect' -- ala Nintendo 60fps "Super Mario Brothers" silky smooth pan effect on CRT as the benchmark of motion perfection), it becomes easy to see 0.5ms differences in motion blur at typical computer monitor viewing distances. Yes, most modern games like Crysis 3 won't permit you the perfect synchronized motion effect, but many not-too-old games do at reduced detail settings or on Titan/700-series cards (e.g. Borderlands 2, Bioshock, Portal 2, etc) -- once you configure a game to do perfect refresh-synchronized motion, it becomes easy to see tiny differences in motion blur once you're trained to do so.

Displays are getting bigger, and many home theater users are viewing displays at 1:1 view distances -- the view distances necessary to easily (after pre-trained) see minor 0.5ms differences in motion blur (in perfect-sharp source material such as video games). More and more are using home theaters to play games too, so there's some overlap there. Video games are an extreme test case of seeing display based motion blur. Mind you, first-person-shooters and racing video games often cause people to track eyes faster than when watching most hockey/football material. Enough semi-new and many older PC video games exists today that are capable of being configured (VSYNC=ON, framerate=Hz, when using a sufficiently powerful GPU, and using a controller that introduces no visible stutters) to approach ideal test case scenarios necessary for humans to see (after pre-trained) 0.5ms differences in motion blur. Sometimes motion blur is good, natural and intended, but there are use cases where zero motion blur is strived for.

Future display tech such as OLED's (flicker based), blue-phase LCD's (microsecond speed LCD's), and strobe-backlight in regular LCD's (backlight strobes bypassing LCD pixel speed limitations), all can theoretically eventually gain motion resolution necessary to completely eliminate human-perceptible motion blur for five-sigma pre-trained population, possibly using 0.2ms strobes (a guessed number, based on my ability to easily tell apart motion blur of 1.4ms strobes versus 1.9ms strobes) -- similar to a CRT with shorter-persistence phosphor versus a CRT with a medium-persistence phosphor (another excellent example of humans seeing millisecond differences in motion blur/phosphor ghosting). At this point, 2000 pixels/sec would have less than half a pixel motion blur. Manufacturers may not go for it this decade, but the potential is there to successfully pull this off on a flat panel technology.
Edited by Mark Rejhon - 6/27/13 at 12:06am
post #3207 of 3670
600Hz looks technically over the top but the 120Hz which the Japanese intend for the 8K sounds very reasonable. However, this gives no reason for bashing the 24 Hz cinema which is extremely clever artistic medium proved innumerable times by genial work. Higher frame rate would provide another medium, claiming that it would be better sounds like saying e.g. impressionists would had produced bettter art if they had computer graphics tools biggrin.gif.
post #3208 of 3670
Quote:
Originally Posted by Mark Rejhon View Post

I've done tests -- as did dozens of others via my Blog -- it is surprising that it's very clearly detectable, under ideal conditions. 8-bit pixel art (high-contrast perfect boundaries between pixels), moving at motions at 1920 pixels per second -- the blur difference of 1/400sec strobe (4.8 pixels of blur) versus 1/700sec strobes (2.7 pixels of blur) becomes clearly noticeable for people who are sitting in front of a computer monitor at 1:1 viewing distance to screen width.

 

I don't doubt that a bit.  Especially when people are looking for it.  What I question is that whether or not we're taking into account two things: 1. how not bothered we are by motion blur in real life vision, and 2. (even though the test above is viewer driven), camera tests are not optical tests.

 

 

Quote:
Originally Posted by Mark Rejhon View Post

Useful tracking accuracy test: Hold up a magazine steadily with both of your hands outstretched. Stand up. Now spin while reading the magazine. Spin at ~30 degrees per second while reading the magazine -- that's spinning one revolution every ~10 seconds -- the rough tracking speed needed for 2000 pixels/sec at 1:1 view distance from a 27" monitor. More than half of human population is still able to accurately read the magazine text while spinning at this speed.

 

Clarify: do you mean spin WITH the magazine?  I don't understand.  That's no additional eye tracking involved if the magazine is moving with your eyes.

post #3209 of 3670
Quote:
Originally Posted by irkuck View Post

600Hz looks technically over the top but the 120Hz which the Japanese intend for the 8K sounds very reasonable. However, this gives no reason for bashing the 24 Hz cinema which is extremely clever artistic medium proved innumerable times by genial work. Higher frame rate would provide another medium, claiming that it would be better sounds like saying e.g. impressionists would had produced bettter art if they had computer graphics tools biggrin.gif.

 

I don't see it as "proved innumerable times" to be "extremely clever artistic medium".

 

I see that the artists creating something the best they can given a medium that they know and predict.

post #3210 of 3670
There is no need to go over 300fps as that will eliminated "Motion Judder" even on fast moving objects. To double refresh that to 600Hz seems unnecessary.
The BBC white paper on the test they made some years ago to determined the framerate that eliminate the "Motion Judder" (PDF); http://downloads.bbc.co.uk/rd/pubs/whp/whp-pdf-files/WHP169.pdf

What should really happen is that the recording of separate frames should be unnecessary and rather move to continuous recording of pixel changes of the motive/object before the camera.
The sensors are fast enough to record the pixel change within a thousands of a second.
The problem currently is that a hold time is needed for moving the data off the sensor and storing the data in the buffer/recorder.
When they manage to make faster processors that can fit into a camera, this should be rather easy.

Before Broadcast move to a new higher resolution system, they need to make a global broadcast system to eliminate the difference between PAL and NTSC, which causes a huge amount of transcoding when programming from one system is sold to broadcasters of another system.
This work is ongoing, but question will be if they can agree on something fast enough to not cause delay to the rolling out higher resolution broadcasts of UHD-1 and UHD-2.

As for framerates in such a system, the most logical and historical backwards convertible with quality footage would be to use the multiple of 24 fps.

-
New Posts  All Forums:Forum Nav:
  Return Home
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 4k by 2k or Quad HD...lots of rumors? thoughts?