Will OLED have better motion resolution and input lag than LCD and Plasma Tv's? - Page 3 - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #61 of 89 Old 07-25-2013, 06:10 PM
Senior Member
 
rurifan's Avatar
 
Join Date: May 2013
Posts: 204
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 12
Quote:
Originally Posted by borf View Post

Since I'm interested in this set, just asking...what was the back light set to?

Maximum, obviously. smile.gif

Even in a dark room it was too dim for me to seriously consider using. (Frustrating tradeoff. Fortunately(?) the W802A also has major backlight uniformity issues, so I got rid of it.)
rurifan is offline  
Sponsored Links
Advertisement
 
post #62 of 89 Old 07-25-2013, 09:56 PM
AVS Special Member
 
xrox's Avatar
 
Join Date: Feb 2003
Posts: 3,169
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 50
Quote:
Originally Posted by Mark Rejhon View Post

Some flickers are brighter than others. It's often a sequence where many flickers are imperceptible, but a few flickers in certain colors (e.g. the last few subfield pulses of a refresh, the brightest ones) might be visible enough to be a dominant, and in certain cases one can see the temporal displacement of the subfield flickers.
I'm sorry but I disagree that individual subfields could ever be noticed by anyone. Temporal integration of light by the human eye prevents this. Furthermore, the phosphor decay rates fill the dark time between subfields anyway. This is all in the Panasonic literature as well as literature from other companies.
Quote:
Originally Posted by Mark Rejhon View Post

The last few flickers of a pulsed sequence can be detected by the human eye as separate distinct flickers
Not sure how you came to this conclusion but I don't agree at all.
Quote:
Originally Posted by Mark Rejhon View Post

since subfield flickers often reveal themselves as a distinct artifact of banding (approximately PWM-style) in the motion blur of things during fast motion.
The banding is due our eyes integrating subfield pulses along the motion vector. It is not subfields revealing themselves or flicker or whatever. It is a motion artifact. Most PDPs have fixed this issue AFAIK.
Quote:
Originally Posted by Mark Rejhon View Post

This also varies on color, e.g. mixed colors, dark shades (can appear as multiple-flicker-per-refresh effect if you're sensitive enough) versus solid whites (resembles a clean one-flicker-per-refresh effect). For black/white solids (e.g. moving solid white object on solid black background), several plasmas can make all flickers dim and only make the last flicker very bright (it shows in high speed camera), since in that case, the plasma doesn't need to add colors via the temporal dithering effect. But for certain colors, there's obvious multiple flickers, and I can see (with human eye) motion artifacts caused by the multiple visible subfield flickers, e.g. blending into artifacts in fast-moving murky colors, along sharp versus fuzzy edges. The flickers can do a kind of a temporal dithering effect (very noticeable in dark shades), since the flickers of some pixels are offset from the flickers of other pixels, in order to create intermediate shades.
High speed cameras capture snapshots in time during the compilation of subfields into a frame, as well as the leading blue rise of phosphor and trailing green. Please do not confuse this with how the human eye sees a PDP. Totally different.

Quote:
Originally Posted by Mark Rejhon View Post

Plasma subfields are very complex, even neither you, nor I, nor Chronoptometrist, can fully understand (especially when you also throw in extra things like motion interpolation into the subfield refreshes -- which the Panasonic VT50 definitely does).
All of my knowledge and information is based on studying PDP literature and patent information over the last 10 years. I wish to think that I have a very strong understanding of PDP driving, materials, and principles.
Quote:
Originally Posted by Mark Rejhon View Post

These noticeable flickers average out to a single flicker to the human eye. Isn't this exactly the same thing we're talking about? We're just interpreting plasma differently, but we're singing to the same choir with similar intentions. It's often just a matter of different terminology / semantics, which is creating misunderstandings here.
Just being honest in saying that my interpretation is directly from the literature. Yours is obviously not and is mostly incorrect IMO. I would not say that the flickers average out. Maybe I'm just misinterpreting terminology?
Quote:
Originally Posted by Mark Rejhon View Post

Again, I am very sensitive to plasma flicker and when I walk up to the plasma, I can on occasion, see the temporally-displacement of different subfield flickers
IMO when you walk up close to a plasma you see halftoning (spatial and temporal dithering and error diffusion), not flickering subfields.

Quote:
Originally Posted by Mark Rejhon View Post

Also, under a high speed camera (I have a budget 1000fps camera which I've pointed at a few displays), the multiple flickers are often temporally different colors; which indicates a lot of plasma take advantage of the temporal time & spatial displacement to do both temporal dithering and spatial dithering simultaneously.
Again, high speed camera only captures a moment in time during compilation of the PDP frame. Colors are most likely due to the very significant difference in rise and fall times for the three phosphors.
Quote:
Originally Posted by Mark Rejhon View Post

I appreciate the plasma engineer's complex technical decisions that are made to create a plasma that creates nicely solid colors. I'm again, saying single flicker per refresh is "not that simple".
I personally find that you are looking to make this more complex than it is. Due to the ramping of subfield brightness we see one dark and one light period per refresh (i.e. - flicker equel to the refresh rate). It is that simple.
Quote:
Originally Posted by Mark Rejhon View Post

But yes, we agree, the temporal displacements of the multiple pulses essentially merge into one nice looking, colorful flicker per refresh.
It is not the temporal displacement, it is the weighted sequence over time (ramping up and down, dark to bright).
Quote:
Originally Posted by Mark Rejhon View Post

Perhaps I am using wrong terminology. I'm worthy of blame for that. I've just chosen the terminology that most accurately describes what I've seen with the camera and eyes. I'm viewing plasma without knowing the plasma engineering terminology, but with a technological appreciation of the temporal complexity of a plasma display, and observing with my eyes (that is sensitive to artifacts caused by various temporal effects).
I think you have an understanding but are not quite clear on how PDP actually works.

Over thinking, over analyzing separates the body from the mind
xrox is offline  
post #63 of 89 Old 07-26-2013, 01:49 AM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by xrox View Post

I'm sorry but I disagree that individual subfields could ever be noticed by anyone. Temporal integration of light by the human eye prevents this.
You cannot detect flicker directly but you can detect it indirectly via the stroboscopic effect.
I present you this scientific proof, from a different industry (lighting industry):



Some papers like this one use "flicker" to define anything that turns ON/OFF even beyond the human's ability to detect it directly, and even indirectly.
This may be different terminology that plasma engineers use. But as you can see, from this chart, 10,000Hz flicker can be indirectly detectable via the stroboscopic effect (Page 6 Flicker Parameters for Reducing Stroboscopic Effects from Solid-state Lighting Systems -- partially why new electronic ballasts for fluorescent lights have moved to 20,000Hz to get well beyond stroboscopic detection thresholds.
Quote:
The banding is due our eyes integrating subfield pulses along the motion vector. It is not subfields revealing themselves or flicker or whatever. It is a motion artifact. Most PDPs have fixed this issue AFAIK.
That's exactly what I'm talking about. The flicker is the motion artifact, and the motion artifact is the flicker revealing itself; the boundary is blurred when we go from detectable flicker (~75Hz approx flicker fusion threshold) going into indirect stroboscopic detection. If you strictly define "flicker" as "directly detectable flicker", then you're correct -- nobody can detect flicker *directly* beyond roughly a high-double-digit range Hz (person varying; some people have high flicker thresholds, others much lower, and depends on ambient lighting, the flicker duty cycle, etc.) And then it no longer should be called "flicker" as a result. One may see still detect 72Hz flicker (e.g. CRT) via peripheral vision in a bright room, while the same person can't see 48Hz flicker in a totally dark room with a different flicker duty cycle while staring directly at a screen (e.g. movie theater). But that's not what I talked about in my posts. The problem is that we had different definitions of the word "flicker".

Examples:
(1) "flicker" at the directly detected level; (e.g. staring directly and see flicker directly)
(2) "flicker" at the indirectly detected level; (e.g. banding artifacts, or stroboscopic / phantom array effect, or color-separation artifacts like DLP rainbows or phosphor color shift, etc)
(3) "flicker" at the essentially undetectable level; (e.g. 20,000 Hz flicker) *in some scientific contexts, this is a non-sequitur
(4) "flicker" at the localized/pixel level; (e.g. DLP noise, plasma noise, temporal dithering effects, flickering pixels, flickering areas, etc)
(5) "flicker" at the full screen level; (e.g. averaged/integrated flicker, screen flicker seen from sofa-viewing distance)
etc.

The plasma engineer papers may be defining flicker as strictly (1) and (5), and not describing (2) (3) as the terminology "flicker".
However, some other engineer communities use a different definition of flicker (e.g. (2) and (3) is called "flicker")
As a blogger, it's always a challenge convert the scientific confusing stuff into material that can be consumed by the average layman, and that is one of the Blur Busters Blog's job.
Quote:
High speed cameras capture snapshots in time during the compilation of subfields into a frame, as well as the leading blue rise of phosphor and trailing green. Please do not confuse this with how the human eye sees a PDP. Totally different.
Not quite.
Here's where I have a gotcha for you:
When you run software that averages the high speed video frames together (stacking/merging frames totalling 1/60sec into one frame), you actually result in a good colorful image.
The software integrated the video frames (i.e. averaged the video frames together -- stacked them and produced a resulting merged frame)
The integrating/averaging similiarity is quite remarkable.
Quote:
All of my knowledge and information is based on studying PDP literature and patent information over the last 10 years.
I believe you, we're just using different terminology.
Quote:
Just being honest in saying that my interpretation is directly from the literature. Yours is obviously not and is mostly incorrect IMO.
Not if you re-interpret my words. I equated detection of flicker with motion artifacts in certain contexts. However, this is possibly inaccurate based on scientific definitions of flicker, that requires flicker to be directly detectable (e.g. staring at stationary flicker).
Quote:
I would not say that the flickers average out.
For the purposes of display temporal effects, averaged=integrated
(These words are not the same -- however for the purposes of this context, when staring at things that are static relative to the human eye -- staring at a static frame, or tracking a specific point in a panning frame (pursuit) -- the averaging and integrating produces remarkably similar result, either via human eye or via video.)

When you run a computer program that averages together 1/60th second worth of high speed camera frames (16 or 17 high speed 1000fps video frames, of a plasma or DLP) into one image, the averaged image is actually a more accurately colorful image of the whole frame. (Assuming full exposures per frame -- 1/1000sec exposure per frame on 1000fps video -- so totalling all frames together essentially behaves like a ~16/1000sec or ~17/1000sec exposure; something close to 1/60sec worth). It's the sum of all light captured for each major color component, whether by photoreceptors, or by CMOS sensor pixels. Obviously, RGB camera sensor primaries don't perfectly match human eye primaries, but let's ignore that technicality for now -- what we observe is the colors now dramatically resembles what the human eye saw when you integrate the video frames together. All the temporal effects are all then integrated; including phosphor decay, phosphor color shift during decay, DLP pixel PWM, temporal dithering, phosphor decay, whatever, doesn't matter what temporal effects of whatever display -- if you stack the video frames (integrate them) it results in the same colorfulness the human eye tends to see. You could even stack multiple refreshes (e.g. about 33 high speed 1000fps frames, covering about two 60Hz refresh cycles), or more, to get a stronger and clearer integration (clearer/more stable color, more resembling what the human eye saw statically).

The mathematical integration of video frame stacking (adding together the frames and averaging the pixel values), produces results remarkably similar to human vision integration in many ways. i.e. The resulting image is surprisingly similar to what the human eye saw statically. Also, if you're lucky enough to have an expensive ultrahighframerate 10,000fps camera, you'd stack together about 167 frames to equal one 1/60sec refresh, etc. to get The eyes certianly work differently as the eyes do not operate on the basis of discrete frames. However, the averaging/integrating works remarkably similar over longer timescales. Although cameras are only an approximate fascimile of what the human eye saw, the effect of the frame stacking (integration) produces a remarkable resemblance to what the human eye saw over a longer timescale (e.g. a full refresh). If you do it yourself, it's rather interesting to observe.

There are some minor psychophysics effects (e.g. scientific paper -- Flicker Induced Colors and Forms) for human vision, and of course camera color artifact quirks for cameras. However, these factors are insignificant at DLP/plasma frequencies when averaged/integrated over the timescale of a refresh. Thiey don't factor in modern DLP and plasma displays, and integrating video frames of a static image over a timescale, produces results remarkably similiar to what the human eye integrated when staring statically at a static image. (Same thing also applies during pursuit: Staring at specific point within a panning image; this then begins to also additionally integrate the effects of sample-and-hold motion blur -- as you can see from my pursuit camera work and resulting pursuit camera photographs that remarkably resembles what the human eye saw).
Quote:
I think you have an understanding but are not quite clear on how PDP actually works.
No disagreement, but I still think you're misinterpreting me, again.
We're talking using different terminologies/definitions I think. You may have terminology more suited to the scientific community, some of which I may not be familiar with. However, it's not even always consistent. Certain engineering communities sometimes come up with a different set of terminologies than other certain engineering communities. (example: plasma engineers versus lighting engineers -- they both define the word "flicker" differently).

Remember some in this forum preach to video; and I often preach to computer/game users (I have enormously popular threads on certain gaming forum sites). I should point out, again, that it is far easier to see temporal display defects in motion computer imagery than with video (e.g. sharp boundaries, high-contrast edges, ability to do high framerates matching refresh, no compression artifacts, no compression softness, no video filtering, no softening during fast motion, etc). Video has a natural softness to it, even on 1080p/24 blu-ray that essentially gives it anti-aliasing. And for the framerate=Hz requirement, lots of 60fps broadcasts are either 1080i/60 or 720p/60, both of which soften on 1080p displays (1080i due to deinterlace, and 720p due to scaling) -- and 20Mbps bitrate isn't enough to preserve perfect 1-pixel-level details at 960pps in fast-panning in sports video. In video, you may not see 1 pixel of motion blur at 960 pixels/sec even for sports motion. However, one often sit closer when using a computer (and even the thousands of computer users on HardForum/OCN using TV's as monitors, sitting only 3-4 feet away), and in computer graphics (e.g. playing a FPS game), motion blur differences of 1 pixel during 960 pixels/sec can become noticeble for a motion blur person, because the source material has no limiting factor in motion blur. 1 pixel at 960pps is equal to 1/960sec of sample-and-hold. The ability to detect motion clarity limitations of a display is higher for computer graphics/video games, than when doing video. Video is full of gradients and details, and provides its own natural anti-aliasing features, and often the camera focus is too soft, or 4:2:2 soften the 1-pixel-thick chroma details, and compression artifacts often prevent perfect-sharp non-antialiased edges during fast pans, etc.

So I again point out that example of some papers like this one use "flicker" to define anything that turns ON/OFF even beyond the human's ability to detect it directly. And staring at plasma noise in very dark shades of colors at very close distances or under magnifying glass, you're witnessing the "flickering" of individual pixels illuminating at different temporal offsets relative to adjacent pixels. However, when you step further back, it's all integrated/averaged (the noise blends into a solid shade). Seeing banding (caused by subfields) is simply the indirect detection of the flicker caused by the subfields, at least within the spatial area that the banding appeared in. Perhaps it should not be defined as the word "flicker", but it's still a stroboscopic effect, which is defined as "flicker" in some contexts (although, perhaps not by the plasma engineers within the scientific papers you have read, however).

Again, perhaps this may or may not be the correct use of the terminology "flicker"; but I think you certainly can understand where I'm getting at. Do a search-and-replace of the certain words change "flicker" (used in human vision contexts) into "various temporal artifacts and side-effects caused by the non-continuity of light at each different display points" or a proper substitution for the word "flicker" to describe effects caused by the high speed flashes of plasma cells. I believe that upon re-reading what I wrote, a lot of what I wrote suddenly is more in sync with what you've already written about. The problem is that the word "flicker" has sometimes been used in some papers to describe anything that turns on/off even at rates beyond direct human detectability (but still detectable stroboscopically); which is possibly the chief cause of your misunderstandings of my posts.

And, please, before replying, go re-read my original posts with the re-defined definition of the word flicker and you'll see a lot of what I said is far more in sync with what you already said.
(even if different communities still has conflicting meanings on the precise definition of "flicker"). Perhaps from now on the terminology needs to be clarified in terms of the word flicker.

In fact, you posted an image that obviously shows we are talking about the same thing now:
Quote:
Originally Posted by xrox 
effectivedutycycle.jpg
PlasmaPhoshpordecay-1.jpg
As you already explained, but translated into the terminology I'm currently using -- the subfields flicker at increasing intensities (and the flicker frequency is beyond direct human detectability) as it is averaged / integrated as the black line in this image. The dotted line is really only an approximation what the human eye actually perceives directly; but in reality varies from human to human. Again, this post is probably not using the same definition of "flicker" as the definition of "flicker" that plasma engineers use. The dotted line behaves like a moving average of a fraction of a second. The formula may not be exactly the same as average (e.g. calculus may be involved), but the dotted line is remarkably similar to an integrated moving average, just over a tiny timescale. Also, the flicker doesn't have to go all the way to full-off; insofar as lamps/fluorescent lamps often "flicker" unseen (they dim-bright-dim-bright in udulations, often beyond human detectability). So again, we're just using different terminology, to describe, exactly the same thing.

You are very welcome to correct my terminology; to make sure that we are "in-sync" with the terminology that we use. What I already wrote in this thread correct based on real-world and observations, but may be terminologically incorrect from the perspective of the language used by, say, plasma engineers. I am only read incorrectly because of the different terminology languages we are using. It is in Blur Busters Blog's intersest to carefully improve terminology. (For example, based on discussions on AVSFORUM and with forum member tgm1024, I have now changed certain terminologies (e.g. stopped using the phrase "pixel persistence" where a different phrase such as "LCD panel's pixel transition speed" is more appropriate (phrase that is more independent of whether the backlight is on or off).

P.S. I still saw banding in certain motion tests in Panasonic VT50 at 1:1 view distance. So not all modern plasmas have 100% fixed this. And computer/game motion can reveal far more artifacts than video motion, due to the 'perfectly sharp' nature of computer graphics (no source based limitation that 'hides' display temporal artifacts; i.e. whatever is seen is definitely caused by the display and not source material).

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #64 of 89 Old 07-26-2013, 02:55 AM - Thread Starter
Member
 
hcps-sulemanma's Avatar
 
Join Date: Dec 2008
Posts: 49
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
No one has commented on this yet so I'll post it here again:
Quote:
Originally Posted by hcps-sulemanma View Post

Also because of their sample and hold nature, wouldn't 24fps and 30 fps signals have no frame-doubling judder (and no uneven frame judder for 24 fps) in a 240 Hz LCD (or even 120 Hz)?
If we compared it to Plasma TV's (like the Panasonic VT60) which WOULD have frame-doubling judder for 30fps@60Hz refresh rate, and frame-doubling judder but no uneven frame judder for 24fps@48Hz or 24fps@96Hz refresh rate?

That would mean a plasma can only properly display without frame-doubling or uneven frame judder if it only receives 60 fps signals. Why can't plasma have multiple refresh rates of 24 Hz, 30 Hz, and 60 Hz to cover all bases in terms of both types of judder?

I'll also add is it possible to improve plasma's motion resolution without having frame-doubling judder? Or because of the sub field technology and phosphor decay rates it is only possible to increase motion resolution at expense of having one frame appear then disappear and the appear again (frame doubling judder)?
hcps-sulemanma is offline  
post #65 of 89 Old 07-28-2013, 12:18 AM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by hcps-sulemanma View Post

No one has commented on this yet so I'll post it here again:
I'll also add is it possible to improve plasma's motion resolution without having frame-doubling judder? Or because of the sub field technology and phosphor decay rates it is only possible to increase motion resolution at expense of having one frame appear then disappear and the appear again (frame doubling judder)?
Yes. Motion-interpolated subfields. The Panasonic VT50 does that.

Here is a scientific paper combining motion interpolation with subfield refreshes:
http://www.es.ele.tue.nl/~dehaan/pdf/54_Sid_pdp_mc.pdf

It does work very well for sports material, but does not work well for Game Mode, as you now have to reduce input lag. So if you're using a lower-input-lag plasma, that can cause subfield-created motion artifacts to come back, during gaming and computer use. Getting enough video data to calculate motion vectors, mandates added input lag. In Game Mode, you cannot lookahead a frame, so you can no longer do lookahead/lookbehind logic to calculate motion vectors. And without vectors, you cannot do proper motion compensation/interpolation.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #66 of 89 Old 07-28-2013, 12:32 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
Looks like we have our answer for OLED image persistence:

mKhbiKK.jpg

http://www.journalofvision.org/content/13/7/6.full
Chronoptimist is offline  
post #67 of 89 Old 07-28-2013, 12:35 AM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by Chronoptimist View Post

Looks like we have our answer for OLED image persistence:

mKhbiKK.jpg

http://www.journalofvision.org/content/13/7/6.full
Good new images.
However, old news...
...I already wrote on HardForum in June 2013 that the Sony OLED has 7.5ms of motion blur

This isn't specifically OLED in general, but specific to one OLED, the same OLED used in the Sony Trimaster displays. Different OLED's can use different strobe lengths, so this graph is not of OLED in general. The Trimaster uses a rolling scan technique; a row of pixels is lit up, and 7.5ms the row of pixels is turned off.

The Samsung Galaxy S3 and PS Vita have 16.7ms of motion blur (at full brightness), the Sony Trimaster OLED has 7.5ms of motion blur, and several other OLED's come up with dramatically different values. Most passive matrix OLED's (PMOLED) have less than 1ms of motion blur -- mainly tiny MP3/music player displays -- but they aren't used for large full-color OLED displays. Those tiny OLED's have less motion blur than CRT. Sadly, it's unobtainium for large-size OLED at the moment. Presently, we're still waiting for numbers for LG's new OLED displays. The active matrix (transistors) puts some performance bottleneck on how quickly the OLED can be turned off immediately after it's turned on, but it can be made faster than 7.5ms in the future.

For other readers, see Blur Buster's article: Why Do Some OLED's Have Motion Blur?, an article I wrote half a year ago.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #68 of 89 Old 07-28-2013, 05:19 PM
AVS Special Member
 
Wizziwig's Avatar
 
Join Date: Jul 2001
Location: SoCal, USA
Posts: 1,161
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 81 Post(s)
Liked: 94
If you read that entire article, you will see some more comparisons to CRT. The OLED has a much slower rise time (as seen in the right graph above) but generally faster fall time. This means that it does not suffer from the phosphor decay afterglow/trail that you see when displaying a moving white object on a black background of a CRT. You also see in the above gray-to-gray graphs that there is no overdrive overshoot error like you see on LCD and zero pixel persistence/bleed into the next frame. Overall, it's a very promising start.
Wizziwig is offline  
post #69 of 89 Old 07-28-2013, 10:01 PM
AVS Special Member
 
xrox's Avatar
 
Join Date: Feb 2003
Posts: 3,169
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 50
Quote:
Originally Posted by Mark Rejhon View Post

Not quite.
Here's where I have a gotcha for you:
When you run software that averages the high speed video frames together (stacking/merging frames totalling 1/60sec into one frame), you actually result in a good colorful image.
The software integrated the video frames (i.e. averaged the video frames together -- stacked them and produced a resulting merged frame)
The integrating/averaging similiarity is quite remarkable.
Mark, I’m not sure what point you are trying to make with that extremely long post. What I posted about high speed cameras is correct. If the camera frame rate is higher than the refresh rate of the PDP then you will capture only partially compiled PDP frames. When you play back this video it will look like it is flickering due to repeated patterns of partial captures. Even when the camera frame rate is longer you will see blue leading flashes and green trailing flashes due to the PDP and camera being out of sync and the phosphors reacting at different speeds.
Saying that software can integrate the high speed frames to prove me wrong is pointless? If you are just trying to say that a very high speed camera could capture enough snapshots of one compiled frame to show you how each frame is compiled then you could have said that in 25 words or less. biggrin.gif
Quote:
Originally Posted by Mark Rejhon View Post

staring at plasma noise in very dark shades of colors at very close distances or under magnifying glass, you're witnessing the "flickering" of individual pixels illuminating at different temporal offsets relative to adjacent pixels.
This is temporal dithering, not subfields. Spatial dithering is using adjacent pixels and temporal dithering is using multiple frames to generate a gray level. In other words the frequency you see is below 60Hz.
Quote:
Originally Posted by Mark Rejhon View Post

Seeing banding (caused by subfields) is simply the indirect detection of the flicker caused by the subfields, at least within the spatial area that the banding appeared in. Perhaps it should not be defined as the word "flicker", but it's still a stroboscopic effect, which is defined as "flicker" in some contexts (although, perhaps not by the plasma engineers within the scientific papers you have read, however).
Again the banding caused by subfields is due to our eyes generating an incorrectly perceived gray level along the motion vector. Nothing to do with flicker.
Quote:
Originally Posted by Mark Rejhon View Post

P.S. I still saw banding in certain motion tests in Panasonic VT50 at 1:1 view distance. So not all modern plasmas have 100% fixed this. And computer/game motion can reveal far more artifacts than video motion, due to the 'perfectly sharp' nature of computer graphics (no source based limitation that 'hides' display temporal artifacts; i.e. whatever is seen is definitely caused by the display and not source material).
Panasonic maybe not but Pioneer used contiguous subfields meaning they were free from dynamic false contours.
Also, IMO what you are seeing is color separation due to the huge phosphor differences in rise and fall times. This will create blue and green banding when eye tracking on any plasma (both static and moving images).

Over thinking, over analyzing separates the body from the mind
xrox is offline  
post #70 of 89 Old 07-28-2013, 10:19 PM
AVS Special Member
 
xrox's Avatar
 
Join Date: Feb 2003
Posts: 3,169
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 50
Quote:
Originally Posted by hcps-sulemanma View Post

No one has commented on this yet so I'll post it here again:
I'll also add is it possible to improve plasma's motion resolution without having frame-doubling judder? Or because of the sub field technology and phosphor decay rates it is only possible to increase motion resolution at expense of having one frame appear then disappear and the appear again (frame doubling judder)?
Increasing motion resolution on PDP is the same basic principle as any other display. Reduce the hold time (light emission time per unique frame). This includes interpolating, BFI, or just plain duty cycle compression.

However, PDP frames have a built in dark and light period and therefore will always have to appear and disappear in a repeat frame situation. As far as duty cycle, PDP effective duty cycle is shortened by reducing the phosphor decay (new fast phosphors), and limiting the time available for bright subfields to emit light on moving portion of the screen. Also, another way is to reverse the subfield sequence which enables the phosphor to decay further before another refresh starts.

Over thinking, over analyzing separates the body from the mind
xrox is offline  
post #71 of 89 Old 07-28-2013, 10:35 PM
AVS Special Member
 
xrox's Avatar
 
Join Date: Feb 2003
Posts: 3,169
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 50
Quote:
Originally Posted by Mark Rejhon View Post

Yes. Motion-interpolated subfields. The Panasonic VT50 does that.

Here is a scientific paper combining motion interpolation with subfield refreshes:
http://www.es.ele.tue.nl/~dehaan/pdf/54_Sid_pdp_mc.pdf

It does work very well for sports material, but does not work well for Game Mode, as you now have to reduce input lag. So if you're using a lower-input-lag plasma, that can cause subfield-created motion artifacts to come back, during gaming and computer use. Getting enough video data to calculate motion vectors, mandates added input lag. In Game Mode, you cannot lookahead a frame, so you can no longer do lookahead/lookbehind logic to calculate motion vectors. And without vectors, you cannot do proper motion compensation/interpolation.
This response tells me you are not understanding PDP operation or that paper you cited. It sounds like you are confusing interpolated frames and interpolated subfields.

That paper is describing dynamic false contours which are incorrect gray levels percieved by our eyes integrating non-contiguous subfields. This can be fixed by anticipating the contour effect and rearranging the subfields to compensate.

Another way to eliminate this issue is to use contiguous subfields like Pioneer does/did.

Over thinking, over analyzing separates the body from the mind
xrox is offline  
post #72 of 89 Old 07-29-2013, 01:13 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by xrox View Post

This response tells me you are not understanding PDP operation or that paper you cited. It sounds like you are confusing interpolated frames and interpolated subfields.
I thought of that, but read more closely and realized that the two is actually more intertwined than one expects. When I went to Best Buy a few months ago to do several motion tests from a laptop (using an early beta of TestUFO), it confirmed that on the VT50 *something* appears to be done in some of them to keep motion blur down even in the temporally-displaced pulses of visible pixels; and the only explanation I could come up with was that motion vectors were being computed & interpolated -- and that's essentially motion interpolation.

I need to go back to the store and see what exactly was being done with some of the plasmas -- and capture some evidence within the next month or two, so that the plasma guys can explain what's being done -- if it's not motion interpolation.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #73 of 89 Old 08-01-2013, 02:27 AM - Thread Starter
Member
 
hcps-sulemanma's Avatar
 
Join Date: Dec 2008
Posts: 49
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I have been away for a while and now coming back to this thread, I think its safe to say that the best technology for high motion resolution, judder free, and least amount of flicker video (that can be adjusted by what the consumer can tolerate) would be a technology that can control both its sample-hold effect AND black frame insertion(Or strobing) to any setting we like. To me it seems in the coming future 240 Hz LCDs will be able to gave us these settings. But I am not sure if Plasma's can give this to us because of its subfields weighing method of delivering frames. Maybe OLEDs can in the future?
hcps-sulemanma is offline  
post #74 of 89 Old 08-01-2013, 02:36 AM - Thread Starter
Member
 
hcps-sulemanma's Avatar
 
Join Date: Dec 2008
Posts: 49
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by xrox View Post

Increasing motion resolution on PDP is the same basic principle as any other display. Reduce the hold time (light emission time per unique frame). This includes interpolating, BFI, or just plain duty cycle compression.

However, PDP frames have a built in dark and light period and therefore will always have to appear and disappear in a repeat frame situation. As far as duty cycle, PDP effective duty cycle is shortened by reducing the phosphor decay (new fast phosphors), and limiting the time available for bright subfields to emit light on moving portion of the screen. Also, another way is to reverse the subfield sequence which enables the phosphor to decay further before another refresh starts.

How does the Plasma TV know what part of the frame is "moving" without adding input lag/motion calculations? It has to be done through motion interpolation as Mark said, I am guessing. And why doesn't it just limit the bright subfields time duration for the entire frame at all times, why only parts of the frame that are "moving"? To me that would seem to not create an uniform frame/video.

Also for 24 fps and 30 fps content, the weighing of the subfields have to change to compared to 60 fps content, if we don't want frame doubling judder. Can plasma's in the near future control this subfield weighing based on the fps (24, 30, 60) we are feeding it?
hcps-sulemanma is offline  
post #75 of 89 Old 08-01-2013, 07:57 AM
AVS Special Member
 
xrox's Avatar
 
Join Date: Feb 2003
Posts: 3,169
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 50
Quote:
Originally Posted by hcps-sulemanma View Post

How does the Plasma TV know what part of the frame is "moving" without adding input lag/motion calculations? It has to be done through motion interpolation as Mark said, I am guessing. And why doesn't it just limit the bright subfields time duration for the entire frame at all times, why only parts of the frame that are "moving"? To me that would seem to not create an uniform frame/video.
Two or more frames are analyzed for motion and processed accordingly before being displayed. This will of course take up some time.

Limiting the entire frame to only 1 subfield would limit the grayscale capability dramatically as all gray levels would be created by halftoning using one subfield weight.
Quote:
Originally Posted by hcps-sulemanma View Post

Also for 24 fps and 30 fps content, the weighing of the subfields have to change to compared to 60 fps content, if we don't want frame doubling judder. Can plasma's in the near future control this subfield weighing based on the fps (24, 30, 60) we are feeding it?
The only situation where subfield weights have to change is when the panel refresh rate changes. This is becuase the time available for generating subfields changes.

Over thinking, over analyzing separates the body from the mind
xrox is offline  
post #76 of 89 Old 08-01-2013, 12:28 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Side note:

Regardless of technology (plasma, LCD, OLED, etc) the lag of motion processing goes down when interpolating a higher native input framerate. For example, if you have a native 240fps video source at the input level, you only need about 2/240ths of a second of buffering lag (less than 10ms) to do high quality predictive motion processing of any kind (anything that requires motion vectors). e.g. using interpolation from 240fps to 960fps.

But since native 240fps is not going to be standardized in the forseeable future...

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #77 of 89 Old 08-01-2013, 01:30 PM
Senior Member
 
rurifan's Avatar
 
Join Date: May 2013
Posts: 204
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 12
Quote:
Originally Posted by Mark Rejhon View Post

Regardless of technology (plasma, LCD, OLED, etc) the lag of motion processing goes down when interpolating a higher native input framerate. For example, if you have a native 240fps video source at the input level, you only need about 2/240ths of a second of buffering lag (less than 10ms) to do high quality predictive motion processing of any kind (anything that requires motion vectors). e.g. using interpolation from 240fps to 960fps.

Assuming the actual processing can be accomplished that quickly, which seems equally unlikely at present.

Especially as the TV makers focus on higher resolutions instead of better frame rate in content. frown.gif
rurifan is offline  
post #78 of 89 Old 08-01-2013, 03:30 PM - Thread Starter
Member
 
hcps-sulemanma's Avatar
 
Join Date: Dec 2008
Posts: 49
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by xrox View Post

Two or more frames are analyzed for motion and processed accordingly before being displayed. This will of course take up some time.

Limiting the entire frame to only 1 subfield would limit the grayscale capability dramatically as all gray levels would be created by halftoning using one subfield weight.
The only situation where subfield weights have to change is when the panel refresh rate changes. This is becuase the time available for generating subfields changes.

I didn't say it clearly but that is what I meant, can or is it possible for one plasma panel to have the three refresh rates: 24, 30, 60 ?
We don't want any of the other refresh rates like the panasonic plasmas have, like the 48 Hz and the 96 Hz, because they will introduce frame doubling judder.

Also I am not saying that we need to limit the entire frame into 1 subfield, but we need to create subfields that last even a smaller amount of time than they currently do now. Of course this would make the "big weighed subfields flicker" more obvious but it will also make the individual "smaller subfield flicker" less obvious.
hcps-sulemanma is offline  
post #79 of 89 Old 08-12-2013, 07:08 AM
AVS Special Member
 
xrox's Avatar
 
Join Date: Feb 2003
Posts: 3,169
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 50
Quote:
Originally Posted by hcps-sulemanma View Post

I didn't say it clearly but that is what I meant, can or is it possible for one plasma panel to have the three refresh rates: 24, 30, 60 ?
We don't want any of the other refresh rates like the panasonic plasmas have, like the 48 Hz and the 96 Hz, because they will introduce frame doubling judder.
Yes it is possible, but not practical. A PDP needs time to compile an image. If the refresh rate is too high (eg 120Hz or more) the time available is too short and thus the PDP will be too dim and have less subfields. If the refresh rate is too low (eg 48Hz or less) there will be severe flicker.
Quote:
Originally Posted by hcps-sulemanma View Post

Also I am not saying that we need to limit the entire frame into 1 subfield, but we need to create subfields that last even a smaller amount of time than they currently do now. Of course this would make the "big weighed subfields flicker" more obvious but it will also make the individual "smaller subfield flicker" less obvious.
In limiting the subfield time you are also limiting the brightness. It is a tradeoff.

Over thinking, over analyzing separates the body from the mind
xrox is offline  
post #80 of 89 Old 11-08-2013, 12:19 AM
Senior Member
 
ChadThunder's Avatar
 
Join Date: Aug 2013
Posts: 280
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 40 Post(s)
Liked: 49
Quote:
Originally Posted by xrox View Post

Yes it is possible, but not practical. A PDP needs time to compile an image. If the refresh rate is too high (eg 120Hz or more) the time available is too short and thus the PDP will be too dim and have less subfields. If the refresh rate is too low (eg 48Hz or less) there will be severe flicker.
In limiting the subfield time you are also limiting the brightness. It is a tradeoff.
My Panasonic 30 series plasma monitor runs at 120 and 125hz with no observable drop in luminance, bit-depth or image quality, looking at the blank time in the refresh cycle leads me to believe Panasonic could have taken plasma to 180hz and well beyond frown.gif

If you think plasma is dim below is a video I made with contrast locked onto the very lowest setting

(Stream LQ)
(Download HQ)

ChadThunder is online now  
post #81 of 89 Old 11-08-2013, 06:22 AM
AVS Special Member
 
xrox's Avatar
 
Join Date: Feb 2003
Posts: 3,169
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 50
Quote:
Originally Posted by ChadThunder View Post

My Panasonic 30 series plasma monitor runs at 120 and 125hz with no observable drop in luminance, bit-depth or image quality, looking at the blank time in the refresh cycle leads me to believe Panasonic could have taken plasma to 180hz and well beyond frown.gif
IIRC to prevent the luminance drop the panel runs with half the subfields it would run with at 60Hz. In Panasonic's case that is only 5 or 6 subfields per refresh. Reducing the addressing time and discharge delay may have improved this since I last checked but the limitation will always be there in a PWM based display.

180Hz and beyond would limit time even further.
ChadThunder likes this.

Over thinking, over analyzing separates the body from the mind
xrox is offline  
post #82 of 89 Old 11-08-2013, 07:23 AM
Senior Member
 
ChadThunder's Avatar
 
Join Date: Aug 2013
Posts: 280
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 40 Post(s)
Liked: 49
Quote:
Originally Posted by xrox View Post

IIRC to prevent the luminance drop the panel runs with half the subfields it would run with at 60Hz. In Panasonic's case that is only 5 or 6 subfields per refresh. Reducing the addressing time and discharge delay may have improved this since I last checked but the limitation will always be there in a PWM based display.
Well perhaps the driving method was optimized for 3D on 14th generation panels. I'm not lying when I say the still image is identical at 60Hz and 120Hz

Also nothing to do with refresh rate but my theory is modes like Dynamic and Normal are using fewer sub-fields to achieve a greater brightness at the expense of some detail (gradations, bit-depth, etc) confused.gif do you know about that?
ChadThunder is online now  
post #83 of 89 Old 11-08-2013, 07:40 AM
AVS Special Member
 
xrox's Avatar
 
Join Date: Feb 2003
Posts: 3,169
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 50
Quote:
Originally Posted by ChadThunder View Post

Well perhaps the driving method was optimized for 3D on 14th generation panels. I'm not lying when I say the still image is identical at 60Hz and 120Hz

Also nothing to do with refresh rate but my theory is modes like Dynamic and Normal are using fewer sub-fields to achieve a greater brightness at the expense of some detail (gradations, bit-depth, etc) confused.gif do you know about that?
Not sure about the picture modes. It is certainly possible. As you say, the less subfield enable higher brightness.
ChadThunder likes this.

Over thinking, over analyzing separates the body from the mind
xrox is offline  
post #84 of 89 Old 11-09-2013, 08:33 PM
Newbie
 
willkm79's Avatar
 
Join Date: Nov 2013
Posts: 8
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Wizziwig View Post

You should really read this thread just a few posts below yours:

http://www.avsforum.com/t/1433254/lcd-motion-blur-eye-tracking-now-dominant-cause-of-motion-blur-not-pixel-persistence

Basically, in order to perceive a blur-free image, the display needs to strobe light in some way. While this may cause some visible flicker for some, it is how CRT achieved the unmatched motion quality.

As far as we know, all upcoming OLED TVs do not use strobing. They rely on motion interpolation to reduce blur. This will likely cause some input lag. Blur should be slightly less than an LCD because the pixels can switch states much faster.

Also, the Lightboost strobing is not compatible with consoles. It's not even fully compatible with most PC games that can't run reliably at 120+ fps.


Anybody know if the SED TV technology strobed light? Anybody aware of how SED TVs handled motion resolution?

 

Sounds like OLED is going to be more like LCD & Plasma than CRT.

willkm79 is offline  
post #85 of 89 Old 11-10-2013, 08:57 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
Quote:
Originally Posted by willkm79 View Post

Anybody know if the SED TV technology strobed light? Anybody aware of how SED TVs handled motion resolution?
Sounds like OLED is going to be more like LCD & Plasma than CRT.
As I understand it, SED used PWM, so they would have a plasma-like image. OLED is more like an LCD than anything else. (sample & hold image, with discrete gradation)
CRTs combined discrete gradation with a scanning image, so an OLED or LCD which scans the image can produce a very CRT-like image.
Chronoptimist is offline  
post #86 of 89 Old 11-11-2013, 07:33 AM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,078
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 236 Post(s)
Liked: 642
Quote:
Originally Posted by Chronoptimist View Post
 
Quote:
Originally Posted by willkm79 View Post

Anybody know if the SED TV technology strobed light? Anybody aware of how SED TVs handled motion resolution?
Sounds like OLED is going to be more like LCD & Plasma than CRT.
As I understand it, SED used PWM, so they would have a plasma-like image.

 

I'm glad you brought this up.  It was previously my understanding that while FED required PWM, because of the voltage range differences SED did not strictly need to.  That was key to my "routing for it".  I think I'm now mistaken.  When I look around at varying sources (I don't use wikipedia if I can help it), it seems that it's a requirement.  I am now no longer in any way "hopeful" that SED succeeds because PWM is just not a technology I'm impressed with, though I understand the need for it with the all-or-nothing emission characteristic of plasma.


WARNING: You have now entered a no @#$%tard zone. Please gather your anti-vaccine propaganda nonsense and slowly back out the way you came in.
tgm1024 is online now  
post #87 of 89 Old 11-11-2013, 11:14 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
Quote:
Originally Posted by tgm1024 View Post

I'm glad you brought this up.  It was previously my understanding that while FED required PWM, because of the voltage range differences SED did not strictly need to.  That was key to my "routing for it".  I think I'm now mistaken.  When I look around at varying sources (I don't use wikipedia if I can help it), it seems that it's a requirement.  I am now no longer in any way "hopeful" that SED succeeds because PWM is just not a technology I'm impressed with, though I understand the need for it with the all-or-nothing emission characteristic of plasma.
When they were talking about each pixel being its own "mini CRT" I was under that impression as well, but the sources I'm finding now all seem to indicate that they were both using PWM - which means I have no interest in them. I hate the way plasma builds up its image over multiple subfields.
Chronoptimist is offline  
post #88 of 89 Old 11-11-2013, 12:30 PM
AVS Special Member
 
xrox's Avatar
 
Join Date: Feb 2003
Posts: 3,169
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 50
Quote:
Originally Posted by Chronoptimist View Post

When they were talking about each pixel being its own "mini CRT" I was under that impression as well, but the sources I'm finding now all seem to indicate that they were both using PWM - which means I have no interest in them. I hate the way plasma builds up its image over multiple subfields.
The papers I have from Toshiba point towards a pure PWM. No subfields. SED was never the ultimate display it was hyped to be. IIRC power consumption, burn-in, viewing angle, and brightness were all questionable to some degree.

Over thinking, over analyzing separates the body from the mind
xrox is offline  
post #89 of 89 Old 11-11-2013, 01:05 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
Quote:
Originally Posted by xrox View Post

The papers I have from Toshiba point towards a pure PWM. No subfields. SED was never the ultimate display it was hyped to be. IIRC power consumption, burn-in, viewing angle, and brightness were all questionable to some degree.
Well that would be a step up at least, but yes SED/FED seemed quite overhyped.
Chronoptimist is offline  
Reply Flat Panels General and OLED Technology

Tags
Oled Tv

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off