HDMI 3.0 -- Adopt Variable Refresh Rates aka G-SYNC? (Good for video, too!) - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #1 of 38 Old 10-25-2013, 09:33 PM - Thread Starter
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 102
Some of you may have heard of NVIDIA's G-SYNC, a new variable refresh rate technology that just got announced by NVIDIA:
http://www.geforce.com/whats-new/articles/introducing-nvidia-g-sync-revolutionary-ultra-smooth-stutter-free-gaming
It allows software to drive the timing of each refresh, asynchronously. Everytime the graphics card finishes a frame, it's *immediately* displayed on the monitor. No scheduled discrete refresh intervals! Your software decides exactly when to display a frame -- right to the microsecond.

This has great applications for home theater.

Another useful use of variable refresh rate technology (such as G-SYNC), operating at maximum cable bandwidth, is low-latency fixed refresh rate. Since G-SYNC has an additional advantage of faster frame delivery times, and faster on-screen scanout time, you can get 60fps@60Hz with cable delivery of only 1/144sec, and screen scanout of only 1/144sec per refresh! (or whatever maximum bandwidth the G-SYNC monitor uses -- or even theoretical future 240Hz G-SYNC monitors using DisplayPort 2.0, which would be able to do 60fps@60Hz or 77.5fps@77.5Hz or 187fps@187Hz, all with just 1/240th sec frame delivery / scanout latency!)

In the past, 60Hz monitors took 1/60sec to scanout (16.7ms).
High speed video of CRT: http://www.youtube.com/watch?v=zVS6QewZsi4
High speed video of LCD: http://www.youtube.com/watch?v=nCHgmCxGEzY

But with G-SYNC, delivery and scanout is decoupled from the refresh rate. You can choose to do 60fps@60Hz, with a lot less input lag than any 60Hz monitor -- even less input lag than a 60Hz CRT, because CRT's take a finite amount of time to scan from top-to-bottom.

Also, fixed refresh rate is great for G-SYNC too, as the software completely seamlessly drives refresh rate:
- Play movies 24fps@24Hz (or 48Hz, 72Hz, 96Hz)
- Play videos 30fps@30Hz
- Play movies 48fps@48Hz
- Play television 60fps@60Hz
- Play television 59.94Hz@59.94Hz
- Play old silent films at original theater speed 18fps@18Hz (or 36Hz, 72Hz)
- Variable frame rate video files, director chooses their own frame rate.
- Future movie formats (e.g. 72fps, 96fps)
- Play mixed TV material and dynamically detect 24p, 30p, and 60p material, dynamically rate-adapt to new fixed rate (with zero mode-change flicker)
- Play games 60fps@60Hz with lower input lag (taking advantage of 1/144sec frame delivery times)
- Future consoles could support variable refresh rate to also eliminate stutters too (G-SYNC raision d'etre)
- Clear path to higher frame rates (e.g. NHK 8K 120Hz)
- You can avoid stutters even if one frame takes a bit longer to render (e.g. 16.8ms instead 1/60sec = 16.7ms) for any innocous reason such as processing, background apps or error correction, you just delay that specific refresh by an unnoticeable 0.1 millisecond rather than be forced to wait till next refresh cycle. Makes video playback even smoother.
- etc.

Observe that HDMI 2.0 has the bandwidth to transmit individual 1080p frames in less than 1/240th of a second. Reducing input latency from 16.7ms all the way down to 4.2ms, is a major reduction.

So it has apparently great applications for movies and home theater. Variable refresh rate capability & faster frame delivery time belongs in HDMI 3.0, in my humble opinion. Less input lag for receivers, less input lag for sports, future game consoles (XBoxTwo, PS5), less broadcasting latency due to speeded-up frame delivery between settop box and TV, less input lag everywhere, future-proof frame rates, faster frame delivery times from one home theater device to another....

Conclusion --
This may be 5 years, 10 years, or maybe until after patents expire, but I think this is an important innovation step, for TWO very, very major reasons:
1. Faster frame delivery time to the display; for less latency even at low frame rates; and
2. Eliminate humankind dependance on discrete refresh rates. One small step (of many) towards Holodeck.

Thanks,
Mark Rejhon


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
Sponsored Links
Advertisement
 
post #2 of 38 Old 10-26-2013, 02:31 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,587
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 228
It's been talked about for years, but I'm glad that Nvidia have finally done this. The problem is that it's only going to show up in G-Sync monitors for the foreseeable future. Hopefully they'll push to get it in televisions, or someone will be interested in licensing it.
It will be interesting to see how variable framerates end up though. I'm sure it will be better than our current options (v-sync, triple-buffering, or screen tearing) but I'm having a hard time believing that a variable framerate is going to look smooth on a low persistence display.
Chronoptimist is offline  
post #3 of 38 Old 10-26-2013, 08:01 AM - Thread Starter
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 102
Quote:
Originally Posted by Chronoptimist View Post

but I'm having a hard time believing that a variable framerate is going to look smooth on a low persistence display.
There is more than one approach:

(1) ....You can create low persistence simply by using ultrahigh framerates. This works on displays without light modulation (no PWM, no plasma subfields, no DLP modulation, no phosphor decay, no strobe backlight). Continuous-light 1000fps@1000Hz is a persistence of 1ms. Variable frame rates opens the progress to ultrahigh framerates to eliminate the need for low-persistence-by-light-modulation (e.g. strobing). One step towards Holodeck (real life has no framerate; or equivalent to infinite frame rate, depending on how you look at it).
....With ultrashort frame lengths (1ms per frame at 1000fps@1000Hz), low persistence is successfully achieved without the need for any light modulation. There would be absolutely no flicker under high speed camera; a high speed video of the screen would look the same as a high speed video of real life.
....You get variable motion blur; the higher the framerate, the less motion blur. e.g. Tomorrow's "Holodeck" content creator (director) can even control the amount of motion blur via the framerate method, e.g. 500fps has twice the motion clarity of 250fps, and 1000fps has twice the motion clarity of 500fps. Motion blur generated by persistence, is proportional to the distance of movement between individual frames -- the length of persistence itself. The motion blur is the same as the equivalent photographic camera shutter speed (e.g. 1/500sec persistence creates equivalent motion blur as 1/500sec camera shutter). Obviously, the faster the motion, the more potential for motion blur to occur, until objects start moving too fast for human eyes to track.

-or-

(2) I've mathematically determined that it is technically possible to have flicker-free persistence-lowering light modulation with variable frame rates.. You keep persistence high at low framerates, but gradually shorten persistence for progressively higher framerates that are above flicker fusion threshold. (e.g. begin modulating light towards one shorter strobe/peak/illumination per refresh cycle). Basically, you blend PWM-free/ultrahigh-frequency PWM at low refresh rates (high persistence or sample-and-hold at low framerates), gradually to full strobing (light modulation) at higher frequencies. While maintaining a constant trailing-average brightness at all times. Example of blended flicker-free variable-rate strobing/pulsing algorithm that is low-persistence at high framerates, but high-persistence flicker-free at low frame rates. This is easier to control with pulse technologies where you can ultra-precisely control persistence (e.g. DLP, OLED, and all-at-once strobe-backlight LCD), than with phosphor-based technologies (where you can't precisely control the speed of phosphor decay), or segmented scanning backlights (where light leakage between segments complicates precise persistence control). Fortunately, the world is heading towards OLED, and persistence of OLED is controllable via adjustable pulse lengths, so this would solve the persistence problem without needing to go to ultrahigh framerates yet.
____

Note: Somebody PM'd me about
-- alternate framerate-less video technologies. Good talk stuff, but probably too far-future stuff for now.
-- technologies that uses eyetracking to increase framerates only in the portion of the image where you are looking at
-- game framerates/refresh rates eventually someday getting high enough to tolerate interpolation without lag (e.g. games running 250fps can be interpolated to 1000fps and only add 4ms of extra input lag)

Thanks,
Mark Rejhon


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #4 of 38 Old 10-26-2013, 08:22 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,587
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 228
Well G-Sync exists because high framerates are not attainable. If they were, we would simply be locked to 120fps at 120Hz and there would be no issues using V-Sync as it already exists.

I'm not sure what you describe in #2 does anything to help with the judder caused by uneven framerates.
G-Sync should have less judder than games which are running at an unlocked framerate on a regular display, because that requires frames to be repeated, but I don't see how an uneven framerate will ever produce completely smooth motion.
Chronoptimist is offline  
post #5 of 38 Old 10-26-2013, 08:48 AM - Thread Starter
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 102
Quote:
Originally Posted by Chronoptimist View Post

Well G-Sync exists because high framerates are not attainable. If they were, we would simply be locked to 120fps at 120Hz and there would be no issues using V-Sync as it already exists.
Humans would still be able to stutters on a 500Hz display: A single 1/500sec frame drop manifests itself as a sudden momentary doubling of persistence (doubling of motion blur). Eye tracking 2000 pixels/sec motion, you have 4 pixels per 500Hz refresh; and a single stutter at that, creates a sudden off-by-4-pixel discontinuity. It's still noticeable in the modern world: As we get to retina/4K/8K/imax (easier to see blur/stutters), get displays closer to our eyes for more vision coverage (easier to see blur/stutters with longer eye tracking), and faster motion (e.g. virtual reality, head turning), sharper graphics (e.g. computers instead of video), the thresholds for detecting persistence-related issues become that much higher.

Yes, it's confirmed. Vpixx has a 500Hz projector used for vision research. It's confirmed:
-- motion blur still human visible at 500fps@500Hz (e.g. 2 pixels blurring in marquee text at 1000 pix/sec)
-- single frame drops still human detectable at 499fps@500Hz. (e.g. panning full-screen patterns gets a minor 'tick')
Despite diminishing points of returns, they haven't disappeared yet at 500fps.
Quote:
I'm not sure what you describe in #2 does anything to help with the judder caused by uneven framerates.
Adding strobing won't add additional judder, if you do a blended algorithm similiar to this one:

VariableRateStrobeNapkinSketch-225x300.jpg

Before you reply regarding this diagram, read this section closely to understand the situation better. Also, I've created a test electronic circuit -- my Arduino variable-LED-flickering tests shows stroberate transitions are possible without being human-noticeable. You just have to keep light modulations quick to make sure that trailing brightness average (over human flicker fusion threshold timescales) remains as constant as possible. It's far more challenging than doing just a constant rate flicker. Look at blackbody radiation, which has lots of noise (random high-frequency brightness modulations), and we see continuous steady brightness anyway -- the high frequency brightness modulations is too high frequency to be noticed. It is simply engineering and precisely mathematically-controlled light modulation, to keep average photon output per frame constant, photon volume changes above flicker fusion thresholds and prevent noticeable flicker during rapid photon volume changes. At least for large percentiles of human population.
Quote:
G-Sync should have less judder than games which are running at an unlocked framerate on a regular display, because that requires frames to be repeated, but I don't see how an uneven framerate will ever produce completely smooth motion.
Well, it definitely does. Uneven framerates actually produces shockingly smooth motion (as confirmed by many of us who saw G-SYNC), especially at framerates >60fps. See How does G-SYNC fix stutters?

Yes, I know, this is a hard concept to wrap your head around. But here's a diagram that helps explain why continuously variable framerates (as long as the variability is at a sufficiently high frequency) look perfectly smooth, provided the variable-framerate-capture/recording/rendering is perfectly in sync with the variable-refresh-rate output (on a relative time basis). Explaining from an eye-trackign perspective:

Traditional Fixed Refresh Displays

fps-vs-hz-1024x576.png

Variable Frame Rates sync'd on Variable Refresh Rate Displays

fps-vs-hz-gsync.png

It's amazing, but true. Zero erratic stutters.

This of course, assumes, object positions in the variable frame rate, corresponds to delivery time to the human eye. Once you do that, erratic stutters are eliminated, and you don't see any erraticness during framerate transitions!!! (yes, I was impressed that this was possible).

Yes, there's a side effect. See earlier diagram of motion blur (smearing/ghosting).
Variable framerate side effects is simply variable motion blurring on steady-light-output displays (e.g. sample-and-hold, LCD).
During ~120fps variable frame rate output (fluctuating 100fps to 150fps), the rate of framerate variances would be so rapid, the variable motion blurring would blend into an average constant motion blurring).

Gamers lucky enough to have owned a 200Hz-capable CRT at one time (e.g. 2048x1536 DiamondTron, capable of 200Hz@640x480) can see easily a 1 frame stutter (e.g. 199fps@200Hz) -- and I know it would not stop there because there's plenty of opportunities for a stutter to show up. (At 2000 pixels/sec eye tracking, a single stutter at 500Hz creates a 4-pixel misalignment). In the world of higher pixel densities, closer to retinas (e.g. VR), faster and sharper motion (computer/graphics), the thresholds of detectability goes up.

People who have seen www.testufo.com (especially on a traditional 120Hz LCD computer monitor) is familiar with the relationship between motion blur and frame rate on non-light-modulated displays (e.g. sample-and-hold). On such displays, the 120fps object has half the motion blur of 60fps, and 60fps would have halve the motion blur of 30fps. 120fps one-quarter motion blur of 30fps. (though 30fps movement looks 'shaky' that the motion blur becomes visible vibration/shaky movement instead, but the amplitude of that shaking is twice as much as during 60fps). Now G-SYNC makes all framerates look like framerate=Hz, so now you've got smoother motion with no erratic stutters. Yes, you'll get "regular stutter" at low framerates (like 24fps@24Hz or 30fps@30fps) as you already see today at 30fps@30Hz or 30fps@60Hz, but beyond a certain framerate, the regular stutters become so high-frequency it looks like motion blur instead. Plus, the important thing -- no erratic stutter during framerate changes. Everything always looks framerate=Hz at all times, even through varying framerates.

There are situations where we definitely do not want to add motion blur the to original source material or the display. For things like virtual reality or video games, there are many use cases where we want motion blur to be 100% natural, completely generated by human brain, and no addition blur enforced upon us by the content/display. Motion blur is beneficial artistically, but shouldn't be a guaranteed/forced bottleneck. We (directors, content creators, users) should be able to choose to go into a zero-motion-blur mode during certain times when we need to. The chain from the director/content to the human eyeballs, should not have any motion-blur-adding bottlenecks, or Holodeck displays will be impossible.

*everyone* who saw G-SYNC in operation says there are no erratic stutters (sampling of G-SYNC news in mainstream media).

[Apologies if I've opened a Pandora's Box of multiple different topics at the same time -- but this is fascinating technology, and very fascinating stuff to display researchers like me.]

Thanks,
Mark Rejhon


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #6 of 38 Old 10-26-2013, 08:54 AM
Senior Member
 
esdwa's Avatar
 
Join Date: Apr 2008
Location: United States
Posts: 358
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 14
Quote:
@Mark Rejhon

24fps@24Hz
30fps@30Hz
60fps@60Hz
...etc

Are you sure you have clear understanding of what fps and Hz mean?
esdwa is offline  
post #7 of 38 Old 10-26-2013, 09:30 AM - Thread Starter
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 102
Quote:
Originally Posted by esdwa View Post

Are you sure you have clear understanding of what fps and Hz mean?
I do, yes. I'm one of the display experts here on AVSFORUM. I'm also the former moderator of the Home Theater Computers forum here, and I also worked in the home theater industry for a number of years. I also invented the world's first open-source 3:2 pulldown deinterlacing algorithm, which was used in dScaler more than 10 years ago, when video processors and line doublers still cost a lot of money in those days. (more info)

If I misphrased something or one of my terminologies is incorrect, point it out, and I can explain, or fix a terminology error that confused you.
I am the creator of www.testufo.com too, in addition to being Chief Blur Buster at www.blurbusters.com .

Animations
www.testufo.com/eyetracking -- motion blur caused by persistence
www.testufo.com/blackframes -- reduction of motion blur by black frames (works better on a 120Hz computer monitor)
www.testufo.com/framerates -- 30fps versus 60fps. (If you have 120Hz, this compares 30fps vs 60fps vs 120fps).
(And several other tests selectable at the top-right corner).

For motion fully synchronized to refresh rate, make sure your web browser supports full VSYNC synchronization to refresh. System requirements www.testufo.com/browser.html -- Recent system containing good GPU like AMD, nVidia, or recent Intel graphics -- and GPU accelerated browser such as Chrome. It also works best if you're not running anything else, when running the web-based motion test.

Higher persistence creates more motion blur/ghosting effects. Persistence is not the same thing as pixel transitions (GtG). For more information, read Why Do Some OLED's Have Motion Blur?, as well as the scientific references that explain sample-and-hold (persistence). John Carmack and Michael Abrash has been talking a lot lately about this as well. I am excited about better OLED displays too, though at the moment, high efficiency all-at-once strobe-backlight LCD's has less motion blur at the moment, at least until OLED improves. TFTCentral has a good explanation about strobe backlights, which are more efficient than scanning backlights, and allows some LCD's to have less motion blur than some CRT's. For those not aware -- Blur Busters is the blog that helped make LightBoost popular (low-persistence strobe backlight for LCD's), creating media coverage that refers to Blur Busters, and all the rave reviews ("It's like a CRT") by high end gamers, the YouTube high speed video proof that LCD pixel transitions can be bypassed via LightBoost. In fact, John Carmack, plus someone from NVIDIA, confirmed that an optional strobe backlight feature is now an official part of G-SYNC monitors.

Also I did Electronics Hacking: Creating A Strobe Backlight, excellent background information for engineers who wants to experiment with strobe backlight technologies.

I believe this eliminates any doubts that I know my frame rates (fps) and refresh rates (Hz). wink.gif

Thanks,
Mark Rejhon


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #8 of 38 Old 10-26-2013, 10:06 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,587
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 228
Quote:
Originally Posted by Mark Rejhon View Post

Variable Frame Rates sync'd on Variable Refresh Rate Displays

http://www.blurbusters.com/wp-content/uploads/2013/10/fps-vs-hz-gsync.png
This is exactly what I'm talking about. When the object position is no longer moving at a consistent rate, how is that perceived as smooth motion?
One of the causes of judder is due to frame repeats, which G-Sync addresses, but it does not address this.
Quote:
Originally Posted by Mark Rejhon View Post

*everyone* who saw G-SYNC in operation says there are no erratic stutters (sampling of G-SYNC news in mainstream media).
Nvidia's demo was primarily focused on fixed framerates which did not sync up with the display. E.g. 40fps at 60Hz, rather than fluctuating framerates. It was part of the demo, but honestly, I don't trust much of the tech press on this.
Chronoptimist is offline  
post #9 of 38 Old 10-26-2013, 10:19 AM - Thread Starter
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 102
Quote:
Originally Posted by Chronoptimist View Post

This is exactly what I'm talking about. When the object position is no longer moving at a consistent rate
For the game use case: you need to view both sides of the equation: The source (game) and the destination (eyeballs) Timing of object position inside the frame is now consistent with timing of object position where the eye tracking position is. Games can do that. It needs to be seen in person to be believed.
Quote:
how is that perceived as smooth motion?
It is much smoother motion because
You may see variances in edge-strobing/motion-blurring, but the object trajectory would stay in far superior sync to relative eye tracking position.

You, however, must have frame capture/generate times correspond to frame presentation times, though.
e.g. frame captured/generated for T+1.3ms presented to the human eye at T+1.3ms
frame captured/generated for T+7.4ms presented to the human eye at T+7.4ms
frame captured/generated for T+11.9ms presented to the human eye at T+11.9ms
frame captured/generated for T+21.7ms presented to the human eye at T+21.7ms
frame captured/generated for T+30.5ms presented to the human eye at T+30.5ms
(etc.)
To eliminate erratic stutters, you must keep the object positions inside the frame, to correspond to the presentation time of the frame. Yes, this only works for games, and not for prerendered content (movies, etc).

As long as the intervals between the frames are sufficiently small, and the number of frames sufficiently high, it's already perceived as smooth motion. The key is to make sure that rate changes occurs at sufficiently high enough that the average smoothness smoothes out to correspond with the average framerate; and thus 60-100fps would average out to look like smooth 80fps@80Hz motion. For on-the-fly rendered content (games), random 60-100fps on a variable framerate display looks much better than 79fps@80Hz (one stutter per second).
Quote:
Nvidia's demo was primarily focused on fixed framerates which did not sync up with the display. E.g. 40fps at 60Hz, rather than fluctuating framerates. It was part of the demo, but honestly, I don't trust much of the tech press on this.
There is an artificial stutter-injector feature in some of their demos (not shown to everyone, but to reputable people). Erratic stutters didn't become visible in the animations until they were grossly dramatic (e.g. >1/30sec between frames). This is confirmed. Yes, 30fps@30fps isn't as smooth looking at 60fps@60Hz. But now fluctuating 57-63fps would now look as perfectly smooth as 60fps@60Hz (assuming object positions inside the frames are adjusted to correspond for the frame fluctuations -- not a problem for realtime generated computer graphics).

With video games dynamically adjusting object positions in each frame based on how early/late a frame gets presented;
random fluctuating video game framerate 25-35fps now looks as smooth as 30fps@30Hz
random fluctuating video game framerate 50-70fps now looks as smooth as 60fps@60Hz
random fluctuating video game framerate 80-140fps now looks as smooth as 110fps@110Hz
(etc.)
In these situations, the display motion blur varies only by about ~20% (proportional to the average variance of the interval between frames), not noticeable when the blurtrail length modulates at very high frequencies (e.g. 110Hz average), it averages out to a fixed 110Hz-lile motion blur looking darn near identical to 110fps@110Hz. Motion blur size modulations are FAR LESS noticeable than stutters. When times between frames randomizes so quickly, the motion blur size modulates at levels above flicker fusion threshold, so the motion blur size stays visually constant. Constant perceived smoothness, constant perceived blur size. Even when stutters varied a lot more (>20% variance), the visibility of motion blur modulations is less noticeable than the visibility of stutters.

Most PC videogames already do this (they adjust object positions based on when they think the frame will be presented to the screen). The timing, is however bottlenecked/distorted by the forced granular refresh rate of traditional fixed-refresh-rate displays.. Games do this anyway to be refresh-rate-independent, and allow accurate object positions to keep VSYNC OFF smoother than otherwise (e.g. frame rates beyond refresh rate). G-SYNC simply eliminates the granular discrete refresh rates (the last frame timing weak link), finally making possible perfect synchrony between object positions (in computer) and the frame hitting human eye retinas (at least during G-SYNC native framerate range, currently 30fps to 144fps in the first upcoming monitors = intervals between frames varies between 1/30sec and 1/144sec), independently of when the frames are created.
The source frame timing equals the destination frame timing. Zero stutters during variable frame rates (above a threshold, ~60fps). Confirmed in demos.

This is amazing new science of exploration for vision researchers. Without G-SYNC, even just 1 frame drops is generally always noticed during consistent-speed motion tests (59fps@60Hz). With G-SYNC, it is impressive that random framerates varying within a percentage (10%-20%) be generally not noticeable at all during 60fps+ situations, if source (timing of object positions) stayed in sync with destination (timing of presentation to eyeballs). Variable refresh rate monitors allow framerates to dynamically vary more until the framerate variances are noticed. Researchers of the future will study: How much does framerates need to vary until framerate variances become noticeable, by more than 50% of population? Etc. It's all an amazing new territory to explore, in the new reality of variable frame rate displays.

If game framerates were never variable, G-SYNC would not be necessary. But playing games like Crysis3, we've got framerate variances all the way from 40fps to 144fps depending on which parts of the game we are playing on our 120Hz/144Hz computer monitor. G-SYNC is a godsend for those scenarios. Goodbye unnecessary 30fps caps, if implemented into HDMI 3.0, and tomorrow's consoles take advantage of it.

Now, for pregenerated content, variable framerate has different advantages/purposes (see my original post above, re-read the first post). The advantages for prerendered content (movies/video) is different from the stutter-elimination advantages for video games (since you can realtime adjust positions of objects based on knowledge on when that specific frame will be presented to the human eye. This is something you cannot do for prerendered content like movies/games).

Thanks,
Mark Rejhon


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #10 of 38 Old 10-27-2013, 10:34 AM
Newbie
 
Alkaizer's Avatar
 
Join Date: Jul 2007
Posts: 10
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
so which tech is better for motion blur? this g-sync or lightboost?

also our TVs is causing problem.

dont u think lightboost display with G-sync combined would make best and beat crt
Alkaizer is offline  
post #11 of 38 Old 10-27-2013, 10:36 AM
Advanced Member
 
tory40's Avatar
 
Join Date: Apr 2011
Posts: 749
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 16
Great idea, but do you think there is a way to implement this into HDMI 2.0? I would bet that 3.0 is a long ways off, IMO anyway. I wonder if there is a way to do this now with 2.0 using a firmware change. Surely the process that starts a refresh is controlled by code? Maybe not, but.. perhaps If the process is started each time by re-writable code, then i could be timed? If its stoppable by re-writable code, it could be delayed?

Makes me wonder what if anything in all the fancy new display features could possibly have caused TV manufacturers to built in software controlled refresh.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

tory40 is offline  
post #12 of 38 Old 10-28-2013, 01:15 PM - Thread Starter
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 102
Quote:
Originally Posted by Alkaizer View Post

so which tech is better for motion blur? this g-sync or lightboost?
G-SYNC monitors are better than LightBoost; they include a sequel to LightBoost.

LightBoost
-- Came out in late 2011 for 3D Vision.
-- Unofficially became popular for 2D motion blur elimination since early 2013. (Google "lightboost")\
-- Degrades color somewhat.
-- Not officially sanctioned by NVIDIA for 2D usage.

G-SYNC
-- Hits the market early 2014.
-- G-SYNC is not LightBoost, but G-SYNC monitors includes an optional mode that's a LightBoost sequel that is "superior". (citation)
-- Should have better color than LightBoost
-- Officially sanctioned by NVIDIA for 2D usage.
-- Has other benefits such as variable refresh rate to eliminate tearing/stutters (although you can only choose between variable-refresh mode or strobing-mode, but you get both options in any G-SYNC monitor)
Quote:
also our TVs is causing problem.
Well, the problem isn't as big for televisions/movies as for video games, because video games create sharper & faster motion, which makes motion blur easier to see. A lot of high-paying elite gamers hate artificial external motion blur (either source-based or display-based) from adding to fully 100% natural motion blur generated by our brains; we often don't want the display/game to be the motion blur bottleneck.

Blur Busters also exists because there's enough of these types of gamers, in addition to people like me....
Even myself, who only uses a single-monitor, single-GPU setup, dislike input lag and motion blur too.
Input lag haters, motion blur haters. Different priorities than for video.

However, intrinsically, variable refresh rates and low latency has so many applications in the near future, since other, more exotic technologies (e.g. direct brain interfaces, lasers into retina, eye tracking for higher refresh rates only where eye is pointing) will take far too long to arrive at consumer prices, so we're stuck with a dependance on traditional pixel-matrix technologies (LCD, OLED, etc) for the forseeable future.

LCD, LCoS, DLP, OLED, and discrete-pixel LED, all can be technologically easily made variable-refresh rate (rate adaptive to frame rate, at dynamically high speeds) without visible flickering, while CRT / plasma is, alas, more complex to eliminate flickering of variable refresh rates with (but you can use internal display electronics to choose the closest-matching refresh rate or refresh rate multiple, and then convert incoming variability into that).
Quote:
dont u think lightboost display with G-sync combined would make best and beat crt
Theoretically, yes.

Combining G-SYNC and LightBoost is very appealing. With G-SYNC, you have the ability to do faster scanout (and have less input lag than the bottom edge of a CRT. CRT only has fully zero signal lag for the top edge of the image only; a CRT still takes a finite amount of time to "scan" from the top edge of the screen to the bottom edge), so LCD's with less input lag than CRT is possible for fully buffered refreshes (VSYNC ON, G-SYNC, etc) because of the faster scan-out of the entire refreshes. However, if you add strobing too as well, the display has to wait for a refresh to completely finish before strobing. However, that doesn't stop a LightBoost+G-SYNC display having less average input lag (including bottom edge of screen) than a CRT, due to more instantaneous full-screen presentation of images, instead of the old-fashioned CRT scanning way...

Thanks,
Mark Rejhon


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #13 of 38 Old 10-28-2013, 02:35 PM - Thread Starter
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 102
Quote:
Originally Posted by tory40 View Post

Great idea, but do you think there is a way to implement this into HDMI 2.0? I would bet that 3.0 is a long ways off, IMO anyway. I wonder if there is a way to do this now with 2.0 using a firmware change. Surely the process that starts a refresh is controlled by code? Maybe not, but.. perhaps If the process is started each time by re-writable code, then i could be timed? If its stoppable by re-writable code, it could be delayed?
Yes, HDMI 2.0 could technically be upgraded to variable refresh rates. A specification would be needed. Might as well call it HDMI 2.5 or something, to prevent confusion, but without waiting for HDMI 3.0. Variable refresh rates can be as simple as using dynamically-resized blanking intervals; something that can be done using any traditional signal with a synchronization interval (which includes VGA, HDMI, DVI, etc). The problem is in hardware that can output it, and displays that can accept that. There are major complexities in having a display become truly variable refresh rates without artifacts during refresh rate transitions. G-SYNC monitors can change refresh rates more than 100 times a second (every single frame!) -- without refresh-rate transition artifacts. LCD, LCoS, DLP, OLED, and discrete-pixel LED, all can be technologically easily made variable-refresh rate (rate adaptive to frame rate, at dynamically high speeds) without visible flickering, while CRT / plasma is much harder (flicker caused by variable refresh).

Thanks,
Mark Rejhon


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #14 of 38 Old 10-28-2013, 10:15 PM
Advanced Member
 
tory40's Avatar
 
Join Date: Apr 2011
Posts: 749
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 16
Wheres the petition page? Seems like they could really fill in a sizable niche with little cost by providing "A+ gaming certified" TVs with a G-sync-like solution, low input lag, high pixel response times for 3D and 120hz+ motion enhancements. They could add it to just a single line of Tvs.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

tory40 is offline  
post #15 of 38 Old 10-29-2013, 05:41 AM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,421
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 430 Post(s)
Liked: 797
Quote:
Originally Posted by Mark Rejhon View Post

However, that doesn't stop a LightBoost+G-SYNC display having less average input lag (including bottom edge of screen) than a CRT, due to more instantaneous full-screen presentation of images, instead of the old-fashioned CRT scanning way...

 

There certainly is no vertical blanking interval any longer, but a monitor still needs to perform a fetch from a backing store someplace, and that's done one at a time unless the memory is multiported or partitioned for parallel fetches, no?  Either top to bottom, or in bands.

 

We're not yet at the pie in the sky era of having every pixel latched to a memory location and having the two change asynchronously to everything else.


Beware the statistical correlations that sound like they're indicative of something. Drowning deaths are tightly correlated to ice cream consumption. In fact, be wary of any statistic that is stated as if it comes with a self-evident conclusion: there is no such thing.
tgm1024 is offline  
post #16 of 38 Old 10-29-2013, 02:31 PM - Thread Starter
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 102
Quote:
Originally Posted by tgm1024 View Post

There certainly is no vertical blanking interval any longer, but a monitor still needs to perform a fetch from a backing store someplace, and that's done one at a time unless the memory is multiported or partitioned for parallel fetches, no?  Either top to bottom, or in bands.
On the display, nope...
On the cable, yes....
The blanking interval still exists in DisplayPort, HDMI, DVI.

My two gaming monitors (one 120Hz, one 144Hz) scans the LCD's, real-time directly from the cable, without any framebuffering whatsoever, in regular gaming mode. My high speed video showed this, and I also measured 2.8 milliseconds of input lag for the top edge of screen, between the computer side and the pixels finishing the 50% midpoint of transition (on the photodiode). Both measurement shows the realtime scanout nature.

From what I know now, G-SYNC behaves the same way; the scanouts is done on fly as the bits come in from the cable. But now the scanouts are done on demand rather than at a regularly scheduled interval (There might be a backing store of a single scanline for some processing, but definitely not full frame buffering). Frame buffering is also done as history (past) framebuffers to help with realtime on-the-fly LCD overdrive calculations.

Band-scanning is currently discouraged as that creates tear-artifacts during fast horizontal motion. A clean sweep scan even still creates skew artifacts (e.g. www.testufo.com/blurtrail during "Height" = "Full Screen" mode creates a tilt on 60Hz CRT's and 60Hz LCD's, including on iPad's in landscape mode -- try it!), while zone/band scanning creates stationary tear artifacts. A good old 1990's paper about the artifacts of band scanning: http://www.poynton.com/PDFs/Motion_portrayal.pdf (see page 5)
Also, the Sony Crystal LED prototype (not OLED) from a year ago had band-scanning artifacts during fast horizontal motion, which was noticed in fast pans.

A rep from NVIDIA did say they they use the variable-blanking-interval method to achieve variable refresh rates. Although the display itself (e.g. LCD) do not really need blanking intervals, they are still used on the cable medium, and this legacy feature is still carried over all the way to DisplayPort. You've seen the timings numbers in ToastyX Custom Resolution Utility, NVIDIA Custom Resolution Utility (both modern equivalents to PowerStrip), and they still allow you to adjust the blanking intervals and porch timings, etc. Even though the displays have moved on from needing them, they are still a legacy part of the signal. Yes, that means about 10% of the bandwidth for transmitting refreshes over a cable, is wasted in blanking intervals. Reduced blanking intervals are used to achieve 144Hz, using roughly the same bandwidth as 120Hz.

Thanks,
Mark Rejhon


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #17 of 38 Old 10-29-2013, 02:53 PM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,421
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 430 Post(s)
Liked: 797
Quote:
Originally Posted by Mark Rejhon View Post
 
Quote:
Originally Posted by tgm1024 View Post

There certainly is no vertical blanking interval any longer, but a monitor still needs to perform a fetch from a backing store someplace, and that's done one at a time unless the memory is multiported or partitioned for parallel fetches, no?  Either top to bottom, or in bands.
On the display, nope...
On the cable, yes....
The blanking interval still exists in DisplayPort, HDMI, DVI.

My two gaming monitors (one 120Hz, one 144Hz) scans the LCD's, real-time directly from the cable, without any framebuffering whatsoever, in regular gaming mode.


Imagine that the computer drawing needs to slow down a moment or two, either because of some complexity limit server side, or because there's a new monitor concept of updating only regions and parts of it are stagnant.  (Disparate issues, disparate display tech, with the same problem).  You'll then have a case where the monitor cannot reflash the frame on its own (to defeat the strobing) unless the entire frame is present within the display.  It needs to reflash the frame at a minimum interval of the persistence of vision.


Beware the statistical correlations that sound like they're indicative of something. Drowning deaths are tightly correlated to ice cream consumption. In fact, be wary of any statistic that is stated as if it comes with a self-evident conclusion: there is no such thing.
tgm1024 is offline  
post #18 of 38 Old 10-29-2013, 03:29 PM - Thread Starter
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 102
Quote:
Originally Posted by tgm1024 View Post

Imagine that the computer drawing needs to slow down a moment or two, either because of some complexity limit server side, or because there's a new monitor concept of updating only regions and parts of it are stagnant.  (Disparate issues, disparate display tech, with the same problem).  You'll then have a case where the monitor cannot reflash the frame on its own (to defeat the strobing) unless the entire frame is present within the display.  It needs to reflash the frame at a minimum interval of the persistence of vision.
Right, this is more of a consideration for impulse displays.
Currently, G-SYNC monitors are sample-and-hold. This would thus, not be a problem/consideration for LCD, OLED, or DLP, all of which can be technically made variable-refresh-rate.

For strobe backlights in gaming monitors, I drew diagrams here on how you can blend PWM-free (at low frame rates) and strobing (at high frame rates), to get flickerfree at low refresh rates, and strobing at high refresh rates:
http://www.blurbusters.com/faq/creating-strobe-backlight/#variablerefresh
Diagram is here for a flicker-free variable-rate strobing algorithm, for 120Hz video game monitors, since I'd love to see NVIDIA attempt to combine G-SYNC and strobing. I've heard NVIDIA is reportedly working on this, already. When 120Hz gets standardized with consumers within ten years hopefully (e.g. NHK 8K 120Hz), there will be more practical possibilities in regards to strobing, since that's less problematic to do in an interpolation-free way at 120Hz than at 60Hz...

But this, clearly, is a separate topic altogether, as eventually displays may migrate to low-persistence with zero light modulation (e.g. ultrahigh frame rates or other exotic technologies).

Thanks,
Mark Rejhon


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #19 of 38 Old 10-29-2013, 04:25 PM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,421
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 430 Post(s)
Liked: 797
Like 8,000,000-ported memory where each pixel is updated instantly and asynchronously from the others.... smile.gif

Beware the statistical correlations that sound like they're indicative of something. Drowning deaths are tightly correlated to ice cream consumption. In fact, be wary of any statistic that is stated as if it comes with a self-evident conclusion: there is no such thing.
tgm1024 is offline  
post #20 of 38 Old 11-18-2013, 09:06 AM - Thread Starter
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 102
I'm pretty curious what single-cable standard NHK 8K 120Hz would go over. It's going to require essentially 8 times the bandwidth of HDMI 2.0. Does anyone know what they're discussing?

On such a cable, over 1000fps realtime at 1080p should be easily possible. (Or any form of display technology that creates <1ms of flickerfree/non-light-modulated persistence)

Thanks,
Mark Rejhon


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #21 of 38 Old 11-18-2013, 09:48 AM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,502
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 35 Post(s)
Liked: 63
Quote:
Originally Posted by Mark Rejhon View Post

I'm pretty curious what single-cable standard NHK 8K 120Hz would go over. It's going to require essentially 8 times the bandwidth of HDMI 2.0. Does anyone know what they're discussing?

On such a cable, over 1000fps realtime at 1080p should be easily possible. (Or any form of display technology that creates <1ms of flickerfree/non-light-modulated persistence)

Wait till ~2020. This is rather hard problem for todays tech.

irkuck
irkuck is offline  
post #22 of 38 Old 11-18-2013, 10:26 AM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,421
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 430 Post(s)
Liked: 797
Every year is "the wrong year to buy a tv"...

Beware the statistical correlations that sound like they're indicative of something. Drowning deaths are tightly correlated to ice cream consumption. In fact, be wary of any statistic that is stated as if it comes with a self-evident conclusion: there is no such thing.
tgm1024 is offline  
post #23 of 38 Old 12-13-2013, 08:10 AM - Thread Starter
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 102
Incidentally, I just received a G-SYNC display, and have written my findings about G-SYNC.

I've confirmed that the motion behaves exactly as I have described. You do get the "low framerate feel" of lower frame rates (some call it "regular stutter", other call it "edge strobing", and yet others "stop-motion feel"). However, there is zero random stutters, and zero framerate-transition-caused stutters.

Thanks,
Mark Rejhon


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #24 of 38 Old 01-02-2014, 09:53 AM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,421
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 430 Post(s)
Liked: 797

Mark, are you aware of any rumors regarding variable refresh rate for CES14?

 

This could really be interesting if it maintains momentum.


Beware the statistical correlations that sound like they're indicative of something. Drowning deaths are tightly correlated to ice cream consumption. In fact, be wary of any statistic that is stated as if it comes with a self-evident conclusion: there is no such thing.
tgm1024 is offline  
post #25 of 38 Old 01-02-2014, 12:31 PM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,421
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 430 Post(s)
Liked: 797
Quote:
Originally Posted by tgm1024 View Post
 

Mark, are you aware of any rumors regarding variable refresh rate for CES14?

 

This could really be interesting if it maintains momentum.

 

Given that within moments of posting this I received a PM from someone busting on me for even asking, I'm assuming not.  LOL!


Beware the statistical correlations that sound like they're indicative of something. Drowning deaths are tightly correlated to ice cream consumption. In fact, be wary of any statistic that is stated as if it comes with a self-evident conclusion: there is no such thing.
tgm1024 is offline  
post #26 of 38 Old 01-07-2014, 03:14 PM - Thread Starter
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 102
I'm curious who sent you a PM. (it wasn't me).

I bet you're joking about me not being aware of it. Ha.
Yes, I am aware of the developments at CES 2014.
I posted about it already on BlurBusters.

I don't post much on here, since I've been focussing on the newly-launched Blur Busters Forum which is taking off rapidly in so short a time period.

So ontopic... Yes, AMD FreeSync is rather interesting!
This could be a huge step towards an open VRR technology that might someday migrate into HDMI.
Meanwhile, Oculus just showed off a low-persistence OLED prototype VR goggles, too!
And a new 2560x1440p GSYNC monitor got announced, so we finally have VRR and strobing in QFHD.

Thanks,
Mark Rejhon


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #27 of 38 Old 01-07-2014, 03:48 PM
Advanced Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 797
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 57 Post(s)
Liked: 135
Quote:
Originally Posted by Mark Rejhon View Post

I'm curious who sent you a PM. (it wasn't me).

I bet you're joking about me not being aware of it. Ha.
Yes, I am aware of the developments at CES 2014.
I posted about it already on BlurBusters.

I don't post much on here, since I've been focussing on the newly-launched Blur Busters Forum which is taking off rapidly in so short a time period.

So ontopic... Yes, AMD FreeSync is rather interesting!
This could be a huge step towards an open VRR technology that might someday migrate into HDMI.
Meanwhile, Oculus just showed off a low-persistence OLED prototype VR goggles, too!
And a new 2560x1440p GSYNC monitor got announced, so we finally have VRR and strobing in QFHD.

Very exciting stuff, especially for the two AMD-driven next gen consoles, which might even have support for VBLANK in hardware, and if they do, all that remains is figuring out if it's possible to run Freesync on them. If they do, it will not only result in smoother games, but much better visual quality as well, since it won't matter if you don't quite hit that 60 FPS target, anything between 30 and 60 should seem smooth.

I wonder if a firmware update and an API update on the consoles could be enough to make it work, but that's only if you can detect via EDID whether a display can support VBLANK, because currently the TV resolution is hidden from game developers (I am one). You make the game in 720p or 1080p, and the console TV settings do scaling up or down (or not at all), from there. We'd need a similar thing in drivers for the consoles to detect if your TV can support this mode, since consoles need to "just work". Or perhaps add an extra checkbox, or check the EDID, or just the model name and number from the HDMI signal of the attached TV (yes, I know that doesn't work if you have a receiver in between).

Let's hope AMD and Sony can enable that, then MS will rush to catch up to support it in DX11 (or vice versa). That's why competition is great. I can even see Steamboxes with either GSYNC of Freesync, pushing each other forward (ideally with Freesync winning out, since it's unlikely AMD will ever license a proprietery blanking / signaling tech that's already built in to VESA spec).

So, questions that need answering, once Catalyst drivers enable Freesync to the public :
1) Can HDMI 1.3 / 1.4 or 2.0 all output a suitable signal for variable refresh? With or without an update to the spec, or firmwares of input ports or output ports. If it's just a matter of altering the signal slightly, it shouldn't require hardware chances in the actual HDMI ports. If HDMI cannot, it will have extremely limited use, although according to articles I read about G-Sync, there's no real reason why this should only work with Display Port 1.2 and above and not HDMI.
2) Since most TVs could support VBLANK, at least with a firmware update (according to AMD's CEO, at least), we will need to compile a list of TVs or monitors or projectors that can actually listen to, and correctly interpret, variable refresh timings. That's assuming AMD releases their drivers to the public, which they should, in response to G-Sync. If it's something that some, or many HDMI displays can support, even without a firmware update, it should be only a matter of time before manufacturers update their current, or at least future, TVs to support Freesync. At that point, you'll see many game developers rejoicing because they can increase the quality levels in games so that they don't need to target 60 FPS minimum, they can target between 30 and 60 and it should look very smooth regardless of variance in frame rendering time.

Mark, do you know anything about the HDMI EDID data that can tell us whether a monitor supports VBLANK? That would be the first step, to compile a list of those that do. Perhaps someone with those Toshiba laptops from the CES 2014 Freesync demo referred to over at Anandtech, can rip out their display's EDID data and we can analyze it. Once we know whether it can be used to distinguish if a display would support variable VBLANK, then it's just a matter of combing the net for all of them, and encouraging manufacturers to update their display firmwares. I personally would jump up and down if I could get BenQ to update my w1070 projector to support Freesync over HDMI, that would be incredible. BenQ has been pretty good about adding new 3D formats, and are one of the companies that is putting G-SYNC into LCD panels this year or next, so it would seem short-sighted for them (and other manufacturers) to not support both approaches.

Even if G-sync ends up being slightly better (1 frame less lag, perhaps, depending on how the third buffer is a backbuffer or adds more latency) than Freesync, this is all terrific news for videogames. Hopefully we can all figure out these issues. The sooner someone on the net gets their hands on a Radeon catalyst driver with Freesync enabled, it's off to the horse races to figure out if it works on common-place HDMI TVs or monitors, or even on the rare one here and there. Because once that happens, you can compare firmwares and try to haxx0r it in, to different models from the same manufacturer. Yeah, it's much better to wait for the manufacturers to do it themselves, but I love H/W haxx0ring like you guys do at Blur Busters, keep up the good work! If I still gamed on puny TV or monitors, I'd use your stuff, but I can't get over the superiority of my 100 inch 3D DLP projector, it kicks ass.

I'm considering trying to get 1400 x 900 working in 120hz 3D on my BenQ using some of those tweak programs, that would be a good compromise. Or even maybe a 2:35 to 1 resolution with 120hz, that would be killer for 3D. It's too bad AMD sucks for supporting stereo 3D in games, I was about to guy a Maxwell GPU but now I will have to wait to see how this Freesync news shakes down. Should be an interesting couple of months.
RLBURNSIDE is offline  
post #28 of 38 Old 01-07-2014, 04:29 PM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,421
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 430 Post(s)
Liked: 797
Quote:
Originally Posted by Mark Rejhon View Post

I'm curious who sent you a PM. (it wasn't me).

I bet you're joking about me not being aware of it. Ha.
Yes, I am aware of the developments at CES 2014.

 

No, I wasn't quizzing you about whether or not you knew of any rumors that I knew existed.  I was asking you if there were any rumors for stuff to watch for.  :)

 

I'm glad this tech seems to be gaining at least a little traction.


Beware the statistical correlations that sound like they're indicative of something. Drowning deaths are tightly correlated to ice cream consumption. In fact, be wary of any statistic that is stated as if it comes with a self-evident conclusion: there is no such thing.
tgm1024 is offline  
post #29 of 38 Old 01-07-2014, 04:36 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,587
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 228
Quote:
Originally Posted by RLBURNSIDE View Post

Even if G-sync ends up being slightly better (1 frame less lag, perhaps, depending on how the third buffer is a backbuffer or adds more latency)
This is my main concern with Freesync.

G-Sync fixes both stuttering/tearing problems, and latency.
It sounds like Freesync is triple-buffered, which means that you have two additional frames of latency compared to G-Sync.

If it becomes a VESA Standard, it's a lot more likely to be adopted by television manufacturers though, so in that regard it's a step forward, as it would apply to more than just PC gaming monitors. (which are tiny, and use poor quality panels)

I think people are being overly optimistic about it being implemented in consoles just because they're using AMD hardware.
Console games rarely ever use triple-buffering to eliminate tearing, and it assumes that this is something which Sony/Microsoft could implement, or would have an interest in implementing. Sony seems like the likely candidate as they sell both consoles and displays though - but unless it's possible via a firmware update on their 2014 displays, which seems unlikely, it's probably at least a year away.
Chronoptimist is offline  
post #30 of 38 Old 01-07-2014, 07:37 PM
Advanced Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 797
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 57 Post(s)
Liked: 135
Yeah, but I'm very skeptical that G-Sync has less latency than that Toshiba laptop has, so the question is, does it necessarily have less latency than a good Freesync implementation with a discrete graphics card and non-integrated display.

Don't forget, triple buffering doesn't mean three buffers, back to back, it means one front buffer and two back buffers, and the GPU merely alternating which backbuffer it writes to to avoid lock stalls (due to v-sync being on). Naive triple buffering would simply add lag without solving tearing, which would be pointless. The entire benefit of triple buffering is to lock on the GPU you're writing to, so the other one is immediately free to send to the front buffer and down the wire. Then the question becomes, how can NVidia only use two front (or back) buffers, one to write to and the other to snapshot down the wire directly, but Freesync can't. I'm skeptical that the CEO knows what he's talking about here, frankly.

In any case, V-SYNC is on, on Next Gen consoles, by fiat from on high, and it's not up to the end user to disable it (much to the chagrin of some of my gamer buddies, some of whom prefer more framerate and less lag over being tear-free, and others hate tearing with a passion). I just don't see what's so magical about G-Sync that VBLANK can't be implemented with exactly the same latency in every respect.

The question is, about the HDMI spec, I mean, are fixed v-blank intervals baked into the assumption for the video signal itself? It don't think so, it's more a question of the HDMI ports, and the videocard and display firmware. If AMD's engineers are as good as we think, with some forethought, they'd have shipped Xbox One and PS4 with the same capability to do Freesync.

I'm not too worried about PCs getting this tech (thought it'd certainly be better if both companies used the same Freesync, which will end up being the case eventually, I'm sure), and simply let gamer-centric manufacturers support G-sync. I mean, it's kind of absurd if it's part of the VESA spec and they can implement it themselves, to not do so.
RLBURNSIDE is offline  
Reply OLED Technology and Flat Panels General

Tags
Nvidia

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off