HDMI 3.0 -- Adopt Variable Refresh Rates aka G-SYNC? (Good for video, too!) - Page 2 - AVS Forum
Forum Jump: 
 
Thread Tools
Old 01-07-2014, 09:37 PM - Thread Starter
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,125
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 102
Quote:
Originally Posted by RLBURNSIDE View Post

Very exciting stuff, especially for the two AMD-driven next gen consoles, which might even have support for VBLANK in hardware, and if they do, all that remains is figuring out if it's possible to run Freesync on them. If they do, it will not only result in smoother games, but much better visual quality as well, since it won't matter if you don't quite hit that 60 FPS target, anything between 30 and 60 should seem smooth.

I wonder if a firmware update and an API update on the consoles could be enough to make it work, but that's only if you can detect via EDID whether a display can support VBLANK, because currently the TV resolution is hidden from game developers (I am one). You make the game in 720p or 1080p, and the console TV settings do scaling up or down (or not at all), from there. We'd need a similar thing in drivers for the consoles to detect if your TV can support this mode, since consoles need to "just work". Or perhaps add an extra checkbox, or check the EDID, or just the model name and number from the HDMI signal of the attached TV (yes, I know that doesn't work if you have a receiver in between).

Let's hope AMD and Sony can enable that, then MS will rush to catch up to support it in DX11 (or vice versa). That's why competition is great. I can even see Steamboxes with either GSYNC of Freesync, pushing each other forward (ideally with Freesync winning out, since it's unlikely AMD will ever license a proprietery blanking / signaling tech that's already built in to VESA spec).

So, questions that need answering, once Catalyst drivers enable Freesync to the public :
1) Can HDMI 1.3 / 1.4 or 2.0 all output a suitable signal for variable refresh? With or without an update to the spec, or firmwares of input ports or output ports. If it's just a matter of altering the signal slightly, it shouldn't require hardware chances in the actual HDMI ports. If HDMI cannot, it will have extremely limited use, although according to articles I read about G-Sync, there's no real reason why this should only work with Display Port 1.2 and above and not HDMI.
2) Since most TVs could support VBLANK, at least with a firmware update (according to AMD's CEO, at least), we will need to compile a list of TVs or monitors or projectors that can actually listen to, and correctly interpret, variable refresh timings. That's assuming AMD releases their drivers to the public, which they should, in response to G-Sync. If it's something that some, or many HDMI displays can support, even without a firmware update, it should be only a matter of time before manufacturers update their current, or at least future, TVs to support Freesync. At that point, you'll see many game developers rejoicing because they can increase the quality levels in games so that they don't need to target 60 FPS minimum, they can target between 30 and 60 and it should look very smooth regardless of variance in frame rendering time.

Mark, do you know anything about the HDMI EDID data that can tell us whether a monitor supports VBLANK? That would be the first step, to compile a list of those that do. Perhaps someone with those Toshiba laptops from the CES 2014 Freesync demo referred to over at Anandtech, can rip out their display's EDID data and we can analyze it. Once we know whether it can be used to distinguish if a display would support variable VBLANK, then it's just a matter of combing the net for all of them, and encouraging manufacturers to update their display firmwares. I personally would jump up and down if I could get BenQ to update my w1070 projector to support Freesync over HDMI, that would be incredible. BenQ has been pretty good about adding new 3D formats, and are one of the companies that is putting G-SYNC into LCD panels this year or next, so it would seem short-sighted for them (and other manufacturers) to not support both approaches.

Even if G-sync ends up being slightly better (1 frame less lag, perhaps, depending on how the third buffer is a backbuffer or adds more latency) than Freesync, this is all terrific news for videogames. Hopefully we can all figure out these issues. The sooner someone on the net gets their hands on a Radeon catalyst driver with Freesync enabled, it's off to the horse races to figure out if it works on common-place HDMI TVs or monitors, or even on the rare one here and there. Because once that happens, you can compare firmwares and try to haxx0r it in, to different models from the same manufacturer. Yeah, it's much better to wait for the manufacturers to do it themselves, but I love H/W haxx0ring like you guys do at Blur Busters, keep up the good work! If I still gamed on puny TV or monitors, I'd use your stuff, but I can't get over the superiority of my 100 inch 3D DLP projector, it kicks ass.

I'm considering trying to get 1400 x 900 working in 120hz 3D on my BenQ using some of those tweak programs, that would be a good compromise. Or even maybe a 2:35 to 1 resolution with 120hz, that would be killer for 3D. It's too bad AMD sucks for supporting stereo 3D in games, I was about to guy a Maxwell GPU but now I will have to wait to see how this Freesync news shakes down. Should be an interesting couple of months.
With your permission, may I crosspost your message in the GSYNC subforum on Blur Busters Forums?
There's some technical/engineer types who are visiting now, that might be able to elaborate more on GSYNC versus FreeSync.

I am not sure what the latency differences are, but just a few days ago, I high speed 1000fps camera tests on my GSYNC monitor with a modified mouse (LED indicator hardwired to mouse button) for button-to-pixels latency measurements via high speed camera. These high speed camera tests are for my as-of-yet-unpublished GSYNC Article Part #2.
I've found virtually negligible differences between VSYNC OFF versus GSYNC (~1-2ms) during sub-144fps framerates. The monitors are unbuffered real-time scanout, so the first pixels start transitioning at the top edge of the screen within 2-3 milliseconds of Direct3D Present() for a very simple blank framebuffer-output test. That said, we have to keep in mind of the complicated whole-chain input latency (often approaching 100ms), as seen in AnandTech Exploring Input Lag Inside and Out.

....versus....

An older game such as Counterstrike:GO, running at hundreds of frame per second VSYNC OFF, can have less than 20ms button-to-photons latency (actual high speed camera test), while Battlefield 4, running at low framerates and low physics tickrates (10 ticks per second), can have well over 60ms button-to-photons latency (even on a Geforce Titan, VSYNC OFF, outputting to a 144Hz monitor).
RLBURNSIDE and Chronoptimist like this.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
Sponsored Links
Advertisement
 
Old 01-07-2014, 09:42 PM - Thread Starter
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,125
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 102
Quote:
Originally Posted by RLBURNSIDE View Post

1) Can HDMI 1.3 / 1.4 or 2.0 all output a suitable signal for variable refresh? With or without an update to the spec, or firmwares of input ports or output ports. If it's just a matter of altering the signal slightly, it shouldn't require hardware chances in the actual HDMI ports. If HDMI cannot, it will have extremely limited use, although according to articles I read about G-Sync, there's no real reason why this should only work with Display Port 1.2 and above and not HDMI.
Oh -- I should add:

Intrisinically, I don't think it requires much modification. Just dynamically add/remove scanlines from the blanking intervals, to pad the time until the next refresh. Long ago, you actually could use PowerStrip in the old days, and slowly increment/decrement the values on the fly without any image disruptions. So, VRR can actually be achieved this way in a pretty simple manner. The trick is modifying the firmwares on both ends to handle dynamically resizeable VSYNC. It would be a very simple change to a video signal to support variable refresh rate.

The trick is you need support along the whole chain. Video source (graphics drivers, graphics hardware, HDMI transceiver) and the display (HDMI transciever, display drivers, TCON, etc), in order for it all to work properly. Variable refresh rate (+/- 1Hz) have long been supported by a lot of old displays, ever since old VCR's often varied by a tiny bit in their signal rate, and displays has had to adapt. Today, we're now asking for gargantuan variability in refresh rate (including blanking intervals bigger than the actual picture), to use the refresh rate variability for a different useful purposes.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
Old 01-08-2014, 05:39 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,644
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 260
Quote:
Originally Posted by RLBURNSIDE View Post

Yeah, but I'm very skeptical that G-Sync has less latency than that Toshiba laptop has, so the question is, does it necessarily have less latency than a good Freesync implementation with a discrete graphics card and non-integrated display.
G-Sync has 1ms latency on top of the time it takes to render a frame.
Quote:
Originally Posted by RLBURNSIDE View Post

Don't forget, triple buffering doesn't mean three buffers, back to back, it means one front buffer and two back buffers, and the GPU merely alternating which backbuffer it writes to to avoid lock stalls (due to v-sync being on).
That's true - real triple-buffering should not add another frame of latency. However most implementations today do add another frame of latency.
Quote:
Originally Posted by RLBURNSIDE View Post

Then the question becomes, how can NVidia only use two front (or back) buffers, one to write to and the other to snapshot down the wire directly, but Freesync can't.
Because Nvidia is using hardware that polls at 1ms looking for updates. As soon as a frame is rendered, it's sent to the display. G-Sync replaces V-Sync, you're not even double-buffering the video.
Chronoptimist is offline  
Old 01-09-2014, 01:42 PM
Advanced Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 933
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 131 Post(s)
Liked: 172
Mark, post away! I would LOVE for this to work on Next Gen consoles, we need the entire industry to push forward on this. Game devs such as myself are massively hindered by the 60hz or die restriction. We optimize as much as we can, but then if a frame takes 17ms instead of 15, it not only studders, but results in 15ms extra lag (one frame behind). And then you get frame lag pileups and everybody notices this, no matter how good your GPU is, even on the best PC hardware, not just on consoles. Freesync would be a game changer for next gen if it can be made to work, and hopefully Steamboxes and the rivalry between MS and Sony can help push them to allow AMD to update their graphics drivers and even their own HDMI firmware if necessary.

On the user end, getting Freesync to work on commercially available displays and TVs, and compiling a list of those, could lend weight to the movement for smoother graphics on consoles. I'm sure it'll happen on PCs if we push it, then the question is, since AMD GPUs are both in NG consoles, why not enable it there too, if it's technically feasible. One thing's for sure, if Sony releases it on PS4, MS will do so on Xb1, and vice versa. Especially Xb1, which is behind in terms of raw GPU power, and would have a much harder time maintaining 60fps at 1080p (probably impossible), while using some form of FXAA and similar quality levels that PS4 builds use.
RLBURNSIDE is online now  
Old 01-09-2014, 01:48 PM
Advanced Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 933
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 131 Post(s)
Liked: 172
Quote:
Originally Posted by Chronoptimist View Post

Quote:
Originally Posted by RLBURNSIDE View Post

Then the question becomes, how can NVidia only use two front (or back) buffers, one to write to and the other to snapshot down the wire directly, but Freesync can't.
Because Nvidia is using hardware that polls at 1ms looking for updates. As soon as a frame is rendered, it's sent to the display. G-Sync replaces V-Sync, you're not even double-buffering the video.

But, that still doesn't explain why AMD's solution must be 1 frame behind NVidia's. Why would they add extra waiting time for nothing? I mean, for their Freesync since 3 generations ago tech on laptops already support it. If the frame is ready, send it, as soon as you finish drawing it. Doesn't make much sense to me that they couldn't or wouldn't have done the same thing as NVidia. I guess it's a matter of hardware implementation, most AMD chips did it for power saving measures, with performance as a second concern.

If current AMD gpus add one frame of extra lag, as well as Next Gen, that's still better than we have now, and in future AMD PC cards, they could probably re-focus it to have G-sync levels of lag. Maybe G-Sync has less latency due to lower Display-side buffering rather than GPU-side?
RLBURNSIDE is online now  
Old 01-09-2014, 02:01 PM
Advanced Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 933
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 131 Post(s)
Liked: 172
Quote:
Originally Posted by Mark Rejhon View Post

Quote:
Originally Posted by RLBURNSIDE View Post

1) Can HDMI 1.3 / 1.4 or 2.0 all output a suitable signal for variable refresh? With or without an update to the spec, or firmwares of input ports or output ports. If it's just a matter of altering the signal slightly, it shouldn't require hardware chances in the actual HDMI ports. If HDMI cannot, it will have extremely limited use, although according to articles I read about G-Sync, there's no real reason why this should only work with Display Port 1.2 and above and not HDMI.
Oh -- I should add:

Intrisinically, I don't think it requires much modification. Just dynamically add/remove scanlines from the blanking intervals, to pad the time until the next refresh. Long ago, you actually could use PowerStrip in the old days, and slowly increment/decrement the values on the fly without any image disruptions. So, VRR can actually be achieved this way in a pretty simple manner. The trick is modifying the firmwares on both ends to handle dynamically resizeable VSYNC. It would be a very simple change to a video signal to support variable refresh rate.

The trick is you need support along the whole chain. Video source (graphics drivers, graphics hardware, HDMI transceiver) and the display (HDMI transciever, display drivers, TCON, etc), in order for it all to work properly. Variable refresh rate (+/- 1Hz) have long been supported by a lot of old displays, ever since old VCR's often varied by a tiny bit in their signal rate, and displays has had to adapt. Today, we're now asking for gargantuan variability in refresh rate (including blanking intervals bigger than the actual picture), to use the refresh rate variability for a different useful purposes.

Thanks for the response, very interesting! If there are even a few displays that support it, gamers with AMD chips might gravitate towards that, and display manufacturers would have a financial incentive to not only support G-Sync, but VRR / Freesync as well, otherwise they'd be only focusing on 1/2 their potential market (or whatever the proportion of AMD / Nvidia is these days). It would be great if at least the video-signal side, at the NG console end, would support Freesync, then it's just a matter of gamers asserting with their buying dollars a "Super Gaming Mode" which in fact supports VRR over HDMI 1.4, if your source also supports it. That's why I was wondering about VBLANK specs being passed by EDID, that would be a requirement that even current displays would have to support, so that the console or PC could detect whether the display supports VRR or not before outputting such a signal. Without EDID, I think it's DOA for NG consoles, unless it's a special mode that you can check box ON in the console video settings. Many of my colleagues also also skeptical that VRR / Freesync can ever be supported, unless there is a fully automatic way of guaranteeing that your console could be able to detect the display's capability to support and properly interpret such a signal.

I'd personally be fine if AMD allowed it in a checkbox, but it does have ramifications in game engine design too, and graphics decision making. Ideally in a few years, most TVs would have this "Super game mode" and we could all live in gaming bliss!

Consoles can do 1080p, just not at 60hz, and certainly not at minimum 60hz consistently. PCs can't even do that either, unless the game is extremely simple. (or old)
RLBURNSIDE is online now  
Old 05-13-2014, 01:11 AM - Thread Starter
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,125
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 102
Big news for eventual variable refresh rate standardization!

VESA has adopted variable refresh rate in DisplayPort standard.
http://www.blurbusters.com/vesa-standardizes-variable-refresh-rate-similiar-to-gsync/

If VESA can do it, why not HDMI too?

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
Old 05-13-2014, 07:28 AM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,861
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 671 Post(s)
Liked: 945
Quote:
Originally Posted by Mark Rejhon View Post

Big news for eventual variable refresh rate standardization!

VESA has adopted variable refresh rate in DisplayPort standard.
http://www.blurbusters.com/vesa-standardizes-variable-refresh-rate-similiar-to-gsync/

If VESA can do it, why not HDMI too?

 

I've mostly given up on HDMI ever catching up.  It reeks of a political machinery that has far too much inertia slowing forward improvements.


Java developers, when I saw what has been placed into Java 8 I was immediately reminded of how I've spent so much of my life trying to protect engineers from themselves. Lambda expressions are a horrible idea. Gentlemen: the goal isn't to make code readable for a competent mid-level engineer. The goal is to make code readable for a competent mid-level engineer exhausted and hopped up on caffeine at 3 am. What a disaster Java 8 is!
tgm1024 is offline  
 

Tags
Nvidia
Thread Tools


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off