Nvidia G-Sync - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #1 of 58 Old 10-18-2013, 12:16 PM - Thread Starter
AVS Special Member
 
LexInVA's Avatar
 
Join Date: Jun 2007
Posts: 1,944
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 29 Post(s)
Liked: 40
Yes, it's Nvidia's solution to the V-Sync problem. Basically a memory buffer and a controller inside a monitor to allow tighter sync between GPU frame-rates and the visual quality of what ends up on the display. Only works with DisplayPort.

http://anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness
LexInVA is offline  
Sponsored Links
Advertisement
 
post #2 of 58 Old 10-18-2013, 02:39 PM
Yrd
AVS Special Member
 
Yrd's Avatar
 
Join Date: Mar 2008
Location: Elkton, MD
Posts: 2,962
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 158 Post(s)
Liked: 182
Slightly shaky cam video demo here.

http://www.engadget.com/2013/10/18/nvidia-g-sync/#continued

Looks pretty cool. Glad I haven't updated my monitor yet. Looks like a new monitor for me next year then.

XBL Gamertag- Yrd
PSN - Yerd

Steam - Yrd

Yrd is offline  
post #3 of 58 Old 10-18-2013, 03:00 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,542
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 287 Post(s)
Liked: 1401
Quote:
Originally Posted by Yrd View Post

Slightly shaky cam video demo here.

http://www.engadget.com/2013/10/18/nvidia-g-sync/#continued

Looks pretty cool. Glad I haven't updated my monitor yet. Looks like a new monitor for me next year then.

 

Fortunately I have that ASUS monitor you can mod....I'm totally getting the kit as soon as I can.  This looks awesome.


Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #4 of 58 Old 10-18-2013, 03:13 PM - Thread Starter
AVS Special Member
 
LexInVA's Avatar
 
Join Date: Jun 2007
Posts: 1,944
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 29 Post(s)
Liked: 40
I will definitely consider buying a monitor with this once the bugs are worked out and it comes in larger sizes. If Asus integrated this tech into their 39-inch 4K monitor, then I would certainly pay a premium for it but I gotta have it in a 27-inch monitor at the least.
LexInVA is offline  
post #5 of 58 Old 10-18-2013, 03:22 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,542
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 287 Post(s)
Liked: 1401

The only thing Im worried about is that I'm going to get way too used to it, and the chance that this comes to TVs anytime in the next decade is about zero.


Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #6 of 58 Old 10-18-2013, 03:23 PM
AVS Addicted Member
 
DaveFi's Avatar
 
Join Date: Nov 2002
Location: Natick MA
Posts: 17,175
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 26
I hope Asus puts this into their upcoming 4K 39" display, I see no reason why they wouldn't. I already have a 27" Dell so I'm not going to upgrade to a different display with G-Sync unless it's a 4k display at this point.

XBOX Live: Wagmman
PSN: Wagg
BFBC2: Wagman
Steam: Wag

My Second Life character looks and acts exactly like me except he can fly.
DaveFi is offline  
post #7 of 58 Old 10-18-2013, 03:30 PM
Yrd
AVS Special Member
 
Yrd's Avatar
 
Join Date: Mar 2008
Location: Elkton, MD
Posts: 2,962
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 158 Post(s)
Liked: 182
I hope they don't focus on the gaming panels that do 120hz. This thing is really a gamers product, I see it as a definite possibility that is just what will happen. At least at the start.

I'd like to get one with higher res than 1080p.

XBL Gamertag- Yrd
PSN - Yerd

Steam - Yrd

Yrd is offline  
post #8 of 58 Old 10-18-2013, 04:23 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,542
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 287 Post(s)
Liked: 1401

Actually come to think of it, there's a chance this might eventually make its way into a projector.  I'm sure its a profound improvement....the stuttering in games drives me crazy, and to entirely avoid it in most games you have to back off the settings a ton. 


Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #9 of 58 Old 10-18-2013, 10:00 PM
Senior Member
 
durack's Avatar
 
Join Date: Oct 2002
Location: Providence, RI
Posts: 365
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 25
I would hold your horses on 4K

From the Montreal event:

http://www.anandtech.com/show/7437/tim-sweeney-johan-andersson-john-carmack-nv-montreal-live-blog

Quote:
02:46PM EDT - Sweeney: when you get 20 TFLOPS in a single GPU, that's when you want to go 4K

02:46PM EDT - Sweeney: I see 4K as a workstation application for the next few years, we'll use those monitors for building games, but I don't think it's the right output resolution for rendering until you get significantly faster GPUs for rendering

02:46PM EDT - NV: it's a good thing that it doesn't have to be either or...

02:45PM EDT - Carmack: for a conventional desktop monitor, I want to see this absolutely ubiquitous

02:45PM EDT - Everyone answers G-Sync

02:45PM EDT - 4K vs. G-Sync what is more important to gaming?


20 TFLOPS - we are talking about Quad SLI Titans. biggrin.gif

Now, Tim Sweeney obviously knows what he is talking about, although you could probably pull off decent framerates on a 4K monitor with GTX 780 in SLI. In other words, forget about gaming at 4K for a few years unless you want to blow some serious cash.
durack is offline  
post #10 of 58 Old 10-19-2013, 02:08 AM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 102
Crosspost, because people are asking about combining LightBoost with G-Sync:
Quote:
Originally Posted by Pocatello;1040297646 
Mark,
What is best right now for FPS gaming? Lightboost or G-sync?
Depends.
-- If you get variable framerates; use G-Sync
-- If you get constant 120fps@120Hz: use LightBoost (blur eliminating strobe backlight).

G-Sync is still limited by 144Hz motion blur; it would take G-Sync at 400fps@400Hz to achieve flickerfree LightBoost CRT motion clarity (2.5ms sample-and-hold length), based on existing motion blur math formulas directly co-relating strobe length with motion blur. So frame durations would have to match today's LightBoost strobe lengths. Using 2.5ms means frames have to last only 1/400th of a second, and that would require 400fps@400Hz using nVidia G-Sync. If you want the clarity of LightBoost=10%, you need about 700fps@700Hz (1.4ms frame duration length) and upwards. That's not feasible. So we still need strobing, at least until we have a 1000Hz LCD (to have flickerfree CRT motion clarity). It's surprising how human eyes still sees indirect display-induced motion blur (sample-and-hold effect), even at 240Hz, 480Hz, and beyond (as we witness human eyes can tell apart LightBoost=10% (~1 to 1.4ms frame duration) versus LightBoost=100% (~2.5ms frame duration) during motion tests such as www.testufo.com/photo#photo=toronto-map.png&pps=1440 ...) Display motion blur is directly proportional to visible frame duration times.

LightBoost (blur eliminating strobe backlight) is great at triple-digit framerates. However, we already know LightBoost becomes terrible at lower framerates, and can become very stuttery at less than triple-digit framerates. G-Sync stays smooth during varying framerates; LightBoost does not. The solution is to Combine LightBoost AND G-Sync. This solves the problem. However, new problems occurs with variable-rate strobing. Fortunately, I've come up with a successful solution.

I've found a way to combine the two. John Carmack did say it was possible in his twitch.tv video, but it didn't appear that anyone came up with a novel idea of blending PWM-free with LightBoost strobing, an enhancement that I have just come up:

I have quickly invented a new idea of combining PWM-free with LightBoost, while having G-Sync:
New Section Added to "Electronics Hacking: Creating a Strobe Backlight"

To the best of my knowledge, no patents exist on this, and not even John Carmack appears to have mentioned this in his twitch.tv video when he mentioned combining LightBoost with G-Sync. So I'm declaring it as my idea of a further improvement to nVidia G-Sync:
Quote:
From: New Section in "Electronics Hacking: Creating a Strobe Backlight"

With nVidia’s G-Sync announcement, variable refresh rate displays are now a reality today. Refresh rates can now dynamically vary with frame rates, and it is highly likely that nVidia has several patents on this already. If you are a monitor manufacturer, contact nVidia to license this technology, as they deserve kudos for this step towards tomorrow’s perfect Holodeck display.

However, one additional idea that Mark Rejhon of Blur Busters has come up with is a new creative PWM-free-to-strobing dynamic backlight curve manipulation algorithm, that allows variable-rate backlight strobing, without creating flicker at lower frame rates.

It is obvious to a scientist/engineer/vision researcher that to maintain constant perceived brightness during variable-rate strobing, you must keep strobing duty cycle percentages constant when varying the strobe rate. This requires careful and precise strobe-length control during variable refresh rate, as the display now refreshes dynamically on demand rather than at discrete scheduled intervals. However, a problem occurs at lower framerates: Strobing will cause uncomfortable flicker at lower refresh rates.

Mark Rejhon has invented a solution: Dynamic shaping of the strobe curve from PWM-free mode at low framerates, all the way to square-wave strobing at high framerates. The monitor backlight runs in PWM-free mode during low refresh rates (e.g. 30fps@30Hz, 45fps@45Hz), and gradually become soft gaussian/sinewave undulations in backlight brightness (bright-dim-bright-dim) at 60fps@60Hz, with the curves becoming sharper (fullbright-off-fullbright-off) as you head higher in framerates, towards 120fps@120Hz. At the monitor’s maximum framerate, the strobing more resembles a square wave with large totally-black-gaps between strobes.

Example:
10fps@10Hz — PWM-free backlight
30fps@30Hz — PWM-free backlight
45fps@45Hz — PWM-free backlight
60fps@60Hz — Minor backlight brightness undulations (bright / dim / bright / dim)
80fps@80Hz — Sharper backlight brightness undulations (very bright / very dim)
100fps@100Hz — Starts to resemble rounded-square-wave (fullbright / fulloff)
120fps@120Hz and up — Nearly square-wave strobing like original LightBoost

This would be a dynamically variable continuum all the way in bewteen too, much like automobile CVT instead of discrete gears in automobile transmissions. You avoid flicker at lower frame rates, and you get full strobing benefits at higher frame rates.

Simpler algorithm variations are also possible (e.g. keeping a square wave, and using only pulsewidth / pulseheight manipulation to achieve the blending effect, but without curve-softening). This is included as part of my general idea of blending from PWM-free at lower refresh rates, to strobing at higher refresh rates. The trigger framerates may be different from the example above (or may even be adjustable via a user flicker-threshold setting), but the concept is the same.

If nVidia or any monitor manufacturer uses this idea (if no patent application dated before October 19, 2013 covers my invention), please give Mark Rejhon / Blur Busters appropriate due credit. It is realized nVidia has several patents, but none appears to be covering this additional improvement being suggested during combining strobing and variable refresh rates. As of this writing, research is being done on any prior art, to determine whether anyone dynamically considered blending from PWM-free to square-wave strobing. If anyone else already came up with this idea, already documented in a patent application prior to October 19, 2013, please let me know & due credit will be given here.

(For those living under a rock, LightBoost (for 2D) is a strobe backlight that eliminates motion blur in a CRT-style fashion. It eliminates crosstalk for 3D, but also eliminates motion blur (For 2D too). LightBoost is now more popular for 2D. Just google "lightboost". See the "It's like a CRT" testimonials, the LightBoost media coverage (AnandTech, ArsTechnica, TFTCentral, etc), the improved Battlefield 3 scores from LightBoost, the photos of 60Hz vs 120Hz vs LightBoost, the science behind strobe backlights, and the LightBoost instructions for existing LightBoost-compatible 120Hz monitors. It is truly an amazing technology that allows LCD to have less motion blur than plasma/CRT. John Carmack uses a LightBoost monitor, and Valve Software talked about strobing solutions too. Now you're no longer living under a rock!)

Thanks,
Mark Rejhon


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #11 of 58 Old 10-19-2013, 08:45 AM
AVS Special Member
 
MSmith83's Avatar
 
Join Date: Jun 2005
Posts: 8,638
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 55 Post(s)
Liked: 60
This is fantastic. Now if only this would be incorporated in quality big screen 1080p TVs and not just smallish monitors.
MSmith83 is offline  
post #12 of 58 Old 10-19-2013, 09:35 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,542
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 287 Post(s)
Liked: 1401
Quote:
Originally Posted by MSmith83 View Post

This is fantastic. Now if only this would be incorporated in quality big screen 1080p TVs and not just smallish monitors.

I just don't see it. All film and video content are fixed frame rate and have no use for this. They'd need to convince HDTV manufacturers to install an expensive module that's relevant to only a fraction of a fraction of a percent of their users.

Even if the consoles supported it I think it'd be unlikely, but since they're all AMD...just give up the dream.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #13 of 58 Old 10-19-2013, 01:24 PM
Yrd
AVS Special Member
 
Yrd's Avatar
 
Join Date: Mar 2008
Location: Elkton, MD
Posts: 2,962
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 158 Post(s)
Liked: 182
Well the biggest issue is that it requires an Nvidia GPU to work. When they can make this work for any GPU, it will be more widespread.

I'm just hoping this will be implemented on a high res IPS panel.

XBL Gamertag- Yrd
PSN - Yerd

Steam - Yrd

Yrd is offline  
post #14 of 58 Old 10-19-2013, 03:46 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,542
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 287 Post(s)
Liked: 1401
Quote:
Originally Posted by Yrd View Post

Well the biggest issue is that it requires an Nvidia GPU to work. When they can make this work for any GPU, it will be more widespread.

I'm just hoping this will be implemented on a high res IPS panel.

 

Perhaps, and there's a good chance nvidia will license the tech if they can retrofit it into the consoles. If they can make that happen, then HDTVs featuring it at least become plausible.  This is arguably a WAY bigger deal for consoles anyway.  Before, if a game could handle 60 most of the time, but frequently drops to 40, they'd have to lock it at 30fps if they wanted v-sync. You'd be surprised how many console games are like that. With g-sync, you'd be able to maintain a much higher level of performance without having to sacrifice the visuals.  If the consoles were in, I'd immediately run out and buy a TV that featured it. 

 

G-Sync alone is going to look vastly superior to the three bad options we have now: no vsync (tearing + stuttering), double buffer vsync (more input lag + drops to half frame rate) or triple buffer vsync (more input lag + stutters). I think what this really needs to seal the deal though, is a game that's aware of g-sync, and has a variable strength motion blur. The amount of motion blur should be tied to the current frame rate - the lower the frame rate, the more motion blur should be applied, since there's a longer time period between frames. Done right, drops in performance would be experienced as little more than an increase in motion blur, but it should still appear smooth to the eye, even at much lower frame rates than 60. You probably wouldn't even notice a momentary drop to 40 unless you we're looking for it. G-sync alone will be great, but a low frame rate is still a low frame rate.  Just the right amount of in-game motion blur is going to make a constantly shifting frame rate much more palatable. 

 

Call me crazy, but I think this is the biggest deal in PC gaming since the pixel shader.


Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #15 of 58 Old 10-19-2013, 04:11 PM
AVS Addicted Member
 
joeblow's Avatar
 
Join Date: May 2006
Location: Los Angeles, CA
Posts: 12,114
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 199
Eh, I'm not so sure. The hoops you have to jump through to take advantage of what a niche audience within a niche audience would greatly appreciate is a pretty strong barrier:

- Only works with nVidia GPUs
- Worthless to HTPC gamers
- Doesn't work with the vast majority of existing monitors
- costs around $200 on top of the cost of a new monitor that you might need
- Elite power gamers (i.e., the ones with the most $$$ to burn) who worship the highest frame rates are better off with LightBoost

That's a ton of hurdles to expect the masses in PC gaming to leap over, in order to get what? Stutter-free gaming without tearing and no input lag? One can address each of these problems with currently available options.

I literally experienced and described all three of these problems during my first time setup of Skyrim a few weeks ago in this post, which happened prior to the G-Sync announcement. Using different methods, I eliminated stuttering, tearing, and GREATLY reduced input lag.

Sure, G-Sync looks to do all of that effortlessly with no loss in frame rates and with zero input lag. I get that. But the fact that a quality gaming experience can be had without perfection in each of these areas takes some of the appeal of the device away IMHO. If trying alternative methods means not jumping through all of the hoops listed above (while saving the money to go towards a better GPU perhaps?), I personally will go that route until the case is made crystal clear that this is must-have tech.

Of all of the cool, new stuff coming into the PC space in the next twelve months or so, I'd say Occulus Rift is the one that excites me the most. It at least advances game play in new directions, unlike G-Sync (or SteamBox, Mantle, etc.).

Los Angeles Lakers - 16 NBA Championships!

joeblow is offline  
post #16 of 58 Old 10-19-2013, 04:49 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,542
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 287 Post(s)
Liked: 1401
Quote:
Originally Posted by joeblow View Post

Eh, I'm not so sure. The hoops you have to jump through to take advantage of what a niche audience within a niche audience would greatly appreciate is a pretty strong barrier:

- Only works with nVidia GPUs
- Worthless to HTPC gamers
- Doesn't work with the vast majority of existing monitors
- costs around $200 on top of the cost of a new monitor that you might need
- Elite power gamers (i.e., the ones with the most $$$ to burn) who worship the highest frame rates are better off with LightBoost

That's a ton of hurdles to expect the masses in PC gaming to leap over, in order to get what? Stutter-free gaming without tearing and no input lag? One can address each of these problems with currently available options.

I literally experienced and described all three of these problems during my first time setup of Skyrim a few weeks ago in this post, which happened prior to the G-Sync announcement. Using different methods, I eliminated stuttering, tearing, and GREATLY reduced input lag.

Sure, G-Sync looks to do all of that effortlessly with no loss in frame rates and with zero input lag. I get that. But the fact that a quality gaming experience can be had without perfection in each of these areas takes some of the appeal of the device away IMHO. If trying alternative methods means not jumping through all of the hoops listed above (while saving the money to go towards a better GPU perhaps?), I personally will go that route until the case is made crystal clear that this is must-have tech.

Of all of the cool, new stuff coming into the PC space in the next twelve months or so, I'd say Occulus Rift is the one that excites me the most. It at least advances game play in new directions, unlike G-Sync (or SteamBox, Mantle, etc.).

 

Whether or not people adopt it en masse is hard to say, like you point out, there's a lot of hurdles to jump.  But the theory is perfectly sound, and arguably nvidia has more to gain by licensing it than keeping it proprietary...it's going to be such a profound difference that anyone who's experienced it will not be able to go back. It should appear to the game the same as no v-sync, so at least there's no compatibility hurdles to jump. Nvidia doesn't need to convince devs to support it like PhysX, they just need to convince gamers to buy in. And it's really got nothing to do with lightboost, which solves a completely different problem....they're totally unrelated. If they can get lightboost/strobing to work properly with g-sync, it'll help cut out pixel persistence just the same. The elite gamers are going to be the ones all over this, because now they'll actually be able to see those frame rates instead of losing them to v-sync.

 

Sure, you've got plenty of options to mitigate stuttering and lag, but there's ALWAYS a compromise right now.  Any form of v-sync by definition has to have a frame more of lag than g-sync or v-sync off.  V-sync off gets you good lag, but also gets you tearing on every frame, and your graphics card is pounding away on useless pixels that never get displayed. You can get it stutter free by v-syncing to 60, but then you have to leave a lot of performance (and by extension, image quality) on the table, and now you've got that extra frame of lag. And V-sync on or off, double or triple buffering, if a single frame drops below 16.6ms frame time, you get a stutter, period. You can fiddle all day to find just the right settings, or just click a button to enable g-sync, and enjoy the game at high settings without having to deal with any of that or making any compromises. All that fiddling is way more hoops to jump through than just buying the right monitor and an nvidia card.  I hate to say it, but they just trumped AMD and mantle SO HARD. Mantle boosts performance, but with g-sync you can get away with a lot less powerful video card and still get a superior image. Going forward there's no way I'd ever buy a monitor/video card that didn't feature this...I'd take g-sync in a heartbeat over whatever performance boost is to be had from mantle. 

 

I'm all for OR too, but this tech is arguably a necessity to make VR work optimally. I have zero doubt that variable refresh rate will be ubiquitous for "gaming" monitors before too long...this is that big of a deal. As long as they can get the circuitry small enough, it should benefit mobile devices (including OR) too.  Really the biggest open question is how long it takes nvidia to license it out, and whether or not HDTVs and consoles can/will adopt it. 


Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #17 of 58 Old 10-19-2013, 07:00 PM
AVS Addicted Member
 
DaveFi's Avatar
 
Join Date: Nov 2002
Location: Natick MA
Posts: 17,175
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 26
Quote:
Originally Posted by durack View Post

I would hold your horses on 4K

From the Montreal event:

http://www.anandtech.com/show/7437/tim-sweeney-johan-andersson-john-carmack-nv-montreal-live-blog
20 TFLOPS - we are talking about Quad SLI Titans. biggrin.gif

Now, Tim Sweeney obviously knows what he is talking about, although you could probably pull off decent framerates on a 4K monitor with GTX 780 in SLI. In other words, forget about gaming at 4K for a few years unless you want to blow some serious cash.

GTX 780 SI coming out next month. Asus is coming out with a 39" 4k/60Hz display that should be <$1500 early 2014. It's doable.

XBOX Live: Wagmman
PSN: Wagg
BFBC2: Wagman
Steam: Wag

My Second Life character looks and acts exactly like me except he can fly.
DaveFi is offline  
post #18 of 58 Old 10-19-2013, 11:01 PM
AVS Addicted Member
 
joeblow's Avatar
 
Join Date: May 2006
Location: Los Angeles, CA
Posts: 12,114
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 199
I didn't expand on the LightBoost comment because it was explained above. Affluent users with uber systems can't use nVidia's tech if their rig is too powerful, but that introduces problems LightBoost addresses. So they are better off by continuing to go beyond G-Sync's capabilities and use LightBoost instead for those different issues.

As for the stutter / tearing / input-lag situation, as I said above, stutter and tearing is eliminated while mouse-look lag is almost undetectable for me after employing the techniques I described in that post. I repeat, there is no stuttering. Believe me, I was not going to play one quest in the game until I fixed that problem. Limiting the frame rate with a free third party program to sync the GPU with my HDTV instantly solved it. That introduced tearing which V-Sync completely eliminated. That introduced crazy mouse lag, which triple buffering severely reduced.

Like many gamers who would be interested in G-Sync, I am sensitive to all of these problems. So the fact that it has been solved for the most part in my case means that I am less interested in a solution that costs money and takes me off of my comfy couch to play on a smaller screen.

I admit my method loses some frames (even though my careful mod installation choices still has me at 60 fps in Skyrim), and the mouse lag still isn't absolute zero. But after 42 hours so far the game experience remains highly playable with that method, so as of now the new nVidia tech isn't wowing me. It's more of a "that'd be nice to have" kinda thing.
DaverJ likes this.

Los Angeles Lakers - 16 NBA Championships!

joeblow is offline  
post #19 of 58 Old 10-19-2013, 11:37 PM
Senior Member
 
durack's Avatar
 
Join Date: Oct 2002
Location: Providence, RI
Posts: 365
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 25
Quote:
Originally Posted by DaveFi View Post

GTX 780 SI coming out next month. Asus is coming out with a 39" 4k/60Hz display that should be <$1500 early 2014. It's doable.

GTX 780 TI you mean which most likely will be a rebranded Titan with less VRAM and some other tweaks. Anand has some benches on his site, if I remember correctly it took him three Titans to run Metro 2033 at 4K resolution to get guaranteed minimum 40 fps. So if you plan to play games at native 4K on that Asus monitor, you'll need to spend some serious money.

Personally, I don't see myself buying a 4K display for at least 3-4 years - not until there is a single GPU card that can run games with acceptable framerates at that resolution (not a fan of multiple GPU setups).
durack is offline  
post #20 of 58 Old 10-20-2013, 06:42 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,542
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 287 Post(s)
Liked: 1401
Quote:
Originally Posted by joeblow View Post

I didn't expand on the LightBoost comment because it was explained above. Affluent users with uber systems can't use nVidia's tech if their rig is too powerful, but that introduces problems LightBoost addresses. So they are better off by continuing to go beyond G-Sync's capabilities and use LightBoost instead for those different issues.

As for the stutter / tearing / input-lag situation, as I said above, stutter and tearing is eliminated while mouse-look lag is almost undetectable for me after employing the techniques I described in that post. I repeat, there is no stuttering. Believe me, I was not going to play one quest in the game until I fixed that problem. Limiting the frame rate with a free third party program to sync the GPU with my HDTV instantly solved it. That introduced tearing which V-Sync completely eliminated. That introduced crazy mouse lag, which triple buffering severely reduced.

Like many gamers who would be interested in G-Sync, I am sensitive to all of these problems. So the fact that it has been solved for the most part in my case means that I am less interested in a solution that costs money and takes me off of my comfy couch to play on a smaller screen.

I admit my method loses some frames (even though my careful mod installation choices still has me at 60 fps in Skyrim), and the mouse lag still isn't absolute zero. But after 42 hours so far the game experience remains highly playable with that method, so as of now the new nVidia tech isn't wowing me. It's more of a "that'd be nice to have" kinda thing.

 

Why can't affluent users use g-sync? Stuttering is a problem even at high frame rates....on a 120hz monitor, it actually looks a lot worse to have a the frame rate hover from 100-120 instead of being locked to 60. It sounds crazy, you'd think more frames would look better no matter what, but the juddering/stuttering really throws off the fluidity of the image.

 

I'm not dismissing that you're getting a satisfactory experience with skyrim, it's not like all gaming has been horrible up until this point. Although something must be broken in that game if limiting it to 59fps isnt causing some minor stuttering, you should be dropping a frame every second. If you're satisfied with the way it is now, the main thing g-sync would do for you is free you up from being so careful about which mods you install, because 60fps is no longer some sort of magic number. You could install many more mods that would drop you from 60 to 45 on a constant basis, and you'd barely notice it.  

 

V-sync only causes one frame of lag, it's not the end of the world, especially in a game like skyrim.  If you turned v-sync off though, and you measured it, I guarantee you there'd be even less lag.  On the other hand, playing a competitive game like CS:GO, that single frame of lag is a big deal. V-sync at 120fps still has a frame of lag, but each frame is half as long than at 60fps...and that 8ms difference is instantly noticeable when you mouselook. 

 

I'll admit the thing that bothers me most about this is that right now it's just limited to desktop monitors. I don't want to get too used to not having to make these compromises at my desk, but still having to tweak and fiddle in my theater.      


Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #21 of 58 Old 10-20-2013, 09:44 AM
AVS Addicted Member
 
joeblow's Avatar
 
Join Date: May 2006
Location: Los Angeles, CA
Posts: 12,114
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 199
Jonh Carmack himself confirms G-Sync doesn't work with LightBoost. You have to choose one or the other. Could that change in the future? And if so, would that require a G-Sync 2.0 purchase? I dunno.

Choosing to limit my frame rate to 59 fps (as opposed to 60 fps) is what was recommended in several places to keep the GPU and display in sync - it works. Movement in all directions (especially strafing facing walls, which was what made stuttering very evident) is now silky smooth. I spent plenty of time loading and reloading the intro segment of Skyrim solely to test for stutter in various ways as I tried different methods until the problem was completely fixed.
DaverJ likes this.

Los Angeles Lakers - 16 NBA Championships!

joeblow is offline  
post #22 of 58 Old 10-20-2013, 11:00 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,542
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 287 Post(s)
Liked: 1401
Quote:
Originally Posted by joeblow View Post

Jonh Carmack himself confirms G-Sync doesn't work with LightBoost. You have to choose one or the other. Could that change in the future? And if so, would that require a G-Sync 2.0 purchase? I dunno.

Choosing to limit my frame rate to 59 fps (as opposed to 60 fps) is what was recommended in several places to keep the GPU and display in sync - it works. Movement in all directions (especially strafing facing walls, which was what made stuttering very evident) is now silky smooth. I spent plenty of time loading and reloading the intro segment of Skyrim solely to test for stutter in various ways as I tried different methods until the problem was completely fixed.

 

I suspect it would, in order for the brightness to appear constant, they'd have to also dynamically vary the brightness/timing of the pulse...which is probably much easier said than done.

 

It's an easy choice for me at least. Lightboost in it's current incarnation is a total mess. Getting it to work in 2D mode essentially involves tricking your display into thinking it's in 3D mode. While that works, it also degrades the 2D image quality in a major way. And it only properly works at 100/120fps right now, at lower frame rates you get image doubling, which may or may not be worse depending on your perspective. I tried gaming in 2D with lightboost for a while and the downsides outweighed the upside. Maybe when it's not a driver hack and works properly in 2D at 60hz without any hacks, it'll be ready for the masses. 


Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #23 of 58 Old 10-20-2013, 12:17 PM
AVS Special Member
 
DaverJ's Avatar
 
Join Date: May 2003
Location: Tennessee
Posts: 6,673
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 69 Post(s)
Liked: 449
Quote:
Originally Posted by joeblow View Post

Choosing to limit my frame rate to 59 fps (as opposed to 60 fps) is what was recommended in several places to keep the GPU and display in sync - it works. Movement in all directions (especially strafing facing walls, which was what made stuttering very evident) is now silky smooth. I spent plenty of time loading and reloading the intro segment of Skyrim solely to test for stutter in various ways as I tried different methods until the problem was completely fixed.

So Joe, let me see if I understand this. You are saying that, based on your testing, when playing on a TV, it's best to set the resolution at 1080p 59hz for v-sync to be smooth/non-stuttering.

Is that correct?
DaverJ is offline  
post #24 of 58 Old 10-20-2013, 01:17 PM
AVS Addicted Member
 
joeblow's Avatar
 
Join Date: May 2006
Location: Los Angeles, CA
Posts: 12,114
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 199
With some research, I learned that the stutter I was experiencing could be addressed.

My HDTV displays at 60Hz. Before I added any mods to Skyrim, my GPU tried to display the visuals at a much higher rate than 60 fps, but of course it wasn't possible with my set. The disparity between the GPU and display had to be forced to be in sync. There are a few ways to do it apparently, but the trial version of Dxtory (which allows unlimited use of the frame limiter) worked perfectly for me once I set it up. As a bonus, like Fraps it can display the frame rate of what you are doing if you wish.

Limiting the frame rate for my Radeon 6970 introduced screen tearing, so I turned off Skyrim's supposed VSync command in the SkyrimPrefs.ini file (iPresentInterval=0) and activated it in RadenPro with a Skyrim profile I created - that took two minutes. Tearing completely cleared up, but the mouse lag VSync introduced was ridiculous. So I turned off mouse acceleration in the .ini file, (bMouseAcceleration=0) and turned on Triple Buffering in RadeonPro. All is now well.

Los Angeles Lakers - 16 NBA Championships!

joeblow is offline  
post #25 of 58 Old 10-20-2013, 01:43 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,542
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 287 Post(s)
Liked: 1401
Quote:
Originally Posted by DaverJ View Post


So Joe, let me see if I understand this. You are saying that, based on your testing, when playing on a TV, it's best to set the resolution at 1080p 59hz for v-sync to be smooth/non-stuttering.

Is that correct?

 

I'm pretty sure that 59hz option is technically 59.97hz, which IIRC is technically the frame rate for video. I'm not sure whether monitors/TVs accept a true 60.00hz signal, convert it to 59.97, etc...but either way it shouldn't be the cause of any stuttering, at least not for every few hundred frames.

 

For whatever reason, bethesda games are really screwy when it comes to v-sync.  It's really tough to get them to output a fully stable 60fps, even when fraps reads 60 I can still see minor hitches every now and then. 

DaverJ likes this.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #26 of 58 Old 10-21-2013, 09:55 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,542
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 287 Post(s)
Liked: 1401

http://www.pcper.com/news/Graphics-Cards/PCPer-Live-NVIDIA-G-Sync-Discussion-Tom-Petersen-QA

 

Pretty epic demo and Q&A session about g-sync.  Obviously they're unable to fully demonstrate what g-sync looks like, since you're going to be watching it on a fixed refresh rate monitor.  But they do go into great detail and demonstrate the issues with v-sync.

 

I havent finished watching it entirely, but there's some good nuggets in there.  From what he said, the g-sync module will have three modes: g-sync, 3d vision, and low persistence.  I'm pretty sure the 3d vision mode is still normal v-sync, since I don't believe the glasses can vary their timing (but I could be wrong).  The low persistence mode sounds exactly like a lightboost toggle though, so it'll be nice to enable that without any hacks...but that doesn't utilize g-sync.

 

Hopefully they get those upgrade kits out soon, I'm gonna try and grab one as soon as it's available.

DaverJ likes this.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #27 of 58 Old 10-22-2013, 08:34 AM
AVS Addicted Member
 
joeblow's Avatar
 
Join Date: May 2006
Location: Los Angeles, CA
Posts: 12,114
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 199
Here is a very interesting article about the new tech which highlights a few more advantages it could potentially give to the gaming experience. It could be cooler than I thought, and I definitely want to see it in action.

Los Angeles Lakers - 16 NBA Championships!

joeblow is offline  
post #28 of 58 Old 10-22-2013, 09:17 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,542
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 287 Post(s)
Liked: 1401
Quote:
Originally Posted by joeblow View Post

Here is a very interesting article about the new tech which highlights a few more advantages it could potentially give to the gaming experience. It could be cooler than I thought, and I definitely want to see it in action.

Heh, I was just about to post that before I refreshed the thread.

I think it's going to a while for it to sink in for most people how much of a revolution this is in computer graphics. We've been making major compromises due to v-sync for so long that it's hard to comprehend what a world without it would be like. Poor nvidia, they keep developing really great tech that's impossible to demonstrate properly over the internet (TXAA, 3D vision, now G-Sync).

It's such a killer app for nvidia and the PC in general that it's almost got me reconsidering purchasing any non-exclusive console games.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #29 of 58 Old 10-22-2013, 09:35 AM
AVS Special Member
 
MSmith83's Avatar
 
Join Date: Jun 2005
Posts: 8,638
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 55 Post(s)
Liked: 60
This is definitely a big deal and is the kind of solution I have always wanted as I've had to settle with the tradeoffs of triple-buffered vsync for the latest games that tax the GPU.

I won't be gaming on a small monitor to take advantage of G-Sync, but it's a huge start that opens up all sorts of possibilities.

Oh, and something tells me that the crazy Jen-Hsun Huang wanted to name this tech N-Sync until the legal team strongly advised him not to. biggrin.gif
MSmith83 is offline  
post #30 of 58 Old 10-22-2013, 10:07 AM
AVS Special Member
 
DaverJ's Avatar
 
Join Date: May 2003
Location: Tennessee
Posts: 6,673
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 69 Post(s)
Liked: 449
The more I hear about this, the more intrigued I am. I'm also not sure about giving up gaming on my comfy chair with a controller for sitting at a desk with mouse+keyboard, but this G-sync thing has me considering it. And with multi-monitor capability, it could be a pretty sweet setup!

I'm hoping someone announces a 27" G-sync monitor soon!
DaverJ is offline  
Reply HTPC Gaming

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off