Nvidia G-Sync - Page 2 - AVS Forum | Home Theater Discussions And Reviews
Baselworld is only a few weeks away. Getting the latest news is easy, Click Here for info on how to join the Watchuseek.com newsletter list. Follow our team for updates featuring event coverage, new product unveilings, watch industry news & more!


Forum Jump: 
 
Thread Tools
Old 10-22-2013, 10:12 AM
Yrd
AVS Special Member
 
Yrd's Avatar
 
Join Date: Mar 2008
Location: Elkton, MD
Posts: 3,207
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 310 Post(s)
Liked: 243
Yeah 27" 1440p res and I'm sold. I've been holding off upgrading my dell 2407 for quite a while.

XBL Gamertag- Yrd
PSN - Yerd

Steam - Yrd

Yrd is offline  
Sponsored Links
Advertisement
 
Old 10-22-2013, 11:40 AM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,127
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 103
G-SYNC includes a LightBoost sequel that's superior to LightBoost. When Andy of nVidia was asked whether LightBoost could be combined with G-GSYNC, AndyBNV of nVidia confirmed on NeoGaf:
Quote:
Originally Posted by AndyBNV 
“We have a superior, low-persistence mode that should outperform that unofficial {LightBoost} implementation, and importantly, it will be available on every G-SYNC monitor. Details will be available at a later date.”.

This scientifically confirms strobing is used, because of the law of vision physics — there is scientifically no other way to do LightBoost-matching low-persistence (1ms) modes without ultrahigh refresh rates (e.g. 1000fps@1000Hz) or frame interpolation (e.g. 200fps->1000fps). Since both are unlikely with nVidia G-SYNC, this officially confirms backlight strobing to keep visible frame displaytimes short (aka persistence). In addition, John Carmack confirmed on twitter that a better backlight strobe driver is included:
Quote:
Originally Posted by John Carmack (@ID_AA_Carmack) on Twitter 
“@GuerillaDawg the didn’t talk about it, but this includes an improved lightboost driver, but it is currently a choice — gsync or flashed.”

Both statements by Andy and John, are confirmations that official backlight strobing (LightBoost) is part of G-SYNC, a 2D motion blur elimination, finally officially sanctioned by nVidia.

In short, all G-SYNC monitors include a LightBoost sequel:
-- G-SYNC mode: Better for variable framerates (less stutters but more blur)
-- Strobe mode: Better for max framerates 120fps@120Hz (zero motion blur)

It can be better than LightBoost, because LightBoost was hard to enable, LightBoost had poor color quality, and they can make it run at any refresh rate. G-SYNC monitors have an 85Hz refresh rate available, and I heard John Carmack said it flickers a bit more at 85Hz than 120Hz, which means we've finally got 85Hz LightBoost available -- a good balance between GPU power and flicker.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
Old 10-22-2013, 01:16 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 11,332
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 749 Post(s)
Liked: 1604
Quote:
Originally Posted by Mark Rejhon View Post

G-SYNC includes a LightBoost sequel that's superior to LightBoost. When Andy of nVidia was asked whether LightBoost could be combined with G-GSYNC, AndyBNV of nVidia confirmed on NeoGaf:
This scientifically confirms strobing is used, because of the law of vision physics — there is scientifically no other way to do LightBoost-matching low-persistence (1ms) modes without ultrahigh refresh rates (e.g. 1000fps@1000Hz) or frame interpolation (e.g. 200fps->1000fps). Since both are unlikely with nVidia G-SYNC, this officially confirms backlight strobing to keep visible frame displaytimes short (aka persistence). In addition, John Carmack confirmed on twitter that a better backlight strobe driver is included:
Both statements by Andy and John, are confirmations that official backlight strobing (LightBoost) is part of G-SYNC, a 2D motion blur elimination, finally officially sanctioned by nVidia.

In short, all G-SYNC monitors include a LightBoost sequel:
-- G-SYNC mode: Better for variable framerates (less stutters but more blur)
-- Strobe mode: Better for max framerates 120fps@120Hz (zero motion blur)

It can be better than LightBoost, because LightBoost was hard to enable, LightBoost had poor color quality, and they can make it run at any refresh rate. G-SYNC monitors have an 85Hz refresh rate available, and I heard John Carmack said it flickers a bit more at 85Hz than 120Hz, which means we've finally got 85Hz LightBoost available -- a good balance between GPU power and flicker.

It would be awesome if we finally had lightboost with a simple in-monitor menu setting, although I don't think I'd ever choose it over g-sync unless it was a old game that never dropped a frame. I'm hoping they can make it work at 60hz though, 85hz can be problematic for some games.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
Old 10-22-2013, 03:41 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,127
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 103
Quote:
Originally Posted by bd2003 View Post

It would be awesome if we finally had lightboost with a simple in-monitor menu setting, although I don't think I'd ever choose it over g-sync unless it was a old game that never dropped a frame. I'm hoping they can make it work at 60hz though, 85hz can be problematic for some games.
I've been able to glean enough information to confirm G-SYNC strobing appears to be available for 85Hz and 144Hz.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
Old 10-22-2013, 03:42 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 11,332
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 749 Post(s)
Liked: 1604
Quote:
Originally Posted by Mark Rejhon View Post

I've been able to glean enough information to confirm G-SYNC strobing appears to be available for 85Hz and 144Hz.

I'm assuming in addition to 100 and 120? And chance of black frame insertion being used?

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
Old 10-22-2013, 06:31 PM
Advanced Member
 
Hawkwing's Avatar
 
Join Date: Jan 2004
Posts: 580
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 36
So what's the skinny on the displays? Who is gonna make them and what sizes and resolutions will be available? Is there a projected date of availability?

I won't even pretend I understand how this works, but how well will this technology work if your PC gives you crappy frame rates? Is there gonna be a minimum system requirement on the PC to go along with whatever resolution monitor you would get??
Hawkwing is offline  
Old 10-22-2013, 09:28 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,127
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 103
The good news is that people who want "LightBoost" style strobing at other refresh rates, to reduce GPU requirements (85fps @ 85Hz) or to reduce input lag (144fps @ 144Hz). G-SYNC's optional superior sequel to LightBoost (optional fixed-rate strobe mode) actually supports strobing at 85Hz and at 144Hz (at least), in addition to existing LightBoost modes (100Hz and 120Hz). Here are clues:

- The G-SYNC upgrade datasheet has 85Hz added.
- AndyBNV suggested on NeoGAF the low-persistence mode is superior to LightBoost.
- The YouTube video of John Carmack at G-SYNC launch, was very suggestive.
- Many articles mentions 85Hz as a CRT frequency that stops flickering for many people.
- The pcper.com livestream suggests a very high fixed refresh in low-persistence mode.[/list]

Upon analysis, both 85Hz and 144Hz are available strobed modes with G-SYNC, in addition to 100Hz and 120Hz. 60Hz has too much flicker, so that mode isn't fully confirmable. Beyond 144Hz is beyond ASUS VG248QE's bandwidth. Other strobed modes might also be available.

You heard it first from me. smile.gif

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
Old 10-22-2013, 09:40 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,127
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 103
Quote:
Originally Posted by bd2003 View Post

I'm assuming in addition to 100 and 120?
Yes.
Quote:
Originally Posted by bd2003 View Post

And chance of black frame insertion being used?
Your question is non-sequitur to me -- can you rephrase?
For persistence purposes, strobing is mathematically identical to black frame insertion.
Turning OFF a backlight/edgelight creates black frame insertion.

50% blackness of any kind + 50% image flash = 50% motion blur reduction.
75% blackness of any kind + 25% image flash = 75% motion blur reduction.
90% blackness of any kind + 10% image flash = 90% motion blur reduction.

Some more background information:
-- Persistence (static pixel state, sample-and-hold) is not GtG (pixel transitions)
-- Most motion blur on modern LCD's is caused by persistence (pixel staticness) rather than from transitions (pixel GtG).
-- Persistence is equal to backlight strobe length, one strobe per refresh, in full-panel strobe backlights.
-- See www.testufo.com/eyetracking for an example of motion blur caused by persistence, not GtG
-- Motion blur is directly proportional to persistence (especially squarewave persistence such as strobed backlight, strobed OLED, BFI on DLP)
-- Simple persistence math: 1ms of persistence = 1 pixel of motion blur during 1000 pixels/second (assumes squarewave persistence)
-- Almost all 60Hz LCD's have 1/60sec of persistence 16.7ms of persistence, so they all create 16.7ms of motion blur.
-- Common black frame is 50%:50%. (e.g. 60fps at 120Hz with black frames); reduces persistence from 16.7ms to 8.3ms
-- Strobe backlights are a more efficient method of creating blackness than black frame insertion. You can flash much shorter more easily.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
Old 10-23-2013, 03:42 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 11,332
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 749 Post(s)
Liked: 1604
What I mean to say is that if the reason they're not exposing a 60hz low persistence mode is that it's too far outside 100-120hz range that the lightboost strobing was originally designed for, that there might be an alternative way to implement that mode.

Even though that might flicker a bit like 60hz CRTs in the good ol days, and it's a relatively low refresh rate by current standards....it's still the refresh rate the vast majority of games were designed for, and the easiest to v-sync.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
Old 10-23-2013, 06:00 AM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,127
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 103
Quote:
Originally Posted by bd2003 View Post

What I mean to say is that if the reason they're not exposing a 60hz low persistence mode is that it's too far outside 100-120hz range that the lightboost strobing was originally designed for, that there might be an alternative way to implement that mode.
There's no reason why they can't do 60Hz. It's relatively simple to lower frequency of strobing as it's easy to do because you have more time for the LCD pixels to finish transitioning -- just that it is not a confirmable mode at this time based on my information analysis. They may have already done 60Hz strobing.

But bad publicity from flicker may be a reason not to allow strobing at 60Hz.
So I'm only calling-out 85Hz and 144Hz at this time, based on evidence -- they may have other strobe modes (unconfirmed).

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
Old 10-23-2013, 02:36 PM - Thread Starter
AVS Special Member
 
LexInVA's Avatar
 
Join Date: Jun 2007
Posts: 2,174
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 128 Post(s)
Liked: 68
Quote:
Originally Posted by Hawkwing View Post

So what's the skinny on the displays? Who is gonna make them and what sizes and resolutions will be available? Is there a projected date of availability?

I won't even pretend I understand how this works, but how well will this technology work if your PC gives you crappy frame rates? Is there gonna be a minimum system requirement on the PC to go along with whatever resolution monitor you would get??

Monitors come out next year but you can do an upgrade on a specific ASUS monitor if you have one. The only system requirements are a Kepler GPU (650 TI Boost or greater) and the latest GPU drivers. I think DisplayPort is also a requirement but I've read conflicting information.
LexInVA is offline  
Old 10-25-2013, 10:08 AM
AVS Special Member
 
Ericthemidget's Avatar
 
Join Date: Jan 2008
Posts: 1,290
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 11
I have a pretty nice system (3570K, GTX770) with a 144hz monitor and I really don't see the value in G-Synch as it propitery and with such a high refresh rate I don't see motion blur or tearing. If this was open to AMD as well and compatible with 120hz/240hz sets then I could get behind it.
Ericthemidget is offline  
Old 10-25-2013, 11:37 AM - Thread Starter
AVS Special Member
 
LexInVA's Avatar
 
Join Date: Jun 2007
Posts: 2,174
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 128 Post(s)
Liked: 68
I'm certain they might open it up to AMD at the behest of the monitor makers but that would take some serious pressure against the monitor makers. I can see it happening in 3 years if the G-Sync tech takes off but not before then as Nvidia will want to keep it to themselves for that long to win over the gamer market.
LexInVA is offline  
Old 10-25-2013, 12:10 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 11,332
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 749 Post(s)
Liked: 1604
Quote:
Originally Posted by LexInVA View Post

I'm certain they might open it up to AMD at the behest of the monitor makers but that would take some serious pressure against the monitor makers. I can see it happening in 3 years if the G-Sync tech takes off but not before then as Nvidia will want to keep it to themselves for that long to win over the gamer market.

I think there's a solid chance someone reverse engineers it to work with AMD long before then.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
Old 10-25-2013, 09:05 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,127
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 103
Another useful use of G-SYNC is low-latency fixed refresh rate.
Since G-SYNC has an additional advantage of faster frame delivery times, and faster on-screen scanout time, you can get 60fps@60Hz with cable delivery of only 1/144sec, and screen scanout of only 1/144sec per refresh! (or whatever maximum bandwidth the G-SYNC monitor uses -- or even theoretical future 240Hz G-SYNC monitors using DisplayPort 2.0, which would be able to do 60fps@60Hz or 77.5fps@77.5Hz or 187fps@187Hz, all with just 1/240th sec frame delivery / scanout latency!)

In the past, 60Hz monitors took 1/60sec to scanout (16.7ms).
High speed video of CRT: http://www.youtube.com/watch?v=zVS6QewZsi4
High speed video of LCD: http://www.youtube.com/watch?v=nCHgmCxGEzY

But with G-SYNC, delivery and scanout is decoupled from the refresh rate. You can choose to do 60fps@60Hz, with a lot less input lag than any 60Hz monitor -- even less input lag than a 60Hz CRT, because CRT's take a finite amount of time to scan from top-to-bottom.

Also, fixed refresh rate is great for G-SYNC too, as the software completely seamlessly drives refresh rate:
- Play movies 24fps@24Hz (or 48Hz, 72Hz, 96Hz)
- Play videos 30fps@30Hz
- Play movies 48fps@48Hz
- Play television 60fps@60Hz
- Play television 59.94Hz@59.94Hz
- Play old silent films at original theater speed 18fps@18Hz (or 36Hz, 72Hz)
- Variable frame rate video files
- Future movie formats (e.g. 72fps, 96fps)
- Play mixed TV material and dynamically detect 24p, 30p, and 60p material, dynamically rate-adapt to new fixed rate (with zero mode-change flicker)
- Play emulators 60fps@60Hz with lower input lag (taking advantage of 1/144sec frame delivery times)
- You can avoid stutters even if one frame takes a bit longer to render (e.g. 16.8ms instead 1/60sec = 16.7ms) for any innocous reason such as processing, background apps or error correction, you just delay that specific refresh by an unnoticeable 0.1 second rather than be forced to wait till next refresh cycle. Makes video playback even smoother.
- etc.

So it has apparently great applications for movies and home theater. Variable refresh rate capability & faster frame delivery time belongs in HDMI 3.0, in my humble opinion. Less input lag for receivers, less input lag for sports, future game consoles (XBoxTwo, PS5), less broadcasting latency due to speeded-up frame delivery between settop box and TV, less input lag everywhere, future-proof frame rates, faster frame delivery times from one home theater device to another....

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
Old 04-09-2014, 04:31 AM - Thread Starter
AVS Special Member
 
LexInVA's Avatar
 
Join Date: Jun 2007
Posts: 2,174
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 128 Post(s)
Liked: 68
Scuttlebutt says that AMD has gotten VESA to adopt FreeSync as part of the current DisplayPort 1.2a standard as an optional feature so it looks like G-Sync is dead in the water unless they come up with something above and beyond.
LexInVA is offline  
Old 04-09-2014, 09:45 AM
AVS Special Member
 
barrelbelly's Avatar
 
Join Date: Nov 2007
Posts: 1,953
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 156 Post(s)
Liked: 275
I have a couple of questions.
  1. Aren't G-Sync and FreeSync pretty much the same? Or work the same way?
  2. Which is a better fit for the potentially looming VR standard that will soon be necessary?
  3. Which will SteamOS be using? Or will it accommodate both?
barrelbelly is offline  
Old 04-09-2014, 10:15 AM
AVS Special Member
 
jhoff80's Avatar
 
Join Date: May 2005
Posts: 4,402
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 220 Post(s)
Liked: 249
Quote:
Originally Posted by LexInVA View Post

Scuttlebutt says that AMD has gotten VESA to adopt FreeSync as part of the current DisplayPort 1.2a standard as an optional feature so it looks like G-Sync is dead in the water unless they come up with something above and beyond.

When has a standards-based solution ever stopped Nvidia from pushing ahead with their proprietary version though? biggrin.gif

XBL/Steam: JHoff80
jhoff80 is offline  
Old 04-09-2014, 11:26 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 11,332
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 749 Post(s)
Liked: 1604
Quote:
Originally Posted by jhoff80 View Post

When has a standards-based solution ever stopped Nvidia from pushing ahead with their proprietary version though? biggrin.gif

True. Either way it looks like you'll be able to buy a g-sync monitor long before a freesync monitor.

I don't care what they call it, all I know is a want it (especially in a PJ.)

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
Old 04-10-2014, 06:27 AM - Thread Starter
AVS Special Member
 
LexInVA's Avatar
 
Join Date: Jun 2007
Posts: 2,174
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 128 Post(s)
Liked: 68
Quote:
Originally Posted by barrelbelly View Post

I have a couple of questions.
  1. Aren't G-Sync and FreeSync pretty much the same? Or work the same way?
  2. Which is a better fit for the potentially looming VR standard that will soon be necessary?
  3. Which will SteamOS be using? Or will it accommodate both?

1. Same end more or less. Different means.

2. FreeSync looks better on paper and would be easier to implement but of course there is no concrete data on either method when applied to VR tech outside of whatever is being done in the labs.

3. It's a driver thing. As long as it's in the driver, it can be used and SteamOS will have support for it though it's easier to implement FreeSync at this point since it's now part of the DisplayPort standard with the spec made public and G-Sync requires proprietary hardware and software from Nvidia that will be inaccessible to most Linux devs, especially since most Nvidia cards used with Linux setups don't have DisplayPort because they are low-end/integrated/ION and are of the pre-Geforce 600 vintages.
barrelbelly likes this.
LexInVA is offline  
Old 04-10-2014, 06:37 AM - Thread Starter
AVS Special Member
 
LexInVA's Avatar
 
Join Date: Jun 2007
Posts: 2,174
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 128 Post(s)
Liked: 68
Quote:
Originally Posted by jhoff80 View Post

When has a standards-based solution ever stopped Nvidia from pushing ahead with their proprietary version though? biggrin.gif

Did they have any proprietary hardware and software tech that was already done in an open standard other than CUDA? I can't recall any on the consumer side of things.
LexInVA is offline  
Old 04-10-2014, 08:06 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 11,332
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 749 Post(s)
Liked: 1604
Quote:
Originally Posted by LexInVA View Post

Did they have any proprietary hardware and software tech that was already done in an open standard other than CUDA? I can't recall any on the consumer side of things.

PhysX and 3D vision come to mind.
jhoff80 likes this.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
Old 04-10-2014, 09:38 AM
AVS Special Member
 
jhoff80's Avatar
 
Join Date: May 2005
Posts: 4,402
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 220 Post(s)
Liked: 249
Quote:
Originally Posted by bd2003 View Post

PhysX and 3D vision come to mind.

Those were the big ones I had in mind as well.

XBL/Steam: JHoff80
jhoff80 is offline  
Old 04-10-2014, 09:45 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 11,332
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 749 Post(s)
Liked: 1604
Quote:
Originally Posted by jhoff80 View Post

Those were the big ones I had in mind as well.

Although to be fair, neither CUDA, PhysX, 3D vision or G-sync came after an open standard. I'm not a fan of this proprietary stuff either, but they should get some credit for innovation. They weren't technically the first with SLI (that would be 3Dfx), but they're the first to do so with fully integrated graphics cards, plus PhysX is still the first and only example of load balancing without SLI. I could be wrong but I think triple monitor surround first came from nvidia as well. I even think they were first with MSAA way back in the day, and also led the charge with post process AA with FXAA.

I can hardly think of anything feature-wise that AMD has innovated in the GPU space, even if their cards keep up with performance.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
Old 04-10-2014, 09:56 AM
Yrd
AVS Special Member
 
Yrd's Avatar
 
Join Date: Mar 2008
Location: Elkton, MD
Posts: 3,207
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 310 Post(s)
Liked: 243
I think Matrox did the multiple monitor thing first.

And wasn't Physx a company selling a dedicated physics card before it went to nvidia?

XBL Gamertag- Yrd
PSN - Yerd

Steam - Yrd

Yrd is offline  
Old 04-10-2014, 10:30 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 11,332
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 749 Post(s)
Liked: 1604
Quote:
Originally Posted by Yrd View Post

I think Matrox did the multiple monitor thing first.

And wasn't Physx a company selling a dedicated physics card before it went to nvidia?

Yeah, the company was called ageia. nvidia ported everything to the GPU.

Matrox....that's a name I haven't heard in years. Their cards will killer back in the s3 virge days. They may have had the first multi monitor card, but I don't think they stretched 3D accelerated games across them.

Just so I don't sound like a fanboy, I fully recognize AMD is just as innovative as intel on the CPU side.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
Old 04-10-2014, 12:21 PM
Yrd
AVS Special Member
 
Yrd's Avatar
 
Join Date: Mar 2008
Location: Elkton, MD
Posts: 3,207
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 310 Post(s)
Liked: 243
I seem to remember them having something able to span screens way before the others, but it wasn't very powerful. I could be wrong.

I tried googling, but I gave up.

XBL Gamertag- Yrd
PSN - Yerd

Steam - Yrd

Yrd is offline  
Old 09-28-2014, 06:11 PM
Newbie
 
Join Date: Sep 2014
Posts: 1
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 0
Quote:
Originally Posted by Yrd View Post
I seem to remember them having something able to span screens way before the others, but it wasn't very powerful. I could be wrong.

I tried googling, but I gave up.
Matrox used to do a dual-head and triple-head 2go hardware solution for video output. Matrox however only supported smallish resolutions so we only used them on desktop setups (no good for projection system, thats where the NV gear was used). NVidia had GSync Display cards (not to be confused with these new monitors) back in 2008 that allowed synchronizing multiple monitors/projectors (we used it for a couple of sims). This was the only option available back then, and it is basically where these GSync monitors have come from.

Imho, its a good thing to have a strong company look after these sorts of new technologies. So many previously defined "standards" have become unsupported and not evolved. At least with this method you can bet if NVidia make money from it, it will be around for another 5+ yrs. They tend to long term invest into many technologies - Physx being a great example.

And on PhysX, yes Ageia used to have Physx cards (had one in my cupboard ). That was around 2008 too - we deployed one in a Car Simulator.
Has anyone got a GSync monitor - all the reviews are quite positive. And this could be a big help in the simulation business - its one of the main problems with rendering large frames, we always run in VSync (because we cant have tears due to the size of the artefacts) and thus any frame drop looks like a horrible jitter. If this GSync solution works well.. it will be a hugely cheaper option to their GSync cards (they cost around 1K USD for each channel + the cost of a Quadro).

Cheers,
Dave.
David Lannan is offline  
Old 10-02-2014, 01:52 PM
AVS Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 1,786
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 617 Post(s)
Liked: 463
I'm about to get a gtx 970 with G-Sync, what's a good G-Sync monitor to use with it? Like on the cheaper end of the price range.

I'd rather get a Freesync card but AMD apparently is only coming out with R9 390X in 1H 2015 and I'm itching to upgrade now (I've had this card for like 8 years now, it's well past its due date).

I bet there will be adapters from G-Sync to FreeSync unless NVidia can update the firmware on their GPUs to support Freesync natively from existing hardware.

Apparently the GPUs in Xb1 and PS4 all have the capability to do Freesync but it would need to be added to the HDMI spec then someone convince Sony or MS to update their HDMI output port firmwares. That way you could get smoother graphics from the consoles just by internally turning V-sync off when it detects a Freesync dispay connected. That would really be the best. Having to target 30 or 60hz in game engines due to consoles is really holding back progress. If you need to hit that 60fps target all the time and never dip below, that puts a huge amount of pressure to reduce quality. Instead of optimising everything you can but targetting 30-60hz variable range, which is much more realistic. It would allow, for example, 1080p to become viable on the Xbox One for more complex games than the limited cases that currently ship with 1080p native rendering.

Some guy at Microsoft a while back I remember said you could do your final render at 1080p or 900p or 800p depending on the scene complexity then simply use the scaler. E.g. 800p when you're in fast action scenes, and 1080p when the camera is moving slower or there is more detail on the scene. I'll have to try coding that myself to see if the idea has merit.
RLBURNSIDE is online now  
Sponsored Links
Advertisement
 
 
Thread Tools


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off