Nvidia G-Sync - Page 2 - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #31 of 57 Old 10-22-2013, 10:12 AM
Yrd
AVS Special Member
 
Yrd's Avatar
 
Join Date: Mar 2008
Location: Elkton, MD
Posts: 2,810
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 71 Post(s)
Liked: 164
Yeah 27" 1440p res and I'm sold. I've been holding off upgrading my dell 2407 for quite a while.

XBL Gamertag- Yrd
PSN - Yerd

Steam - Yrd

Yrd is offline  
Sponsored Links
Advertisement
 
post #32 of 57 Old 10-22-2013, 11:40 AM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
G-SYNC includes a LightBoost sequel that's superior to LightBoost. When Andy of nVidia was asked whether LightBoost could be combined with G-GSYNC, AndyBNV of nVidia confirmed on NeoGaf:
Quote:
Originally Posted by AndyBNV 
“We have a superior, low-persistence mode that should outperform that unofficial {LightBoost} implementation, and importantly, it will be available on every G-SYNC monitor. Details will be available at a later date.”.

This scientifically confirms strobing is used, because of the law of vision physics — there is scientifically no other way to do LightBoost-matching low-persistence (1ms) modes without ultrahigh refresh rates (e.g. 1000fps@1000Hz) or frame interpolation (e.g. 200fps->1000fps). Since both are unlikely with nVidia G-SYNC, this officially confirms backlight strobing to keep visible frame displaytimes short (aka persistence). In addition, John Carmack confirmed on twitter that a better backlight strobe driver is included:
Quote:
Originally Posted by John Carmack (@ID_AA_Carmack) on Twitter 
“@GuerillaDawg the didn’t talk about it, but this includes an improved lightboost driver, but it is currently a choice — gsync or flashed.”

Both statements by Andy and John, are confirmations that official backlight strobing (LightBoost) is part of G-SYNC, a 2D motion blur elimination, finally officially sanctioned by nVidia.

In short, all G-SYNC monitors include a LightBoost sequel:
-- G-SYNC mode: Better for variable framerates (less stutters but more blur)
-- Strobe mode: Better for max framerates 120fps@120Hz (zero motion blur)

It can be better than LightBoost, because LightBoost was hard to enable, LightBoost had poor color quality, and they can make it run at any refresh rate. G-SYNC monitors have an 85Hz refresh rate available, and I heard John Carmack said it flickers a bit more at 85Hz than 120Hz, which means we've finally got 85Hz LightBoost available -- a good balance between GPU power and flicker.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #33 of 57 Old 10-22-2013, 01:16 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,289
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 124 Post(s)
Liked: 1343
Quote:
Originally Posted by Mark Rejhon View Post

G-SYNC includes a LightBoost sequel that's superior to LightBoost. When Andy of nVidia was asked whether LightBoost could be combined with G-GSYNC, AndyBNV of nVidia confirmed on NeoGaf:
This scientifically confirms strobing is used, because of the law of vision physics — there is scientifically no other way to do LightBoost-matching low-persistence (1ms) modes without ultrahigh refresh rates (e.g. 1000fps@1000Hz) or frame interpolation (e.g. 200fps->1000fps). Since both are unlikely with nVidia G-SYNC, this officially confirms backlight strobing to keep visible frame displaytimes short (aka persistence). In addition, John Carmack confirmed on twitter that a better backlight strobe driver is included:
Both statements by Andy and John, are confirmations that official backlight strobing (LightBoost) is part of G-SYNC, a 2D motion blur elimination, finally officially sanctioned by nVidia.

In short, all G-SYNC monitors include a LightBoost sequel:
-- G-SYNC mode: Better for variable framerates (less stutters but more blur)
-- Strobe mode: Better for max framerates 120fps@120Hz (zero motion blur)

It can be better than LightBoost, because LightBoost was hard to enable, LightBoost had poor color quality, and they can make it run at any refresh rate. G-SYNC monitors have an 85Hz refresh rate available, and I heard John Carmack said it flickers a bit more at 85Hz than 120Hz, which means we've finally got 85Hz LightBoost available -- a good balance between GPU power and flicker.

It would be awesome if we finally had lightboost with a simple in-monitor menu setting, although I don't think I'd ever choose it over g-sync unless it was a old game that never dropped a frame. I'm hoping they can make it work at 60hz though, 85hz can be problematic for some games.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #34 of 57 Old 10-22-2013, 03:41 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by bd2003 View Post

It would be awesome if we finally had lightboost with a simple in-monitor menu setting, although I don't think I'd ever choose it over g-sync unless it was a old game that never dropped a frame. I'm hoping they can make it work at 60hz though, 85hz can be problematic for some games.
I've been able to glean enough information to confirm G-SYNC strobing appears to be available for 85Hz and 144Hz.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #35 of 57 Old 10-22-2013, 03:42 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,289
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 124 Post(s)
Liked: 1343
Quote:
Originally Posted by Mark Rejhon View Post

I've been able to glean enough information to confirm G-SYNC strobing appears to be available for 85Hz and 144Hz.

I'm assuming in addition to 100 and 120? And chance of black frame insertion being used?

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #36 of 57 Old 10-22-2013, 06:31 PM
Advanced Member
 
Hawkwing's Avatar
 
Join Date: Jan 2004
Posts: 565
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 35
So what's the skinny on the displays? Who is gonna make them and what sizes and resolutions will be available? Is there a projected date of availability?

I won't even pretend I understand how this works, but how well will this technology work if your PC gives you crappy frame rates? Is there gonna be a minimum system requirement on the PC to go along with whatever resolution monitor you would get??
Hawkwing is offline  
post #37 of 57 Old 10-22-2013, 09:28 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
The good news is that people who want "LightBoost" style strobing at other refresh rates, to reduce GPU requirements (85fps @ 85Hz) or to reduce input lag (144fps @ 144Hz). G-SYNC's optional superior sequel to LightBoost (optional fixed-rate strobe mode) actually supports strobing at 85Hz and at 144Hz (at least), in addition to existing LightBoost modes (100Hz and 120Hz). Here are clues:

- The G-SYNC upgrade datasheet has 85Hz added.
- AndyBNV suggested on NeoGAF the low-persistence mode is superior to LightBoost.
- The YouTube video of John Carmack at G-SYNC launch, was very suggestive.
- Many articles mentions 85Hz as a CRT frequency that stops flickering for many people.
- The pcper.com livestream suggests a very high fixed refresh in low-persistence mode.[/list]

Upon analysis, both 85Hz and 144Hz are available strobed modes with G-SYNC, in addition to 100Hz and 120Hz. 60Hz has too much flicker, so that mode isn't fully confirmable. Beyond 144Hz is beyond ASUS VG248QE's bandwidth. Other strobed modes might also be available.

You heard it first from me. smile.gif

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #38 of 57 Old 10-22-2013, 09:40 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by bd2003 View Post

I'm assuming in addition to 100 and 120?
Yes.
Quote:
Originally Posted by bd2003 View Post

And chance of black frame insertion being used?
Your question is non-sequitur to me -- can you rephrase?
For persistence purposes, strobing is mathematically identical to black frame insertion.
Turning OFF a backlight/edgelight creates black frame insertion.

50% blackness of any kind + 50% image flash = 50% motion blur reduction.
75% blackness of any kind + 25% image flash = 75% motion blur reduction.
90% blackness of any kind + 10% image flash = 90% motion blur reduction.

Some more background information:
-- Persistence (static pixel state, sample-and-hold) is not GtG (pixel transitions)
-- Most motion blur on modern LCD's is caused by persistence (pixel staticness) rather than from transitions (pixel GtG).
-- Persistence is equal to backlight strobe length, one strobe per refresh, in full-panel strobe backlights.
-- See www.testufo.com/eyetracking for an example of motion blur caused by persistence, not GtG
-- Motion blur is directly proportional to persistence (especially squarewave persistence such as strobed backlight, strobed OLED, BFI on DLP)
-- Simple persistence math: 1ms of persistence = 1 pixel of motion blur during 1000 pixels/second (assumes squarewave persistence)
-- Almost all 60Hz LCD's have 1/60sec of persistence 16.7ms of persistence, so they all create 16.7ms of motion blur.
-- Common black frame is 50%:50%. (e.g. 60fps at 120Hz with black frames); reduces persistence from 16.7ms to 8.3ms
-- Strobe backlights are a more efficient method of creating blackness than black frame insertion. You can flash much shorter more easily.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #39 of 57 Old 10-23-2013, 03:42 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,289
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 124 Post(s)
Liked: 1343
What I mean to say is that if the reason they're not exposing a 60hz low persistence mode is that it's too far outside 100-120hz range that the lightboost strobing was originally designed for, that there might be an alternative way to implement that mode.

Even though that might flicker a bit like 60hz CRTs in the good ol days, and it's a relatively low refresh rate by current standards....it's still the refresh rate the vast majority of games were designed for, and the easiest to v-sync.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #40 of 57 Old 10-23-2013, 06:00 AM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by bd2003 View Post

What I mean to say is that if the reason they're not exposing a 60hz low persistence mode is that it's too far outside 100-120hz range that the lightboost strobing was originally designed for, that there might be an alternative way to implement that mode.
There's no reason why they can't do 60Hz. It's relatively simple to lower frequency of strobing as it's easy to do because you have more time for the LCD pixels to finish transitioning -- just that it is not a confirmable mode at this time based on my information analysis. They may have already done 60Hz strobing.

But bad publicity from flicker may be a reason not to allow strobing at 60Hz.
So I'm only calling-out 85Hz and 144Hz at this time, based on evidence -- they may have other strobe modes (unconfirmed).

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #41 of 57 Old 10-23-2013, 02:36 PM - Thread Starter
AVS Special Member
 
LexInVA's Avatar
 
Join Date: Jun 2007
Posts: 1,871
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 39
Quote:
Originally Posted by Hawkwing View Post

So what's the skinny on the displays? Who is gonna make them and what sizes and resolutions will be available? Is there a projected date of availability?

I won't even pretend I understand how this works, but how well will this technology work if your PC gives you crappy frame rates? Is there gonna be a minimum system requirement on the PC to go along with whatever resolution monitor you would get??

Monitors come out next year but you can do an upgrade on a specific ASUS monitor if you have one. The only system requirements are a Kepler GPU (650 TI Boost or greater) and the latest GPU drivers. I think DisplayPort is also a requirement but I've read conflicting information.
LexInVA is offline  
post #42 of 57 Old 10-25-2013, 10:08 AM
AVS Special Member
 
Ericthemidget's Avatar
 
Join Date: Jan 2008
Posts: 1,289
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 11
I have a pretty nice system (3570K, GTX770) with a 144hz monitor and I really don't see the value in G-Synch as it propitery and with such a high refresh rate I don't see motion blur or tearing. If this was open to AMD as well and compatible with 120hz/240hz sets then I could get behind it.
Ericthemidget is offline  
post #43 of 57 Old 10-25-2013, 11:37 AM - Thread Starter
AVS Special Member
 
LexInVA's Avatar
 
Join Date: Jun 2007
Posts: 1,871
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 39
I'm certain they might open it up to AMD at the behest of the monitor makers but that would take some serious pressure against the monitor makers. I can see it happening in 3 years if the G-Sync tech takes off but not before then as Nvidia will want to keep it to themselves for that long to win over the gamer market.
LexInVA is offline  
post #44 of 57 Old 10-25-2013, 12:10 PM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,289
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 124 Post(s)
Liked: 1343
Quote:
Originally Posted by LexInVA View Post

I'm certain they might open it up to AMD at the behest of the monitor makers but that would take some serious pressure against the monitor makers. I can see it happening in 3 years if the G-Sync tech takes off but not before then as Nvidia will want to keep it to themselves for that long to win over the gamer market.

I think there's a solid chance someone reverse engineers it to work with AMD long before then.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #45 of 57 Old 10-25-2013, 09:05 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Another useful use of G-SYNC is low-latency fixed refresh rate.
Since G-SYNC has an additional advantage of faster frame delivery times, and faster on-screen scanout time, you can get 60fps@60Hz with cable delivery of only 1/144sec, and screen scanout of only 1/144sec per refresh! (or whatever maximum bandwidth the G-SYNC monitor uses -- or even theoretical future 240Hz G-SYNC monitors using DisplayPort 2.0, which would be able to do 60fps@60Hz or 77.5fps@77.5Hz or 187fps@187Hz, all with just 1/240th sec frame delivery / scanout latency!)

In the past, 60Hz monitors took 1/60sec to scanout (16.7ms).
High speed video of CRT: http://www.youtube.com/watch?v=zVS6QewZsi4
High speed video of LCD: http://www.youtube.com/watch?v=nCHgmCxGEzY

But with G-SYNC, delivery and scanout is decoupled from the refresh rate. You can choose to do 60fps@60Hz, with a lot less input lag than any 60Hz monitor -- even less input lag than a 60Hz CRT, because CRT's take a finite amount of time to scan from top-to-bottom.

Also, fixed refresh rate is great for G-SYNC too, as the software completely seamlessly drives refresh rate:
- Play movies 24fps@24Hz (or 48Hz, 72Hz, 96Hz)
- Play videos 30fps@30Hz
- Play movies 48fps@48Hz
- Play television 60fps@60Hz
- Play television 59.94Hz@59.94Hz
- Play old silent films at original theater speed 18fps@18Hz (or 36Hz, 72Hz)
- Variable frame rate video files
- Future movie formats (e.g. 72fps, 96fps)
- Play mixed TV material and dynamically detect 24p, 30p, and 60p material, dynamically rate-adapt to new fixed rate (with zero mode-change flicker)
- Play emulators 60fps@60Hz with lower input lag (taking advantage of 1/144sec frame delivery times)
- You can avoid stutters even if one frame takes a bit longer to render (e.g. 16.8ms instead 1/60sec = 16.7ms) for any innocous reason such as processing, background apps or error correction, you just delay that specific refresh by an unnoticeable 0.1 second rather than be forced to wait till next refresh cycle. Makes video playback even smoother.
- etc.

So it has apparently great applications for movies and home theater. Variable refresh rate capability & faster frame delivery time belongs in HDMI 3.0, in my humble opinion. Less input lag for receivers, less input lag for sports, future game consoles (XBoxTwo, PS5), less broadcasting latency due to speeded-up frame delivery between settop box and TV, less input lag everywhere, future-proof frame rates, faster frame delivery times from one home theater device to another....

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #46 of 57 Old 04-09-2014, 04:31 AM - Thread Starter
AVS Special Member
 
LexInVA's Avatar
 
Join Date: Jun 2007
Posts: 1,871
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 39
Scuttlebutt says that AMD has gotten VESA to adopt FreeSync as part of the current DisplayPort 1.2a standard as an optional feature so it looks like G-Sync is dead in the water unless they come up with something above and beyond.
LexInVA is offline  
post #47 of 57 Old 04-09-2014, 09:45 AM
AVS Special Member
 
barrelbelly's Avatar
 
Join Date: Nov 2007
Posts: 1,649
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 40 Post(s)
Liked: 221
I have a couple of questions.
  1. Aren't G-Sync and FreeSync pretty much the same? Or work the same way?
  2. Which is a better fit for the potentially looming VR standard that will soon be necessary?
  3. Which will SteamOS be using? Or will it accommodate both?
barrelbelly is offline  
post #48 of 57 Old 04-09-2014, 10:15 AM
AVS Special Member
 
jhoff80's Avatar
 
Join Date: May 2005
Posts: 3,943
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 17 Post(s)
Liked: 138
Quote:
Originally Posted by LexInVA View Post

Scuttlebutt says that AMD has gotten VESA to adopt FreeSync as part of the current DisplayPort 1.2a standard as an optional feature so it looks like G-Sync is dead in the water unless they come up with something above and beyond.

When has a standards-based solution ever stopped Nvidia from pushing ahead with their proprietary version though? biggrin.gif

XBL/Steam: JHoff80
jhoff80 is offline  
post #49 of 57 Old 04-09-2014, 11:26 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,289
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 124 Post(s)
Liked: 1343
Quote:
Originally Posted by jhoff80 View Post

When has a standards-based solution ever stopped Nvidia from pushing ahead with their proprietary version though? biggrin.gif

True. Either way it looks like you'll be able to buy a g-sync monitor long before a freesync monitor.

I don't care what they call it, all I know is a want it (especially in a PJ.)

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #50 of 57 Old 04-10-2014, 06:27 AM - Thread Starter
AVS Special Member
 
LexInVA's Avatar
 
Join Date: Jun 2007
Posts: 1,871
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 39
Quote:
Originally Posted by barrelbelly View Post

I have a couple of questions.
  1. Aren't G-Sync and FreeSync pretty much the same? Or work the same way?
  2. Which is a better fit for the potentially looming VR standard that will soon be necessary?
  3. Which will SteamOS be using? Or will it accommodate both?

1. Same end more or less. Different means.

2. FreeSync looks better on paper and would be easier to implement but of course there is no concrete data on either method when applied to VR tech outside of whatever is being done in the labs.

3. It's a driver thing. As long as it's in the driver, it can be used and SteamOS will have support for it though it's easier to implement FreeSync at this point since it's now part of the DisplayPort standard with the spec made public and G-Sync requires proprietary hardware and software from Nvidia that will be inaccessible to most Linux devs, especially since most Nvidia cards used with Linux setups don't have DisplayPort because they are low-end/integrated/ION and are of the pre-Geforce 600 vintages.
barrelbelly likes this.
LexInVA is offline  
post #51 of 57 Old 04-10-2014, 06:37 AM - Thread Starter
AVS Special Member
 
LexInVA's Avatar
 
Join Date: Jun 2007
Posts: 1,871
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 39
Quote:
Originally Posted by jhoff80 View Post

When has a standards-based solution ever stopped Nvidia from pushing ahead with their proprietary version though? biggrin.gif

Did they have any proprietary hardware and software tech that was already done in an open standard other than CUDA? I can't recall any on the consumer side of things.
LexInVA is offline  
post #52 of 57 Old 04-10-2014, 08:06 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,289
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 124 Post(s)
Liked: 1343
Quote:
Originally Posted by LexInVA View Post

Did they have any proprietary hardware and software tech that was already done in an open standard other than CUDA? I can't recall any on the consumer side of things.

PhysX and 3D vision come to mind.
jhoff80 likes this.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #53 of 57 Old 04-10-2014, 09:38 AM
AVS Special Member
 
jhoff80's Avatar
 
Join Date: May 2005
Posts: 3,943
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 17 Post(s)
Liked: 138
Quote:
Originally Posted by bd2003 View Post

PhysX and 3D vision come to mind.

Those were the big ones I had in mind as well.

XBL/Steam: JHoff80
jhoff80 is offline  
post #54 of 57 Old 04-10-2014, 09:45 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,289
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 124 Post(s)
Liked: 1343
Quote:
Originally Posted by jhoff80 View Post

Those were the big ones I had in mind as well.

Although to be fair, neither CUDA, PhysX, 3D vision or G-sync came after an open standard. I'm not a fan of this proprietary stuff either, but they should get some credit for innovation. They weren't technically the first with SLI (that would be 3Dfx), but they're the first to do so with fully integrated graphics cards, plus PhysX is still the first and only example of load balancing without SLI. I could be wrong but I think triple monitor surround first came from nvidia as well. I even think they were first with MSAA way back in the day, and also led the charge with post process AA with FXAA.

I can hardly think of anything feature-wise that AMD has innovated in the GPU space, even if their cards keep up with performance.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #55 of 57 Old 04-10-2014, 09:56 AM
Yrd
AVS Special Member
 
Yrd's Avatar
 
Join Date: Mar 2008
Location: Elkton, MD
Posts: 2,810
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 71 Post(s)
Liked: 164
I think Matrox did the multiple monitor thing first.

And wasn't Physx a company selling a dedicated physics card before it went to nvidia?

XBL Gamertag- Yrd
PSN - Yerd

Steam - Yrd

Yrd is offline  
post #56 of 57 Old 04-10-2014, 10:30 AM
AVS Addicted Member
 
bd2003's Avatar
 
Join Date: Jun 2004
Location: Long Island, NY
Posts: 10,289
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 124 Post(s)
Liked: 1343
Quote:
Originally Posted by Yrd View Post

I think Matrox did the multiple monitor thing first.

And wasn't Physx a company selling a dedicated physics card before it went to nvidia?

Yeah, the company was called ageia. nvidia ported everything to the GPU.

Matrox....that's a name I haven't heard in years. Their cards will killer back in the s3 virge days. They may have had the first multi monitor card, but I don't think they stretched 3D accelerated games across them.

Just so I don't sound like a fanboy, I fully recognize AMD is just as innovative as intel on the CPU side.

Steam/PSN/Xbox Live: Darius510
bd2003 is offline  
post #57 of 57 Old 04-10-2014, 12:21 PM
Yrd
AVS Special Member
 
Yrd's Avatar
 
Join Date: Mar 2008
Location: Elkton, MD
Posts: 2,810
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 71 Post(s)
Liked: 164
I seem to remember them having something able to span screens way before the others, but it wasn't very powerful. I could be wrong.

I tried googling, but I gave up.

XBL Gamertag- Yrd
PSN - Yerd

Steam - Yrd

Yrd is offline  
Reply HTPC Gaming

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off