AMD R7 250 vs nVidia GTX 750 - madVR - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 9 Old 07-12-2014, 04:19 PM - Thread Starter
Newbie
 
pubare's Avatar
 
Join Date: Oct 2008
Location: Lafayette, LA
Posts: 6
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
AMD R7 250 vs nVidia GTX 750 - madVR

Title says it pretty well... I am looking for a replacement card for madVR performance ONLY (no gaming) and am not willing to use anything that uses greater than 65W of power (thermal / noise / space considerations). Which pretty much means the R7 250 or GTX 750 series. Which one typically does better with madVR performance? Primary use is 720P DVD / 1080P BluRay rips (handbrake CQ = 18) to mkv. CPU is Intel i5-4570 with HM-87 motherboard, 8GB low-latency RAM.
pubare is offline  
Sponsored Links
Advertisement
 
post #2 of 9 Old 07-12-2014, 04:32 PM
Advanced Member
 
techmattr's Avatar
 
Join Date: Aug 2013
Posts: 613
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 71 Post(s)
Liked: 183
Pretty sure @renethx and @MlNDBOMB could answer that off the top of their head.
techmattr is offline  
post #3 of 9 Old 07-12-2014, 09:20 PM
AVS Club Gold
 
renethx's Avatar
 
Join Date: Jan 2006
Posts: 16,240
Mentioned: 13 Post(s)
Tagged: 0 Thread(s)
Quoted: 238 Post(s)
Liked: 387
I tested only R7 250X vs. GTX 750 Ti. They are almost equivalent in madVR performance.

R7 250: 384sp
R7 250X: 640sp
GTX 750: 512sp
GTX 750 Ti: 640sp

So obviously GTX 750 is better than R7 250.
pubare likes this.
renethx is offline  
post #4 of 9 Old 07-13-2014, 01:17 AM
AVS Special Member
 
Dark_Slayer's Avatar
 
Join Date: May 2012
Posts: 2,585
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 203 Post(s)
Liked: 309
Quote:
Originally Posted by renethx View Post
I tested only R7 250X vs. GTX 750 Ti. They are almost equivalent in madVR performance.

R7 250: 384sp
R7 250X: 640sp
GTX 750: 512sp
GTX 750 Ti: 640sp

So obviously GTX 750 is better than R7 250.
What about all this then?
Quote:
Originally Posted by Mfusick View Post
AMD does MadVR a little better because of how they handle OPEN CL
Quote:
Originally Posted by renethx View Post
Not "a little better" but "twice better".
In a brief search I did just now, I see both the 750 Ti (MSI) and 260X (MSI) for about $100 AR (though the techbargains link for the 750Ti has an additional savings code)

Both 2GB

How much difference is there between the 250X 2GB and 260X 2GB (they seem to be priced about the same)
Dark_Slayer is offline  
post #5 of 9 Old 07-13-2014, 02:06 AM
AVS Club Gold
 
renethx's Avatar
 
Join Date: Jan 2006
Posts: 16,240
Mentioned: 13 Post(s)
Tagged: 0 Thread(s)
Quoted: 238 Post(s)
Liked: 387
GTX 750 Ti is equivalent to R7 260X (1971 GFLOPS) in pure shader unit performance (as seen in various gaming benchmarks). But because of poor OpenCL support, it's only equivalent to R7 250X (1280 GFLOPS) in madVR NNEDI3.

260X vs. 250X = 1971 vs. 1280.
Dark_Slayer likes this.
renethx is offline  
post #6 of 9 Old 07-14-2014, 06:48 AM
Member
 
pylor's Avatar
 
Join Date: Mar 2014
Posts: 82
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 20 Post(s)
Liked: 18
I don't think you'll be able to do NNEDI3 image doubling from the 720p material with either of those (I could be wrong), which is the new "standard" for madvr, atleast from what I've read/seen. I have a 270x and can only do 32 neurons image doubling for 720p material.
pylor is offline  
post #7 of 9 Old 07-14-2014, 07:31 AM
AVS Club Gold
 
renethx's Avatar
 
Join Date: Jan 2006
Posts: 16,240
Mentioned: 13 Post(s)
Tagged: 0 Thread(s)
Quoted: 238 Post(s)
Liked: 387
Rendering time per frame to upscale 1280x720 to 2560x1440 (x2) by NNEDI3 neurons 32, then downscale to 1920x1080 (x0.75) by CRARLL (BC75AR for chroma upscaling) is (according to my own test)

- R7 250X: ~29ms
- R7 260X: ~19ms
- R7 270X: ~15ms (all with 13.12 driver)
- GTX 750 Ti: ~28ms

All of them are enough shorter than 41.8ms = 1/(23.976Hz), but not enough shorter than 16.7ms = 1/(59.94Hz). In other words, every card above can upscale 720p24 to 1080p by N32 without dropped frames, but every card will have to drop frames to upscale 720p60 to 1080p.

Comparing GTX 750: 512sp vs. GTX 750 Ti: 640sp, I can assure GTX 750 is fast enough to upscale 720p24 to 1080p by N32.

On the other hand, R7 250: 384sp vs. R7 250X: 640sp, so R7 250 can't upscale 720p24 by N32. If you reduce neurons to 16, then

- R7 250X: 23.3ms (again my own test)
- R7 250: 38.8ms = 23.3ms x (640)/(384)

so that R7 250 can barely upscale 720p24 to 1080p by N16. Using 13.12 driver is critical in this case.

Last edited by renethx; 07-14-2014 at 07:48 AM.
renethx is offline  
post #8 of 9 Old 07-14-2014, 07:52 AM
Member
 
danbez's Avatar
 
Join Date: Feb 2004
Location: Seattle
Posts: 104
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 11
Quote:
Originally Posted by renethx View Post
Rendering time per frame to upscale 1280x720 to 2560x1440 (x2) by NNEDI3 neurons 32, then downscale to 1920x1080 (x0.75) by CRARLL (BC75AR for chroma upscaling) is (according to my own test)

- R7 250X: ~29ms
- R7 260X: ~19ms
- R7 270X: ~15ms (all with 13.12 driver)
- GTX 750 Ti: ~28ms

All of them are enough shorter than 41.8ms = 1/(23.976Hz), but not enough shorter than 16.7ms = 1/(59.94Hz). In other words, every card above can upscale 720p24 to 1080p by N32 without dropped frames, but every card will have to drop frames to upscale 720p60 to 1080p.

Comparing GTX 750: 512sp vs. GTX 750 Ti: 640sp, I can assure GTX 750 is fast enough to upscale 720p24 to 1080p by N32.

On the other hand, R7 250: 384sp vs. R7 250X: 640sp, so R7 250 can't upscale 720p24 by N32. If you reduce neurons to 16, then

- R7 250X: 23.3ms (again my own test)
- R7 250: 38.8ms = 23.3ms x (640)/(384)

so that R7 250 can barely upscale 720p24 to 1080p by N16. Using 13.12 driver is critical in this case.
Renethx comments are my main source of education those days :-)

One follow up question - do you see any difference between 16 and 32 neurons when watching movies? I have it set for 16 today (that's all my card can afford - GTX650), and I wonder if I should upgrade the GPU just to get more neurons (film only, 720p and DVD -> 1080).
danbez is offline  
post #9 of 9 Old 07-14-2014, 08:09 AM
AVS Club Gold
 
renethx's Avatar
 
Join Date: Jan 2006
Posts: 16,240
Mentioned: 13 Post(s)
Tagged: 0 Thread(s)
Quoted: 238 Post(s)
Liked: 387
Generally I see less artifacts with 32 neurons than 16 for DVD upscaling. This depends on the scenes, of course. As for 720p, the difference between 16 and 32 tends to be even smaller.
renethx is offline  
Reply Home Theater Computers
Gear in this thread - i5-4570 by PriceGrabber.com

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off