Cheapest videocard for MadVR? - Page 4 - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #91 of 112 Old 09-14-2012, 07:02 AM
AVS Special Member
 
Sammy2's Avatar
 
Join Date: Mar 2011
Location: Right next to Wineville, CA
Posts: 9,754
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 75 Post(s)
Liked: 181
Quote:
Originally Posted by madshi View Post

Just to let you guys know: It seems that NVidia has finally fixed the presentation glitch problem in their latest WHQL drivers:
http://www.geforce.com/drivers/results/48847#
It seems madVR with default settings now runs fine without any presentation glitches (except at the very start of the video) with these new NVidia drivers. Thanks a bunch to NVidia for fixing this!!!

I suppose I'll be installing that this weekend. Thanks for the head's up. I am wondering if this improves the 29/59 issue with LiveTV playback too?

Sammy2 is online now  
Sponsored Links
Advertisement
 
post #92 of 112 Old 09-14-2012, 07:29 AM
Advanced Member
 
Ruiner's Avatar
 
Join Date: Dec 2002
Posts: 527
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 17
Mentions of GTX650 at techreport with the 660 review, but no benches. Will have DDR5 and run 10 bucks more than a 640. Pricey for a HTPC card, but will probably be a decent gamer.

Any theories on i3-3225/HD4000 with new madVR?


edit: GTX650 review at tom's. Games a tad slower than 7750. Prices around 110 to 120 at newegg.
Ruiner is offline  
post #93 of 112 Old 09-14-2012, 01:37 PM
Senior Member
 
trooper11's Avatar
 
Join Date: Sep 2004
Posts: 442
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 17
Quote:
Originally Posted by Ruiner View Post

Mentions of GTX650 at techreport with the 660 review, but no benches. Will have DDR5 and run 10 bucks more than a 640. Pricey for a HTPC card, but will probably be a decent gamer.
Any theories on i3-3225/HD4000 with new madVR?
edit: GTX650 review at tom's. Games a tad slower than 7750. Prices around 110 to 120 at newegg.


+1 about the i3-3225/HD4000

I'm holding off on my build until I can get an idea of how well MadVR can work with that cpu. It also sounds like anything less then an i3 (anything intel that it) would not be sufficient to run MadVR.

I had basically ignored the AMD line, but I dont have anything against using their cpus since my build isnt about high cpu performance.
trooper11 is offline  
post #94 of 112 Old 09-15-2012, 07:09 AM
AVS Addicted Member
 
Mfusick's Avatar
 
Join Date: Aug 2002
Location: Western MA
Posts: 22,355
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 141 Post(s)
Liked: 761
Quote:
Originally Posted by trooper11 View Post

+1 about the i3-3225/HD4000
I'm holding off on my build until I can get an idea of how well MadVR can work with that cpu. It also sounds like anything less then an i3 (anything intel that it) would not be sufficient to run MadVR.
I had basically ignored the AMD line, but I dont have anything against using their cpus since my build isnt about high cpu performance.

anyone have luck with a 3570 i5 and no graphics card?

-

"Too much is almost enough. Anything in life worth doing is worth overdoing. Moderation is for cowards."
Mfusick is online now  
post #95 of 112 Old 09-15-2012, 10:52 AM
Advanced Member
 
Ruiner's Avatar
 
Join Date: Dec 2002
Posts: 527
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 17
Quote:
Originally Posted by trooper11 View Post

+1 about the i3-3225/HD4000
I'm holding off on my build until I can get an idea of how well MadVR can work with that cpu. It also sounds like anything less then an i3 (anything intel that it) would not be sufficient to run MadVR.
I had basically ignored the AMD line, but I dont have anything against using their cpus since my build isnt about high cpu performance.

From the xbitlabs review, it looks like HD4000 (for the i5 at least) sits between the 5530 and 5550 of the A6 and A8 respectively, or somewhere in GT430 territory. LAV can use quicksync. This should be enough for the current madVR but what about the new build madshi is referencing? Not bad for 55TDP total.
Ruiner is offline  
post #96 of 112 Old 10-26-2012, 01:16 PM
Member
 
RamGuy's Avatar
 
Join Date: Dec 2008
Posts: 65
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 10
I'm in the market for building a HTPC myself, and was aiming for optimum MadVR + LAV using Windows 8 + Windows Media Center with Media Browser running Media Player Classic Home Cinema with MadVR + LAV.

Have pretty much nailed every part of the build besides the case (+ PSU, depends on the case) and graphics card. If the Intel Core i7-3770T with it's Intel HD 4000 is going to be sufficient I will build everything into a compact Streacom FC8B EVO Sort case, if I have to go dedicated and the Radeon HD 7750 isn't recommended and the GT 640 with GDDR3 will have too slow memory would mean I have to go full profile instead of low profile forcing me over to bigger build with Silverstone Sugo 08 case.


I don't mind having dedicated graphics, that would help with emulating old gaming consoles like using Project64 as it's plugins tends to love nVIDIA graphics and pretty much hate everything else, but I hate to be forced into a bigger case and it would also resulting in much higher TDP, more noise and whatnot.. If the Intel HD 4000 managed to cut it even for the most intensive MadVR + LAV workloads that would be terrific.
RamGuy is offline  
post #97 of 112 Old 10-26-2012, 01:46 PM
AVS Special Member
 
madshi's Avatar
 
Join Date: May 2005
Posts: 5,443
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 37 Post(s)
Liked: 116
From what I've heard the Intel HD 4000 does fine as long as you don't want to use Jinc upscaling. For Jinc upscaling it seems that depending on movie framerate and source and output resolution it might be too slow. Of course that's just today. I don't know how much performance future madVR algorithms will cost. There might be more or better algorithms coming in the future which use even more power than Jinc upscaling alone. So my recommendation has always been to get the fastest card that you can afford and which fits into the thermal requirements of your HTPC. I understand the attractiveness of not using a dedicated card. And you can use madVR that way. But you might have to compromise a bit on the scaling and maybe future post processing algorithms that way. That said, Jinc upscaling is pretty new. A couple of weeks ago the Intel HD 4000 could do everything madVR offered. So the question is how much you lose by not using Jinc. Only your eyes can decide that...
madshi is offline  
post #98 of 112 Old 10-26-2012, 10:10 PM
AVS Special Member
 
Tong Chia's Avatar
 
Join Date: Nov 2002
Posts: 1,140
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 16
Quote:
Originally Posted by RamGuy View Post

I'm in the market for building a HTPC myself, and was aiming for optimum MadVR + LAV using Windows 8 + Windows Media Center with Media Browser running Media Player Classic Home Cinema with MadVR + LAV.
Have pretty much nailed every part of the build besides the case (+ PSU, depends on the case) and graphics card. If the Intel Core i7-3770T with it's Intel HD 4000 is going to be sufficient I will build everything into a compact Streacom FC8B EVO Sort case, if I have to go dedicated and the Radeon HD 7750 isn't recommended and the GT 640 with GDDR3 will have too slow memory would mean I have to go full profile instead of low profile forcing me over to bigger build with Silverstone Sugo 08 case.
I don't mind having dedicated graphics, that would help with emulating old gaming consoles like using Project64 as it's plugins tends to love nVIDIA graphics and pretty much hate everything else, but I hate to be forced into a bigger case and it would also resulting in much higher TDP, more noise and whatnot.. If the Intel HD 4000 managed to cut it even for the most intensive MadVR + LAV workloads that would be terrific.

You would be better off with a dual core instead of a quad as most of the heavy lifting is done by the GPU. The GPU is a very hot running part relative to the CPU, at full load the GPU will over heat the CPU and cause it to slow down due to thermal throttling. Quad cores make the problem worse with the additional cores adding to the heat load. They are on to the same piece of silicon.

As an experiment, I ran MadVR with 8tap Jinc, Anti-ringing filter and SVP, sat back and watched the CPU clock drop as it hit 90+ degrees.
At that point the fan in the case was like a vacuum cleaner, not too pleasant for a htpc.

To make make matters worse, Intel cheaped out on the CPU packaging, the heat spreader is no longer soldered to the CPU, they use cheap heat sink paste instead, this makes heat removal difficult leading to overheating.

On Sandybridge the metal cover is soldered to the chip
http://www.geek.com/articles/chips/ivy-bridge-chips-run-hot-due-to-intels-thermal-paste-choice-20120514/

Reports of oveheating when the GPU is stressed. One poster reported 95+ degrees
http://communities.intel.com/message/157887
Tong Chia is offline  
post #99 of 112 Old 10-26-2012, 10:26 PM
AVS Special Member
 
jdcrox's Avatar
 
Join Date: Jan 2003
Location: Cape Cod, MA
Posts: 2,005
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 28 Post(s)
Liked: 61
Quote:
Originally Posted by Tong Chia View Post

You would be better off with a dual core instead of a quad as most of the heavy lifting is done by the GPU. The GPU is a very hot running part relative to the CPU, at full load the GPU will over heat the CPU and cause it to slow down due to thermal throttling. Quad cores make the problem worse with the additional cores adding to the heat load. They are on to the same piece of silicon.
As an experiment, I ran MadVR with 8tap Jinc, Anti-ringing filter and SVP, sat back and watched the CPU clock drop as it hit 90+ degrees.
At that point the fan in the case was like a vacuum cleaner, not too pleasant for a htpc.
To make make matters worse, Intel cheaped out on the CPU packaging, the heat spreader is no longer soldered to the CPU, they use cheap heat sink paste instead, this makes heat removal difficult leading to overheating.
On Sandybridge the metal cover is soldered to the chip
http://www.geek.com/articles/chips/ivy-bridge-chips-run-hot-due-to-intels-thermal-paste-choice-20120514/
Reports of oveheating when the GPU is stressed. One poster reported 95+ degrees
http://communities.intel.com/message/157887
I would be interested to know how they soldered the metal heat spreader to a silicon chip? I wonder if the google translation is correct in the terminology?
Removing the heat spreader and running direct to chip used to be done often on AMD chips, much better cooling than using a heat spreader. But you have to be careful, it is very easy to chip the core itself.
jdcrox is online now  
post #100 of 112 Old 10-26-2012, 10:36 PM
AVS Special Member
 
Tong Chia's Avatar
 
Join Date: Nov 2002
Posts: 1,140
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 16
Quote:
Originally Posted by jdcrox View Post

I would be interested to know how they soldered the metal heat spreader to a silicon chip? I wonder if the google translation is correct in the terminology?
Removing the heat spreader and running direct to chip used to be done often on AMD chips, much better cooling than using a heat spreader. But you have to be careful, it is very easy to chip the core itself.

Fluxless Indium-Tin solder melts at about 157 degrees
http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=4810761&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D4810761

Youtube video on how to pull the soldered top off an Intel CPU
http://youtu.be/eItYq6nxfJ4
http://youtu.be/JdPuneT30_o
Tong Chia is offline  
post #101 of 112 Old 10-27-2012, 01:09 AM
AVS Club Gold
 
renethx's Avatar
 
Join Date: Jan 2006
Posts: 16,082
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 159 Post(s)
Liked: 357
Actually IVB's TIM (thermal interface material) problem is no problem in most cases, including video playback with madVR, unless you overclock a Core i5/i7 "K" processor. Here is my own test results:

- Core i5-3570K / Intel HD Graphics 4000 (both @default clock)
- Scythe Shuriken with the stock fan (100mm 2200rpm PWM) @1300rpm (50%; close to silent); the cooler is very modest, Big Shuriken (for ML03/GD04/GD05) or Geminii S524 is much better.

Running Prime95 (CPU stress test):

The average core temperature reaches 65°C and never goes up, with CPU clock 3.6GHz, GPU clock 0.35GHz (no throttle down)

Running FurMark (GPU stress test):

The average core temperature reaches 45°C and never goes up, with CPU clock 2.1GHz, GPU clock 1.15GHz (no throttle down)

Running Prime95 and FurMark simultaneously:

The average core temperature reaches 75°C and never goes up, with CPU clock 3.6GHz, GPU clock 1.15GHz (no throttle down)

Remember that IVB's TJ Max (maximum operating temperature before throttling) is 105°C. smile.gif

There is a huge discrepancy between "CPU" temperature (the temperature of the center of the heat spreader) and "Core" temperature; the former could be as much as 30°C lower than the latter, perhaps because of not so good TIM of IVB. When configuring SpeedFan, always trust only "Core" temperature.
renethx is offline  
post #102 of 112 Old 10-27-2012, 01:26 AM
AVS Special Member
 
Tong Chia's Avatar
 
Join Date: Nov 2002
Posts: 1,140
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 16
Quote:
Originally Posted by renethx View Post

Actually IVB's TIM (thermal interface material) problem is no problem in most cases, including video playback with madVR, unless you overclock a Core i5/i7 "K" processor. Here is my own test results:
- Core i5-3570K / Intel HD Graphics 4000 (both @default clock)
- Scythe Shuriken with the stock fan (100mm 2200rpm PWM) @1300rpm (50%; close to silent)
Running Prime95 (CPU stress test):
The average core temperature reaches 68°C and never goes up, with CPU clock 3.6GHz, GPU clock 3.5GHz (no throttle down)
Running FurMark (GPU stress test):
The average core temperature reaches 45°C and never goes up, with CPU clock 2.1GHz, GPU clock 1.15GHz (no throttle down)
Running Prime95 and FurMark simultaneously:
The average core temperature reaches 75°C and never goes up, with CPU clock 3.6GHz, GPU clock 1.15GHz (no throttle down)
Remember that IVB's TJ Max (maximum operating temperature before throttling) is 105°C. smile.gif
There is a huge discrepancy between "CPU" temperature and "Core" temperature; the latter could be as much as 30°C lower than the latter, perhaps because of not so good TIM of IVB. When configuring SpeedFan, always trust only "Core" temperature.

Do you have any Furmark Numbers for Furmark + Prime 95 with the CPU @ 3.6 GHz and GPU @ 3.5Ghz ?

Have you tried an i7 running at the same 3.5GHz speed (i7-3770K or similar)?
Tong Chia is offline  
post #103 of 112 Old 10-27-2012, 01:48 AM
AVS Club Gold
 
renethx's Avatar
 
Join Date: Jan 2006
Posts: 16,082
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 159 Post(s)
Liked: 357
Quote:
Originally Posted by Tong Chia View Post

Do you have any Furmark Numbers for Furmark + Prime 95 with the CPU @ 3.6 GHz and GPU @ 3.5Ghz ?

Have you tried an i7 running at the same 3.5GHz speed (i7-3770K or similar)?
You mean GPU overclocked @3.5Ghz? I don't know.

I haven't tested IVB i7, but basically thermal envelop of i7 is close to i5 (only ~6W difference; see e.g. x-bit labs test). Core i7 is physically identical with i5, just with fake additional cores (HyperThreading).
renethx is offline  
post #104 of 112 Old 10-27-2012, 02:21 AM
AVS Special Member
 
Tong Chia's Avatar
 
Join Date: Nov 2002
Posts: 1,140
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 16
Quote:
Originally Posted by renethx View Post

You mean GPU overclocked @3.5Ghz? I don't know.
I haven't tested IVB i7, but basically thermal envelop of i7 is close to i5 (only ~6W difference; see e.g. x-bit labs test). Core i7 is physically identical with i5, with fake additional cores (HyperThreading).

The max non overclocked frequency on the HD4000 is 1.2G, was the 3.5G a typo?, I thought you overclocked yours.

I did my test on the i7-3770K, retail Intel heatsink, no overclock,. I started at about 80-85 degrees and moved north, probably the cramped HTPC case.
Given the TDP number you mentioned it is not far too off at the beginning. (75+6 = 81)

I also notice a larger than expected difference between core and case temps and that is how I started digging and found out about the TIM instead of solder.

I am also running OpenCL as part of SVP, the GPU has to do more shader work which runs out of cache , so it hits the cache hard no downtime to save power, looks like OpenGL is efficient..


The other diff is DXVA copyback for LAV, not sure if it made a difference, CPU/Memory interface gets pretty busy from all the copying so the DRAM line drivers never really shutdown on the chip.

Did your prime95 test do the large memory tests (Blend mode)
Tong Chia is offline  
post #105 of 112 Old 10-27-2012, 03:03 AM
AVS Club Gold
 
renethx's Avatar
 
Join Date: Jan 2006
Posts: 16,082
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 159 Post(s)
Liked: 357
I see it's a typo. smile.gif 3.5GHz -> 0.35GHz (the default idle clock)

IMO using the stock cooler for i5/i7 (except for "T" versions) is a horrendous idea. It's loud at load (precisely speaking, it has to run at the full speed, hence is very loud, to keep the temperature below TJ Max) and even then you may see CPU throttling. Try Big Shuriken 2 (for a low-profile case) or Cooler Master GeminII S524; the difference is like night and day.

Prime95 Blend is not so CPU-intensive. The max core temperature at Prime95 Blend + FurMark is only 70°C.
renethx is offline  
post #106 of 112 Old 10-27-2012, 12:12 PM
AVS Special Member
 
Tong Chia's Avatar
 
Join Date: Nov 2002
Posts: 1,140
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 16
Quote:
Originally Posted by renethx View Post

I see it's a typo. smile.gif 3.5GHz -> 0.35GHz (the default idle clock)
IMO using the stock cooler for i5/i7 (except for "T" versions) is a horrendous idea. It's loud at load (precisely speaking, it has to run at the full speed, hence is very loud, to keep the temperature below TJ Max) and even then you may see CPU throttling. Try Big Shuriken 2 (for a low-profile case) or Cooler Master GeminII S524; the difference is like night and day.
Prime95 Blend is not so CPU-intensive. The max core temperature at Prime95 Blend + FurMark is only 70°C.

Big Shuriken 2 looks nice, I will take a look, thanks.

On the current MadVR with the Jinc scaler and Anti ringing filter operating at 1080p, HD4000 drops too many frames to be watchable (60fps->45fps) and that is without SVP running

On HTPCs I keep things stock, turning off the HD4000 and using an external GPU made most of the issues like the loud fan noise go away.
Tong Chia is offline  
post #107 of 112 Old 10-28-2012, 03:41 AM
AVS Special Member
 
Mike99's Avatar
 
Join Date: Sep 2004
Location: Illinois
Posts: 2,987
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 20 Post(s)
Liked: 45
Quote:
Originally Posted by renethx View Post

There is a huge discrepancy between "CPU" temperature (the temperature of the center of the heat spreader) and "Core" temperature; the former could be as much as 30°C lower than the latter, perhaps because of not so good TIM of IVB. When configuring SpeedFan, always trust only "Core" temperature.

I’m looking at my AMD desktop temperatures using HWMonitor.

Under Temperatures > CPU it’s 36 degrees C.
Under Temperatures > Mainboard it's 33 C.

Under AMD Athlon all the cores are 20 C.

My CPU temperature is higher than the core temperatures. Am I looking at things wrong or is HWMonitor not to be trusted?
Mike99 is offline  
post #108 of 112 Old 10-30-2012, 02:26 PM
Member
 
RamGuy's Avatar
 
Join Date: Dec 2008
Posts: 65
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 10
How does AMD and nVIDIA compare in terms of MadVR quality and performance? I've read that you have some good advantages using nVIDIA due to LAV de-interlacing through CUDA or something in those lines (?)

Reason why I'm asking is because you can grab one Radeon HD 7750 with GDDR5 low-profile, there doesn't seem to be anything low-profile from nVIDIA with the same performance levels, the GT 640 is only GDDR3 and should be slower and I can't find any low-profile GTX 650 or GTX 650 Ti.


Would you chose Radeon HD 7750 1GB GDDR5 Low-Profile or GeForce GT 640 1GB GDDR3 Low-Profile for optimum performance and quality?
RamGuy is offline  
post #109 of 112 Old 10-30-2012, 11:52 PM
AVS Club Gold
 
renethx's Avatar
 
Join Date: Jan 2006
Posts: 16,082
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 159 Post(s)
Liked: 357
I wrote a summary here.

HD 7750 is somewhat moot because it can't handle SD video (not film) with Jinc 3/AR. Basically HD 7750 = Trinity A8/A10 in terms of madVR.

madVR uses the deinterlacing algorithm in GPU's driver. If you select NVIDIA CUVID decoder in LAV Video Decoder (NVIDIA GPUs only, of course), then you can do deinterlacing first at the decoding stage, or postpone it later in madVR; in either case, the same algorithm in the driver is used and the quality is identical.
renethx is offline  
post #110 of 112 Old 11-01-2012, 12:01 AM
AVS Club Gold
 
renethx's Avatar
 
Join Date: Jan 2006
Posts: 16,082
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 159 Post(s)
Liked: 357
I found a way out - overclocking! HD 7750 with core clock 900MHz (the default 800MHz) can handle all video formats just fine under madVR chroma M-N / luma Jinc3+AR. 950MHz is better.

There are several overclocking utility. AMD OpenDrive in AMD Vision Engine Control Center (up to 900MHz), Sapphire TriXX (up to 1200MHz), MSI Afterburner (up to 900MHz) etc. The latter two also support smart fan control of your own.

To overclock the core at video playback, hardware decode acceleration must be turned off (otherwise the "high performance" mode won't kick in). In LAV Video Decoder settings, select

- None, or
- Intel QuickSync (for Intel SNB/IVB users)

in "Hardware Decoder to use".
renethx is offline  
post #111 of 112 Old 11-01-2012, 01:13 AM
Member
 
RamGuy's Avatar
 
Join Date: Dec 2008
Posts: 65
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 10
So basically it makes much more sense going with a AMD Trinity build instead of dealing with dedicated HD 7750? Only problem I've had with AMD Trinity here in Norway is the complete lack of Mini-ITX boards forcing me to go Micro-ATX making the build much larger in the first place. Then I have to chose between a smaller, but thicker Intel IvyBridge + HD 7750 / GT 630 GDDR5 / GT 640 GDDR3 low-profile build or a larger but thinner AMD Trinity build.


Or I could simply go flat out and get something like this:

- Asus P8Z77-I Deluxe
- Intel Core i7-3770T
- 8GB (2x 4GB) 1600MHz @ CL7 DDR3
- Gigabyte GeForce GTX 660 Ti 3GB GDDR5 "WindForce 3"
- Corsair H70 with SP120 Quite Fans
- Seasonic X-400 Passive 400W PSU
- Lian Li PC-Q08 Mini-ITX case


That should do the trick? Only question is whether the added size, power consumption, noise and heat is all worth it? Considering my library consists of primarily:

- 720P / 1080P Blu-Ray rips in H.264 / VC-1 / MPEG-2 @ 23,976 FPS @ High Profile 4,1 @ Dolby Digital True HD / DTS HD Master Audio @ ≈ 20000 - 35000 kbps
- 720P / 1080i / 1080P TV-series rips in H.264 / MPEG-4 / MPEG-2 @ 23,976 FPS @ High Profile 4,1 @ Dolby Digital / DTS @ ≈ 2500 - 5000 kbps


And some very few:

- 480i / 704x400 aka SD TV-series rips (both PAL and NTSC) in H.264 / MPEG-4 / MPEG-2 / XVID / DIVX / AVC @ 23,967 FPS @ Stereo / Dolby Digital / DTS @ ≈ 750 - 1500 kbps
- 480i / 704x400 DVD rips (both PAL and NTSC) in H.264 / MPEG-4 / MPEG-2 / XVID / DIVX / AVC @ 23,967 FPS @ Stereo / Dolby Digital / DTS @ ≈ 750 - 1500 kbps
RamGuy is offline  
post #112 of 112 Old 01-04-2014, 02:43 PM
Newbie
 
DJ-1's Avatar
 
Join Date: Jul 2008
Posts: 3
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Hi currently I'm thinking about what to do about a htpc in the near future...I wanna be able to use madvr with old DVD /SD content,old sd TV shows etc.

I wanna go for has well (due to 'almost perfect 23.976 reproduction) and tie that up with a cuda capable GPU.

Is a i5 minimum?... Or is most stuff offloaded to to GPU when madvr is setup?
When looking for a CPU for use with madvr, would I be looking for high clock speeds spec CPU, or will lower clock + better iGPU core be better?


My current htpc is i3 2105 + As rock Z77e-itx,ssd + Openelec.but suspected dead mobo...

Cheers.
DJ-1 is online now  
Reply Home Theater Computers

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off