Guide: Building a 4K HTPC for madVR - Page 2 - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 94Likes
Reply
 
Thread Tools
post #31 of 499 Old 10-14-2016, 10:22 AM - Thread Starter
Senior Member
 
Onkyoman's Avatar
 
Join Date: Aug 2007
Location: Canada
Posts: 354
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 156 Post(s)
Liked: 75
Quote:
Originally Posted by OzHDHT View Post
You give this advice again, as the this thread's creator, however we still don't know what display you base your comments on. Both the poster and myself have JVC RS series projectors with 4K e-shift for example.. Can you clarify this for us as it makes a lot bigger difference if someone is giving advice based owning a 65" 4K panel vs say 11.5ft CIH setup with a 4K projector. If as Billqs poses, Onkyoman you are indeed Warner who wrote the Kodi guide then I'm a bit perplexed as to why you'd write off NNEDI3 here and not modify the guide to reflect your thoughts more accurately - so perhaps you aren't? Also, wondering if @madshi himself endorses the set up guide on the Kodi Forums? I've also not heard him state NNEDI3 as being a waste of time in any of his posts in the projector threads I'm involved in btw.
For me with the new JVC Z1/4500 on the way with it's full 4K panel, I'll definitely be doing some even more critical testing and tweaking with MadVR which I look forward to.
I'm not saying NNEDI3 is a waste of time. I'm saying it is overkill with chroma upscaling and you aren't likely to notice the difference.

For 1080p -> 4K upscaling, image doubling with SuperRes will likely produce the best result. This image doubling could come in the form of super-xbr or NNEDI3. NNEDI3 is the better of the two, but the difference will be small.

If you have unlimited processing resources, set everything to NNEDI3. If not, don't feel like you are missing out on image quality by using lesser settings.
mingus and VBB like this.

Last edited by Onkyoman; 10-14-2016 at 10:25 AM.
Onkyoman is offline  
Sponsored Links
Advertisement
 
post #32 of 499 Old 10-14-2016, 10:27 AM - Thread Starter
Senior Member
 
Onkyoman's Avatar
 
Join Date: Aug 2007
Location: Canada
Posts: 354
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 156 Post(s)
Liked: 75
Quote:
Originally Posted by billqs View Post
Thanks for your help. I went with your settings from the Kodi guide for high end. I did lower Chroma Upscaling to super-xbr but with NNEDI3 64 for both both Image Doubling and Image Quadrupling + SuperRes 3 I was getting around 55ms render times. I had to lower it down to NNEDI3 32 and I think I had to lower down the Quadruple to NNEDI. I followed the guide's advice about BiCubic 150 for the Image Downscale down from Jinc where I had it and I finally got the render time dependably around 30-34ms and quit dropping frames. I disabled Chroma doubling as it said to in the Kodi guide.

I ended up with a very detailed but very overcooked looking picture. Maybe the Superres on both doubling and quadrupling was too much? I'm sure I can continue to work with the settings to improve.
Image doubling is more important than image quadrupling. I'd dump image quadrupling and stick with image doubling.

Your eyes are the best guide in what looks good. Try lowering the value of SuperRes to 1 or 2.
Onkyoman is offline  
post #33 of 499 Old 10-14-2016, 10:40 AM
Member
 
MisterXDTV's Avatar
 
Join Date: Dec 2008
Posts: 146
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 39 Post(s)
Liked: 23
I have a question guys: I'm looking to upgrade my graphics card for HEVC HW 10-bit decoding.

Right now I have a 1080p monitor/tv but let's say I will have a 4K screen in the future. Is it true that 2GB of VRAM is not enough for true 4K 10-bit playback? I read this somewhere

Do I need 4GB of VRAM?

I'm waiting for GTX 1050 (Ti) and I want to know if more VRAM makes a difference
MisterXDTV is offline  
 
post #34 of 499 Old 10-14-2016, 12:53 PM
Advanced Member
 
billqs's Avatar
 
Join Date: Jan 2003
Posts: 826
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 302 Post(s)
Liked: 179
One thing I may be doing which may be a mistake, I haven't gotten to set up all the different profiles, I've just configured what appear to be the best settings that my card can bear in each of the settings in MadVR. Therefore, under Image Doubling, I have both Image Doubling and Image Quadrupling checked. Image Doubling is checked for at least 2x and Image Quadrupling is checked for at least 3X. Is this correct?

My understanding would be if I have a 1080p source, then the logic under Image Doubling would kick in and Double the image to my 4k display. I assumed what happened when I had a 720p source was that image quadrupling would be selected under the 3x or higher. But technically, is it also 2x or greater than image which would call image doubling up as well?

That might explain some of the reason the card seems to be slightly under-delivering.

JVC RS500, Denon 7200WA, 7.2.4 Atmos/DTSX dedicated Theater. 133" Dalite 1.3 screen. M&K S150 + K7 ear level, 4 Tannoy DC overheads.
billqs is offline  
post #35 of 499 Old 10-14-2016, 01:13 PM - Thread Starter
Senior Member
 
Onkyoman's Avatar
 
Join Date: Aug 2007
Location: Canada
Posts: 354
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 156 Post(s)
Liked: 75
Quote:
Originally Posted by billqs View Post
One thing I may be doing which may be a mistake, I haven't gotten to set up all the different profiles, I've just configured what appear to be the best settings that my card can bear in each of the settings in MadVR. Therefore, under Image Doubling, I have both Image Doubling and Image Quadrupling checked. Image Doubling is checked for at least 2x and Image Quadrupling is checked for at least 3X. Is this correct?

My understanding would be if I have a 1080p source, then the logic under Image Doubling would kick in and Double the image to my 4k display. I assumed what happened when I had a 720p source was that image quadrupling would be selected under the 3x or higher. But technically, is it also 2x or greater than image which would call image doubling up as well?

That might explain some of the reason the card seems to be slightly under-delivering.
Make a profile for each resolution and possibly different source fps. This is covered in the guide.

2160p: NNEDI3 chroma upscaling with sharpen edges + AR + AB (image enhancements)

720p/1080p: super-xbr chroma upscaling with NNEDI3 luma doubling and SuperRes 1-3 (upscaling refinement).

It sounds like you need to experiment with SuperRes if the image is too harsh/detailed.
dwaleke likes this.
Onkyoman is offline  
post #36 of 499 Old 10-14-2016, 01:23 PM - Thread Starter
Senior Member
 
Onkyoman's Avatar
 
Join Date: Aug 2007
Location: Canada
Posts: 354
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 156 Post(s)
Liked: 75
Quote:
Originally Posted by MisterXDTV View Post
I have a question guys: I'm looking to upgrade my graphics card for HEVC HW 10-bit decoding.

Right now I have a 1080p monitor/tv but let's say I will have a 4K screen in the future. Is it true that 2GB of VRAM is not enough for true 4K 10-bit playback? I read this somewhere

Do I need 4GB of VRAM?

I'm waiting for GTX 1050 (Ti) and I want to know if more VRAM makes a difference
I am under the belief 2GB is enough VRAM because of the low fps (24-30). I would go with 4GB in the event high fps content becomes available (50-60 fps).
Onkyoman is offline  
post #37 of 499 Old 10-14-2016, 01:42 PM
Member
 
MisterXDTV's Avatar
 
Join Date: Dec 2008
Posts: 146
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 39 Post(s)
Liked: 23
Quote:
Originally Posted by Onkyoman View Post
I am under the belief 2GB is enough VRAM because of the low fps (24-30). I would go with 4GB in the event high fps content becomes available (50-60 fps).
That was my idea too, but you never know maybe it could be worth the small difference in price and get a 4GB card. But of course I would have preferred to save money as I'm not getting the card for gaming.

The non-Ti version is more than enough in power for me but I don't know about VRAM
MisterXDTV is offline  
post #38 of 499 Old 10-14-2016, 05:27 PM
AVS Forum Special Member
 
LexInVA's Avatar
 
Join Date: Jun 2007
Posts: 3,269
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 662 Post(s)
Liked: 271
VRAM has little bearing on video decoding these days, cause it's usually done by the video decoder ASIC and not CUDA/shader decoding like it was in the old days. Nvidia killed that off several years ago. Encoding/transcoding, on the other hand, will see performance increases with VRAM increases and madVR is software that re-renders video frames to your liking, so it too does better with larger amounts of memory. IMO, there is no reason not to buy the upcoming GTX 1050TI, unless you simply cannot afford it or it won't fit in your case. I am waiting for the compact, single-slot version that was pictured in the leaks, yet we only see longer, two-slot cards with twin cooling for over-clocking enthusiasts, so I am not sure when we'll get what was leaked.
LexInVA is offline  
post #39 of 499 Old 10-15-2016, 03:07 AM
AVS Forum Special Member
 
OzHDHT's Avatar
 
Join Date: Jun 2003
Location: The Antipodes aka Oz
Posts: 2,802
Mentioned: 9 Post(s)
Tagged: 0 Thread(s)
Quoted: 938 Post(s)
Liked: 522
Quote:
Originally Posted by Onkyoman View Post
I'm not saying NNEDI3 is a waste of time. I'm saying it is overkill with chroma upscaling and you aren't likely to notice the difference.

For 1080p -> 4K upscaling, image doubling with SuperRes will likely produce the best result. This image doubling could come in the form of super-xbr or NNEDI3. NNEDI3 is the better of the two, but the difference will be small.

If you have unlimited processing resources, set everything to NNEDI3. If not, don't feel like you are missing out on image quality by using lesser settings.
I need to once again reiterate that if I'm sitting say just on 11 feet from my 11.5 foot wide screen, I'm going to have a lot better chance of perceiving differences between NNEDI3 and Super-xbr in Chroma Upscale than having any chance of doing the same in my living room on my 65" Z9 panel - why I don't even use the spare nvidia equipped Gigabyte Brix mini pc(left over from another location) to view anything and just use Plex.. The large format 4K projector screen viewing scenario is of major relevance to myself and Billqs and why we are both pushing our 1070's get the highest possible settings. I've watched MadVR evolve no from close to its beginnings and have seen the steady increases in PQ the new processing techniques have brought along the way.
OzHDHT is online now  
post #40 of 499 Old 10-28-2016, 06:38 AM
AVS Forum Special Member
 
baniels's Avatar
 
Join Date: Oct 2006
Location: SE Iowa
Posts: 1,385
Mentioned: 27 Post(s)
Tagged: 0 Thread(s)
Quoted: 219 Post(s)
Liked: 350
Noticed something recently and would like some input.

Full HTPC in my signature, but in summary: Win10 Pro, Jriver, i5-4670k, GTX-1060 SC Gaming (6GB). I have a Vizio M70-d3, and I'm outputting video directly from the GPU (while audio goes from the iGPU to my old receiver).

I setup MadVR using the Kodi forum guide from @Onkyoman signature and OP. I used the part of the guide several posts down, starting with "Let's repeat this process, this time assuming the display resolution is 3840 x 2160p (4K UHD). Two graphics cards will be used for reference. A Medium-level card such as the GTX 960, and a High-level card similar to a GTX 1080."

For the most part I used the GTX 960 settings. At this point I can't remember if I made any substantial changes toward the 1080 settings, but I think in an effort to solve this issue I am pretty much back at the 960 settings.

On certain content types, namely TV shows that are 30 (or 29.97) fps, whether 1080p or 720p, I get dropped frames constantly. At least a few per second. Using a little android remote system monitor I could see while viewing this content that the GPU core was running at 96-99%. For 1080p24 content it hovered in the range of 85-92%. Clearly I'm taxing the GPU.

I tried every manner of knocking back settings with no luck, until last night. I unchecked SuperRes from upscaling refinement. Boom - problem went away, GPU usage % dropped to mid 80's and all was good.

Previously I had only basic profiles in MadVR... SD, 720p, 1080p, 2160p. I created another filter as the guide suggests for 1080p60, and one for 720p60 and unchecked SuperRes for these.

My questions are...
  1. Does this make sense?
  2. Should my 1060 have this sort of issue with settings proposed for a 960?
  3. Am I better off sacrificing something other than SuperRes to bring the processing time down?


Here is some MedifInfo data for one of the files in question:

Spoiler!


Thanks!
baniels is offline  
post #41 of 499 Old 10-28-2016, 10:56 AM - Thread Starter
Senior Member
 
Onkyoman's Avatar
 
Join Date: Aug 2007
Location: Canada
Posts: 354
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 156 Post(s)
Liked: 75
Quote:
Originally Posted by baniels View Post
Noticed something recently and would like some input.

Full HTPC in my signature, but in summary: Win10 Pro, Jriver, i5-4670k, GTX-1060 SC Gaming (6GB). I have a Vizio M70-d3, and I'm outputting video directly from the GPU (while audio goes from the iGPU to my old receiver).

I setup MadVR using the Kodi forum guide from @Onkyoman signature and OP. I used the part of the guide several posts down, starting with "Let's repeat this process, this time assuming the display resolution is 3840 x 2160p (4K UHD). Two graphics cards will be used for reference. A Medium-level card such as the GTX 960, and a High-level card similar to a GTX 1080."

For the most part I used the GTX 960 settings. At this point I can't remember if I made any substantial changes toward the 1080 settings, but I think in an effort to solve this issue I am pretty much back at the 960 settings.

On certain content types, namely TV shows that are 30 (or 29.97) fps, whether 1080p or 720p, I get dropped frames constantly. At least a few per second. Using a little android remote system monitor I could see while viewing this content that the GPU core was running at 96-99%. For 1080p24 content it hovered in the range of 85-92%. Clearly I'm taxing the GPU.

I tried every manner of knocking back settings with no luck, until last night. I unchecked SuperRes from upscaling refinement. Boom - problem went away, GPU usage % dropped to mid 80's and all was good.

Previously I had only basic profiles in MadVR... SD, 720p, 1080p, 2160p. I created another filter as the guide suggests for 1080p60, and one for 720p60 and unchecked SuperRes for these.

My questions are...
  1. Does this make sense?
  2. Should my 1060 have this sort of issue with settings proposed for a 960?
  3. Am I better off sacrificing something other than SuperRes to bring the processing time down?


Here is some MedifInfo data for one of the files in question:

Spoiler!


Thanks!
Those are guesses not hard wired, tested settings. It may be that these settings are too aggressive for some content (> 24 fps) as 24 fps is the example frame rate. This is news to me that the 1060 would struggle with these settings.

Try setting the 30 fps settings to ordered dithering and Bicubic chroma upscaling. That might make room for SuperRes.
Onkyoman is offline  
post #42 of 499 Old 10-30-2016, 09:56 AM
Newbie
 
Join Date: Oct 2016
Posts: 4
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 0
new graphic card for madvr

Hello.

Recently, I had replace my HTPC due to a broken CPU. In the past, I struggled with my fanless AMD HD6570 when I switch to madvr.

My new config is i5 6400, Z170 microATX mobo, 8GB DDR4 RAM, 64 GB SSD hard disk and a HDR4400 sat card. HDMI mobo port is connected to a 1080p Pioneer plasma. I consider to buy a 4K OLED UHD in future.

I like SVP (even 120Hz output), and with the new config, I can play 4k content with the i5 6400, and media is smooth with appropiate SVP config. Problem is when switch with madvr again. iGPU Intel HD530 is not enough.

I would like to buy a new card able to trust with madvr. My doubt is AMD or NVIDIA. I consider RX470 or GTX 1050 Ti or GTX 1060 (3Gb or 6GB?). I always had AMD configs, but now with Intel, I consider Nvidia (I never used CUDA features with madvr). I have some doubts:

1) 23,976 and how accurate is frequency rate on NVIDIA? I have read some problems with frequency rate with NVIDIA. iGPU Intel is fine in that point, and AMD has good opinions in that point. Is really important that if I use SVP? Or this is too a problem with 60Hz?

2) I actually have a 1080p panel, if I reproduce 4K content, madvr works to downscale and work with 1080p? In future I consider to buy a 4K panel, what card is more appropiate for that?

3) Would I have enough power with i5 6400 and one of these cards to play with SVP+madvr?

Thanks.
tecram3 is offline  
post #43 of 499 Old 10-30-2016, 10:57 AM - Thread Starter
Senior Member
 
Onkyoman's Avatar
 
Join Date: Aug 2007
Location: Canada
Posts: 354
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 156 Post(s)
Liked: 75
Quote:
Originally Posted by tecram3 View Post
Hello.

Recently, I had replace my HTPC due to a broken CPU. In the past, I struggled with my fanless AMD HD6570 when I switch to madvr.

My new config is i5 6400, Z170 microATX mobo, 8GB DDR4 RAM, 64 GB SSD hard disk and a HDR4400 sat card. HDMI mobo port is connected to a 1080p Pioneer plasma. I consider to buy a 4K OLED UHD in future.

I like SVP (even 120Hz output), and with the new config, I can play 4k content with the i5 6400, and media is smooth with appropiate SVP config. Problem is when switch with madvr again. iGPU Intel HD530 is not enough.

I would like to buy a new card able to trust with madvr. My doubt is AMD or NVIDIA. I consider RX470 or GTX 1050 Ti or GTX 1060 (3Gb or 6GB?). I always had AMD configs, but now with Intel, I consider Nvidia (I never used CUDA features with madvr). I have some doubts:

1) 23,976 and how accurate is frequency rate on NVIDIA? I have read some problems with frequency rate with NVIDIA. iGPU Intel is fine in that point, and AMD has good opinions in that point. Is really important that if I use SVP? Or this is too a problem with 60Hz?

2) I actually have a 1080p panel, if I reproduce 4K content, madvr works to downscale and work with 1080p? In future I consider to buy a 4K panel, what card is more appropiate for that?

3) Would I have enough power with i5 6400 and one of these cards to play with SVP+madvr?

Thanks.
1) 23.976 is not that accurate with Nvidia cards. I get one frame drop/repeat every three and a half minutes. SVP doesn't matter in this case because it is outputting at 60 Hz not 24p.

2) 1080p panels don't require a high end GPU. I am using a GTX 750 Ti with success. However, downscaling 4K to 1080p requires a more powerful card to use aggressive settings. Given the lack of 4K media, this should not matter today. But you should invest in a card that can hardware decode HEVC in the future.

3) SVP + madVR at 1080p would be easy with anything above a GTX 750 Ti. Again, to be future-proof, I would invest in a new card with HEVC decoding.
Onkyoman is offline  
post #44 of 499 Old 10-30-2016, 11:15 AM
Newbie
 
Join Date: Oct 2016
Posts: 4
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 0
Quote:
Originally Posted by Onkyoman View Post
1) 23.976 is not that accurate with Nvidia cards. I get one frame drop/repeat every three and a half minutes. SVP doesn't matter in this case because it is outputting at 60 Hz not 24p.
Would you recommend an AMD card for that aspect? There is any reason to prefer a Nvidia card even so?

Quote:
Originally Posted by Onkyoman View Post
3) SVP + madVR at 1080p would be easy with anything above a GTX 750 Ti. Again, to be future-proof, I would invest in a new card with HEVC decoding.
I wonder that GTX1060 is more powerful than a GTX1060(3G), more powerful than a RX470, and then GTX1050Ti. Is that correct? There is a special feature to be superior to AMD (madvr terms), or vice versa?
tecram3 is offline  
post #45 of 499 Old 10-31-2016, 11:10 AM - Thread Starter
Senior Member
 
Onkyoman's Avatar
 
Join Date: Aug 2007
Location: Canada
Posts: 354
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 156 Post(s)
Liked: 75
Quote:
Originally Posted by tecram3 View Post
Would you recommend an AMD card for that aspect? There is any reason to prefer a Nvidia card even so?



I wonder that GTX1060 is more powerful than a GTX1060(3G), more powerful than a RX470, and then GTX1050Ti. Is that correct? There is a special feature to be superior to AMD (madvr terms), or vice versa?
I don't know anything about AMD. Nor do I know which GPU is the most powerful. You'll have to look at reviews and benchmarks to determine this.
Onkyoman is offline  
post #46 of 499 Old 10-31-2016, 06:28 PM
AVS Forum Special Member
 
JeffR1's Avatar
 
Join Date: Feb 2013
Location: Canada
Posts: 1,226
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 747 Post(s)
Liked: 262
Quote:
Originally Posted by tecram3 View Post
Would you recommend an AMD card for that aspect? There is any reason to prefer a Nvidia card even so?



I wonder that GTX1060 is more powerful than a GTX1060(3G), more powerful than a RX470, and then GTX1050Ti. Is that correct? There is a special feature to be superior to AMD (madvr terms), or vice versa?
Basically the more cores the better and the more processing power there is to run Madvr.
The RX470 has 2048 for example.
https://www.amd.com/en-gb/products/g.../radeon-rx-470

While the GTX1060 (3G) (Three gigabytes of memory) has 1152 cores.
http://www.geforce.com/hardware/10se...force-gtx-1060.

To my knowledge there isn't a card yet that will run Madvr at max settings.
Also know that the more cores there are, the more expensive the cards get _ generally speaking.

Madvr doesn't care if you use AMD or NVIDIA.

Chroma upscaling is the one that eats up the cards and the more resolution you have, the more pixels there are to deal with.

I have a now older GTX980 upscaling to 4K, running Madvr and Dmitri Render.
It's running at around 75 to 80% of its capacity _ it has 2048 cores, do give you some idea of a reference number.

Last edited by JeffR1; 10-31-2016 at 06:41 PM.
JeffR1 is offline  
post #47 of 499 Old 11-03-2016, 11:16 AM
Advanced Member
 
gorman42's Avatar
 
Join Date: Nov 2005
Posts: 698
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 89 Post(s)
Liked: 50
On Doom 9 a guy has compiled a sheet with comparison of madVR run at different levels with different cards. The indication to look for are Tflops, not cores (as cores performance depends on the overall architecture).

Seriously. AVS is a place where you go to learn to be unhappy. - Bear5k
gorman42 is offline  
post #48 of 499 Old 11-11-2016, 11:39 AM
Senior Member
 
RigorousXChris's Avatar
 
Join Date: Jul 2008
Posts: 253
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 138 Post(s)
Liked: 40
I currently have a computer with an i7 and GTX 980ti SLI, and wanted to get people's opinion on what they believe looks better.
I have a 120'' Screen with a 4k Image Shift JVC projector (x750).

Would a 1080p blu-ray using MadVR upscaled to 4k look better than the UHD 4k Blu-ray version?

JVC RS500 | Sony HW40ES | Pioneer SC-95 | Panamax MB1500
Klipsch RP-280FA | RP-450CA | RP-260F | RP-250S | RP-140SA | R-115SW x2
4k MadVR HTPC | Panasonic DMP-UB900 | PS4 Pro | DirecTV Genie
RigorousXChris is offline  
post #49 of 499 Old 11-11-2016, 03:09 PM
AVS Forum Special Member
 
dwaleke's Avatar
 
Join Date: Feb 2006
Location: Michigan
Posts: 2,165
Mentioned: 27 Post(s)
Tagged: 0 Thread(s)
Quoted: 1127 Post(s)
Liked: 526
Quote:
Originally Posted by RigorousXChris View Post
I currently have a computer with an i7 and GTX 980ti SLI, and wanted to get people's opinion on what they believe looks better.
I have a 120'' Screen with a 4k Image Shift JVC projector (x750).

Would a 1080p blu-ray using MadVR upscaled to 4k look better than the UHD 4k Blu-ray version?
Most uhd 4k blurays are 2k upscales.

However they use a wider color range (rec 2020) and hdr.

So it'll depend on the movie. But hdr adds more than the resolution bump imo.

Real 4k stuff would look better.

Movies with poor hdr and colors are not much better than 1080p bluray. So madvr.

Madvr for both will be great once the protection is cracked and we can play the movies through the pc.
dwaleke is offline  
post #50 of 499 Old 11-17-2016, 06:58 AM
Newbie
 
Join Date: Oct 2016
Posts: 4
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 0
Quote:
Originally Posted by JeffR1 View Post
Basically the more cores the better and the more processing power there is to run Madvr.
The rx470 has 2048 cores

While the GTX1060 (3G) (Three gigabytes of memory) has 1152 cores.

According to that. Rx470 (2048 cores) outperforms gtx1070 (1920 cores) with madvr ? And rx480 (2304 cores) is near gtx 1080 (2560 cores)? Always attending madvr performance...

If amd outputs more exactly 23.976 and has better performance/money is a better bet?

Garmon42. Do you remember the link? I can't find it.
tecram3 is offline  
post #51 of 499 Old 11-17-2016, 08:25 AM
Advanced Member
 
sonichart's Avatar
 
Join Date: Jan 2002
Location: East Dundee, IL
Posts: 948
Mentioned: 17 Post(s)
Tagged: 0 Thread(s)
Quoted: 574 Post(s)
Liked: 491
Jumping into the fray here. I'm looking at picking up a gtx 1080 card, but there are a ton of different flavors, manufacturers, clock speeds, etc... It's a bit overwhelming.

Is there one in particular that will do best with MadVR and using NNEID3 (or the new NG1 that madshi is developing).. I plan on posting over at doom9, but there's a 5-day wait for new users to post.

I have been eyeing this one in particular:

https://www.amazon.com/dp/B01K5F8MJK...LCPXKZWH&psc=1

JVC RS600 ¤ 130" Wide Seymour XD 2.35:1
HTPC GTX1080 (MadVr) ¤ Zidoo X8 ¤ Panny UB-900 ¤ HDFury Linker
Denon x4200 ¤ Onkyo M5010 ¤ iNuke 3000 ¤ 7.2.2
sonichart is offline  
post #52 of 499 Old 11-17-2016, 09:31 AM
AVS Forum Special Member
 
dwaleke's Avatar
 
Join Date: Feb 2006
Location: Michigan
Posts: 2,165
Mentioned: 27 Post(s)
Tagged: 0 Thread(s)
Quoted: 1127 Post(s)
Liked: 526
Quote:
Originally Posted by tecram3 View Post
According to that. Rx470 (2048 cores) outperforms gtx1070 (1920 cores) with madvr ? And rx480 (2304 cores) is near gtx 1080 (2560 cores)? Always attending madvr performance...

If amd outputs more exactly 23.976 and has better performance/money is a better bet?

Garmon42. Do you remember the link? I can't find it.
Dollar for dollar (cost and power usage) I don't think there is a single AMD card that can perform as well as the Nvidia counterpart for MadVR right now.

If you look at the madvr thread at doom9 people were posting comparisons of the rx480 and gtx1070. 1070 outperformed and did it without generating as much heat and noise. Things may have changes since I've looked last so go over there and see what is current.
dwaleke is offline  
post #53 of 499 Old 11-17-2016, 10:45 AM
AVS Forum Special Member
 
JeffR1's Avatar
 
Join Date: Feb 2013
Location: Canada
Posts: 1,226
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 747 Post(s)
Liked: 262
Quote:
Originally Posted by tecram3 View Post
According to that. Rx470 (2048 cores) outperforms gtx1070 (1920 cores) with madvr ? And rx480 (2304 cores) is near gtx 1080 (2560 cores)? Always attending madvr performance...

If amd outputs more exactly 23.976 and has better performance/money is a better bet?

Garmon42. Do you remember the link? I can't find it.
Someone corrected me on this too, saying that the more teraflops a card has the better _ for Madvr.
Teraflops is the number of calculations per second and the way I see it, the more cores there are, the more calculations per second that can be carried out.
Prices also increase due to how much video memory is in the card and how fast the cores run at.
The real increase in cost is how many cores it has.

Someone please correct me if wrong about this, when it comes to Madvr, video memory and core speed are not as important as how many cores there are.

Not that this has anything to do with Madvr, but when it comes to frame interpolation (DmitriRender) the more cores the better.
I had a GTX 760 being pushed to its limits (running at 90 degrees) and now my GTX 980 runs at around 65 to 70 degrees.
The amount of memory and speed hadn't changes that much, but the number of cores was increased.

I can't say if one card performs any better when it comes to AMD vs NVIDIA, but AMD always did have lower costs over-all.
JeffR1 is offline  
post #54 of 499 Old 11-17-2016, 10:49 AM
AVS Forum Special Member
 
mightyhuhn's Avatar
 
Join Date: Nov 2013
Posts: 1,489
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 775 Post(s)
Liked: 265
you need 4 GB of vram to display UHD. frame rate has nothing to do with this.
with 2Gb and a GPU queue of 4 you can still use madVR but deinterlancing doesn't work with this and other setting my break down too.

i have a RX 480 and i'm pretty sure it can't beat the 1070 in madVR.
nvidia has currently the better working hardware decoder but a RX 480 4GB version is cheaper than a 6 GB 1060 and the 1060 3G can get in Vram trouble at UHD.
mightyhuhn is offline  
post #55 of 499 Old 11-17-2016, 01:08 PM
Newbie
 
Join Date: Oct 2016
Posts: 4
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 0
I can't find comparisons at doom9. People speaks about his card... the best or shorty card. I insist. Looking for cores: rx480 beats gtx 1070 or i am wrong?

Speaking about heat and temperature. Could you consider a valid solution for silent htpcs the liquid cooling for gpus like nzxt kraken or similar?

Thanks.
tecram3 is offline  
post #56 of 499 Old 11-17-2016, 02:08 PM
AVS Forum Special Member
 
mightyhuhn's Avatar
 
Join Date: Nov 2013
Posts: 1,489
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 775 Post(s)
Liked: 265
you still have to cool the radiator and extra noise from the pump.
water cooling is mostly improving the temps not noise level.

a GTX 1070 is a way better card than a RX 480. the RX 480 shouldn't stand a chance.

a passive 1050/1050 ti could show up sooner than later.
passive cooling a CPU is boring and a no issue at all of cause with intel. AMD doesn't exists on the CPU market for now...
mightyhuhn is offline  
post #57 of 499 Old 11-19-2016, 11:05 AM
Newbie
 
Join Date: Nov 2016
Posts: 4
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 0
I don't game and my machines sole purpose is as an htpc.

I had the evga gtx 960 but as soon as I upgraded from a 1080p to 4k uhd tv I had to swap with the evga gtx 970 in my other machine because it wasn't able to keep up. I tried out the evga gtx 1050ti today but immediately put it back in the box to return because it can't keep up based on how I configured madvr for the gtx 970.

I'm debating getting the evga gtx 1060 or just sticking with my 970 for now. I don't want to get the 1060 and be disappointed again.

I'd love to get a 1070 or even 1080 but that's currently way too expensive for an htpc.

I'm definitely snobby when it comes to audio /video quality lol.

Any thoughts on if the 1060 will out perform the 970 with madvr and mpc-hc or should I not waste my time?

Thanks in advance.
Nemxwasp is offline  
post #58 of 499 Old 11-19-2016, 12:14 PM
AVS Forum Special Member
 
dwaleke's Avatar
 
Join Date: Feb 2006
Location: Michigan
Posts: 2,165
Mentioned: 27 Post(s)
Tagged: 0 Thread(s)
Quoted: 1127 Post(s)
Liked: 526
970 and 1060 will not be much different. You'd need a 1070 or better for an upgrade.
dwaleke is offline  
post #59 of 499 Old 11-19-2016, 12:24 PM
Newbie
 
Join Date: Nov 2016
Posts: 4
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 0
Thanks that's what I was afraid of. Thinking maybe I might look for a used 980 for now or just wait.
Nemxwasp is offline  
post #60 of 499 Old 11-19-2016, 04:39 PM
Member
 
jtscribe's Avatar
 
Join Date: Aug 2007
Posts: 123
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 54 Post(s)
Liked: 13
Any low-profile 4K cards out yet?
jtscribe is offline  
Sponsored Links
Advertisement
 
Reply Home Theater Computers

Tags
htpc build , madvr

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off