AVS Forum banner
1 - 14 of 14 Posts

·
Registered
Joined
·
4 Posts
Discussion Starter · #1 ·
I'm looking to build a new HTPC/media server and want to make sure its able to meet my needs. I read through some of the other threads on future-proofing for HEVC and 4k, but I found that often the OP had different goals and thus some questions still exist.

My Goals:

- Play 2D 1080p24/60 10-bit HEVC + HD audio(passthrough) content - lets assume its super high bitrate, I dont want to run into some demanding 1080p HEVC content and not be able to play it perfectly.

- Some console video game emulators

- Media management - the occasional re-encode/remux, downloading, storage, streaming FROM the htpc to other players - basically need to be able to saturate the wifi bandwidth without affecting simultaneous playback.

- 4k is a "would be nice" but low priority - an option to upgrade to 2160p60 would be sufficient.

- I don't need any of it to fit into a small form factor case or mobo.

Basically I need to figure out what CPU/GPU I need to support my needs - sounds like Skylake or Carrizo will support proper hw decoding of HEVC (I can wait if I have to), but is Haswell or Broadwell (socketed desktop chips expected in June) enough to do it all with software/hybrid decoding, and if so, how low can I go?

Additionally, I haven't dug into MadVR yet, but I'm constantly working on improving the quality of my home theater setup and I'm pretty sure I want to be able to handle some of this. How much more CPU/GPU is needed to run some fancy MadVR post-processing on top of the 1080p HEVC decode? Does proper hw decoding of HEVC via Skylake or Carrizo leave more headroom (or less?) for MadVR, thus negating the need for a discrete GPU? Or do I need a discrete GPU to get a lot out of MadVR, making the wait for Skylake/Carrizo a non-issue?

I don't have a lot of HEVC content yet, so if it means adding discrete GPU later to handle the HEVC decoding, I could do that. I also have a 3770k that could be re-purposed (good excuse to upgrade my gaming rig), but I heard there were some refresh rate issues with the older Intel chips and I wouldn't want to use that without a discrete GPU (i think).

The rest of the hardware I either have or can figure out just fine, it's the CPU/GPU requirements that are giving me trouble.

Thanks!
 

·
Registered
Joined
·
2,488 Posts
Only full solution for HEVC at the moment is GTX 960 (someone oorrect me if I'm mistaken). Broadwell iGPUs do "hybrid" HEVC decode as do various other Nvidia GPUs and AMD's Tonga (R9 285--AMD's only partial-HEVC solution).


Skylake is expected to have HEVC support in iGPUs but remains to be seen. Similarly Carrizo APUs should probably have support for HEVC as well but you'll have to wait for release to find out.


madVR I don't know too much about, but from what I do know, you're probably going to want a dGPU for that if you wish to use some of the intensive features. I wouldn't imagine any iGPU other than Iris Pro-class or the top AMD APU will do too much for you. Only thing is again only choice you have there is GTX 960 and that's only as good as a R7 260X in madVR (from what I've read here). So, your best bet is probably to wait for new GPUs.
 

·
Registered
Joined
·
23,131 Posts
- Play 2D 1080p24/60 10-bit HEVC + HD audio(passthrough) content - lets assume its super high bitrate, I dont want to run into some demanding 1080p HEVC content and not be able to play it perfectly.
I guess the first question is why? There's no commercial source for HEVC content that I'm aware of, and if you're re-encoding yourself, I don't know why you'd be worried about "super high bitrate" since the only reason to transcode to HEVC at the moment is to make things as small as possible.

I wouldn't worry about HEVC until it's time to start dealing with Ultra HD Blu-ray, which should be the first source to really bring HEVC to the public.
 

·
Registered
Joined
·
4 Posts
Discussion Starter · #4 ·
I guess the first question is why? There's no commercial source for HEVC content that I'm aware of
Future proofing - but I'd like to focus more on finding out what it takes instead of why I want to do it.


and if you're re-encoding yourself, I don't know why you'd be worried about "super high bitrate" since the only reason to transcode to HEVC at the moment is to make things as small as possible.
I prefer my video to be as compressed as possible without sacrificing a lot of quality - not necessarily lossless, but near-lossless, which usually results in large file sizes. I want to be sure the hardware can handle it, but maybe "super-high bitrate" is not the correct term - perhaps I should be saying low compression? I'm assuming that it takes a lot of resources to playback a 20GB 2-hour movie, but maybe it takes more resources to decode a 2GB one that's been highly compressed?

@ES_Revenge -
Thanks for the insight. I think you've helped me to narrow down my questions:
-I'll plan to buy a discrete GPU if I choose to pursue MadVR, and likely once more HEVC capable ones are available.
-For now, I'll assume that the Carrizo/Skylake would be sufficient to meet my goals (would confirm once they are available)
-Unless I'm missing something, I don't think it matters to me if its software/hybrid/hardware decoded, as long as it can be done.


And so I'm left with this question:
What is the minimum amount of CPU required to playback 2D 1080p60 10-bit HEVC + HD audio(passthrough) content on available hardware? I've found some data (Sorry, can't post links yet) that claims the 4770k can software decode HEVC 2160p60 (ill assume 8-bit and a stock 4770k) without stuttering - but this source does not explain where the cutoff is (e.g. can the 4690k do it too?), and does not take into account 10-bit or 1080p60 (only 1080p24).


According to recent driver updates which annouced hybrid decoding support for some 4th and 5th gen Intel Core processors, Haswell does NOT have hybrid decoding for 10-bit HEVC, so it would have to be all software decoded - is an i3-4130 enough? Is a 4690k or OC'd 4790k even enough? Somewhere in between?
 

·
Registered
Joined
·
23,131 Posts
I prefer my video to be as compressed as possible without sacrificing a lot of quality - not necessarily lossless, but near-lossless, which usually results in large file sizes. I want to be sure the hardware can handle it, but maybe "super-high bitrate" is not the correct term - perhaps I should be saying low compression? I'm assuming that it takes a lot of resources to playback a 20GB 2-hour movie, but maybe it takes more resources to decode a 2GB one that's been highly compressed?
It generally takes more power for higher bitrates, but I'm not sure where you're going with this. As home, end users, we have absolutely no access to uncompressed content. Blu-ray uses H.264 (and sometimes VC1). You're unlikely, for a long time, to run into high bitrate, 1080p H.265 content. H.265 will first be deployed for streaming, to reduce bandwidth further (going against your low compression goal) and Ultra HD Blu-ray, which is obviously Ultra HD (2160p).

If you're thinking recompressing, well if your goal is to minimize compression, then you are, by far, best off just not recompressing at all. Just leave what you get on your Blu-ray alone. If you want to reduce size, then for the next while you're still better off sticking with H.264. The last I'd heard, x264 was still actually more efficient (ie makes smaller sizes for the same size) than x265 which is still in development. By the time x265 is significantly more efficient than x264, you'll have no trouble getting hardware capable of dealing with high bitrate H.265.

And so I'm left with this question:
What is the minimum amount of CPU required to playback 2D 1080p60 10-bit HEVC + HD audio(passthrough) content on available hardware? I've found some data (Sorry, can't post links yet) that claims the 4770k can software decode HEVC 2160p60 (ill assume 8-bit and a stock 4770k) without stuttering - but this source does not explain where the cutoff is (e.g. can the 4690k do it too?), and does not take into account 10-bit or 1080p60 (only 1080p24).
The simple reality is no one knows yet. There's not enough content, or people interested yet to have built up a data set or rule of thumb for what is needed. If I were you I would just wait and see. If you really want to be future proof, you should probably encode yourself some 4K content (either native, or upconverted, doesn't really mater) at about 100-120Mbps and try it yourself. That should cover you for Ultra HD Blu-ray.
 

·
Registered
Joined
·
3,714 Posts
There are cheap chinese arm boxes doing varying versions of hardware HEVC 4K decoding

I don't recommend these, but just saying 2015 will likely see a big name android box released with the capability. Roku might jump in as well? Shield console will have some support for this

It's all so early, I'd say just meet all of your other needs up to 1080p on the cheap with 1080-HEVC and ignore 4k-HEVC for now. When it comes around to "real" 4k content becoming mainstream (blu-ray) replace the cheap with the next cheap.

I predict you won't see any savings going the custom route. I can't say for sure that next gen cards (4k more stabilized by then hopefully) will still use PCIe. Nvidia, for example, has said that Pascal is going to stomp Maxwell (x10 performance increase) but they may move to NVLINK instead of PCIe only. Which means a new motherboard, which means a new cpu, and that basically means a new htpc. At that point you're also tempted to do ddr4 essentially only reusing the case, ssd, and psu.

You can also do cheap custom instead of android and get very good 1080p results. Cheap matx with a milo3 (instead of 4) and a g1610 w/ a picoPSU can be put together for somewhere in the neighborhood of $150. I know first-hand that a g1610 can do most 1080p HEVC (I've used it to playback a few HEVC titles I already have, encoded from h264 ~10GB each)
 

·
Registered
Joined
·
322 Posts
Future proofing - but I'd like to focus more on finding out what it takes instead of why I want to do it.






And so I'm left with this question:
What is the minimum amount of CPU required to playback 2D 1080p60 10-bit HEVC + HD audio(passthrough) content on available hardware? I've found some data (Sorry, can't post links yet) that claims the 4770k can software decode HEVC 2160p60 (ill assume 8-bit and a stock 4770k) without stuttering - but this source does not explain where the cutoff is (e.g. can the 4690k do it too?), and does not take into account 10-bit or 1080p60 (only 1080p24).


According to recent driver updates which annouced hybrid decoding support for some 4th and 5th gen Intel Core processors, Haswell does NOT have hybrid decoding for 10-bit HEVC, so it would have to be all software decoded - is an i3-4130 enough? Is a 4690k or OC'd 4790k even enough? Somewhere in between?
I don't understand why you are so hell bent on 'future proofing' a pc build for something like this that is still essentially in its infancy. Like a couple others have pointed out, buy what works now, and when 4k HEVC is common, buy what works then. That's kind of the beauty of a pc based setup. Instead of paying top dollar and way more powerful stuff (think heat/power consumption/etc) then needed now simply to get one of the first devices out the gate to support a new standard, especially if only partially so...why not just get what is more reasonably suited overall for the now, then when HEVC comes in to play and is supported by about everything at that time, simply upgrade the video card at that time. That way you will maintain an overall more efficient better suited system both now and down the road for significantly less money in the end than trying to 'do it all' up front before it's prime.


In a nutshell in my eyes, you are talking about overclocked i7's, GTX970, etc just to do HEVC (think expense, heat, noise, power consumption, etc) when in a year or 2 when/if by then HEVC really comes into play, even a cheap intel dualcore processor using igp will likely hardware decode it just as well if not better than a hot, comparatively louder, power hog, high dollar system you want to build today. I'd just go with an i3 since you want to do some transcoding....it should be enough to do on the fly with about anything you can throw at it with common methods...maybe bump it up to an i5 or even i7 if shaving some time off your transcoding is of critical importance thus worth the cost increase...and stick to the igp for now if it covers your current use, or the lowest cost video card that will cover your needs if you are using madVR or something...then simply upgrade the video card (or add if using igp) down the road when HEVC is common and about everything supports it. Remember, transcoding and ripping will use up any raw power available thus more power = shorter times. Software decoding does to a point, so not much reason to go beyond what is required for what you want to do. Hardware decoding is just that and doesn't impact raw performance much. For something full hardware decoded, the lowest cheapest cpu/gpu processing around should do just as well as the highest most expensive around. You just have to find that balance with something powerfull enough to fit your needs, but not too much wasteful unnecessary overhead. Building that powerfull of a system just to get hardware decoding that isn't widely available yet just nets you a very costly and inefficient system best case when it does become a norm.
 

·
Registered
Joined
·
3,567 Posts
2016/2017 is when you want to think about building a 4K HTPC. The hardware (GPU and displays) will be there as will the software for playback. HEVC is supposed to be in all the next-gen GPUs, but it remains to be seen as to whether or not Google's VP9 codec - which is going to be used by YouTube and others - will make it into any upcoming GPU releases. Google finally got a hardware IP to license at the end of last year, so they are playing catch-up with HEVC, which is in the fine-tuning/optimization phase. The only hardware which is 100% sure to have VP9 this year and next are the embedded device GPUs for phones, tablets, and media boxes, which have hardware accelerated decoding of VP9 and HEVC.
 

·
Registered
Joined
·
4 Posts
Discussion Starter · #9 ·
Again, I'd like to focus more on what it takes to do it then why I want to do it. If it makes it easier, pretend I have a fancy video camera and I have home movies encoded with x265 - and I'll have to re-iterate: I am not concerned with 4k, that's a "would be nice"

what I'm reaching for:
Lets say for example, that an [email protected] DOES NOT have enough power to properly decode 2D 1080p60 10-bit HEVC - but an [email protected] DOES - I would then be able to decide whether or not paying the extra $50 for a better processor is worth it to me to have that capability now, and even more importantly, be able to wait LONGER (more options) to buy a discrete GPU for hardware decoding, MadVR, maybe 4k support later.


I think Dark Slayer gets the idea
... meet all of your other needs up to 1080p on the cheap with 1080-HEVC and ignore 4k-HEVC for now.

... I know first-hand that a g1610 can do most 1080p HEVC (I've used it to playback a few HEVC titles I already have, encoded from h264 ~10GB each)
This is exactly what I'm looking for, or at least its another data point - thanks. Ever try to play back 10-bit HEVC content with that g1610? I've heard it takes more resources, but I have no idea how much.


The simple reality is no one knows yet. There's not enough content, or people interested yet to have built up a data set or rule of thumb for what is needed. If I were you I would just wait and see.
Thanks here too, I'll take "We just don't know" as an answer as well, but I have seen SOME data out there, just not exactly what I'm looking for. One test I read shows average CPU loads while (SW) decoding HEVC 1080p24 - 27% for a 4330 , and 11% for a 4770k. This just doesnt account for 10-bit and 1080p60, and doesn't show anything in between (higher end i3, i5, etc).
 

·
Registered
Joined
·
3,714 Posts
I'd be happy to try a sample if you can drop a clip into a public share

I missed the hi 10 part. I have no 10 bit content, but I think we can both guess the g1610 will fall on it's face trying that. I can also try on a 3570k as well, which surely "should" do it but I'll check utilization to see how much is left in the tank
 

·
Registered
Joined
·
3 Posts
Update?

Just out of interest, what path did you end up taking? As my current 6 year old system is coming to the end of it's life, I can understand your original question.

Unfortunately, I feel that in order to get the same amount of uninterrupted wear out of my next upgrade, I am realistically 1-2 years away. I will need to upgrade my MB, CPU, RAM and possibly GPU.
 

·
Registered
Joined
·
4 Posts
Discussion Starter · #12 ·
I basically decided to wait for skylake. this should give me a good platform for hw decoding HEVC content and allow an upgrade path to 4k if I ever feel like it. it allows for the same options if I choose to pursue madVR any further. this also allows me to wait for the best prices on components as long as I know that they'll fit into my system. for example I picked up an SSD for very cheap already and I'll be able to buy RAM as soon as I know if it will be ddr3 or ddr4, and I've been collecting 4 terabyte Nas drives when they are on sale (which im using in the meantime). It seems like i should be able to get away with a lower end skylake cpu then what I would have to pay for with Haswell or Broadwell in order to get the performance I'm looking for over the next five years or so. plus the lack of good data on HEVC with Haswell and Broadwell allowed me to set aside that urgent feeling to build a PC right away. :)
 

·
Registered
Joined
·
2 Posts
I kind of wish I found this thread a few months back. Save me a lot of headache. The Roku 4 is the only retail HTPC I can think of that's built around 4K HEVC and vp9 right now (the specs say nothing about 10 bit), but that will likely become outdated very soon. You'd at least want a projector to justify 4K, and those displays aren't ready for 4Kp60 Main 10 yet either. Same reason you should wait for upgrades. You might even hold off until Kaby Lake processors are released since Skylake is only planned for partial/hybrid HEVC 10 bit and vp9 acceleration.

I wouldn't worry much about PCIe becoming outdated though. Even 4-way SLI Titans haven't fully saturated PCIe 3.0 I've been told. Most can get away with PCIe 2.0 today with no change in performance. There's also Thunderbolt 3.
 

·
Registered
Joined
·
2 Posts
I kind of wish I found this thread a few months back. Save me a lot of headache. The Roku 4 is the only retail HTPC I can think of that's built around 4K HEVC and vp9 right now (the specs say nothing about 10 bit), but that will likely become outdated very soon. You'd at least want a projector to justify 4K, and those displays aren't ready for 4Kp60 Main 10 yet either. Same reason you should wait for upgrades. You might even hold off until Kaby Lake processors are released since Skylake is only planned for partial/hybrid HEVC 10 bit and vp9 acceleration.

I wouldn't worry much about PCIe becoming outdated though. Even 4-way SLI Titans haven't fully saturated PCIe 3.0 I've been told. Most can get away with PCIe 2.0 today with no change in performance. There's also Thunderbolt 3.
Oh and there's also the doom 9 thread on the state of HEVC decoders Thread #171219 .

For 4K BluRay with a bitrate of 100Mbps and 10 bit resolution, expect CPU decoders and hybrid decoders to be useless (?)* even with Haswell-E or Xeon processors.

We are going to need definitely pure fixed-function HW decoders for 4K BluRay.

On the other hand, 4K BluRay will appear on winter holidays of 2015, so until then, CPU and hybrid decoders are just fine for low bandwidth and low fps clips that HEVC is the best codec to use.
 
1 - 14 of 14 Posts
Top