AVS Forum banner

1 - 11 of 11 Posts

·
Banned
Joined
·
8 Posts
Discussion Starter #1
Hello

I plan on building a PC that will double as a HTPC. I decided against having a dedicates htpc because my main desktop PC always runs 24/7. My TV is 1080p. I will watch everything from from live tv to blurays.

For good portion of the time PC will be used simultaneously as a regular PC and as HTCP. My goal is to not notice any slowdowns on PC monitor or TV. Video on TV should not stutter just because I started some demanding game (while also possiby encoding a video or rendering a 3d scene, I hope not to be limited in multitasking).

What should I consider when building such a PC?

I have a couple of theories and questions, so please comment on them:

- What CPU should I use? FX 8350 has more cores, i7 has faster cores. Would more cores equal a smoother user experience without any glitches, even if individual programs run noticably slower? Smooth user experience without random stuttering ijs more important than the overall speed.
- What graphic card? Does amd vs nvidia matter? I dont want to spend a ton, hopefully something in the r7 265 or 650ti price range will do.
- Is it possible to add additional cheaper graphic card for TV use only, so that video rendering doesn't take resources from the main graphic card? I know about SLI/crossfire, but that is a bit different and also out of my budget.

Any other comments appreciated.. thanks.
 

·
Registered
Joined
·
3,714 Posts
I kind of use this setup already. My server is 24/7 available for whatever I want to work on. I do any heavy lifting work on it through chrome RDP (I could just as easily go sit at it instead)

I watch my entire library (non 3D, non vc1, without hd-audio passthrough but can be decoded) on a Nexus Player, and I play my steam/origin games through Limelight which makes use of Nvidia's Gamestream protocol

I have an i7-3370k w/ GTX 660. I would recommend Nvidia if you want to use the gamestream part since limelight is already functional and works a treat. There is talk of integrating limelight into Kodi with a GSOC project or to roll it into the upcoming retroplayer Kodi project.

AMD has a similar purpose build video encoder, but no open source project like Limelight or even closed source like Shield built around it. The only thing that makes use of it is Steam, and I'm not sure how well they make use of it to be honest

The ADT-1 would work a little better, but still can't handle full vc1. The new shield console would be best, but won't be released until May

Not moving to a two box setup and using just one, it shouldn't be difficult to run your dGPU for gaming/monitor/PC stuff and run the iGPU for HTPC/HDTV stuff
 

·
Banned
Joined
·
8 Posts
Discussion Starter #3
I kind of use this setup already. My server is 24/7 available for whatever I want to work on. I do any heavy lifting work on it through chrome RDP (I could just as easily go sit at it instead)

I watch my entire library (non 3D, non vc1, without hd-audio passthrough but can be decoded) on a Nexus Player, and I play my steam/origin games through Limelight which makes use of Nvidia's Gamestream protocol

I have an i7-3370k w/ GTX 660. I would recommend Nvidia if you want to use the gamestream part since limelight is already functional and works a treat. There is talk of integrating limelight into Kodi with a GSOC project or to roll it into the upcoming retroplayer Kodi project.

AMD has a similar purpose build video encoder, but no open source project like Limelight or even closed source like Shield built around it. The only thing that makes use of it is Steam, and I'm not sure how well they make use of it to be honest

The ADT-1 would work a little better, but still can't handle full vc1. The new shield console would be best, but won't be released until May

Not moving to a two box setup and using just one, it shouldn't be difficult to run your dGPU for gaming/monitor/PC stuff and run the iGPU for HTPC/HDTV stuff
So it is possible to use a dedicated and integrated GPUs simultaneously.. that's great. I wonder how you go about configuring a program to use or the other though?

And what would the worst case scenario CPU usage be when using intel's integrated GPU to decode and potentially also deinterlace a video? I will play blurays, 720p (60 fps), 1080i/p, and SD (both p and i), all upscaled (if necessary) to 1080p.
 

·
Banned
Joined
·
8 Posts
Discussion Starter #4
I am a bit conflicted whether to go the FX 8350 or the Intel route (I would probably get I7 4770k).

I am not sure if twice the number of cores on the AMD side would be more beneficial than Intel's faster cores plus an integrated GPU that could be used just for HTCP.
 

·
Registered
Joined
·
2,486 Posts
I am a bit conflicted whether to go the FX 8350 or the Intel route (I would probably get I7 4770k).

I am not sure if twice the number of cores on the AMD side would be more beneficial than Intel's faster cores plus an integrated GPU that could be used just for HTCP.
The i7 is faster overall but consider that a 4770k costs roughly double what an FX 8350 does. It's quite a difference in price. FX-8350's marketing position is, realistically, against a low-3Ghz-range Ivy Bridge i5, not a Haswell i7 K processor.

Though the Vishera does have more cores, each core is 30-40% slower than each Haswell i7 core; plus, the i7 has HT. The other thing is, with Vishera, you're not really getting 8 "full" cores; you're getting 8 modules. What may not be immediately obvious is each module essentially shares a single FPU (without getting overly technical, that's how it works). The i7 of course, also has four FPUs. That said, Vishera is still pretty competent at a lot of multi-threaded tasks, like encoding/transcoding which it does pretty well at. But then i7 has an iGPU which, if you can use it, has QuickSync which can speed transcode operations a great deal.

As for doing something like playing a game and doing encoding at the same time, I dunno, but no matter what you do, a Haswell i7 won't be significantly slower than the Vishera FX at anything. I.e. If there is anything the Vishera can do faster, the i7 will be right on its heels. And, there's probably very little the Vishera is competitive/faster at. Piledriver (what Vishera falls under) is an aging architecture and wasn't even all that modern when it was released.

Make no mistake, the Haswell i7 is the superior processor. However the question becomes if it's worth 2x the price, for your needs.
 

·
Banned
Joined
·
8 Posts
Discussion Starter #6
The i7 is faster overall but consider that a 4770k costs roughly double what an FX 8350 does. It's quite a difference in price. FX-8350's marketing position is, realistically, against a low-3Ghz-range Ivy Bridge i5, not a Haswell i7 K processor.

Though the Vishera does have more cores, each core is 30-40% slower than each Haswell i7 core; plus, the i7 has HT. The other thing is, with Vishera, you're not really getting 8 "full" cores; you're getting 8 modules. What may not be immediately obvious is each module essentially shares a single FPU (without getting overly technical, that's how it works). The i7 of course, also has four FPUs. That said, Vishera is still pretty competent at a lot of multi-threaded tasks, like encoding/transcoding which it does pretty well at. But then i7 has an iGPU which, if you can use it, has QuickSync which can speed transcode operations a great deal.

As for doing something like playing a game and doing encoding at the same time, I dunno, but no matter what you do, a Haswell i7 won't be significantly slower than the Vishera FX at anything. I.e. If there is anything the Vishera can do faster, the i7 will be right on its heels. And, there's probably very little the Vishera is competitive/faster at. Piledriver (what Vishera falls under) is an aging architecture and wasn't even all that modern when it was released.

Make no mistake, the Haswell i7 is the superior processor. However the question becomes if it's worth 2x the price, for your needs.
Thanks for all the info. I7 + dedicated GPU, while using integrated GPU and quicksync to decode the video, sounds perfect for what I need. It would free CPU's and dedicated GPU's resources for other things.

But I've found very conflicting information about using both integrated and dedicated GPUs simultaneously, and how reliable it is. Do I need a special chipset on the motherboard that supports this? Do I need virtu mvp software? Why do so few motherboards for haswell ship with virtu mvp software, unlike older generation motherboards?

A lot of conflicting information when I tried to google about it, so hopefully someone can clear things up.
 

·
Registered
Joined
·
2,486 Posts
Virtu MVP is mainly a frame-rate & vsync "enhancer", from what I know. It uses the dGPU and iGPU together to prevent tearing and stuttering caused by mismatched framerates and monitor refresh rates. Though initially I believe it was an "OEM thing" where mobo makers would include the software and licence with the board, at MVP2, I believe this changed and there weren't many OEM implementations. Instead LucixLogix started to sell the software on their own and I don't believe it has and overly specific hardware requirements.


However for the Virtual Vsync that Virtu MVP does, will slowly be superseded by the likes of G-Sync and FreeSync. I'm guessing there are more uses for Virtu software but give it's a paid program and there's probably other solutions these days, you might not want to focus too much on that.


Instead you might take a look at OBS which allows you to do such things as use QuickSync to encode/stream gaming content, while playing on a dGPU. I'm not exactly sure what your uses are but there should be software out there to enable the use of QS, regardless of having a dGPU (provided the CPU has QS of course).
 

·
Banned
Joined
·
8 Posts
Discussion Starter #8
Virtu MVP is mainly a frame-rate & vsync "enhancer", from what I know. It uses the dGPU and iGPU together to prevent tearing and stuttering caused by mismatched framerates and monitor refresh rates. Though initially I believe it was an "OEM thing" where mobo makers would include the software and licence with the board, at MVP2, I believe this changed and there weren't many OEM implementations. Instead LucixLogix started to sell the software on their own and I don't believe it has and overly specific hardware requirements.


However for the Virtual Vsync that Virtu MVP does, will slowly be superseded by the likes of G-Sync and FreeSync. I'm guessing there are more uses for Virtu software but give it's a paid program and there's probably other solutions these days, you might not want to focus too much on that.


Instead you might take a look at OBS which allows you to do such things as use QuickSync to encode/stream gaming content, while playing on a dGPU. I'm not exactly sure what your uses are but there should be software out there to enable the use of QS, regardless of having a dGPU (provided the CPU has QS of course).
I was interested in using iGPU's QuickSync/DXVA2 for decoding/deinterlacing and scaling video, so that dGPUs resources are free for other things. Doing that should be possible?

And would it even be worth it? It seemed like a good idea, but I really have no clue how taxing 1080p decoding (or upscaling 720p to 1080p) is on modern CPUs and GPUs.
 

·
Registered
Joined
·
1,001 Posts
So it is possible to use a dedicated and integrated GPUs simultaneously.. that's great. I wonder how you go about configuring a program to use or the other though?
Games will typically start on the primary display. The primary display can easily be managed using profiles in programs like DisplayFusion. Programs like WMC and Kodi can be swapped between primary and secondary display without reconfiguring primary display.

My HTPC is my main PC, and I use Intel 4600 graphics for the monitor and a fanless Nvidia GT640 for the TV. For my Z87 motherboard, no extra software like Virtu MVP is required. It just works.

I have a separate PC for gaming. If it at all possible, I would recommend same. Regardless, I would at least have a dedicated media drive that is not going to be used for disk-intensive processes while recording or viewing shows. Multithreaded, CPU-intensive processes like Handbrake haven't affected my viewing in WMC and Kodi, and this is with an Intel i5-4670, a 4 core CPU.
 

·
Registered
Joined
·
297 Posts
My desktop PC is what I would consider a jack of all trades. I use it for work with multitasking, casual websurfing/email, and sending a 1080P video signal via HDMI to my TV. The build is a 3770K I7 processor, 12gb of ram, Nvidia GTX 760 video card, and Kodi media player. Unfortunately, I cannot afford a server so I have several HDDs for my videos and a Corsair SSD for the OS and programs. I would say it is a bit overkill, but it performs extremely well and I do not see myself needing to upgrade it for a very long time.

One thing to note, this used to be an issue a few years ago, I don't know if it still is, but when purchasing a video card, make sure the card can bitstream Dolby True HD and DTS HD MA out to the receiver.
 

·
Registered
Joined
·
3,168 Posts
PC gaming is an endless treadmill of upgrades on the bleeding edge. Buy the fastest processor you can and the fastest video card you can and just accept that gaming is on NVIDIA. Consider the possibility of having two NVIDIA graphics cards (not necessarily identical) so one can handle the video and the other the gaming with the option of using both for multi-monitor gaming or blinding fast graphics using SLI.

Do check the reviews on the graphics to make sure it works well with your intended use. Newer NVIDIA should be fine. I would be surprised if you run into any issues with your intended setup BUT my impression is that AMD and Intel graphics solutions are inferior.

Maybe you can use hardware decoding for video with essentially no processor load on your gaming or vice-versa. Look into the hardware decoding before making your decision on graphics. Make sure the solution you pick for video at least includes the option of hardware decoding all your HTPC media.

For the PC, the Intel integrated, AMD, and NVIDIA graphics solutions should all be capable of hardware decoding but the version of graphics determines the level of support. Newer graphics are capable of 4K hardware decoding but the actual version of decoding available varies with the generation of the technology. Check first. For NVIDIA, you can find out the capabilities of the GPU hardware decoding on their web site.

The processors you are investigating will have no trouble doing software decoding and I am not sure just how much support Windows plus media applications has for hardware decoding since I stopped using Windows years ago. The point is, if you preserve the option for hardware decoding you potentially improve the overall multitasking performance and save CPU horsepower for your game rendering but make sure you do the hardware decoding on a separate graphics engine or you might find yourself facing unpredictable performance or glitches.

Probably, no one bothers to do much testing of blinding-fast gaming running simultaneously with hardware decoding of video on one graphics engine, but I know that under Ubuntu, mplayer can be started with multiple instances doing hardware decoding so it cannot be that finicky running it simultaneously with other graphics tasks under Windows.

I do not recommend relying on integrated graphics for a bleeding edge machine. No way to upgrade except to turn it off and then what did you buy it for in the first place? If it comes for free though, well go for it and save yourself the cost of a second graphics card (for now anyway).

Again, I am not a gamer or Windows user so take my advice with all the skepticism it deserves.
 
1 - 11 of 11 Posts
Top