AVS Forum banner

1 - 20 of 31 Posts

·
Registered
Joined
·
911 Posts
Discussion Starter #1
I'm getting a new TV today (Sony XBR75Z8D) so I want to be able to do 4k HDR. Not sure what it takes for a HTPC using Kodi to do this? Also, I'm not sure if my Denon AVR 4520CI will pass the 4k HDR to the TV? Also, I have a server using FlexRAID and not sure if it will stream 4k content over the network? It's all CAT 6 1gigabit...

My current HTPC mobo is a ASRock Z97M Pro4 that has an i5-4460 Haswell with 16GB of DDR3 (i don't think the mobo will do 4k HDR? not sure about the i5 either?)

Have no problem building a new system. My case is a Micro-ATX

Any input is greatly appreciated! Thanks!
 

·
Registered
Joined
·
180 Posts
Your current mobo will not do 4k according to the spec page here "Supports HDMI with max. resolution up to 1920x1200 @ 60Hz"

You will need to upgrade to something with HDMI 2 to get 4k to your TV. Beyond that there is the issue of very little content in 4k that can be used with your computer. If you don't care about DRM issues you could add a graphics card with HDMI 2 such as a GTX1060 or better. I'm currently using a GTX1070 to upscale my blu-ray rips to 4k using madVR with good results.
 

·
Registered
Joined
·
3,565 Posts
Build a new system later this year. No point in building one now, when HDR isn't solid yet and new cards to go with HDR monitors will be out over the next few months.
 

·
Registered
Joined
·
911 Posts
Discussion Starter #6
Can a HTPC do like some of the Bluray players are doing and have 2 HDMI outs, one for video to TV and other the audio to AVR? That would solve the issue with AVR's not being able to pass through 4k/60hz material, etc? Or am I missing something?
 

·
Registered
Joined
·
1,647 Posts
Build a new system later this year. No point in building one now, when HDR isn't solid yet and new cards to go with HDR monitors will be out over the next few months.
Do you predict that present 10-series owners will get future software updates from Nvidia that will, essentially, make them fully HDR, etc. compatible, or will a whole new crop of cards come out that one would be advised to wait for?
 

·
Registered
Joined
·
3,565 Posts
Do you predict that present 10-series owners will get future software updates from Nvidia that will, essentially, make them fully HDR, etc. compatible, or will a whole new crop of cards come out that one would be advised to wait for?
Well, I can't really say for sure. Nvidia has been known to screw the pooch in recent times. I would hope they would bring HDR to the 1000-series because it was designed for UHD use and has HDMI 2.0B, but they have a record of being somewhat dishonest with the past few generations of GPU and what they can do in regards to HDMI. With the current generation, they even pass the buck to the card makers by saying "Refer to the specifications of the video card manufacturer". AMD is pretty honest about what they are bringing to the party and they are definitely bringing HDR and everything else.
 

·
Registered
Joined
·
3,565 Posts
Can a HTPC do like some of the Bluray players are doing and have 2 HDMI outs, one for video to TV and other the audio to AVR? That would solve the issue with AVR's not being able to pass through 4k/60hz material, etc? Or am I missing something?
The Windows 10 DRM chain would probably not allow such a thing. I have seen dual-HDMI cards before, but they are not reference designs, so you can only get them if you have a lot of money to burn on a premium gamer video card or get one from an overseas company that specializes in non-reference designs. In all honesty, we'll have to wait and see what HDMI 2.1 brings us in video card designs and functionality. With the rise of media boxes and the decline of low-end video cards and GPU designs in favor of integrated graphics, we may not see anything great for HTPC use below the premium cards after this current generation of cards.
 

·
Registered
Joined
·
911 Posts
Discussion Starter #10
What video card could I throw in my current HTPC (ASRock Z97M Pro4 that has an i5-4460 Haswell with 16GB of DDR3) to at least take advantage of as much 4k my Denon 4520 will handle until the HDR hardware comes out down the road? GTX 1050 that powers off the PCI? or?
 

·
Registered
Joined
·
858 Posts
What video card could I throw in my current HTPC (ASRock Z97M Pro4 that has an i5-4460 Haswell with 16GB of DDR3) to at least take advantage of as much 4k my Denon 4520 will handle until the HDR hardware comes out down the road? GTX 1050 that powers off the PCI? or?
If you don't care about gaming, AMD RX460 is the cheapest ticket for HEVC 10 hardware decoding, but be aware it won't enable your system to UHD disc playback because of all the DRM crap.

Frankly, I would simply go with Nvidia Shield instead of upgrading the HTPC.
 

·
Registered
Joined
·
23,130 Posts
Frankly, I would simply go with Nvidia Shield instead of upgrading the HTPC.
Exactly, everyone seems to keep ignoring the most important part of the 4K equation, content. What content are you trying to play? If you haven't looked, it's really, really hard to get 4K content (other than some Youtube and demo files) on a PC. The best 4K source, Ultra HD Blu-ray is completely impractical on a PC right now, requiring a relatively high end Intel machine, and will cost far more than even a top standalone player. Then there's the streaming services, which all prefer/prioritize or generally work better on standalones or streamers.

If you really want to get into 4K, your best bet might be the Panasonic UBD900, it's currently the best UHD Blu-ray player, supports Netflix and Amazon 4K with HDR (not sure about Vudu). And it will just work.
 

·
Registered
Joined
·
371 Posts
A protected 4K requires HDCP 2.2 which your AV receiver doesn't support. Also, HDMI 1.4a limits 4K to 30 fps. You will need to get a new AV receiver that has support for HDCP 2.2 and HDMI 2.0a. A 100 Mb network can handle 4K content thanks to H.265 and even H.264 won't use all the bandwidth.

Since AMD introduced Ryzen processors, Intel is going to introduce Canyon Lake early than expected, but this processor is not the one to get. Ice Lake is the next microarchitecture after Canyon Lake that should introduce better support for 4K. Ice Lake will be coming out next year. Also, AMD's latest APU (CPU + GPU) based on a Zen microarchitecture will be out later this year.

If you want to only playback 4K content that is not encrypted and/or protected, use at least GeForce GTX 1050. An TI version will give you future proofing, so you can playback 4K Netflix. AMD Radeon RX 400 or 500 series could be used, but Netflix doesn't support those video cards yet. The 500 series from AMD is basically the same chip that is used in 400 series, but just a speed bump. AMD Vega graphics will be coming out later this year. They probably won't support HDMI 2.1. HDR for these cards are for gaming.

Both AMD and nVidia does beat around the bush evenly about their cards specs. For example, the AMD Radeon RX 400 series technically fake HDR, so outputs are 4:2:2 of Y'CbCr when using HDR when using HDMI. This means you are technically 8-bit instead of 10-bit. For AMD Radeon RX 400 series and 500 series, you have to use DisplayPort for 4K HDR. Both AMD and nVidia cater to gamers for the desktop lines while they cater to the professional 3D artist for their workstation models. Anybody that wants HDR for movies have to contact these companies to put in support. At this time HDR is for gaming. I don't think some HDR games are working well.

The best way to is to wait for 4K to get better support. If you want to test 4K HDR, use a computer monitor that has 4K HDR. Only computer monitors have DisplayPort. You will need at least i5 Kaby Lake processor, a motherboard with 200 series chipset, SGX that can be enabled in the BIOS/UEFI, certified UHD Blu-ray player, and Windows 10. Kodi will be able to playback 4K, but only if it is not encrypted. There is no legal way yet to rip UHD Blu-ray movies.
 

·
Registered
Joined
·
160 Posts
The best way to is to wait for 4K to get better support. If you want to test 4K HDR, use a computer monitor that has 4K HDR. Only computer monitors have DisplayPort. You will need at least i5 Kaby Lake processor, a motherboard with 200 series chipset, SGX that can be enabled in the BIOS/UEFI, certified UHD Blu-ray player, and Windows 10. Kodi will be able to playback 4K, but only if it is not encrypted. There is no legal way yet to rip UHD Blu-ray movies.
Exactly, that's why personally, I choose the temporal solution as HDR on 1080p. Maybe Sony has a right future outlook about that people care HDR more than 4K, so that Sony introduced HDR on 1080p TV in CES2017.

For those who want to buy a HDMI 2.0x/DP 1.4 to output 4k HDR, please be aware of the warning from @madshi, @Nevcairiel - developers of madVR, Lavfilter, that you should stay away from 2GB Vram cards because of memory lack for 4K.
 

·
Registered
Joined
·
3,565 Posts
What video card could I throw in my current HTPC (ASRock Z97M Pro4 that has an i5-4460 Haswell with 16GB of DDR3) to at least take advantage of as much 4k my Denon 4520 will handle until the HDR hardware comes out down the road? GTX 1050 that powers off the PCI? or?
A video card upgrade on older hardware won't really help you in the long-run. The new 4K decoding methods coming soon are very hardware intensive, so you'd need to build a new machine to fully exploit a new video card. Imagine madVR running in the GPU itself and you'd get an idea of what is coming down the pipeline from Nvidia and AMD. 4K content on the PC is more or less limited to porn and games at this time and that won't change until iTunes or someone else rolls out 4K downloads, which won't happen any time soon. There's also a new 4K-friendly video codec in development that is supposed to be adopted by pretty much everyone and it will be around 2 years before you see that being decoded by any GPU. I've seen the UHD-BD on PC stuff with my own eyes and it's not worth the effort or expense to bother building a computer for, let alone upgrading anything. At this point, you're better off just sticking with HD stuff on your PC and leaving the 4K stuff to an Android media box and UHD-BD player.
 

·
Registered
Joined
·
3,565 Posts
Exactly, that's why personally, I choose the temporal solution HDR on 1080p. Maybe Sony has a right future outlook about that people care HDR more than 4K, so that Sony introduced HDR on 1080p TV in CES2017.

For those who want to bye a HDMI 2.0x/DP 1.4 to output 4k HDR, please be aware of the warning from @madshi , @Nevcairiel - developers of madVR, Lavfilter, that you should stay away from 2GB Vram cards because of memory lack for 4K.
Sony didn't do that at CES. They did it on the sly and quietly released those TVs in international markets - Europe and Asia - where Full HD and Blu-Ray are still the mainstays, but the PS4 and Sony's brand are very popular. They won't sell them in the Americas - North or South - because they want everyone to buy UHDTVs and UHD-BDs and Sony has been eclipsed here by other brands, namely Samsung and Vizio.
 

·
Registered
Joined
·
858 Posts
4K content on the PC is more or less limited to porn and games at this time
There are 4K porn? :eek: I finally find a great use of my new 4K ready PC. I mean 4K games, of course. ;)

leaving the 4K stuff to an Android media box and UHD-BD player
I reached the same conclusion. Nvidia Shield or Zidoo X9S for Android Media box. Any Standalone UHD-BD players like Samsung UBD-M8500 are good.
 

·
Registered
Joined
·
858 Posts
For example, the AMD Radeon RX 400 series technically fake HDR, so outputs are 4:2:2 of Y'CbCr when using HDR when using HDMI. This means you are technically 8-bit instead of 10-bit. For AMD Radeon RX 400 series and 500 series, you have to use DisplayPort for 4K HDR.
Could you share the reference of this claim? Thanks.

HDMI2.0 standard does not allow RGB or YCbCr 4:4:4 in 10bit [email protected], so it is not just AMD, it is HDMI2.0 bandwidth issue.
 

·
Registered
Joined
·
23,130 Posts
I reached the same conclusion. Nvidia Shield or Zidoo X9S for Android Media box. Any Standalone UHD-BD players like Samsung UBD-M8500 are good.
Don't waste your money on the Samsung, it is not a good player, it has a number of issues if you research it.
 

·
Registered
Joined
·
858 Posts
Don't waste your money on the Samsung, it is not a good player, it has a number of issues if you research it.
Really? I don't have personal experience on the new model, but I have been using the first gen Samsung K8500 UHD disc player for one year, and I am happy to say I had no issues with it.
 
1 - 20 of 31 Posts
Top