Originally Posted by Mount81
Why did you state that Netflix streaming quality is far better on those listed media devices than on a capable HTPC, what would be the reason for this?
And why do you think that HDMI 2.1 would be so important or so much better than HDMI 2.0a for any HTPC purpose?
There will be e-ARC, Variable Frame Rate and Seamless Frame Rate Switching that I'm aware of, these would be those reasons? And beside, you will than also need a HDMI 2.1 capable AVR and display to gain any of these advantages.
(I'm not scoffing, just curious).
Probably what people mean that STB (set-top-boxes) is better because it uses the correct color space for the content. Windows, for example, use RGB. Windows can output to illuminance and chrominance, but this depends on the graphics driver and usually buried in the settings.
People don't want to do research. What I have found.
* Intel processors based on Kaby Lake and newer is still hardware limited to HDMI 1.4
* Motherboard manufacturers that advertise HDMI 2.0 for Intel (Chipset 200 or above) uses DisplayPort to HDMI converter
* Using DisplayPort to HDMI converter may work on most TV
* Stagnate CPU production from both Intel and AMD
It's best to test the UHD HTPC in a lab before putting in production. UHD HTPC still early adoption technology, so stop whining. I have 4K TV and Radeon RX 580. I still use 1080p. I prefer DVD over Blu-ray because DVD is easier to playback. What is your problem people?
From my experience use what works best not is the best.