Originally Posted by MinnesotaMan
HDCP has nothing to do with image or PQ. It is the link between devices that is encrypting the signal to make it non breakable; i.e. protected. This was upgraded when it came to the 4k era.
that's not "quite" true and therein lies the confusion/problem. I did a BUNCH more research on this since the original post.
HDCP doers effect the image/PQ, in so much that if any link in the "source to sink" (in this case, that would be the TiVo/Roku - AVR - display) does not support HDCP 2.2, a 4k signal cannot be passed and the source will usually down-convert to 1080p, as you stated. So- the difference is a 1080p image vs. a 4k image.
just a few years ago, many of us spent the extra money to "future-proof" and buy "4k capable" AVRs (which were HDCP 2.0)- only to now have that equipment unable to actually pass a 4k signal.
YOu see, my AVR's specs show that it "passes a 4k signal" via PDCP 2.0, which AT THE TIME it was manufactured, was technically true. HOWEVER, with the advance to HDCP 2.2, that AVR (and ALL of the hardware manufactured in that "early adopter" period) is now "obsolete" since they cannot be upgraded to 2.2.
I feel like what this really represents is a "scam" by the industry to coerce people to have to upgrade their AVRs on a much faster cycle. My AVR is only 3-4 years old and works perfectly with this one exception and I would otherwise have no reason to replace it . (granted there are a few other new things, like Atmos, as well. but that's fairly minor).
The "justification" for this change was to appease the movie studios to "prevent" piracy- except we know that any legit pirate can easily obtain the codes to hack/break these methods. It's all really just BS to "pretend" things are protected (hell, look at what just happened to Equifax- cybersecurity is just a joke.
But it's a bit misleading, and even a little bit of bait and switch, ("here, upgrade to this new device, which really won't do what it says it will do by the time you get it") to pretend that this is all being done for some noble reason like copy-protection.
As you say, 1080p is probably "adequate" for 90% of what most folks watch anyway, since there is still not a lot of 4k content available (at least for streaming- which is pretty much my only option, for now, at least).
The only real "losers" here are all the early adopters and those buying high-end stuff, only to learn that much of it is obsolete even before it's out of the box.
Since the 5040 does not support true 4k, anyway, it seems pretty hard to justify spending $400-$1,500 for a new AVR just to watch one or two shows in faux-k HDR (although I have ceiling speakers that have just been waiting for Atmos/DTS:X which makes it pretty tempting.... once the wife gets over the cost of the new PJ)