The fact is VBR ~320kbps MP3s are absolutely indistinguishable (except perhaps by a few "golden ears") from CDs, which use roughly 4x that data. With something like AAC, you can go to closer to 8x compression and achieve the same thing.
This points out that compression is not necessarily even at all. And with video -- especially H.265/HEVC -- I suspect that 20x compression will be very, very hard to tell apart from the uncompressed 4x originals. But you will never see 20x compression. Today's uncompressed 2K is 1.5 Gbps. An uncompressed 4K signal would, therefore, be around 6Gbps. If you apply 20x compression, you are talking 300 Mbps, a consumer product that will never, ever exist anywhere, period.
If we get a "SuperBluRay", we will likely see a 75 Mbps bursty kind of bit rate, which will likely look freaking awesome. If broadcast is done at 20 Mbps, it will also likely look pretty good.
What appears likely, however, is that there will efforts to squeeze 4K into 30 Mpbs on BluRay discs and into 10 Mpbs on broadcast. While these will have some attributes that exceed the quality of existing 2K images, they will also be worse in some significant ways. In the event this occurs, actually selling 4K as a quality thing will be nigh impossible. Marketing it and giving it away, however, will remain very plausible.
(EDIT: This post originally contained a typo, caught by vtms and now corrected).
There is no difference in HDMI cables. If you can see the picture without visible dropouts or sparklies, the cable is working at 100%. No other cable will display a better version of that picture. You're simply wrong if you think there is a better digital cable than one that is already working.