The new codec is ~2x as efficient. Since you need nominally almost 4x the bits, you cannot maintain bitrate equivalent on a dual-layer BluRay for 4K. It's not possible. You can bit starve and have 4K and do it, but that will diminish the result. For realistic 4K discs, there needs to be H.265 and more layers. You can make bad 4K files in 8GB, the same way you can get a 4GB 1080p movie from iTunes today. But what they hell is the point of 4K then? Or even BluRay? If that's the future, we ought to just push for 4GB H.265 1080p files and upscale them to reduce aliasing. Screw ultimate quality, yes to low bandwidth.
There is no difference in HDMI cables. If you can see the picture without visible dropouts or sparklies, the cable is working at 100%. No other cable will display a better version of that picture. You're simply wrong if you think there is a better digital cable than one that is already working.