Does anyone know of anyone's systematic evaluation of the resolution/ motion artifact differences between the various compression ratios? variable vs constant bit rate? low average but with high peak values, etc.?
I will probably be buying a Sony BDPS350 bluray next week [they just came down $100 to $299!
Sweet!], already have several BluRay resolution test discs, and will attempt my own comparisons if no one else has.
8.8 Mbps CBR let's me get 2 hr of 1080i onto a DVD+DL disc and it seems pretty sharp and artifact free, except for explosions of red fire which break up into tiles/ macro blocking (whatever you want to call it). Livable but I'd like to know exactly what the sweet spots are in terms of quality vs time going all the way down to what is said to be the bare minimum for HD, 5 Mbps.
In A/V reproduction accuracy, there IS no concept of "accounting for personal taste/preference". As art consumers we don't "pick" the level of bass, nor the tint/brightness of a scene's sky, any more than we pick the ending of a novel or Mona Lisa's type of smile. "High fidelity" means "high truthfulness", faithful to the original artist's intent: an unmodified, neutral, accurate copy of the original master, ideally being exact and with no discernable alterations, aka "transparency".