Originally Posted by hrdlywrknj
The reason I ask is because I imagine people would be willing to sacrifice image quality and go with AVCHD,
You're not necessarily going to "sacrifice" image quality by going with avchd. I think that's leaning a bit too far over to the other extreme. The absolute WORST that I can say about avchd in terms of quality is that is does tend to break up a bit more on faster pans than HDV. But then you can adjust avchd bitrate to much wider swings without drastically affecting quality... which is not the case with HDV. Don't forget though, the compression scheme used is only a small part of image quality. I would say that lenses plays a much more important role.
HDV also has the ability to work in editors at a much faster and more efficient level. On the other hand HDV has a 1440x1080 restriction and you also use tapes. HDV has also pretty much maxed out in terms of growth. It will not advance any further than it is now.
a good format... albeit a new one so there are complications in using and editing. Right now it is not much more than a consumer level standard.... but that MAY change in time.
I'm not at all convinced however that avchd has gained in popularity because of the avchd itself. I believe more that the manufacturers have used the new methods of storage (HDD, Flash...etc) as transportation to introduce avchd as it would not have made such a big splash without the new storage methods.
As far as compression goes... there is a restriction on avchd compression and that is 24Mb/s (high profile). It's a rather severe restriction when you consider the incredible compression levels that can be attained by the codec involved (h.264). In any reputable editor you can easily render out avc/h.264 to a Blu Ray players maximum (which I think is somewhere around 40 or 48 Mb/s)....but then on the other hand... we have a hard enough time editing avchd at 24Mb/s. I should point out though that it's not like hdv doesn't have restrictions either.
Your idea that HDV needs to be "up-converted" to 1920x1080 for blu ray is also wrong. While it is true that 1440x1080 works out mathematically to 4:3, it uses rectangular pixels instead of square pixels found in 1920x1080. The pixel width in 1440 is 1.333 instead of one.
Blu Ray DOES accept this and DOES display 1440x1080 as real 16:9