Originally Posted by lenkiatleong
Nevertheless, if Sony flagship for this category can only records in 1080i, i am sceptical that others are really recording in 1080p.
If they claim it, they're probably doing it. It just means they're not doing something else that the Sonys are doing - for example, face and smile detection, GPS registration, advanced electronic image stabilization, etc. Remember that all of this capture and processing is done real-time and the cams don't go back and post-process your clips to look good after the fact. So it's all about computing power and memory management tradeoffs. Each cam has a target market and each vendor has a philosophy about what sells.
But a question similar to yours might be: why did Sony choose not to burn CPU/GPU cycles, storage, and cost to produce 60p high-res recording, instead putting those things into other deliverables? Were they just dumb? Or do they have reason to believe that some things are more important than 60p to their target markets? Or do they figure this will simply go away as an issue in a few years as hardware improves, so they want to sell other types of features now instead of just pushing the technical limits?
Presumably all the vendors are looking for the marketing sweet spot and bending technology to fit that.
Two analogies that are also computer-based:
1. Artificial intelligence in games: I read some great articles that talked about how programmers do AI in real-time games. Basically, there's a budget for every cycle of computer time, and it has to be allocated out among graphics, data management, I/O, etc - and the amount of cycles used to determine what the computer will do in the game besides responding to user inputs. This AI budget is almost always incredibly tight, so the quality of the AI often hinges on how many other cycles can be freed up to be used by it. When you complain about dumb AI, the root cause is often that the hardware just isn't powerful enough yet to give the AI budget enough power.
2. Scanning of photographs: the first photos I scanned in the early 1990s were scanned at 150 dots per inch, I believe. This was about the max the scanners supported and the files contained at least twice as much data as could actually be shown on a PC monitor. The output device was behind the capture device (I think monitors averaged 72 dpi at the time). Printer output was similarly primitive by today's standards. So there was virtually no market for high-resolution scanners because you couldn't do anything with the output anyway. As the dpi you could route to an output device climbed sharply, so did the resolution you could capture with even the average scanner, and the prices kept dropping over time as the capabilities increased.
I'd bet we're in the same world with these digital camcorders - they are in fact digital, their storage is digital, their processing is digital, their output is digital. So the features a vendor selects to implement today will seem primitive 10 years from now, and every vendor must have a lot of fun trying to figure out what to implement in what order. If you build some patently great resolution but no one can use it, you'd better have superb salesman because you're really not giving people what they want or can use today. The two places where I suspect that's going on now are capture at 60p at high resolutions and the bitrates used. The measure of a cam for you is whether it does what you need and can use. You may factor in a bit of thinking about the future, even, and buy HD now though you have no HD displays in your home. But there's a point where highly touted features are what people claim they are but they're not worth paying for if you can't take advantage of them.
So if you buy a cam because of marketing claims but you don't really know if they are something you can touch and feel and use, I think you've been fooled somewhat.
The bad news is that, being largely computers now, it's a good idea to dig into the specs and uses of these cams before buying, and many people don't have time to do that. The other thing to watch for is advancements in the sensors, their resolutions, the optics, and so forth. But I think those will occur less frequently than the annual processor-based feature improvements.
The good news is that every year will yield better cams and it won't be that long before you can fix a buying mistake if you make one, at a lower price than what you bought before. It may take a few years, but cams aren't going to sit still any more.