Originally Posted by MovieSwede
Yes because we all know how important megapixels are to PQ.
Some people does understand how important megapixels are for better image quality, but some people clueless people obviously don't.
The "2K is good enough" argument is like saying; "that 70mm film 50 years ago was a useless mistake and so has IMAX been".
I don't think anybody would agree with that.
First of all single chip cmos cameras doesnt have 5K resolution if they have a 5K sensor. Because it can only use 50% of its cells to reproduce green, 25% red and 25% blue. The data is then interpolated to achive 5K.
Is that some kind of argument against 5K cameras compared to 2K cameras?. The 2K cameras by the same argument end up with closer to 1K resolution in the end.
All the 5K+ cameras end up with more than 4K measurable resolution subsampled to true 4K.
If more than four times the resolution on the screen compared to what comes from a 2K camera doesn't make a difference in all parts of image quality (remember we talk large screen projection), then real life would look as pixelated as movie screen image.
Also image quality is more then just resolution. Several cameras have been using CCD instead CMOS. How do you judge PQ between different technologys? Color reproduction, signal to noise ratio, highlights etc.
No camera manufacturer uses CCD any more for cameras where image quality is important. Even Sony, long time "CMOS basher" ditched CCD for their latest cinema camera. NHK's newest 8K broadcast camera is also CMOS.
CCD is dead in the world of high quality cameras. The reason is obvious.
Actual shooting conditions, go out and buy a DSLR and take some pictures with it. Resolution will be very different from picture to picture, because resolution is very dependet on your lens settings. So while you in theory can achive 4K resolution, real world shooting conditions can prevent you from taking advatange of 4K.
Having owned 2 megapixel camera in the -90's and now own 18 megapixel camera, and between those both film cameras and other digital megapixel cameras. Scanned a lot of photo film. Can access my 2 megapixel photos from the -90's.
Even if higher megapixel photos only can be seen downsampled to 2MP (PC/TV monitor) I promise you that higher megapixel win in all parameters. Both in Color reproduction, detail reproduction and sharpness, and in flexibility when one want to do adjustments on the image.
The 2MP camera image fall apart as soon as some adjustments are applied.
The only people still using 2MP cameras today are the people that make images for big screen display, like movie makers and broadcast.
Everybody else, people that usually only see their images in small sizes, from high fashion and advertising photography to people with "point'n shoot" and camera phones have higher megapixels than the "big screen display shooters".
This show that something is very wrong in the world of movie making.
So in the end you have several cameras to choose between. They all have their pros and cons. The one you should use for your project is the one best create the look you want, it doesnt matter if the camera is 2K, 4K or even 16K. If it cant produce the look in 2K, it wont produce it in 4K either.
Of course, if you think 2K is the "end of the road", then why use 2K?
0.5K should be as good if that kind of argument had any basis in reality.
If you think that 2K can produce the same look as 4K then it must be that you "dumb down" the 4K look to look like 2K.
The versatility and flexibility to create something in 4K that you can't produce in 2K is a fact. The 2K image will fall apart long before the 4K look is achieved.
iPhone4 screen is 329 PPI
50" HDTV is 44 PPI
50" 4K TV is 88 PPI
a 15 meter wide cinema screen in 2K is 3.15 PPI
a 15 meter wide cinema screen in 4K is 6.29 PPI
If that difference in Pixel per Inch does not demonstrate the easy understandable logic of the impact resolution (MP/PPI) has on image quality, nothing will.
Ten years from now the discussion will be all about "is 8K good enough", and 2K will be seen as the big historical mistake it was to use it for movie making.
But let's not derail this thread further, this argument (2K vs. 4K) is discussed many other places on the forum.
Some part of this "Hobbit" thread argues HFR vs. 24fps.
Watch the first part of this video and the small glimpse of a split screen demo between a IMAX 70mm film projection and a 11K scan of the film downsampled to 4K and projected digitally side by side. They are still both 24fps, but it clearly demonstrate what your brain has to "hide" in a film projection compared to the digital version at higher refresh-rate, and how this will be further improved by higher framerate.