4k by 2k or Quad HD...lots of rumors? thoughts? - Page 31 - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #901 of 3692 Old 02-21-2012, 04:27 PM
AVS Addicted Member
 
rogo's Avatar
 
Join Date: Dec 1999
Location: Sequoia, CA
Posts: 30,255
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 200 Post(s)
Liked: 545
Quote:
Originally Posted by irkuck View Post

The only point which is debatable is if the 4K is a must for cognitive algorithms. I can accept it is a must due to some peculiarities but I doubt it. Another issue is that, at least with present algorithms, they are not infallible. Meaning that occasionally processing artefacts are showing up. This leads to a situation common today which is that there are tons of various picture processing functions in TVs but general advice is to turn them off for highest PQ. Obviously it is not excluded that the Icubed processing is so refined it is always better than the non-Icubed 4K. Then, the merits of 4K would be rather attributable to the Icubed than to the 4K per se.

For people following this lengthy debate/discussion, I actually agree with you. We don't know. It might be this is all achievable with a 2k display. I doubt it is. You think it might be. Regardless, it appears this is one of the best upgrades we've seen in apparent resolution in a long time.

The sooner we can buy it and people get to see it, the better.

There is no difference in HDMI cables. If you can see the picture without visible dropouts or sparklies, the cable is working at 100%. No other cable will display a better version of that picture. You're simply wrong if you think there is a better digital cable than one that is already working.
rogo is offline  
Sponsored Links
Advertisement
 
post #902 of 3692 Old 02-21-2012, 07:22 PM
 
Auditor55's Avatar
 
Join Date: Jan 2002
Location: Silicon Valley, CA.
Posts: 8,795
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 20
Quote:
Originally Posted by sstephen View Post

I don't know how anyone thinks they can come to any useful conclusion about 4k vs 2k when the conclusion is based on a comparison of a still of a girl in front of bamboo at 4k vs a single shot from an action movie with a fair amount of cgi, then taken with a camera off of a display where the image itself is not filling the full frame, then downrezzed to 1024 pixels wide, then judged on your computer monitor.

4K belongs in the cinema and maybe for some home front projection system. But not for the flat panel TV market.
Auditor55 is offline  
post #903 of 3692 Old 02-21-2012, 07:33 PM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,556
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 130 Post(s)
Liked: 56
Quote:
Originally Posted by Auditor55 View Post

4K belongs in the cinema and maybe for some home front projection system. But not for the flat panel TV market.

Couldn't it be better quality / more accurate on a flat panel or FOLED screen than a front projection screen?
Joe Bloggs is offline  
post #904 of 3692 Old 02-21-2012, 08:42 PM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by Airion View Post

No argument here then, native 1080p looks much sharper on my 1080p projector than on my 720p one and I think native 4k will look perfectly sharp on a 4k display.

However, that just doesn't seem to be the case with upscaling. While my 720p projector has larger pixels, it appears sharper than feeding the same 720p image to my 1080p projector. It makes sense to me that upscaled 4k will be likewise be softer.

Yes I don't think we disagree. Finer pixel size makes images "softer" or more "analog" as less jaggies. It is an erronous corollary to conclude hence that means larger pixel size means it is sharper. You may have missed my contention. I disagree with the following quote:

Quote:
Originally Posted by amirm View Post

You talked about perceptual effects. That is why I said upsampling actually makes the image softer. It does that because larger pixels tend to accentuate image sharpness. Smaller pixels do less of this and hence, serve to make the image softer. This is not related to resampling but the nature of how the pixels are presented. Remember: your video was mastered on a 1080p display. It was sharpened for that. If you go to a finer geometry, then you are seeing a softer image than the talent approved.

If you are not able to see the pixels, you don't have a problem that needs fixing with 4k.

I have also seen 1080p on a 103" plasma and it doesn't accentuate sharpness as well. And I am wondering who actually see pixel on their display. If you do, either your distance is too near or your source/ equipment is faulty. It's hard enough to see dithering beyond 3'. Something is wrong somewhere.

Quote:
Originally Posted by Joe Bloggs View Post

The one on the left is a simulation of native on an LCD TV, and the one on the right is a simulation of the same source on an LCD TV of the same size, with 4x the number of pixels (no special scaling - each source pixel just duplicated). I think the 2nd one is probably more accurate.

Sigh... let me reiterate:
Quote:
Originally Posted by specuvestor View Post

I don't understand why in this thread we keep making strawman out of this simple logic:
VCD< DVD< DVD upsampling< Blu Ray < Blu Ray up sampling < 4k
Why is the argument always on Blu Ray upsampling will not be as good as 4k native? Is that a revelation?

That's what we're saying: it reduces jaggies and you can improve on gradation instead of just duplicating. But native to native still best ie 4k source on 4k TV rather than upscaling

But this is a really bad example anyway. Like I said... we don't even see pixels... not to mention SUB pixels
specuvestor is offline  
post #905 of 3692 Old 02-21-2012, 08:53 PM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,556
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 130 Post(s)
Liked: 56
Quote:
Originally Posted by specuvestor View Post

But native to native still best ie 4k source on 4k TV rather than upscaling

When you said "native to native is always the best" I thought you meant that 1080p content on a 1080p TV would be better than 1080p content displayed on a 3840x2160 TV.
1080p content on a 3840x2160 TV would not be native
1080p content on a 1080p TV would be.

So if 1080p native content is better displayed on a 4K TV than a native 1080p TV (which is what you are showing in your other quote), the statement "native to native is always best" is false.
ie. "native to native is always the best" means a source of one resolution is best displayed on a TV with the same resolution, which is a false statement if 1080p content is displayed better on a 4K TV than it is on a 1080p TV.
Joe Bloggs is offline  
post #906 of 3692 Old 02-21-2012, 09:15 PM
AVS Special Member
 
saprano's Avatar
 
Join Date: Oct 2007
Location: Bronx NY
Posts: 3,396
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 99 Post(s)
Liked: 249
Quote:
Originally Posted by rogo View Post

For people following this lengthy debate/discussion, I actually agree with you. We don't know. It might be this is all achievable with a 2k display. I doubt it is. You think it might be. Regardless, it appears this is one of the best upgrades we've seen in apparent resolution in a long time.

The sooner we can buy it and people get to see it, the better.

Can you explain this further? I don't understand how you can say this when BD resolution shows how bad SD was. While im sure 4K will be better in some ways, we're not going to get the same effect.

And isn't the sharp icube congin, whatever the heck it's called, exclusive to them? (I think thats what you were only referring to)

home theater addict
saprano is offline  
post #907 of 3692 Old 02-21-2012, 09:30 PM
AVS Special Member
 
Richard Paul's Avatar
 
Join Date: Sep 2004
Posts: 6,959
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 29
For anyone interested Sony released a document on 4K for cinema. Something it notes on page 4 is that the NHK did research where people could tell the difference between pictures which were 312 pixels per degree compared to 156 pixels per degree. That is far beyond the 60 pixels per degree that is used as the standard on human visual acuity for someone with 20/20 vision. Of course human visual acuity is complicated and even more so when you bring in the issue of hyperacuity. There are tests on this website from Professor Michael Bach that cover both visual acuity and hyperacuity.


Quote:
Originally Posted by sytech View Post

Maybe they can fix with a firmware update?

Considering the "not supported" wording in the Onkyo TX-NR616 datasheet my guess is that it is a hardware limitation. If it was a limitation that could be fixed with a firmware update I think Onkyo would have mentioned that.
Richard Paul is offline  
post #908 of 3692 Old 02-21-2012, 11:04 PM
AVS Addicted Member
 
rogo's Avatar
 
Join Date: Dec 1999
Location: Sequoia, CA
Posts: 30,255
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 200 Post(s)
Liked: 545
Quote:
Originally Posted by saprano View Post

Can you explain this further? I don't understand how you can say this when BD resolution shows how bad SD was. While im sure 4K will be better in some ways, we're not going to get the same effect.

And isn't the sharp icube congin, whatever the heck it's called, exclusive to them? (I think thats what you were only referring to)

What do you want me to explain further? I'm happy to do it.

And yes, the iCube is exclusive to Sharp. The concept, however, is not.

There is no difference in HDMI cables. If you can see the picture without visible dropouts or sparklies, the cable is working at 100%. No other cable will display a better version of that picture. You're simply wrong if you think there is a better digital cable than one that is already working.
rogo is offline  
post #909 of 3692 Old 02-22-2012, 12:28 AM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,502
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 35 Post(s)
Liked: 63
Quote:
Originally Posted by Richard Paul View Post

For anyone interested Sony released a document on 4K for cinema. Something it notes on page 4 is that the NHK did research where people could tell the difference between pictures which were 312 pixels per degree compared to 156 pixels per degree. That is far beyond the 60 pixels per degree that is used as the standard on human visual acuity for someone with 20/20 vision.

This article has been referenced here numerous times since it is a golden proof against 4K for the standard TV viewing scenario. You may wish to pay attention not to a side notes but at its essence illustrated in figeurs: it shows the 4K does not make sense below 3PH. This is provided by the main proponents of 4K who can not be suspected of trying to decrease the merits of 4K.

Apart of this you rightly notice:

Quote:
Originally Posted by Richard Paul View Post

Of course human visual acuity is complicated and even more so when you bring in the issue of hyperacuity. There are tests on this website from Professor Michael Bach that cover both visual acuity and hyperacuity.

....but you do not consider specifics of different scenarios where the various capabilities of human vision apply. TV viewing scenario deals with video at a distance of 3-4PH and 30 deg coverage. Opposite to it is paper reading scenario
with static pics available for prolonged inspection with position optimally adjusted by eye movements and motoric system. There is huge difference between those scenarios and the best illustration of it is the requirement for glossy magazines to get the content absolutely perfect: it is up to 2400 dpi - you see it easily when comparing standard newspaper print to what you see in glossy playboys. On the other hand for motion pictures even 60 pix/inch is already high, 40+ is realistic.

Thus, referring to hyperacuity in the context of TV viewing scenario is nonsense but it is very relevant for tablets and smartfones.

Quote:
Originally Posted by coolscan View Post

Aliasing on 1080p material comes mostly from the source which is either the digital camera or a badly done 2K scan of film. Bad compression or just bad work by the guy who does the authoring (DVD/BD), which happens frequently, just makes the aliasing in the original source worse.

But clean 1080p source does almost not exist, except from 4K or higher scans.

You contradict yourself here: On one hand you rightly show that aliasing originates from bad work. Logically then the cure for aliasing is in correct work and not in 4K. You imply that 4K is panacea: if 2K is done badly then 4K can be done badly equally well.
Claim that clean 1080 source can come only from higher scans is not justified. It is more depending on the design of complete optical, electronic and digital processing system than just mere 4K. High-end professional equipment can provide clean 1080, low-end 4K will cut so many corners to get dirty 1080.

irkuck
irkuck is offline  
post #910 of 3692 Old 02-22-2012, 03:14 PM
AVS Special Member
 
coolscan's Avatar
 
Join Date: Jan 2006
Posts: 1,795
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 102
Quote:
Originally Posted by irkuck View Post

You contradict yourself here: On one hand you rightly show that aliasing originates from bad work. Logically then the cure for aliasing is in correct work and not in 4K. You imply that 4K is panacea: if 2K is done badly then 4K can be done badly equally well.

Of course 4K can be done badly, it's already done all the time.

Difference is that there is a whole lot of 4K worklflow education being done, specially by RED, because RED cameras will only output RAW and people mess that up all the time and gives the camera a bad reputation.

Another part is the recently understood need for capturing in higher resolution than the intended output.
Red as an example; as the first 4K motion CMOS camera maker they started with a 4K sensor, but found soon out they needed higher resolution to downsampled in post, so they offered a sensor upgrade.

The best 2K camera Arri Alexa has a 2880x1620 CMOS sensor downsamples 1.5x in camera and outputs 2K ProRes files. It is not very sharp and does not fully resolve fine details so it is prone to artifacts if you are not very careful in post.

RED cameras has been the only 4K cameras till now, but Sony has now shifted their technology from their previous 2K CCD and releasing a 20MP CMOS camera that outputs 4K preprocessed in camera.
The Sony cameras 20MP sensor has double green photo-sites and the sensor is turned 45 degrees so conventional CMOS pixelcount is about 15-17MP, just above the Red Epic-X camera.

When it comes to 35mm film; They are scanned for Digital Intermediate and edited digitally, for so to be printed back to film.
Up to recently this scan was done in 2K and is still done in 2K for those that don't have a budget for 4K scans.

But even those that can afford a 4K scan, which is best done in 6K and downsampled to 4K, just convert the 4K file to 2K ProRes, if the movie is not intended for 4K release.
They then edit, grade and color correct on the 2K file.
Best would be if they followed the 4K RAW type workflow which now quite recently been developed and Graded the finished edit on the full 4K scanned DI.

This is important to understand so one understands how they arrive at the results.

If we say that all movies shot on 35mm film and thereby the source is equal to each other, then all of them should look equally artifacts free on a DVD or BD.
But that is not a case.
Frequently movies are released that have sub-par image quality compare to other movies shot on 35mm.

Now you should start to understand the reasons for this.

Read this Arri Scan article, even if it is some years old, where they explain the quality difference between a 2K scan and a 4K scan.
http://archiv.arri.de/news/newslette...al_systems.htm

Example;
The movie Gladiator was released on BD and nobody was very happy about the quality. Then in 2010 an new version was quietly released without any promotion about improved quality.
Only outer difference between the two releases was -2 at the end of the SKU number.
This new release was from a new 4K scan, regraded and authored.
Here are two downscaled for size crop examples showing the difference.
(click on the thumbnail to see full size)


Here are more "mouseover" in full size. Find the # above the image to see all grabs.
http://screenshotcomparison.com/comparison.php?id=76352
http://screenshotcomparison.com/comparison.php?id=100
http://screenshotcomparison.com/comp.../100/picture:3

In some images the difference isn't remarkable, but looking at the level of the resolved grain we can see how much better the new release is.

So the quality of the source is very important, and in HD the "sins" of mastering houses and studios is even more apparent than in SD/DVD.

Quote:


Claim that clean 1080 source can come only from higher scans is not justified. It is more depending on the design of complete optical, electronic and digital processing system than just mere 4K. High-end professional equipment can provide clean 1080, low-end 4K will cut so many corners to get dirty 1080.

We of course have always assumed that high end equipment delivers clean 1080p, and to a certain degree from the newest scanners that is the case. (Arri Scanners now scans 3K for 2K delivery.)

But clean 1080p from digital cameras have not been possible in the past, due to sensor technology and the before mentioned shooting 2K for 2K.

The best and most used digital camera for movies has been the Sony F35, a 35mm equivalent size RGB CCD camera and the Panavision Genesis buildt on the F35.
The Sony F23 has also been used quite much, and is a 2/3" RGB CCD sensor, a camera based on the popular Sony 900F HD broadcast camera.
These are the High End cameras.
In addition we have all the other HD cameras, many of them with 4:3 aspect ratio sensors which are cropped for 16:9, which all gives less quality than the before mentioned cameras.

Here is a zone chart example of a Canon 7D DSLR video "lineskipping" CMOS, which in this case has to represent the "lower end" HD video cameras.
The F35 CCD 2K and a RED One-MX 4.5K which I believe here is windowed down to 2K.

Loook at the difference in resolving detail in the 100% crops. The 7D, even with 1920x1080 is just a "porridge mess". The F35 shows in finer details exactly the problems that end up as arifacts. The Red One is the only one that resolves clean.
(click on it)


To conclude; there is always possible to "mess up" in all part of the chain in 4K too, and it will be done.
The positive is that the new and best 4K cameras will shoot at higher than 4K resolution for better 4K results and people starts to learn proper digital 4K workflow.

Another fact to understand; A motive in front of a 2K camera and a 4K or higer resolution camera is the same.
The 2K camera can not resolve the details, they will merge together in blocks or a "soupe" just because 2K is not enough resolution to represent the details. And details are very important part of good image quality.

Example also is sharpness: 2K has to be electronic sharpened more than 4K to appear sharp and this produces artifacts and can look "oversharpened", while a 4K camera will have a more natural sharpness. (ref. image in last link)

I just recently wrote two similar posts about this, mostly comparing RED cameras and Arri Alexa. The first one has another zone plate.
http://www.avsforum.com/avs-vb/showp...0&postcount=67

The second one is more about workflow differences and some interesting links in the end about "The Girl with the Dragon Tattoo", the first movie that had a 4K DI workflow from start to end.
http://www.avsforum.com/avs-vb/showp...8&postcount=85

And here is a framegrab from a Red Epic-X test; http://www.avsforum.com/avs-vb/showp...5&postcount=62
coolscan is offline  
post #911 of 3692 Old 02-22-2012, 04:43 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,584
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 227
Thank you for those examples coolscan. I've said that here a few times (having to shoot at least "double" the resolution of your target and downsample) but pictures make things far clearer.

The same thing applies to display resolutionyou need at least double the resolution in either direction (essentially 4x) than what works out to be the "limit" of the human visual system to actually have a completely smooth image that doesn't appear as if you're looking at an image on a screen.



And seeing as you posted those zone plates from the camera tests, here's a 1080p zone plate off the Spears & Munsil Blu-ray disc, displayed at 1:1
http://www.mediafire.com/?3aylyly1uscs535

And here it is upscaled to 4K via madVR:
http://www.mediafire.com/?8j9oakve743pisg

Unfortunately these two images are 20MB in total, which far exceeds what any free image host accepts, so you will have to download them.

Please note how there is considerably less aliasing when upscaled to 4K,.and as much as I like madVR, I doubt it's as advanced as the scaling going into 4K devices.
Chronoptimist is offline  
post #912 of 3692 Old 02-22-2012, 09:33 PM
AVS Special Member
 
DaViD Boulet's Avatar
 
Join Date: Aug 1999
Location: Washington DC area
Posts: 6,427
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 38
Quote:


Again, if the pixels are quite large and visible then the harshness becomes a liability and resampling helps a lot with that. But in tiny amounts which is typical in our viewing distance with 1080p, it can actually be beneficial. And getting rid of it will make the image look a bit softer subjectively.

T.I. marketing suggested the same thing when the advent of 1080p was threatening their 720p DLP market... they suggested that the pixel-edginess of their 720p DLP PJs make their images "sharper" than 1080p displays that lacked obtrusive pixel noise and looked smoother.

I'd rather see a "softer" image but one lacking pixel noise and one revealing the maximum natural detail possible than an artificially sharpened image (whether by EE or pixel edge noise).

Regardless, I doubt that the differences will be dramatic with "normal" 1.5 screen-width distances (I agree with you in that regard). But in my opinion an ideal HT would provide pixel-free viewing as close as 1 screen width for "front seat" viewers. Naturally at 2 screen widths the differences would be imperceptible considering that at 2 screen widths even 1080p and 720p can start to look disappointingly similar.

I also agree with you that other issues that reduce image accuracy in fine detail (chromatic aberration, lack of panel alignment etc.) could easily negate the benefits of 4K. Hopefully 4K chips will motivate high-end PJ designs to eliminate such detail-robbing anomalies to the best degree possible.

1080p and lossless audio. EVERY BD should have them both.
DaViD Boulet is online now  
post #913 of 3692 Old 02-22-2012, 10:53 PM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,502
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 35 Post(s)
Liked: 63
Quote:
Originally Posted by coolscan View Post

Of course 4K can be done badly, it's already done all the time.

Difference is that there is a whole lot of 4K worklflow education being done, specially by RED, because RED cameras will only output RAW and people mess that up all the time and gives the camera a bad reputation.

Another part is the recently understood need for capturing in higher resolution than the intended output.
Red as an example; as the first 4K motion CMOS camera maker they started with a 4K sensor, but found soon out they needed higher resolution to downsampled in post, so they offered a sensor upgrade.

The best 2K camera Arri Alexa has a 2880x1620 CMOS sensor downsamples 1.5x in camera and outputs 2K ProRes files. It is not very sharp and does not fully resolve fine details so it is prone to artifacts if you are not very careful in post.

RED cameras has been the only 4K cameras till now, but Sony has now shifted their technology from their previous 2K CCD and releasing a 20MP CMOS camera that outputs 4K preprocessed in camera.
Now you should start to understand the reasons for this.

What you do not appreciate enough is that RED & others have single CMOS sensor chips, essentially Bayer scheme. While the traditional high-end video cameras are 3 RGB chips. REDs are cheaper but do not forget they cut corners comparing to the 3RGB especially if we are talking about 4K perfection. So they have to go through the 4K claiming that it is better since one has 4K and downing to 2K is better. But in reality 3RGB 2K will be at least the same though the camera is more expensive. All this due to the fact that in the end the market is driven not by the highest PQ but by just-good-enough PQ @lower price.

Quote:
Originally Posted by coolscan View Post

Loook at the difference in resolving detail in the 100% crops. The 7D, even with 1920x1080 is just a "porridge mess". The F35 shows in finer details exactly the problems that end up as arifacts. The Red One is the only one that resolves clean.
(click on it)

F35 and Red pics would have to be carefully analyzed as to what kind of processing is applied. Notice there is aliasing in F35 only for vertical direction which asks what were the parameters of zone generator used. On the other hand Red is that clean and sharp that: 1. maybe it goes over 2K res and the pic is scaled, 2. some PQ enhancement is applied, nothing wrong with this but it may be showing up artefacts for other signal structures. I would say it is rather F35 which looks like undoctored full 2K res.

For sobering, those pics show well craziness of the 4K. Slight artefacts visible at the highest frequencies with 4:4:4 uncompressed pics of theoretical content are proof that 4K is better. This is quite far from reality.

irkuck
irkuck is offline  
post #914 of 3692 Old 02-23-2012, 04:06 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,584
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 227
Quote:
Originally Posted by DaViD Boulet View Post

T.I. marketing suggested the same thing when the advent of 1080p was threatening their 720p DLP market... they suggested that the pixel-edginess of their 720p DLP PJs make their images "sharper" than 1080p displays that lacked obtrusive pixel noise and looked smoother.

I'd rather see a "softer" image but one lacking pixel noise and one revealing the maximum natural detail possible than an artificially sharpened image (whether by EE or pixel edge noise).

Aliasing or "pixel noise" as you put it, can give the impression of a sharper image, but it is a less detailed one. (specifically when feeding a 720p projector a 1080p source) What most people don't seem to understand, is that increasing display resolution does not necessarily increase image sharpness, but rather, image detail and overall image quality. (computer generated content such as a PC input or games will be sharper though)

Playing 4K content downsampled on a 1080p display, will look sharper than an equal quality 1080p encode of the same source material, for example.

At a certain point though, resolution becomes so low that you either have to filter the image too much, or simply have pixels that are too large to appear sharp any more.

Quote:
Originally Posted by DaViD Boulet View Post

I also agree with you that other issues that reduce image accuracy in fine detail (chromatic aberration, lack of panel alignment etc.) could easily negate the benefits of 4K. Hopefully 4K chips will motivate high-end PJ designs to eliminate such detail-robbing anomalies to the best degree possible.

This is something that people overlook. Even if you have panel misalignment on a 4K projector, it is always going to be sharper than a 1080p one. When displaying a 1080p source, you can do digital panel alignment in software to correct for any misalignment without any resolution loss. (as we see today on 1080p projectors that offer sub pixel alignment such as Sony's SXRD projectors)

This should get alignment errors with a 1080p source down to ¼ of a pixel or less, rather than ½.

And while chromatic aberrations will still exist, it should not be too challenging for modern optics to resolve 4K.

Even if they do not "fully" resolve 4K, they should easily outresolve any 1080p projector.
Quote:
Originally Posted by irkuck View Post

What you do not appreciate enough is that RED & others have single CMOS sensor chips, essentially Bayer scheme. While the traditional high-end video cameras are 3 RGB chips. REDs are cheaper but do not forget they cut corners comparing to the 3RGB especially if we are talking about 4K perfection. So they have to go through the 4K claiming that it is better since one has 4K and downing to 2K is better. But in reality 3RGB 2K will be at least the same though the camera is more expensive. All this due to the fact that in the end the market is driven not by the highest PQ but by just-good-enough PQ @lower price.

3-chip or not, it really doesn't matter, you need to oversample the image to maximise detail and avoid aliasing with digital imaging.

As for RED "cutting corners", I am not aware of anything else that shoots RAW Video. This is a significant image quality advantage over anything else, 3-chip or not, and it means that the quality of any footage shot will improve over time as newer software is released. (we have seen this with stills photography in recent years as RAW processing has improved and we return to images shot on older cameras)

And I've posted it before, but modern optics can greatly outresolve 4K.
http://www.fredmiranda.com/forum/top...r=2010#8243766

These results are from a relatively inexpensive (compared to cine lenses) consumer-grade Canon SLR lens on a 7D. (which is also a "corner-cutting" single-chip camera)
Chronoptimist is offline  
post #915 of 3692 Old 02-23-2012, 11:57 AM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,502
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 35 Post(s)
Liked: 63
Quote:
Originally Posted by Chronoptimist View Post

Playing 4K content downsampled on a 1080p display, will look sharper than an equal quality 1080p encode of the same source material, for example.

Even if they do not "fully" resolve 4K, they should easily outresolve any 1080p projector.
3-chip or not, it really doesn't matter, you need to oversample the image to maximise detail and avoid aliasing with digital imaging.

You lack basics of digital signal processing theory and this is why you miss the point. What is a must is proper bandlimiting, you can not get over it with mere oversampling. Downsampled 4K and original 2K will look the same IF they are properly bandlimited. If not, the downsampled 4K may even look worse.

Same with the 3-chip. When talking about such esotheric differences, the image pickup is essential. The difference in detail and sharpness is visible, can be seen comparing pics from Bayer sensors and the Foveon sensor (which has other problems but detail and sharpness is unparallel).

Quote:
Originally Posted by Chronoptimist View Post

As for RED "cutting corners", I am not aware of anything else that shoots RAW Video. This is a significant image quality advantage over anything else, 3-chip or not, and it means that the quality of any footage shot will improve over time as newer software is released. (we have seen this with stills photography in recent years as RAW processing has improved...

Now one can understand your knowledge is based on advertising leaflets and RED is very good in this. The Red raw is not raw but it is a result of processing from the Bayer sensor. The real raw can be obtained only from 3-chip cameras and every such high-end camera has 4:4:4 SDI output which can be recorded. This is then video of pure unprocessed perfectly aligned RGB subpixels.

But again, considering esotheric details of raw 4K and 2K is bordering on absurdity since in the real world material is heavily compressed. 4K heavily compressed and 2 compressed to the same bit rate wil not be visually differing in the TV scenario.

irkuck
irkuck is offline  
post #916 of 3692 Old 02-23-2012, 01:26 PM
AVS Addicted Member
 
walford's Avatar
 
Join Date: May 2003
Location: Orange County, CA
Posts: 16,789
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
The 300x 195 or 300x199 Jpeg images attached to some of the above post in no possible way properly display the actual resolution/detail of the screen image from whiich the Jpegs were downscaled from.
walford is offline  
post #917 of 3692 Old 02-23-2012, 01:38 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,584
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 227
Quote:
Originally Posted by irkuck View Post

You lack basics of digital signal processing theory and this is why you miss the point. What is a must is proper bandlimiting, you can not get over it with mere oversampling. Downsampled 4K and original 2K will look the same IF they are properly bandlimited. If not, the downsampled 4K may even look worse.

Clearly you have no experience in this area, and are just parroting some nonsense you've read somewhere. (probably an internet forum)

Quote:
Originally Posted by irkuck View Post

Same with the 3-chip. When talking about such esotheric differences, the image pickup is essential. The difference in detail and sharpness is visible, can be seen comparing pics from Bayer sensors and the Foveon sensor (which has other problems but detail and sharpness is unparallel).

When you are oversampling, using bayer filtering really doesn't hurt the image much at all. It's only an issue when you are capturing at the resolution you intend to broadcast.

Foveon sensors are great in theory, but in reality are lacking in resolution and suffer from high levels of noise due to the layered sensor design. Colour accuracy is greatly reduced.

Quote:
Originally Posted by irkuck View Post

Now one can understand your knowledge is based on advertising leaflets and RED is very good in this. The Red raw is not raw but it is a result of processing from the Bayer sensor. The real raw can be obtained only from 3-chip cameras and every such high-end camera has 4:4:4 SDI output which can be recorded. This is then video of pure unprocessed perfectly aligned RGB subpixels.

But again, considering esotheric details of raw 4K and 2K is bordering on absurdity since in the real world material is heavily compressed. 4K heavily compressed and 2 compressed to the same bit rate wil not be visually differing in the TV scenario.

I don't think you understand how RAW processing works. RAW captures the sensor data before it is debayered or flattened, unedited. As image processing improves, you can go back to your RAW data and get better image quality from the same file.

A 3-chip camera with a 4:4:4 SDI output is a "pre-baked" image that gives you far less headroom when editing. What you captured is it, any editing you do to the image is destructive, and you have no room for highlight/shadow recovery.

A recent example in stills photography would be Adobe's 2010 rewrite of their camera raw engine, which reduced noise levels by at least a stop, made the appearance of noise finer and improved fine detail, while reducing artefacts in the image, extracting more shadow & highlight detail etc. There's more improvements due later this year with a 2012 update to the RAW engine.

If you had been shooting in RAW and converting to 16-bit TIF for example, as many photographers had been doing (essentially equivalent to your 4:4:4 recording) you could not benefit from any of these improvements.

There is a dramatic difference in what you can do to the image in terms of editing a RAW file compared to a "baked" file too.

EDIT: Not my image, but an example of Lightroom 2, using the 2003 Camera Raw engine:

And Lightroom 3, using the 2010 Camera Raw engine:

These results are impossible to achieve with a "baked" image file. The source also contains two versions of the image that were processed as a pre-baked 16-bit TIF in third-party dedicated noise reduction programs.

Quote:
Originally Posted by walford View Post

The 300x 195 or 300x199 Jpeg images attached to some of the above post in no possible way properly display the actual resolution/detail of the screen image from whiich the Jpegs were downscaled from.

You seem to have missed this before but you can click on the image thumbnails to view the full resolution picture.
Chronoptimist is offline  
post #918 of 3692 Old 02-23-2012, 03:45 PM
AVS Special Member
 
coolscan's Avatar
 
Join Date: Jan 2006
Posts: 1,795
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 102
Quote:
Originally Posted by irkuck View Post

What you do not appreciate enough is that RED & others have single CMOS sensor chips, essentially Bayer scheme.

I do appreciate CMOS sensors very much and so does all camera manufacturers except Sigma as they use Foveon CMOS sensors.

It seems to me that you desperately seeking "far and deep" for any argument to prevent to change both opinion an understanding. Or maybe you just want to be contrary for "the heck of it".
Quote:


While the traditional high-end video cameras are 3 RGB chips. REDs are cheaper but do not forget they cut corners comparing to the 3RGB especially if we are talking about 4K perfection.

RED is cheaper with newer better technology because they do not take extortion prices for their cameras.
You just throw out some claims like that "RED cuts corners" without any facts to back that up whatsoever.

These two cameras was released about the same time. Cost about the same. Though the Alexa needs a external $30K recorder to record RAW.
Do you see the difference in technology advancements?



The Alexa is a camera that delivers good 2K images, no doubt, and nobody disagree that it is a little on the soft side unfortunately.
Here is a crop of th middle of the zone chart I posted in the other thread.
Alexa on the left, Epic-X on the right.



Quote:


So they have to go through the 4K claiming that it is better since one has 4K and downing to 2K is better. But in reality 3RGB 2K will be at least the same though the camera is more expensive. All this due to the fact that in the end the market is driven not by the highest PQ but by just-good-enough PQ @lower price.

Some years ago Sony Motion Camera division would have agreed with you very much.
At the same time Sony was manufacturing CMOS sensors themselves both for their own DSLR's and for Nikon DSLR's.
Nothing new that the various Sony "fiefdoms" don't talk to each other.

Sony Motion Camera division has been "thrash talking" CMOS for years, with good help from Panavision's John Galt who was in charge of developing the Panavision Genesis based on the F35. He was also the one who allegedly talked Sony out of building the F35 with a 4K CCD sensor and has forever argued against 4K. You and John Galt would be very good friends, I'm sure.

What you seems to have missed in my post when you now start this strange argument for CCD cameras, a technology everybody has left, is that Sony Motion Camera division also have left CCD behind.

Their new F65 4K camera is all CMOS, so now Sony would not agree with you more.
Same with their latest Sony NEX-F100 s-35/APS-C CMOS prosumer video camera.

Quote:


F35 and Red pics would have to be carefully analyzed as to what kind of processing is applied. Notice there is aliasing in F35 only for vertical direction which asks what were the parameters of zone generator used. On the other hand Red is that clean and sharp that: 1. maybe it goes over 2K res and the pic is scaled, 2. some PQ enhancement is applied, nothing wrong with this but it may be showing up artefacts for other signal structures. I would say it is rather F35 which looks like undoctored full 2K res.

You are grabbing at straws here.
In a CCD camera the light is split by a prism befor the 3 sensors.
Sony has always claimed that their F35 was striped RGB.

Here is a close-up of another zone chart for F35 which shows that Sony's claim of "stripes" are far from the facts.

You think this is better than CMOS? You don't think this is prone to create artifacts?




Quote:


For sobering, those pics show well craziness of the 4K. Slight artefacts visible at the highest frequencies with 4:4:4 uncompressed pics of theoretical content are proof that 4K is better. This is quite far from reality.

Whoa, you talk about 4K as if it was some newfangled extreme tech. 4K is about 8Mpx. That is a resolution that DSLR's has passed many years ago.

The only real problem for much higher resolution CMOS motion cameras is sensor read-speed to not get Rolling Shutter, to be able to process and record 24fps and higher framerate to capture medium (SSD/CF) and heat management.

Canon showed a 50Mpx APS-C CMOS in 2007 and a 120Mpx APS-H CMOS in 2010. The 120Mpx sensor managed only 9.7 pictures in read-speed at the time, which is quite good compared to many of the much lower resolution DSLR's.

So the higher resolution sensor technology has already been available for years, they just haven't wanted to give it to us yet.
coolscan is offline  
post #919 of 3692 Old 02-23-2012, 05:36 PM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
^^ agree but just to chime in on last part: it's not that they are with-holding 4k from us. It's more because of infrastructure and that capturing still vs capturing motion are different though heuristically just sounds like 24 stills to make motion.

Think how long HD or 3G mobile take to go mainstream, from existence in lab to mass market. Contrary to common belief, technology doesn't make big leaps, they take small steps on the shoulders of existing tech. It only looks big leap when you look back, just as you see China now 4k will come in small steps too.
specuvestor is offline  
post #920 of 3692 Old 02-23-2012, 05:50 PM
AVS Special Member
 
coolscan's Avatar
 
Join Date: Jan 2006
Posts: 1,795
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 102
Quote:
Originally Posted by Chronoptimist View Post

And seeing as you posted those zone plates from the camera tests, here's a 1080p zone plate off the Spears & Munsil Blu-ray disc, displayed at 1:1
http://www.mediafire.com/?3aylyly1uscs535

And here it is upscaled to 4K via madVR:
http://www.mediafire.com/?8j9oakve743pisg

Unfortunately these two images are 20MB in total, which far exceeds what any free image host accepts, so you will have to download them.

Please note how there is considerably less aliasing when upscaled to 4K,.and as much as I like madVR, I doubt it's as advanced as the scaling going into 4K devices.

I appreciate those very much. Good to see how much up-conversion clean up the image.

I wish there where more of these Zone charts around. Would like very much zone charts from high resolution DSLR's Stills, to compare them to those motion camera zone charts.

Quote:
Originally Posted by Chronoptimist View Post

And I've posted it before, but modern optics can greatly outresolve 4K.
http://www.fredmiranda.com/forum/top...r=2010#8243766

These results are from a relatively inexpensive (compared to cine lenses) consumer-grade Canon SLR lens on a 7D. (which is also a "corner-cutting" single-chip camera)

These "doubts" that current lenses can not resolve 4K I have seen other places too. That are from people that have never used or even know much about DSLR's. Mostly DP's, which often show how little some people that have motion camera work as their profession know about other parts of photography craftsmanship. They show sometimes shocking lack of general knowledge.

The "motion capture world" has some very fine and very very expensive lenses. $10K is reckoned as a low price for a Cine lens.
Red as an example (again ) has a range of very fine PL mount Cine lenses from between $3K, average price $5K and most expensive $10K. They are supposedly very good, but because of the low price compared to other Cine lenses, they haven't been very successful because of people doubting the quality based on price.

Nikon just announced a 36Mpx camera, the D800. I guess they don't worry their lenses are out-resolved yet.
Two different versions in fact, a D800E which has less AA filter. Interesting to see how that works out. Might create sharper images.

Now with the possibility to mount Canon EF, Leica-M and soon Nikon lenses on Red cameras, people have started to compare DSLR lenses to expensive Cine lenses on their own camera. And that might end with something of a backlash against the Cine lens quality/price.

One Director/DP and owner of a lot of very expensive Cine glass did an extensive test of Canon lenses on an Epic vs. some of his cine glass.
Here are some quotes:
Quote:


NOTE: I matched a $369 Canon 50mm 1.4 up against a $12,000 50mm Arri Master prime today...
Quite literally could not tell the difference in RCX... Even at 800%
makes we want to sell alot of my PL glass when I see **** like this,

Quote:


Bang for buck the Canon bright 85mm f1.2L primes are as sharp as Master Primes. $1550 dollars Vs $8,000... other than having teeth for my preston and follow focus units to hang off (ARRI), The Canon does exactly what the Master prime does...

Quote:


The 24mm 1.4L is pretty much one of my most favourite wides. Im not big on lenses under this as I've never been a fan of wides or lens distortion.
This lens how ever never looked that sharp on my 5D (retired) but for some reason, even at f1.4 its sharp as!
Way sharper than the Ziess Super speed, sharper than Arri ultra prime and a master prime, plus you get lens capable of 7K image area meaning no dark corners like you get on the lens I just spoke of.

Just as an apropos to the notion that no DSLR lenses are as good as Cine glass.

The test is here; http://www.reduser.net/forum/showthr...hell-out-of-it!
coolscan is offline  
post #921 of 3692 Old 02-23-2012, 05:52 PM
AVS Special Member
 
coolscan's Avatar
 
Join Date: Jan 2006
Posts: 1,795
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 102
Quote:
Originally Posted by specuvestor View Post

^^ agree but just to chime in on last part: it's not that they are with-holding 4k from us. It's more because of infrastructure and that capturing still vs capturing motion are different though heuristically just sounds like 24 stills to make motion.

You misunderstood me, maybe I wasn't clear. I meant they (Canon) withhold the 50Mpx and 120Mpx sensors.
coolscan is offline  
post #922 of 3692 Old 02-23-2012, 06:00 PM
AVS Special Member
 
Richard Paul's Avatar
 
Join Date: Sep 2004
Posts: 6,959
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 29
Quote:
Originally Posted by irkuck View Post

This article has been referenced here numerous times since it is a golden proof against 4K for the standard TV viewing scenario. You may wish to pay attention not to a side notes but at its essence illustrated in figeurs: it shows the 4K does not make sense below 3PH.

Just curious but what exactly is the "standard TV viewing scenario"? Also where are you getting this information from?


Quote:
Originally Posted by irkuck View Post

TV viewing scenario deals with video at a distance of 3-4PH and 30 deg coverage.

Those figures look like they are for HDTV. For example in the document I posted Sony gave a minimum viewing distance of 3.16 PH for HDTV.


Quote:
Originally Posted by irkuck View Post

Thus, referring to hyperacuity in the context of TV viewing scenario is nonsense but it is very relevant for tablets and smartfones.

Well in my opinion if hyperacuity is a factor with mobile devices (such as smartphones) than it is also a factor with TVs.
Richard Paul is offline  
post #923 of 3692 Old 02-23-2012, 06:41 PM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
^^ I'm also of the opinion that projector and direct view fixed pixel has different acuity characteristics.

@coolscan roger that miscommunication
specuvestor is offline  
post #924 of 3692 Old 02-23-2012, 11:22 PM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,502
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 35 Post(s)
Liked: 63
Quote:
Originally Posted by coolscan View Post

I do appreciate CMOS sensors very much and so does all camera manufacturers except Sigma as they use Foveon CMOS sensors.

It seems to me that you desperately seeking "far and deep" for any argument to prevent to change both opinion an understanding. Or maybe you just want to be contrary for "the heck of it".

I am not sure we are on the same frequency. The point is not CMOS vs. non-CMOS. The point is how RGB subpixels are picked up. In both 3-RGB and Foveon subpixels are aligned so that they are taken from precisely same point.
In a single chip CMOS this is not the case, like in Bayer. I hope you understand that Bayer megapixels are not the same as 3-RGB megapixels? This has impact when one is talking about the minute aspects of 4K and 2K sensors and resolution.

Quote:
Originally Posted by coolscan View Post

RED is cheaper with newer better technology because they do not take extortion prices for their cameras.
You just throw out some claims like that "RED cuts corners" without any facts to back that up whatsoever.These two cameras was released about the same time. Cost about the same. Though the Alexa needs a external $30K recorder to record RAW. Do you see the difference in technology advancements?

Technically Red is cheaper since single CMOS is cheaper than 3-CCD. This is the main corner which Red has cut. Since CMOS progressed at much faster pace (due to high-end compact cameras) and its res can be very high, one can obviously say that Red is now good enough.

Quote:
Originally Posted by coolscan View Post

What you seems to have missed in my post when you now start this strange argument for CCD cameras, a technology everybody has left, is that Sony Motion Camera division also have left CCD behind.

The point is NOT about CCD vs. CMOS! The point is about 3-chip vs. single chip. Let's say somebody makes 4K camera based on 3 CMOS chips each separate for single RGB channel - thus producing pure 4:4:4 RGB with prefectly aligned subpixels and without any digital manipulations. Such camera would blow out any Reds. Reds trying to match is would need to have 20 Bayer megapixels but still there would be visible res differences in close-ups. Situation is similar like in comparisons between Bayer and Foveon /though Foveon has other problems/.

Now, why is nobody responding to the Red challenge and making 3-CMOS 4K film camera? It is because it would be more expensive and single CMOS is just good enough. So the 3-chip is disappearing and good enough single chip is ruling.


Quote:
Originally Posted by Richard Paul View Post

Just curious but what exactly is the "standard TV viewing scenario"? Also where are you getting this information from?Those figures look like they are for HDTV. For example in the document I posted Sony gave a minimum viewing distance of 3.16 PH for HDTV.

Standard TV scenario is a model used in the development of HDTV. The assumption is to achieve 30 deg coverage of visual field and typical living room viewing distance in the order of 10 feet which is derived from observation and testing how comfortable people feel. This all translates to the 3-4PH. The model was used, with data on human visual res, e.g. by the ATSC to arrive at the necessary and sufficient resolution of 1000 lines vertical . The 3.16PH you refer is given in the Sony paper as the border distance: above it one can not see 4K and below it the 4K information is becoming visible. Obviously the 0.16 is theoretical so practically one can say that 4K starts becoming justified below 3PH. This is beyond the standard TV viewing scneario.

Quote:
Originally Posted by coolscan View Post

Well in my opinion if hyperacuity is a factor with mobile devices (such as smartphones) than it is also a factor with TVs.

Well, you do not notice that both scenarios are totally different. If you collect more information about the visual system this will be obvious. Hyperacuity really is not relevant for the moving pics. Mobile device is akin to paper info reading on which there is tons of research. In paper reading there is static information which is inspected locally and sequentially for longer time by the center of retina with subtle cooperation by the motoric system which adjusts hand, head and body position for the best acquisition. To start with, TV watching is not like paper reading?

irkuck
irkuck is offline  
post #925 of 3692 Old 02-24-2012, 08:52 AM
AVS Special Member
 
coolscan's Avatar
 
Join Date: Jan 2006
Posts: 1,795
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 102
Quote:
Originally Posted by irkuck View Post

I am not sure we are on the same frequency. The point is not CMOS vs. non-CMOS. The point is how RGB subpixels are picked up. In both 3-RGB and Foveon subpixels are aligned so that they are taken from precisely same point.
In a single chip CMOS this is not the case, like in Bayer. I hope you understand that Bayer megapixels are not the same as 3-RGB megapixels? This has impact when one is talking about the minute aspects of 4K and 2K sensors and resolution.

You just grab some claims out of the "top of your hat", and make it sound like CCD is a superior technology to CMOS.

Nobody agrees with you, including 99% of all the Camera manufacturers. They have all left CCD behind. Everybody tested Foveon, nobody found any advantage.
The majority of investments in sensor R&D is in CMOS, from high end large sensors to micro sensors for smartphones.

Try to come with some facts to back up your claims!

This is the market situation for CCD vs. CMOS now:

Quote:


iSuppli: CCD Market Share Dropped to 8% in 2011
IHS iSupply Image Sensor Market Q1 2012 report states that CMOS sensors took 73% of the revenue and 92% of the units shipped in 2011. "CMOS technology is expected to climb even higher in future years, to 90 percent of revenues for sensors and 97 percent of shipments.

CMOS sensors already dominate cameras in notebook PCs cameras with 100% penetration as a result of their lower power and cost advantages.

While CCD manufacturers are attempting to compete with price reductions due to yield improvements, we believe this will not help in the long run as CMOS continues to improve with offering technologies such as back side illumination (BSI) improving low light conditions.

Even in one of CCDs largest segments - digital still cameras - the technology is waning projected to drop to 25 percent of CCD revenues and 27 percent of all CCD units by 2015."

Quote:
Originally Posted by irkuck View Post

Technically Red is cheaper since single CMOS is cheaper than 3-CCD. This is the main corner which Red has cut. Since CMOS progressed at much faster pace (due to high-end compact cameras) and its res can be very high, one can obviously say that Red is now good enough.

Why single out RED. Everybody is using CMOS. CCD is left behind in the dust.

Quote:


The point is NOT about CCD vs. CMOS! The point is about 3-chip vs. single chip. Let's say somebody makes 4K camera based on 3 CMOS chips each separate for single RGB channel - thus producing pure 4:4:4 RGB with prefectly aligned subpixels and without any digital manipulations. Such camera would blow out any Reds. Reds trying to match is would need to have 20 Bayer megapixels but still there would be visible res differences in close-ups. Situation is similar like in comparisons between Bayer and Foveon /though Foveon has other problems/.

Now, why is nobody responding to the Red challenge and making 3-CMOS 4K film camera? It is because it would be more expensive and single CMOS is just good enough. So the 3-chip is disappearing and good enough single chip is ruling.

They are responding to RED, but slow, and when the competition have a new product RED is introducing a new upgraded sensor to their excisting camera. Non of the competition does this. The competition will force you to buy a whole new camera.
And all of the competitors use CMOS sensors.

Sony introduced Backside illuminated Cmos, Fuji did too, with the random pixel array and 45degree turned pixels. Fuji again recently released a camera with Organic Cmos which doesn't need IR-filter or optical low-pass filters. The Cmos tech is called X-Trans CMOS.
This enables the sensor to resolve finer sharper details because the details aren't blurred by filters.
After some refinements of the the tech. this is very promising.

You woun't like this; NHK just announced a 8K 120fps Cmos. As NHK doesn't manufacture sensors somebody must make them for them: http://techon.nikkeibp.co.jp/english...120224/205923/

And Sony R&D team is talking about their desire to develope 8K and 16K projectors; Leading the way towards ever higher resolution

As you have not managed to back up any of you arguments an claims with any substance whatsoever you have put yourself outside of this discussions.

I'm not going to feed you more facts, do some research yourself.

Quote:
Originally Posted by irkuck View Post

Quote:


Originally Posted by coolscan
Well in my opinion if hyperacuity is a factor with mobile devices (such as smartphones) than it is also a factor with TVs.

This quote was not written by me!
coolscan is offline  
post #926 of 3692 Old 02-24-2012, 11:31 AM
AVS Addicted Member
 
rogo's Avatar
 
Join Date: Dec 1999
Location: Sequoia, CA
Posts: 30,255
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 200 Post(s)
Liked: 545
I don't want to wade too deeply into this, but it's probably more fair to characterize things as follows:

When CMOS first started to displace CCD, it was cheaper, but not better. Then, it became equivalent and often better. CCD remained more expensive and wasn't developing at "semiconductor speed" while CMOS was. The reasons for using CCD grew less and less relevant, which is where we are today.

As for Foveon, everyone tested it and few found it overly compelling. But, again, the big problem was cost vs. performance. If dozens of cameras had used Foveon a bunch of years ago, the costs would have fallen and newer, better generations at low prices would've become real more quickly. Conventional image sensors continued to develop at high rates of speed while Foveon's "breakthrough" looked less impressive as time passed.

There is no difference in HDMI cables. If you can see the picture without visible dropouts or sparklies, the cable is working at 100%. No other cable will display a better version of that picture. You're simply wrong if you think there is a better digital cable than one that is already working.
rogo is offline  
post #927 of 3692 Old 02-25-2012, 07:41 AM
Senior Member
 
Russell Burrows's Avatar
 
Join Date: Oct 2007
Location: no mans land aka mexican dmz
Posts: 422
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
A visit to my eye doctor gave me the detail that our eyes can do 30 megapixels thats around 16K??

So I guess that we do need 4K displays and later 8K and then 16K displays that then being the limit?? of what most human eyes can see??

Any eye doctors here that can chime in on this??

DIY beats store purchased.
Russell Burrows is offline  
post #928 of 3692 Old 02-25-2012, 12:54 PM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,556
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 130 Post(s)
Liked: 56
Quote:
Originally Posted by Russell Burrows View Post

A visit to my eye doctor gave me the detail that our eyes can do 30 megapixels thats around 16K??

So I guess that we do need 4K displays and later 8K and then 16K displays that then being the limit?? of what most human eyes can see??

Any eye doctors here that can chime in on this??

Isn't it around the resolution of Super Hi-vision? ie. about 8K (not 16K)?
SHV (slightly less than 8K) = 7680x4320 pixels = 33,177,600 total pixels (per frame) = about 33.18 megapixels.

Also, what our eyes can see is also determined by how far away they are from the screen.
Also, it would depend on the aspect ratio of the content. Studios call "8K" the amount of pixels in width, no matter what the aspect ratio of the film is (2.40:1 at 8K would have less active pixels than 1.78:1 at 8K). Also 1.78:1 is probably closer to our eye's field of view than 2.40:1 (I think it's actually more like 1.6:1)
Joe Bloggs is offline  
post #929 of 3692 Old 02-25-2012, 01:19 PM
Senior Member
 
Russell Burrows's Avatar
 
Join Date: Oct 2007
Location: no mans land aka mexican dmz
Posts: 422
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Yeah its around 8K for most folks eyesight.
But then I also wonder if laser corrected eyes for 20/10 vision will have me looking at 16K or?? is 8K the real limit??

I am just wow! at the idea of sooner or later haveing movies in 2160p for use on blu ray at home.

DIY beats store purchased.
Russell Burrows is offline  
post #930 of 3692 Old 02-25-2012, 01:29 PM
AVS Special Member
 
saprano's Avatar
 
Join Date: Oct 2007
Location: Bronx NY
Posts: 3,396
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 99 Post(s)
Liked: 249
^^^^Please don't expect miracles. Your just going to end up disappointed.

home theater addict
saprano is offline  
Reply OLED Technology and Flat Panels General

Tags
Samsung Bd D7000 3d Blu Ray Disc Player Silver , Samsung Pn51d8000 51 Inch 1080p 3d Ready Plasma Hdtv 2011 Model , Displays , Pioneer Pdp 5080 Hd
Gear in this thread

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off