AVS Forum banner
Status
Not open for further replies.
1 - 20 of 33 Posts

· Registered
Joined
·
6,092 Posts
Discussion Starter · #1 ·
Mark,


QXGA should be a 2048 wide and I think Tom has a 1080p feed correct? It doesn't sound like any dance moves are necessary.


Tom,


Is the 1100:1 going to be possible on any of your commericial gear?


Anyway, I am glad to here about the success. Other than being easier to calibrate, what advantage does a 3-chip DLP have?


-Mr. Wigggles




------------------

The Mothership is now boarding.
 

· Registered
Joined
·
855 Posts
MrWigggles -


I read a lot about DLP Cinema on the TI site in the last year or so and from what I remember:

DLP Cinema projectors use 14-bits of color depth.

They ARE (not "will be") available from multiple vendors today. Barco, Cristie and another (DP?).


Tom -


Why only 10 bits of color depth? How do the blacks look? How many ft-lamberts are delivered? (at the screen) What size lamp is used? Does it use a standard film projector lamp housing?


Bummer! It is too bad someone hasn't combined a 14-bit color gammut with REALLY high resoultion. I thought JVC would. Looks like each company (JVC and TI) got half of it right.


------------------


Huck
 

· Registered
Joined
·
6,092 Posts
Discussion Starter · #4 ·
Quote:
Originally posted by Huckster:
MrWigggles -


I read a lot about DLP Cinema on the TI site in the last year or so and from what I remember:

DLP Cinema projectors use 14-bits of color depth.

They ARE (not "will be") available from multiple vendors today. Barco, Cristie and another (DP?).


Tom -


Why only 10 bits of color depth? How do the blacks look? How many ft-lamberts are delivered? (at the screen) What size lamp is used? Does it use a standard film projector lamp housing?


Bummer! It is too bad someone hasn't combined a 14-bit color gammut with REALLY high resoultion. I thought JVC would. Looks like each company (JVC and TI) got half of it right.

Larry,


If you are reading, I blame this type of "misunderstanding" on you http://www.avsforum.com/ubb/smile.gif .


Huckster,


I would love to argue that 8 bit color depth is plenty (especially for pure colors), but my god 10 bit is at least 1 bit more than enough, I see no benefit at all to a 14-bit "color depth" as the term is commonly understood today (i.e. non-uniform steps with a gamma between 2.2 and 2.5).


-Mr. Wigggles


------------------

The Mothership is now boarding.
 

· Registered
Joined
·
547 Posts

· Registered
Joined
·
774 Posts
Well,


Yesterday the moment of truth came...finally got to see film transfer projected using our QXGA chip based projector!


Full resolution, 1920x1080 projected on a 25'(+-) wide Stewart Microperf screen...contrast ratio better than 1100:1. Sources were Sony HDCAM and HDD-1000. While the projector is an engineering sample and has only 8 bit/ch color at the moment (production will have 10 bit), it was simply stunning! Anyone who says that 1280x1024 resolution is enough is just simply wrong! Yes, you can put up a nice image...but the depth and detail of the higher resolution image is breathtaking...and rock solid. It is like a virgin saying sex is nothing special...


I know there will be plenty of questions but I'm probably not going to be here much over the next few days because of the show...I'll check in as I can.


Regards,




------------------

Tom Stites

Director, Business Development

Digital Systems Division

JVC Professional Products

"My opinions do not necessarily reflect..."
 

· Registered
Joined
·
4,525 Posts
Awesome, Tom! We can't wait! Finally a true HD resolution system.


What are the long-term plans for the QXGA chip? Will we see it in a product that we could conceivably put in a home within the next year or two?


Also, does this projector do the fancy DILA/HDTV progressive/interlaced dance with the 1080i feed as discussed here ? Does it automatically upconvert the material to 1080p?


Finally, will we see this in Digital Cinema any time soon (I would love to see this at a theater). Is this what Sony will be using as their Digital Cinema offering?


------------------

My home theater: ht.markhunter.com
 

· AVS Forum Special Member
Joined
·
11,140 Posts
Quote:
Originally posted by tstites:
Full resolution, 1920x1080 projected on a 25'(+-) wide Stewart Microperf screen...contrast ratio better than 1100:1. Sources were Sony HDCAM and HDD-1000.
Also delighted to see this advance. In light of this column and similar discussions regarding resolution it appears from the HDTV source, Sony's HDCAM, that the horizontal res was limited to 1440.(Thought it was the tape system that restricted resolution.) So even if the chip has 2000-plus pixels, and even if it is designed for a hybrid interlace/progressive display, it seems there's nothing to display beyond the 1440 Sony cutoff.


Wonder if it's necessary to use 24p capture, with the Sony 2000-plus (horizontal res) camera, also with uncompressed storage, to show this display system to full advantage? And even with that, it would be interesting to hear how much of the resolution, if any, is being filtered before reaching the projection screen. -- John


------------------

STOP DVI/HDCP AND DFAST




[This message has been edited by John Mason (edited 03-06-2001).]
 

· Registered
Joined
·
855 Posts
Mr.Wiggles,


You believe that 8-bits of color depth is enough for everything? What a heretic. http://www.avsforum.com/ubb/smile.gif Next you'll be saying SACDs sound the same as regular CDs. http://www.avsforum.com/ubb/smile.gif http://www.avsforum.com/ubb/smile.gif


The current crop of movies being shown on DLP Cinema screens is stored as 10 bit 4:2:2. Using 10 bits in the output path would leave no space for error terms. Using a 14-bit output path allows gamma correction and color space conversion to have some over/underflow space and should allow smoother color gradients and better black level detail.


Sorry, but I don't have a mis-understanding. http://www.avsforum.com/ubb/wink.gif


You asked for advantages of DLP Cinema, 14-bit color depth is one answer.


------------------


Huck
 

· Registered
Joined
·
731 Posts
Guys-

I work on movies and most matte paintings, created in Photoshop, are done in an 8 bit color space. They look fine, don't they? Adobe has no plans to increase the bit depth of the program because 8 bit color overlays provide MILLIONS of colors, more than enough for professional photography. Look through photography Annuals and you'll find that thousands of the best photographers in the world are using 8 bit Photoshop to finish their work. Banding and other display anomolies have more to do with gamma calibration, and display techniques than with the bit depth, so fix the other problems before you start screaming about bits.


------------------

JVC - MPAA - Are you listening?

DFAST MUST DIE!
 

· Registered
Joined
·
6,092 Posts
Discussion Starter · #13 ·
Huckster,


As one of the first reviewers of SACD said when supplied with the same source material from Sony in both the CD and SACD format: "The two are recorded so well it is hard to tell the difference."


Technology is one thing implementation is another. If the recording engineer takes the extra time to make SACD sound better it will. If someone takes the extra time to master the video transfer and it is shown on a $100K+ projector it will look better.


14-bit processing is overkill in this limited contrast/light output world we live in. If movie projectors could generate 100 Ft-L images to zero Ft-L images (without causing the audience to blink to much) than a greater than 8-bit color space would be necessary.


I apologize to anyone who has seen this before but here it goes again. Here are 126,126,126 letters on a 127,127,127 background. Can you make out the difference on even a good CRT?


" www.geocities.com/mrwigggles/testletters.bmp "


There are limitations on digital projectors but it isn't color depth. It is resolution, color accuracy, contrast ratio, and light output to name the most important.


I don't like walking on someone else's thread, but it appears the proof is in the pudding. It appears as if the JVC with its measly 8-bits of color depth is matching or surpassing what TI has to offer.


TI would love for people to think that 14 bits is the way to go. It just gives them more time to get their resolution act together.


-Mr. Wigggles


------------------

The Mothership is now boarding.


[This message has been edited by MrWigggles (edited 03-06-2001).]
 

· Registered
Joined
·
855 Posts
Mr.Wiggles,

Quote:
The two are recorded so well it is hard to tell the difference.
But a difference was there and could be repeatbly found, none the less.

Quote:
Technology is one thing implementation is another. If the recording engineer takes the extra time to make SACD sound better it will. If someone takes the extra time to master the video transfer and it is shown on a $100K+ projector it will look better.
I agree. Much like the difference between a new anamorphic transfer for a dvd vs a reused laserdisc transfer. A sign with "It's the transfer, stupid" should be above every telecine station in the world.

Quote:
I don't like walking on someone else's thread, but it appears the proof is in the pudding. It appears as if the JVC with its measly 8-bits of color depth is matching or surpassing what TI has to offer.


TI would love for people to think that 14 bits is the way to go. It just gives them more time to get their resolution act together.
It sounds like you have seen JVC's projector and have declared a winner. Where did you see it? http://www.avsforum.com/ubb/smile.gif


The proof will be in the pudding. Until I actually see something, besides specs, it will be hard to tell whether a 10 bit data path is enough to process 10 bits of data without over/underflow during color calibration, color space conversion and gamma processing. I will hold off my judgement until I see it. In the mean time, I disagree with your assessment. http://www.avsforum.com/ubb/smile.gif


BarkingArt,


Interesting. Another "8 is enough". http://www.avsforum.com/ubb/wink.gif


Not putting you on the spot but if 8 bits are enough, why are digital films distributed in 10bits per component, 4:2:2, YCrCb? Lowering the bit depth would certainly reduce reproduction, processing and distribution costs so what is the reason?


Are feature films finished in an 8bpp system?

Quote:
Banding and other display anomolies have more to do with gamma calibration, and display techniques than with the bit depth, so fix the other problems before you start screaming about bits.
I'm happy to see you agree with me! http://www.avsforum.com/ubb/wink.gif


How do you "fix" the gamma and displayed output? By using video processing. To do video processing on a 10 bit data word (the standard input from a digital film), you cannot use a 10 bit processing path without introducing other artifacts. This, is the crux of my disagreement.


------------------


Huck
 

· Registered
Joined
·
4,720 Posts
Tom has spoken about this before on the forum. As far as

saving up for one of these QXGA projectors - see:

http://www.avsforum.com/ubb/Forum10/HTML/004124.html


where Tom states:


"Our QXGA based product is actually a little ahead of schedule but,

don't get all excited about it though, it's 7000 lumens and priced around $200K"


When Tom is talking about Digital Cinema - I believe he means

"Cinema" as in "Movie Theater".


Greg
 

· Registered
Joined
·
891 Posts
Quote:
Simple question: What are the letters?
The thing is, that's not a good test of the potential problem. We should really take a nice smooth color gradient, then put it though some gamma correction and color correction, and look at the smoothness of the resulting gradient.


Did the least-significant bits that got truncated during processing make a discernable difference? Or were those LSB's so far below the perception/reproduction threshold that you can't tell? Until we actually have the projectors to test on, no one knows for sure.


Your 126-vs-127 letter test is good for illustrating that even if the error introduced is equivalent to a 1/256 error, it's potentially imperceptable. But is that true for all 1/256 errors? And depending on the processing done, the error could potentially be higher than that, so is it true for 1/128 or 1/64 errors?


[This message has been edited by Chris Carollo (edited 03-07-2001).]
 

· Registered
Joined
·
731 Posts
Huck-


Good. Now for the rest of the story. Up until about 2 years ago, any digital processing (visual effects composites, mostly) were done in 8 bit color space. Up until about 4 years ago, 2k resolution was standard for output to the film recorders. When the idea of a higher rez was seriously considered (made possible by cheaper data storage, affordable higher rez film recorders and finer grain film stocks developed for, among other things, digital film recording), the standard became 3k and soon after, 4k. The increased output resolution gave rise to development of higher bit depth renders, currently 16 bit using Maya, Shake and Inferno software. Still, most matte paintings are done in 8 bit color due to the limitation of Photoshop, and are incorporated into 16 bit composites. Some elements are painted in Matador, which renders in 16 bit color, but it's not very user friendly and hasn't been updated in years.

So, we're talking about resolutions that are FAR beyond what current home theater projectors can handle, with these decisions made for archival, original negative output. 4k is fine for 70mm, and we go up to 6k or 8k for Imax and Omnimax.


Now let's get down to it. http://www.avsforum.com/ubb/smile.gif For our purposes, namely scaling DVD movies to the hardware resolution of our projectors, is an 8 bit video card enough? Will a higher color bit depth make a difference? I risk contradicting myself by exclaiming YES YES YES YES YES!!!!!!


8 bits are MORE than enough, IF they are the RIGHT BITS. What this means is that after the image is digitally mastered, if it is kept in its original form (no upscaling, rotation, skewing, etc.), 8 bits will provide more than enough color information. Hoewever, if the pixels are altered in any way, even a .001 differentiation, EVERY pixel needs to be recalculated and will suffer degradation. Banding occurs when pixels are stretched (scaling) and it is there that 8 bit depth becomes deficiant. If I have a low rez image and I want to alter it, I will usually scale it up to a much larger size than my target size, do my alteration, then scale it down to the final size. This will almost always give me a more refined result than if I processed the image at the same rez. More info, better scaling. The scaling is the key. The higher the color bit depth, the greater the palette to work with when adding information that wasn't there before. If your projector's hardware resolution was exactly the same pixel height and width as the MPEG 2 image file on the DVD, 8 bits would be optimal.


Add on top of this that digital color space is linear color space. Using non-floating point calculations, this puts middle gray at about 18 on a scale from 1-100. You can just a imagine what linear scaling will do to a linear grayscale.


So, after the scaling or filtering, do 10 bits, 12 bits, 16 bits matter? Only if the pixel amount or position needs to be recalculated once again. Otherwise, a higher bit depth buys you nothing.


Every day I work with an incredibly detailed, subtley colored image on my 24 bit (8 bit color space) monitor using Photoshop @ 1280x1024 rez. In my case, 8 is enough. In your case however, the video card bit depth is significant.



[This message has been edited by BarkingArt (edited 03-07-2001).]
 

· Registered
Joined
·
731 Posts
bump


------------------

JVC - MPAA - Are you listening?

DFAST MUST DIE!
 

· Registered
Joined
·
731 Posts
2 away from 100...


------------------

JVC - MPAA - Are you listening?

DFAST MUST DIE!
 

· Registered
Joined
·
855 Posts
BarkingArt,


Thank you. This is a lot of very cool information. I really appreciate it.

Quote:
So, first of all, we're talking about resolutions that are FAR beyond what current home theater projectors can handle
I understand you are talking about archival but these versions are used to strike digital cinema "prints", too. Right?

Quote:
Is increased bit depth noticable using streaming Mpeg 2 (8 bit) compression ...
OK, DVDs and HDTV won't use the added depth.


Now I'll get back to my question and see if I can pull an answer out of what you wrote. Please correct me if this is wrong or misrepreasents what you wrote or meant. (I guess I didn't have to say that, eh?)


Q> If 8 bits are enough, why are digital films distributed in 10bits per component, 4:2:2, YCrCb?


A> That is the archival format so movies make it to that format sooner or later anyway.


Hopefully this is a correct interpretation. If so, why wouldn't you want a digital cinema projector to use the 10 bits of information that already exists?


Perceptable or not is not the only issue. Or is it? As an artist, do you want the primary piece of display hardware for your art deciding to truncate your work because it doesn't have the data path width to process it completely? Doesn't that idea bother you?


I still believe the statement in my first post. Each company got half the equation right. Resolution on one and color depth on the other.


I guess we will just disagree.


------------------


Huck
 
1 - 20 of 33 Posts
Status
Not open for further replies.
Top