or Connect
AVS › AVS Forum › Display Devices › Digital Hi-End Projectors - $3,000+ USD MSRP › Native Resolution Digital Cinema
New Posts  All Forums:Forum Nav:

Native Resolution Digital Cinema  

post #1 of 33
Thread Starter 
Mark,

QXGA should be a 2048 wide and I think Tom has a 1080p feed correct? It doesn't sound like any dance moves are necessary.

Tom,

Is the 1100:1 going to be possible on any of your commericial gear?

Anyway, I am glad to here about the success. Other than being easier to calibrate, what advantage does a 3-chip DLP have?

-Mr. Wigggles



------------------
The Mothership is now boarding.
post #2 of 33
That's exciting news Tom.

I can't wait to see one myself.

I guess this means that I'll have to get a 25' wide screen now. http://www.avsforum.com/ubb/wink.gif

If you need a West Coast demo location, I'll be happy to show the new QXGA D-ILA FPTV off to all of the local HT enthusiasts.

-Dean.

[This message has been edited by Dean McManis (edited 03-05-2001).]
post #3 of 33
MrWigggles -

I read a lot about DLP Cinema on the TI site in the last year or so and from what I remember:
DLP Cinema projectors use 14-bits of color depth.
They ARE (not "will be") available from multiple vendors today. Barco, Cristie and another (DP?).

Tom -

Why only 10 bits of color depth? How do the blacks look? How many ft-lamberts are delivered? (at the screen) What size lamp is used? Does it use a standard film projector lamp housing?

Bummer! It is too bad someone hasn't combined a 14-bit color gammut with REALLY high resoultion. I thought JVC would. Looks like each company (JVC and TI) got half of it right.

------------------

Huck
post #4 of 33
Thread Starter 
Quote:
<font face="Verdana, Arial" size="2">Originally posted by Huckster:
MrWigggles -

I read a lot about DLP Cinema on the TI site in the last year or so and from what I remember:
DLP Cinema projectors use 14-bits of color depth.
They ARE (not "will be") available from multiple vendors today. Barco, Cristie and another (DP?).

Tom -

Why only 10 bits of color depth? How do the blacks look? How many ft-lamberts are delivered? (at the screen) What size lamp is used? Does it use a standard film projector lamp housing?

Bummer! It is too bad someone hasn't combined a 14-bit color gammut with REALLY high resoultion. I thought JVC would. Looks like each company (JVC and TI) got half of it right.

</font>
Larry,

If you are reading, I blame this type of "misunderstanding" on you http://www.avsforum.com/ubb/smile.gif.

Huckster,

I would love to argue that 8 bit color depth is plenty (especially for pure colors), but my god 10 bit is at least 1 bit more than enough, I see no benefit at all to a 14-bit "color depth" as the term is commonly understood today (i.e. non-uniform steps with a gamma between 2.2 and 2.5).

-Mr. Wigggles

------------------
The Mothership is now boarding.
post #5 of 33
Tom
What Rough guess what price is this projector going to hang around.

Thanks Hugo
post #6 of 33
owesome!!! Tom,

can you please post some photo's for us to see. I am going to start saving $$$ for this QXGA from now.

Also, will this projector be shown at infocomm 2001 this summer?

Mark Hunter,

to answer your question, read this article! about how boeing and directors will push for digital cinema.

http://www.latimes.com/news/asection...000019012.html
http://www.latimes.com/business/2001...000019399.html

thanks,

seng



[This message has been edited by adaseng (edited 03-05-2001).]
post #7 of 33
Well,

Yesterday the moment of truth came...finally got to see film transfer projected using our QXGA chip based projector!

Full resolution, 1920x1080 projected on a 25'(+-) wide Stewart Microperf screen...contrast ratio better than 1100:1. Sources were Sony HDCAM and HDD-1000. While the projector is an engineering sample and has only 8 bit/ch color at the moment (production will have 10 bit), it was simply stunning! Anyone who says that 1280x1024 resolution is enough is just simply wrong! Yes, you can put up a nice image...but the depth and detail of the higher resolution image is breathtaking...and rock solid. It is like a virgin saying sex is nothing special...

I know there will be plenty of questions but I'm probably not going to be here much over the next few days because of the show...I'll check in as I can.

Regards,



------------------
Tom Stites
Director, Business Development
Digital Systems Division
JVC Professional Products
"My opinions do not necessarily reflect..."
post #8 of 33
You gotta get this thing out there, and in the hands of us true believers (with a trade-in allowance, of course)! ;-)
post #9 of 33
Awesome, Tom! We can't wait! Finally a true HD resolution system.

What are the long-term plans for the QXGA chip? Will we see it in a product that we could conceivably put in a home within the next year or two?

Also, does this projector do the fancy DILA/HDTV progressive/interlaced dance with the 1080i feed as discussed here? Does it automatically upconvert the material to 1080p?

Finally, will we see this in Digital Cinema any time soon (I would love to see this at a theater). Is this what Sony will be using as their Digital Cinema offering?

------------------
My home theater: ht.markhunter.com
post #10 of 33
Quote:
<font face="Verdana, Arial" size="2">Originally posted by tstites:
Full resolution, 1920x1080 projected on a 25'(+-) wide Stewart Microperf screen...contrast ratio better than 1100:1. Sources were Sony HDCAM and HDD-1000.
</font>
Also delighted to see this advance. In light of this column and similar discussions regarding resolution it appears from the HDTV source, Sony's HDCAM, that the horizontal res was limited to 1440.(Thought it was the tape system that restricted resolution.) So even if the chip has 2000-plus pixels, and even if it is designed for a hybrid interlace/progressive display, it seems there's nothing to display beyond the 1440 Sony cutoff.

Wonder if it's necessary to use 24p capture, with the Sony 2000-plus (horizontal res) camera, also with uncompressed storage, to show this display system to full advantage? And even with that, it would be interesting to hear how much of the resolution, if any, is being filtered before reaching the projection screen. -- John

------------------
STOP DVI/HDCP AND DFAST



[This message has been edited by John Mason (edited 03-06-2001).]
post #11 of 33
Mr.Wiggles,

You believe that 8-bits of color depth is enough for everything? What a heretic. http://www.avsforum.com/ubb/smile.gif Next you'll be saying SACDs sound the same as regular CDs. http://www.avsforum.com/ubb/smile.gif http://www.avsforum.com/ubb/smile.gif

The current crop of movies being shown on DLP Cinema screens is stored as 10 bit 4:2:2. Using 10 bits in the output path would leave no space for error terms. Using a 14-bit output path allows gamma correction and color space conversion to have some over/underflow space and should allow smoother color gradients and better black level detail.

Sorry, but I don't have a mis-understanding. http://www.avsforum.com/ubb/wink.gif

You asked for advantages of DLP Cinema, 14-bit color depth is one answer.

------------------

Huck
post #12 of 33
Guys-
I work on movies and most matte paintings, created in Photoshop, are done in an 8 bit color space. They look fine, don't they? Adobe has no plans to increase the bit depth of the program because 8 bit color overlays provide MILLIONS of colors, more than enough for professional photography. Look through photography Annuals and you'll find that thousands of the best photographers in the world are using 8 bit Photoshop to finish their work. Banding and other display anomolies have more to do with gamma calibration, and display techniques than with the bit depth, so fix the other problems before you start screaming about bits.

------------------
JVC - MPAA - Are you listening?
DFAST MUST DIE!
post #13 of 33
Thread Starter 
Huckster,

As one of the first reviewers of SACD said when supplied with the same source material from Sony in both the CD and SACD format: "The two are recorded so well it is hard to tell the difference."

Technology is one thing implementation is another. If the recording engineer takes the extra time to make SACD sound better it will. If someone takes the extra time to master the video transfer and it is shown on a $100K+ projector it will look better.

14-bit processing is overkill in this limited contrast/light output world we live in. If movie projectors could generate 100 Ft-L images to zero Ft-L images (without causing the audience to blink to much) than a greater than 8-bit color space would be necessary.

I apologize to anyone who has seen this before but here it goes again. Here are 126,126,126 letters on a 127,127,127 background. Can you make out the difference on even a good CRT?

"www.geocities.com/mrwigggles/testletters.bmp"

There are limitations on digital projectors but it isn't color depth. It is resolution, color accuracy, contrast ratio, and light output to name the most important.

I don't like walking on someone else's thread, but it appears the proof is in the pudding. It appears as if the JVC with its measly 8-bits of color depth is matching or surpassing what TI has to offer.

TI would love for people to think that 14 bits is the way to go. It just gives them more time to get their resolution act together.

-Mr. Wigggles

------------------
The Mothership is now boarding.

[This message has been edited by MrWigggles (edited 03-06-2001).]
post #14 of 33
Mr.Wiggles,

Quote:
<font face="Verdana, Arial" size="2">The two are recorded so well it is hard to tell the difference.</font>
But a difference was there and could be repeatbly found, none the less.

Quote:
<font face="Verdana, Arial" size="2">Technology is one thing implementation is another. If the recording engineer takes the extra time to make SACD sound better it will. If someone takes the extra time to master the video transfer and it is shown on a $100K+ projector it will look better.</font>
I agree. Much like the difference between a new anamorphic transfer for a dvd vs a reused laserdisc transfer. A sign with "It's the transfer, stupid" should be above every telecine station in the world.

Quote:
<font face="Verdana, Arial" size="2">I don't like walking on someone else's thread, but it appears the proof is in the pudding. It appears as if the JVC with its measly 8-bits of color depth is matching or surpassing what TI has to offer.

TI would love for people to think that 14 bits is the way to go. It just gives them more time to get their resolution act together.</font>
It sounds like you have seen JVC's projector and have declared a winner. Where did you see it? http://www.avsforum.com/ubb/smile.gif

The proof will be in the pudding. Until I actually see something, besides specs, it will be hard to tell whether a 10 bit data path is enough to process 10 bits of data without over/underflow during color calibration, color space conversion and gamma processing. I will hold off my judgement until I see it. In the mean time, I disagree with your assessment. http://www.avsforum.com/ubb/smile.gif

BarkingArt,

Interesting. Another "8 is enough". http://www.avsforum.com/ubb/wink.gif

Not putting you on the spot but if 8 bits are enough, why are digital films distributed in 10bits per component, 4:2:2, YCrCb? Lowering the bit depth would certainly reduce reproduction, processing and distribution costs so what is the reason?

Are feature films finished in an 8bpp system?

Quote:
<font face="Verdana, Arial" size="2">Banding and other display anomolies have more to do with gamma calibration, and display techniques than with the bit depth, so fix the other problems before you start screaming about bits.</font>
I'm happy to see you agree with me! http://www.avsforum.com/ubb/wink.gif

How do you "fix" the gamma and displayed output? By using video processing. To do video processing on a 10 bit data word (the standard input from a digital film), you cannot use a 10 bit processing path without introducing other artifacts. This, is the crux of my disagreement.

------------------

Huck
post #15 of 33
Tom has spoken about this before on the forum. As far as
saving up for one of these QXGA projectors - see:

http://www.avsforum.com/ubb/Forum10/HTML/004124.html

where Tom states:

"Our QXGA based product is actually a little ahead of schedule but,
don't get all excited about it though, it's 7000 lumens and priced around $200K"

When Tom is talking about Digital Cinema - I believe he means
"Cinema" as in "Movie Theater".

Greg
post #16 of 33
Quote:
<font face="Verdana, Arial" size="2">Simple question: What are the letters?</font>
The thing is, that's not a good test of the potential problem. We should really take a nice smooth color gradient, then put it though some gamma correction and color correction, and look at the smoothness of the resulting gradient.

Did the least-significant bits that got truncated during processing make a discernable difference? Or were those LSB's so far below the perception/reproduction threshold that you can't tell? Until we actually have the projectors to test on, no one knows for sure.

Your 126-vs-127 letter test is good for illustrating that even if the error introduced is equivalent to a 1/256 error, it's potentially imperceptable. But is that true for all 1/256 errors? And depending on the processing done, the error could potentially be higher than that, so is it true for 1/128 or 1/64 errors?

[This message has been edited by Chris Carollo (edited 03-07-2001).]
post #17 of 33
Huck-

Good. Now for the rest of the story. Up until about 2 years ago, any digital processing (visual effects composites, mostly) were done in 8 bit color space. Up until about 4 years ago, 2k resolution was standard for output to the film recorders. When the idea of a higher rez was seriously considered (made possible by cheaper data storage, affordable higher rez film recorders and finer grain film stocks developed for, among other things, digital film recording), the standard became 3k and soon after, 4k. The increased output resolution gave rise to development of higher bit depth renders, currently 16 bit using Maya, Shake and Inferno software. Still, most matte paintings are done in 8 bit color due to the limitation of Photoshop, and are incorporated into 16 bit composites. Some elements are painted in Matador, which renders in 16 bit color, but it's not very user friendly and hasn't been updated in years.
So, we're talking about resolutions that are FAR beyond what current home theater projectors can handle, with these decisions made for archival, original negative output. 4k is fine for 70mm, and we go up to 6k or 8k for Imax and Omnimax.

Now let's get down to it. http://www.avsforum.com/ubb/smile.gif For our purposes, namely scaling DVD movies to the hardware resolution of our projectors, is an 8 bit video card enough? Will a higher color bit depth make a difference? I risk contradicting myself by exclaiming YES YES YES YES YES!!!!!!

8 bits are MORE than enough, IF they are the RIGHT BITS. What this means is that after the image is digitally mastered, if it is kept in its original form (no upscaling, rotation, skewing, etc.), 8 bits will provide more than enough color information. Hoewever, if the pixels are altered in any way, even a .001 differentiation, EVERY pixel needs to be recalculated and will suffer degradation. Banding occurs when pixels are stretched (scaling) and it is there that 8 bit depth becomes deficiant. If I have a low rez image and I want to alter it, I will usually scale it up to a much larger size than my target size, do my alteration, then scale it down to the final size. This will almost always give me a more refined result than if I processed the image at the same rez. More info, better scaling. The scaling is the key. The higher the color bit depth, the greater the palette to work with when adding information that wasn't there before. If your projector's hardware resolution was exactly the same pixel height and width as the MPEG 2 image file on the DVD, 8 bits would be optimal.

Add on top of this that digital color space is linear color space. Using non-floating point calculations, this puts middle gray at about 18 on a scale from 1-100. You can just a imagine what linear scaling will do to a linear grayscale.

So, after the scaling or filtering, do 10 bits, 12 bits, 16 bits matter? Only if the pixel amount or position needs to be recalculated once again. Otherwise, a higher bit depth buys you nothing.

Every day I work with an incredibly detailed, subtley colored image on my 24 bit (8 bit color space) monitor using Photoshop @ 1280x1024 rez. In my case, 8 is enough. In your case however, the video card bit depth is significant.


[This message has been edited by BarkingArt (edited 03-07-2001).]
post #18 of 33
bump

------------------
JVC - MPAA - Are you listening?
DFAST MUST DIE!
post #19 of 33
2 away from 100...

------------------
JVC - MPAA - Are you listening?
DFAST MUST DIE!
post #20 of 33
BarkingArt,

Thank you. This is a lot of very cool information. I really appreciate it.

Quote:
<font face="Verdana, Arial" size="2">So, first of all, we're talking about resolutions that are FAR beyond what current home theater projectors can handle</font>
I understand you are talking about archival but these versions are used to strike digital cinema "prints", too. Right?

Quote:
<font face="Verdana, Arial" size="2">Is increased bit depth noticable using streaming Mpeg 2 (8 bit) compression ...</font>
OK, DVDs and HDTV won't use the added depth.

Now I'll get back to my question and see if I can pull an answer out of what you wrote. Please correct me if this is wrong or misrepreasents what you wrote or meant. (I guess I didn't have to say that, eh?)

Q&gt; If 8 bits are enough, why are digital films distributed in 10bits per component, 4:2:2, YCrCb?

A&gt; That is the archival format so movies make it to that format sooner or later anyway.

Hopefully this is a correct interpretation. If so, why wouldn't you want a digital cinema projector to use the 10 bits of information that already exists?

Perceptable or not is not the only issue. Or is it? As an artist, do you want the primary piece of display hardware for your art deciding to truncate your work because it doesn't have the data path width to process it completely? Doesn't that idea bother you?

I still believe the statement in my first post. Each company got half the equation right. Resolution on one and color depth on the other.

I guess we will just disagree.

------------------

Huck
post #21 of 33
one more...

------------------
JVC - MPAA - Are you listening?
DFAST MUST DIE!
post #22 of 33
100 !!!!!!!

Huck-
Check my edited post above. The new info is more relevant to the matter at hand. http://www.avsforum.com/ubb/smile.gif

------------------
JVC - MPAA - Are you listening?
DFAST MUST DIE!
post #23 of 33
Hey, are you having any fun with that stuff yet. Whatta ya think of the quality of that paint?

As for the 8 bit vs. manipulation, and the resultant being insufficent.. please look for my thread in the CRT forum, about nyqist theorem, geforce cards, resolution (custom T&R's) and DVD specs and interpolation.

Here, I'll dig it up.


http://www.avsforum.com/ubb/Forum2/HTML/004334.html
------------------
goosystems.com

[This message has been edited by KBK (edited 03-07-2001).]
post #24 of 33
Hi KBK!!!

I haven't had time since I received ther paint to test it out yet- We're jammin' on Reign of Fire-and our spray booth has been relocated closer but is still unusable until next week, so next week will be the screen test. I'm looking forward to it- using a PLUS U2-1130 DLP for now. Your posts on the nyquist theorem are not far off- our digital guru here tells me that it is most certainly taken into account when preparing digital files for compression. That's all I got from him today- I'll grill him for more info tommorrow. 1800 x1200 @72hz sounds like heaven. http://www.avsforum.com/ubb/smile.gif

Huck- As long as pixel height and width is the same from file to projected image, higher bit depth is not going to make much difference.

------------------
JVC - MPAA - Are you listening?
DFAST MUST DIE!
post #25 of 33
Thread Starter 
Huckster,

Simple question: What are the letters?

-Mr. Wigggles

------------------
The Mothership is now boarding.
post #26 of 33
This bit of info that I posted in the CRT forum is definitly relevant for folks using fixed panel projectors.

The extra color depth brought in with the use of VGA cards with 10-bit color dacs is definitely of interest to you folks. It is the only way yo go, if you have a 'standard' resolution panel. ATI makes the only ones in normal use (10-bit color dac equipped vga cards) at this time. They will have the standard panel resolutions within the driver set that is issued with the cards at this time. The JVC (DILA [ab]users) guys will have to wait, or fight with the card's parameters.. supposedly the card can do the proper numbers (for the panel dimensions) but the custom T&R capacities have yet to be activated and used by the ATI folk. There is a back door though....

This,of course the ATI Radeon Chip equipped VGA cards. I have begun using one, and the extra two bits are DEFINETLY worth the price of admission. I am using the 1600x1200 setting on the drivers, but it is not good enough, I am getting banding errors, where in this respect, the image from the GeForce card was PERFECT... no scaling errors AT ALL. But, and BIG BUT.. I had to run the card at a custom resolution of 1800x1200, a 2.5 multiple of the DVD spec (720x480). For fixed panel users, whith SERIOUS complaints of loss of color depth and noticibly shortened problems in this regard, the ATI card will make you smile. Forget about the Geforce card if you have the ability to make the ATI card fit your panel dimensions (at this time).


I cannot WAIT to get the ATI card at custom Timings and Resolutions (T&R's) and really see what this puppy can do with a proper multiple of the DVD spec, with Nyquist taken into account. It should be a SNELL AND WILCOX 'SWORD-THRUOGH-THE-HEART' killer for sure. This applies only to the DVD situation, of course...on a HTPC. Two things remain to be beaten out of my way on this quest. The custom timings and resolutions question. this is critical.. and the filters on the card DO HAVE TO BE REMOVED.... I see softness that was not there with the modified GeForce card.

Basically.. the irony of the situation.. is that: I am ironing out problems for fixed panel users... by calibrating for myself... and using a CRT projector to see these differences. For these purposes, the fineness of the gradations of change are easily seen in a microscopic-like manner, so the CRT projector is the tool of choice for figuring these things out.


Alan, If you want me to modify one of these cards for you, just shoot me an e-mail. BNC's, filter removal; this card is beautifully laid out and is RIPE for full blown modification.


As for the topic at hand, the qxga panel DILA projectors... It would be nice if they were a bit more 'real' for the average folk. I can't wait 5 years to get one; so in the meantime, I realize that quality in my home for much, much, much less money by using and modifying CRT projectors. Life is only so long...
------------------
goosystems.com

[This message has been edited by KBK (edited 03-08-2001).]
post #27 of 33
A word about our bit depth plans...the projector shown was there for one reason, to show the advantage of the higher resolution vs the existing SXGA level chips, ours included. The driver D/A conversion was still 8 bit output with 10 bit internal processing. The ASIC nearing completion for the next generation electronics package will feature 10bit output and 12 bit internal processing. The plan is to fully handle 10bit RGB 4:4:4: color and this should be entirely sufficient for our color space.

The current 8 bit D/A's are not sufficient for our needs now that we are achieving over 1000:1 contrast ratio...we can't get the range we need both at the low and high end, we have to set the units up for one or the other but can't manage both.

The question then arises, do we need to go to 14 bit or higher processing? The answer is no, we do not. The TI chips require this higher bit depth because of problems at the low levels. There are a number of tricks that must be played to manage the lower levels and the extra bits are important in acheiving this. I will have a new white paper describing these issues on my website in the next day or two.

Regards,

------------------
Tom Stites
Director, Business Development
Digital Systems Division
JVC Professional Products
"My opinions do not necessarily reflect..."
post #28 of 33
Thread Starter 
Tom,

I think that is kind of where I was going with this thread. These fixed panel devices need the additional resolution on the projector's final A/D's but the actual input signal doesn't need to be 14 bit "color depth", 10 bits is more than enough. (Where the term color depth is as we normally refer to as a non linear quantization with a gamma of 2.2 to 2.5. Which is the format of all standard video signals.)

I am sure that TI is refering to their final output A/D's when refering to a 14 bit color depth (probably linearly spaced). But I am sure they are more than happy when people get confused and think "14 bits! That's 16 times more colors than competitors!"

A little bit of knowledge is a dangerous thing.

-Mr. Wigggles

------------------
The Mothership is now boarding.
post #29 of 33
That is exactly right. In conversations at various meetings with industry research and standards groups, I've heard no one express a need for greater than 10-bit, RGB, 4:4:4 material... TI needs that internal processing to deal with idiosyncrasies of their technology.

Well, maybe someone did express such a desire but they were immediately relegated to my mental "ignore" list. http://www.avsforum.com/ubb/biggrin.gif

Regards,

------------------
Tom Stites
Director, Business Development
Digital Systems Division
JVC Professional Products
"My opinions do not necessarily reflect..."
post #30 of 33
I can confirm that TI's Digital Cinema projector doesn't take input at 14 bits/channel. The digital movies are all encoded at 10 bits/channel. The 14 bits is to convert the 10 bit linear input to a *perceptually* linear output.

Just a little background, our perception of brightness is not linear, something that measures as twice the brightness of something else does not appear twice as bright to our eyes! We perceive light logarithmically, so that the ratios of different light levels are the same. Ratios should be familiar to us forum users with the constant talk of contrast ratios. This has important benefits, a book appears the same under different lights (to a limit) because the contrast ratio of black print to white paper is the same no matter the level of illumination. Imagine if we perceived the black text of a book outside to be 10 times brighter than the white of the paper inside, how confusing would that be? (BTW, this is true if you ever measure this with an instrument rather than your eyes) Anyways, if you ever look at a graph of perceived brightness vs. actual light levels the graph will start out rather flat and increases to a nearly vertical line rather rapidly.

I don't have the expertise to give a detailed explanation of why you need more bits in a digital projector to account for our non-linear perception of light, needless to say you need the extra bits so that the darker parts of the picture don't suffer from banding/quantization errors.

I suspect that Tom Stites is quite correct when he says that 12 bit gamma for a 10 bit input is enough. It should certainly be enough for the contrast ratios attainable in real world installations.

Regards,

Kam Fung

Regrads,

Kam Fung
New Posts  All Forums:Forum Nav:
  Return Home
This thread is locked  
AVS › AVS Forum › Display Devices › Digital Hi-End Projectors - $3,000+ USD MSRP › Native Resolution Digital Cinema