I have always tought 1440p makes a lot of sense for Blu-Ray. Am I wrong? - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 12 Old 07-05-2009, 01:39 AM - Thread Starter
Member
 
rekced's Avatar
 
Join Date: Jan 2007
Posts: 66
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Please help me understand my unanswered question about 16:9 1080p displays. It might have already been asked, but I couldn't find a place where this was brought up.


Question: Have we never truly seen 1080p before?


First, most Blu-Ray movies are encoded on the disk at 16:9 2.4:1, right? At that's what most or all of my Blu-Ray movies say on them. I'm assuming there's a black border encoded on the top and bottom to fill the dead space. In either case, most or all big screen HDTVs are 16:9, right? It doesn't take a genius to figure out that a 1080p 2.4:1 picture displayed on a 1080p 16:9 screen is only going to be using a little over 800 lines of resolution (barely better than true 720p, or 5XXp if we applied the same logic to a 720p TV?).

Second, if any of that is true, why do people freak out so much when 1440p becomes talked about saying "1080p is as high as it needs to be. HDTV signals are compressed 1080i and don't need more resolution. I have a coax cable lodged in my brain, bla bla bla..."

I have never been able to get this answered. Thanks in advance!
rekced is offline  
Sponsored Links
Advertisement
 
post #2 of 12 Old 07-05-2009, 01:51 PM
Advanced Member
 
dovercat's Avatar
 
Join Date: Apr 2008
Posts: 574
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 13
Quote:
Originally Posted by rekced View Post

It doesn't take a genius to figure out that a 1080p 2.4:1 picture displayed on a 1080p 16:9 screen is only going to be using a little over 800 lines of resolution (barely better than true 720p, or 5XXp if we applied the same logic to a 720p TV?).

I think Blu-ray is encoded 1080x1920 16:9. With a 2.4:1 movie the actual image encoded is 16:9 including the blackbars. If you displayed it at 1080x2592 for 2.4:1 or with non-square pixels at 1080x1920 2.4:1, you would be rescaling the image. When you scale the image you soften it, so the image would be worse than using a normal display that matches the encoded blu-rays resolution 1080x1920 16:9.

Some movies and hd tv are 16:9. But I agree Blu-ray marketing is misleading quoting the 1080p resolution when you do not get the full resolution on 2.4:1 films.

In cinemas I believe digital projection is:
1.85:1 format 1998x1080, often scaled down to 1588x858 with cinemas using fixed top masking.
2.39:1 2048x858 shown at 1:1 pixel mapping.
The major difference is I think colour depth, cinema is 15 bit per colour and color/chroma resolution is the same as the black and white luma image resolution. While Blu-ray is only 8-bit color and color/chroma is recorded on the disc I believe at a lower resolution than the luminance/luma, usually half the vertical and horizontal resolution then interpolated up by the player.

A possible future format could be Japanese state broadcaster NHKs Super Hi-Vision 4320x7680pixels 60Hz progressive and 22.2 multichannel surround sound. Recorded/transmitted at 24Gbit/s. Which is being developed in co-operataion with European broadcasters. It was forecast for consumer displays by 2025 but has been moved forward to 2020. It also has the possibility to be high enough resolution to use with true three dimensional television, no glasses and no fixed viewer position, infact when you move your head you can look around objects on the screen in the foreground and see more things in the background.
dovercat is offline  
post #3 of 12 Old 07-06-2009, 10:26 AM
AVS Addicted Member
 
Jason Turk's Avatar
 
Join Date: Oct 2000
Location: Rochester, NY USA
Posts: 12,451
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by rekced View Post

Please help me understand my unanswered question about 16:9 1080p displays. It might have already been asked, but I couldn't find a place where this was brought up.


Question: Have we never truly seen 1080p before?


First, most Blu-Ray movies are encoded on the disk at 16:9 2.4:1, right? At that's what most or all of my Blu-Ray movies say on them. I'm assuming there's a black border encoded on the top and bottom to fill the dead space. In either case, most or all big screen HDTVs are 16:9, right? It doesn't take a genius to figure out that a 1080p 2.4:1 picture displayed on a 1080p 16:9 screen is only going to be using a little over 800 lines of resolution (barely better than true 720p, or 5XXp if we applied the same logic to a 720p TV?).

Second, if any of that is true, why do people freak out so much when 1440p becomes talked about saying "1080p is as high as it needs to be. HDTV signals are compressed 1080i and don't need more resolution. I have a coax cable lodged in my brain, bla bla bla..."

I have never been able to get this answered. Thanks in advance!

Remember when people say 1080p is usually enough, they are talking about the display resolution, not source. You are right and this is a big issue...a 1080p BluRay that is 2.4:1 is not full 1080p, so you aren't using the full potential.
Jason Turk is offline  
post #4 of 12 Old 07-06-2009, 08:20 PM - Thread Starter
Member
 
rekced's Avatar
 
Join Date: Jan 2007
Posts: 66
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by dovercat View Post

I think Blu-ray is encoded 1080x1920 16:9. With a 2.4:1 movie the actual image encoded is 16:9 including the blackbars. If you displayed it at 1080x2592 for 2.4:1 or with non-square pixels at 1080x1920 2.4:1, you would be rescaling the image. When you scale the image you soften it, so the image would be worse than using a normal display that matches the encoded blu-rays resolution 1080x1920 16:9.

Some movies and hd tv are 16:9. But I agree Blu-ray marketing is misleading quoting the 1080p resolution when you do not get the full resolution on 2.4:1 films.

In cinemas I believe digital projection is:
1.85:1 format 1998x1080, often scaled down to 1588x858 with cinemas using fixed top masking.
2.39:1 2048x858 shown at 1:1 pixel mapping.
The major difference is I think colour depth, cinema is 15 bit per colour and color/chroma resolution is the same as the black and white luma image resolution. While Blu-ray is only 8-bit color and color/chroma is recorded on the disc I believe at a lower resolution than the luminance/luma, usually half the vertical and horizontal resolution then interpolated up by the player.

A possible future format could be Japanese state broadcaster NHKs Super Hi-Vision 4320x7680pixels 60Hz progressive and 22.2 multichannel surround sound. Recorded/transmitted at 24Gbit/s. Which is being developed in co-operataion with European broadcasters. It was forecast for consumer displays by 2025 but has been moved forward to 2020. It also has the possibility to be high enough resolution to use with true three dimensional television, no glasses and no fixed viewer position, infact when you move your head you can look around objects on the screen in the foreground and see more things in the background.


Thanks for the reply.

One thing to note: I did not say I felt it was misleading to call it 1080p when it is 850p displayed within a 2.39:1 space. I'm just trying to get a better grasp of television technology today and in the future.

After reading your response a second time I realized you answered my question in more than one way. Now I wonder, will the quest to reach higher resolutions be like the digital camera industry always going for resolution and worrying about dynamic range whenever they fit into their marketing plan?

Is 15 bit color kind of the minimum "ideal" standard for cinema? I remember reading an article a while back from Eizo (monitor company) where they were saying that it takes a monitor with 16 bit color to display 8 bit properly. I wonder what it would take in the television world to display a 15 bit color source properly. I guess part of Eizo's reasoning is that there are a number of 8 bit sources out there and all of them would require a much larger space to in order to display all of them properly. I'm guessing this is where their 16 bit claim came from. Blu-Ray disks are probably more consistent in their color properties than what graphics professionals encounter, so maybe televisions wouldn't need such a higher amount of color depth? Am I understanding any of this properly?
rekced is offline  
post #5 of 12 Old 07-07-2009, 03:13 AM
Member
 
Aidoru's Avatar
 
Join Date: Aug 2006
Location: Italy
Posts: 25
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by dovercat View Post

(...) In cinemas I believe digital projection is:
1.85:1 format 1998x1080, often scaled down to 1588x858 with cinemas using fixed top masking. 2.39:1 2048x858 shown at 1:1 pixel mapping (...)

That's right, however most theatres have preset zoom levels, or an additional wide/anamorphic lens to show Cinemascope material.
So, even Flat material is shown at its native resolution on CIH screens.

Quote:
Originally Posted by dovercat View Post

(...) The major difference is I think colour depth, cinema is 15 bit per colour and color/chroma resolution is the same as the black and white luma image resolution. (...)

It's JPEG2000, 4:4:4, 12 bits per channel, up to 250mbit/s.
Aidoru is offline  
post #6 of 12 Old 07-07-2009, 03:27 AM
Advanced Member
 
dovercat's Avatar
 
Join Date: Apr 2008
Posts: 574
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 13
The main advantages of commercial digital theaters over home cinema are I believe.

Color gamut the range of colors that can be displayed. Digital theaters use xeon lamps and different filters to get a larger gamut. Still not as large as traditional film stock cinemas and no where near as large as the full range of visible colors you can see. For consumer displays xvYCC is an attempt at increasing the color gamut.
http://www.efilm.com/colormatrix/colormatrix.html
http://www.youtube.com/watch?v=x0-qoXOCOow

Color bit depth the number of color shades that can be represented. Digital Cinema in theaters has a lot more fine graduations than home cinema. Color graduations look smoother, no visible banding. This becomes more of an issue the larger the color gamut. Deep Color is an attempt at increasing the consumer display bit depth and is designed to be used with xvYCC color gamut, to prevent color banding.

Thanks Aidoru for correcting me, commercial digital theaters use 12bit depth. I have read that 12 bit is the minimum needed in theory for most people to not see banding and is the standard for the source.
Digital film and video for the consumer market is usually mastered and processed at 10-bits or higher, but blu-ray uses 8-bit. Still many displays can not even handle 8-bit on a per pixel per frame bases they have to use temporal or spatial dithering to cheat. I have Darkchip3 DLP, according to Texas Instuments darkchip3 is capable of displaying 10-bit depth but I assume that is with a 1xspeed color wheel using just RGB segments, with typical 4x and 5x color wheels it is probably having to dither to even display 8-bit. For consumer projectors LED dlp should be capable of greater bit depth due to no spoke time and a more dynamic light source.
One reason commercial digital theaters use 3chip dlp is I assume because it gives the ability to display greater bit depth without relying on dithering. The other reason being greater brightness without inducing the dlp rainbow effect.

Resolution of commercial digital theater is higher than home cinema but not massively so with 2K (2048×1080) or 2.2 MP at 24 frames per second. The resolution advantage is being increased with 4K digital cinemas (4096×2160) or 8.85 MP at 24 frames per second. Resolution in effect equals minimum viewing distance or maximum image size before pixel structure becomes obvious. This determines how much of your field of view the image occupies. THX Home cinema recommends the image is 36 degrees or more of your field of view, a front row at the movie theater is about 90 degrees, the middle row about 50 degrees. This is so the image occupies the central detail sensitive 20 degrees and some of the motion sensitive peripheral vision, so motion on the screen looks like motion in the real world. This has the effect of making the image more immersive.

Unfortunately for commercial cinemas the biggest wow factor for picture quality in my opinion is perceived image depth, looking into the image rather than at it, the image as they say justs pops out of being a picture and into appearing three dimensional. This can be done using visual cues/illusion but usually needs good contrast levels with a low black level. Consumer displays at least those bright enough and with a high enough contrast ratio to be setup with a high gamma can look stunning in a batcave. I am one of those who does not believe display gamma should be 2.2, I prefer 2.5 in a batcave. Commercial theaters due to health and safety, in the UK at least, nolonger go to complete blackout like they did when I was a kid, so the image these days is always compromised. I have read commercial digital cinemas reference values are on screen sequencial contrast of at least 2000:1 and ansi contrast of at least 150:1. But actual digital theaters can be as low as 1200:1 sequencial contrast and 100:1 ansi contrast and still be within spec. (Traditional film print has a maximum contrast ratio of about 400:1 [less on screen due to lens flare,etc..], with sequential contrast of about 1600:1 pre 1997, and nearly 4000:1 from 1997). "Real3D" is an attempt to give commercial cinemas more image depth. I have seen Monsters vs Aliens in real3d and was not impressed, but I will give it another go with Avatar.

In my opinion the blu-ray spec should of made a greater leap in image quality with color gamut, bit depth, 3D standards. These now all seem to be on the drawing board but I can not see them getting wide support except for 3D. After all superbit DVDs did not out sell the DVDs with all the extras.

In effect Blu-ray is the same picture as DVD just with a higher resolution. In comparison to DVD it is much better. Fine details have more contrast, since it has a higher resolution and unlike dvd the image is not pre-smoothed during mastering (to reduce intelace line twitter and reduce the amount of data to be encoded) and possibly incorrectly de-interlaced or suffering from high frequency roll-off at analogue output due to the video dac and filters, colors may also have better saturation in fine details due to higher resolution and not using analogue output video dacs and filters, also with a matching display resolution you can use pixel to pixel mapping so no softening due to scaling. Some blu-ray players also have image processing to improve the image quality with detail enhancement.
I do however think they should have made 50Hz output part of the spec for PAL region players, since in the UK we are not used to 24frames per second flicker or 60Hz motion judder, and now need to rely on the display to be capable of doing a multiple of 24frame per second.
dovercat is offline  
post #7 of 12 Old 07-13-2009, 10:30 AM
AVS Special Member
 
Roger Dressler's Avatar
 
Join Date: Jul 2000
Location: Oregon
Posts: 8,217
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 124 Post(s)
Liked: 165
Quote:
Originally Posted by rekced View Post

Second, if any of that is true, why do people freak out so much when 1440p becomes talked about saying "1080p is as high as it needs to be. HDTV signals are compressed 1080i and don't need more resolution. I have a coax cable lodged in my brain, bla bla bla..."

The resolution needed in the delivery format/display is not too meaningful until we know the display size and viewing distance. If one uses the Carlton Bale calculator, which includes a visual acuity feature, it turns out that a 1080p display is optimally viewed with a 30 degree angle of view, while for a 1440p display it is increased to 40 degrees. Based on this, 1440p would be a much better match to PJ based home theaters which tend to have viewing angles of 40+ deg. For typical flat panel displays it's not necessary to have more than 1080p, as it cannot be seen at <30 deg viewing size.

When we bought some 50" plasmas for home use, we went with 720p models instead of 1080p, as at 10' seating it made no difference in PQ, but we saved $1k per display. The salesman was convinced we'd miss out on HD quality, but had no idea about the reality of human acuity.
Roger Dressler is offline  
post #8 of 12 Old 07-14-2009, 03:00 PM
Member
 
umeng2002's Avatar
 
Join Date: Aug 2006
Posts: 65
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 17
Yes,

The Dark Knight IMAX scenes on Blu-ray go to full 1080p and looks awesome compared to the normal aspect ratio. I wish studios would just leave out the theater aspect ratio and encode 16:9 1080p for all movies on BD. There's not much going on on the far edges of the screen.
umeng2002 is offline  
post #9 of 12 Old 07-22-2009, 12:09 PM
 
addi123's Avatar
 
Join Date: Jul 2009
Posts: 3
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
srry!!!

addi
addi123 is offline  
post #10 of 12 Old 07-23-2009, 12:13 PM
Member
 
NCARalph's Avatar
 
Join Date: Jan 2008
Posts: 35
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
It's seemed to me for awhile that an upconverter that would expand BluRay to 1080 vertical would be very worthwhile for PJs. The DVD upconverters aren't perfect, but do help, so it seems plausible that upconverting BluRay would be a good thing.
NCARalph is offline  
post #11 of 12 Old 07-26-2009, 10:39 AM
AVS Special Member
 
mikieson's Avatar
 
Join Date: Jun 2008
Posts: 1,279
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
All of this is why I get tired of technology. We try to move ahead quicker then we need to. Does br look better than stdvd? Sure but its still not quite worth it to me yet. I keep going back and forth on this issue, but when time comes for an upgrade on my part...Im going for a good upscalling dvd player with tons of features. Use all my old dvds and can buy new dvds for 3$-5$ at walmart on a daily basis.

When tech slows down and finally catches up to its self...Then the time will come to worry with it all..

jmo, not trying to diss anyone that has followed all the trends or step on any shoes here..
mikieson is offline  
post #12 of 12 Old 07-26-2009, 07:17 PM
 
dogone's Avatar
 
Join Date: Apr 2008
Posts: 2,155
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I "tought" I saw a puddy cat!"...:0)
dogone is offline  
Reply Screens

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off