or Connect
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 4k by 2k or Quad HD...lots of rumors? thoughts?
New Posts  All Forums:Forum Nav:

4k by 2k or Quad HD...lots of rumors? thoughts? - Page 118

post #3511 of 3670
Quote:
Originally Posted by Glimmie View Post
 
Quote:
Originally Posted by 8mile13 View Post

In some area's Hollywood is conservative. For instance just a few directors are pushing for higher frames rates while most directors want to stick with (the olmost 100 years old eek.gif) 24fps.

And the reason for that is thousands of years old!

1913: Twice the frame rate, twice the film consumed, twice the cost.

2013: Twice the frame rate, twice the digital storage required, twice the cost.

What has changed?

 

What's changed is the impact of the media costs on the overall cost of the film.  What you listed above could well be true if memory were a $1 a PetaByte.  Twice that would be $2.  "What's changed?"

post #3512 of 3670
Doubling the frame rate, if uncompressed should only double the storage requirement too, just like with stereoscopic 3D. Going 2K to 4K uncompressed should quadruple it. A 2K 48 fps/50 fps film should be less expensive/easier than a 3D 2K 24 fps film.

They've already made a 48 fps 2D film in India (Orey Nyabagam) so I'm sure Hollywood could easily afford to make them too.
Edited by Joe Bloggs - 9/29/13 at 9:28pm
post #3513 of 3670
Quote:
Originally Posted by coolscan View Post

This is the important slide, the rest is mostly JVC spesific.........



As for the Rec.2020, there are no displays currently being manufactured that can show 2020 even if you could feed it to them, and no material for the foreseeable future that will be produced with with rec.2020.

Let the movie content producers get their "head around" 4K at all (which they have big enough mental problems with) not to mention real quality 4K before anybody can start to dream about wider color space and higher framerates.

There are still a majority of film makers and broadcast people that think 1080p and rec.709 is "Good Enough forever". rolleyes.gif

Thanks for posting the slide. Seems we have a case of buyer beware regarding early HDMI 2.0 "ready" equipment. Perhaps that's why there is a NDA for the 2.0 spec.
post #3514 of 3670
Quote:
Originally Posted by dsinger View Post

Seems we have a case of buyer beware regarding early HDMI 2.0 "ready" equipment. Perhaps that's why there is a NDA for the 2.0 spec.

 

No, IMO it's almost certainly pure revenue generation.  I've seen it before from time to time and it's obnoxious.  In computer science, even the C language specification (I believe "C11" was the last one?) was pay-to-read and under tight copyright completely without "fair usage" exceptions.

post #3515 of 3670
Quote:
Originally Posted by Joe Bloggs View Post

Doubling the frame rate, if uncompressed should only double the storage requirement too, just like with stereoscopic 3D. Going 2K to 4K uncompressed should quadruple it. A 2K 48 fps/50 fps film should be less expensive/easier than a 3D 2K 24 fps film.

They've already made a 48 fps 2D film in India (Orey Nyabagam) so I'm sure Hollywood could easily afford to make them too.

And what do you do about the installed 24fps base? How do they see the feature? You are going to have to make a 24fps version, that costs money. And it's not as simple as throwing away every other frame.
post #3516 of 3670
Informal Poll:

How many of the Hollywood technical critics here belong to SMPTE?

SMPTE is an open society meaning that anyone can join as a student and get full access to all the publications. And to spite the professional experience requirements for full private membership, exceptions are frequently made for those who have the technical background yet work outside the industry.

So how many of you belong to SMPTE? Because they are the people who set the standards. Instead of bitching about it why not join and become part of the decision process*.

*of course joining a standards committee is another matter. You must have the peer certified technical background to participate here.
Edited by Glimmie - 9/30/13 at 7:47am
post #3517 of 3670
Quote:
Originally Posted by Glimmie View Post

Quote:
Originally Posted by coolscan View Post

Yes, why indeed.
Ask the guys in Hollywood why they haven't jumped on 4K years ago,
Ahh, they did! , Digital 4K has been around since 2005.
Just because four movies (according to Sony website - sourced from imdb) got a 4K scan of the film isn't what I call "Jumped on 4K".

A total of 68 movies between 2005 and 2013 had a 4K DI..........when we know that CGI is rendered 2K, so to get that to 4K they either have to up-convert, or most likely, do a 4K scan of the release-print and call it 4K.
Then 4K just becomes a number of pixels and has nothing to do with properly resolved 4K resolution.

We have no confirmation on wheter these 68 movies was released as a 4K DCP to cinemas, no words on the post workflow to confirm that is was 4K all the way.

All in all, when we know that there are around 600 movies released in the US every year, I won't call 68 movies that claim some kind of "4K" source to be jumping on 4K.

Quote:
Originally Posted by Glimmie View Post

Quote:
Originally Posted by coolscan View Post

or are rolling out 8K in the cinemas before it gets to the consumers.
Show me some 8K production gear on the market I can buy and have support for. And just because you can shoot a YouTube video with nothing more than a camera does not mean you can do the same for an
A grade theatrical title. It' takes a bit more than just an 8K camera.
There where no 4K cameras in Hollywood in 2005, when you argue that they started with 4K (not that they told anybody at the time), it was up to a disruptive startup to do that two years later.
So it is up to Hollywood to say that they want to be in the forefront and offer better quality in the cinema before it drizzles down to the consumers homes.

If Hollywood said that they want all big budget movies to be shown in 8K within the next two years, both the 8K cameras and the 8K projectors would be there.
But as Hollywood can't even say that for 4K, where all the cameras and projectors are available just shows that Hollywood aren't interested in being in the forefront and competing with home viewing.
In the -50s and -60s they was afraid of the competition from TV, and then they developed CinemaScope, 35mm Horizontal and 70mm.
Now Hollywood and the people working there are behind the curve in technology, resisting even what is available and leaving the development to the broadcast community like NHK.

Doesn't help much that Sony claiming the F65 to be a 8K camera, then a year later claiming it is a "real 8K" camera after a firmware upgrade, non of which is actually true. The F65 can never be a 8K camera unless it gets a new sensor.
Quote:
Originally Posted by Glimmie View Post

Quote:
Originally Posted by coolscan View Post

Ask them why they are the last to jump on technological advantages made specific for the movie industry, and instead behave more like as if they are dragged kicking and screaming into the future.

The movie industry are incredible conservative and slow when it comes to embracing new image technology, and the people are afraid of having to learn new ways of doing things.

People here in this forum are several steps ahead in the cycles and tempo compared to what is developing in the movie industry, or among CEM.

The movie industry has been using digital effects since the 1970s. Long before there was a hint of a home PC. Digital scanning has bee going on since the 1980s. Did you have a digital camera in 1984? Well I worked with early laser scanners that year.

How about the move to non-linear editing in the late 1980s. The movie industry was editing on DOS based PC platforms long before Finalcut Pro came about.

The Hollywood industry is far more technologically advanced that you imagine.

You don't know what you are talking about here.
Is that your response..... that I don't know what I am talking about.......and then giving anecdotal stories from the -70s and the -80s about digital VFX development........something I haven't made an argument about.

Yes Hollywood was ahead of the development once upon a time, as I mentioned above. Hollywood is still in the forefront when it comes to development of digital effects.
But that is not what we talk about here............we talk about the present and the future of higher resolution imagery as in the quality of how movies are shot and released.

You just comes over as a apologist for Hollywood. You never present any real facts as to how Hollywood acts when it comes to driving cinema towards higher resolution.
Just do a check on how many movies in the last few months shoot on Alexa (majority), how many shoots on 35mm (#2) and how many movies are actually shooting on cameras that can provide proper 4K (Red/F65), then you will see how much Hollywood "Jumps On 4K".

Tell us how they get all the VFX/CGI from 2K to 4K in these movies with a lot of VFX that are released in So-Called-4K?

We see it on all the complains over EE in BD releases. Obviously someone down the pipeline think that compressed 1080p is too soft and need excessive up-sharpening, which shows 2K is a inferior resolution.

For me, the only way I can accept a movie shot on 35mm film as real 4K is if the scan that is done from the Negatives are used for the digital release. To rescan a release print is not real 4K for me.

The day the large Hollywood studios say that they want all their movies to be shot at higher than 4K resolution and that there are a higher than 4K DI post flow all the way for 4K digital releases, then I will trust that Hollywood really has "Jumped On 4K" with both feet and might trust that all 4K movies are really 4K, until then I will continue being critical.

Just for a cool shot of Michael Bay shooting "Bayhem". tongue.gif

CREATOR: gd-jpeg v1.0 (using IJG JPEG v62), quality = 90


Michael Bay shooting Transformers 4; Age of Extinction on a 3alty 3D rig with two Red Epic.
He also got two Red Dragons recently, in addition he is using Red Scarlet, 35mm film, IMAX Digital 3D cameras (essentially reworked Phantom 65 cameras - rumour is that they where abandone because they didn't work out too well), Canon DSLR and GoPro.
post #3518 of 3670
Coolscan,

There were 4K cameras in 1935! It's called Technicolor three stripe. Where do you get the idea that 4K re-scans are done from print? yes some elements are where the OCN is missing or damaged but that's why its called restoration. You do the best you can.

How many 4K theaters were there in 2005? How many today? I think you will see the 4K releases are very much in step with the venues. It makes little financial sense to run a full 4K DI process when there are only 20 screens that can reproduce 4K. Now back around 2010, the studios started storing raw camera data on LTO tape. As post production is all computer driven today, it is an easy process to re-conform a feature in 4K. It's mostly an automatic process.

That is a very sound and wise business decision. Release 2K today but protect for 4K future. Of course the feature must be worthy of this expense so not every movie gets it.

You seem to think it's a flip of a switch to go to an all 4K or even 8K model. But where does the funding come from?

If you think this is such a slam dunk then why not go to wall street and start your own studio? Surely the major investment banks will agree with you.

BTW, have you applied for SMPTE membership yet? I suggested you do so months ago.
Edited by Glimmie - 9/30/13 at 7:48am
post #3519 of 3670
Quote:
Originally Posted by Glimmie View Post

And what do you do about the installed 24fps base? How do they see the feature? You are going to have to make a 24fps version, that costs money. And it's not as simple as throwing away every other frame.
Actually, it is that simple. But I would much rather see high framerate productions move to 60fps rather than 48fps. 48fps requires new displays for support, 60fps is supported on pretty much anything, and it's a higher framerate.
post #3520 of 3670
Quote:
Originally Posted by Chronoptimist View Post

Actually, it is that simple.

Which is the anchor frame, the first or the second? Sounds petty but without a standard.....

A shot may have been set up differently owing to 24p motion issues. DP's know how to do that. Now will a 48frame action shot look OK at 24fps? Do we just live with it? What about when the studio execs say they want it right for 24frame as 48 is minority at the time.

These are all issues the armchair folks just brush away but in the real world you can't do that.

HFR will evolve based on the public demand and as the costs of storage fall. Will 24fps be around forever? probably not but it isn't going away soon either.
post #3521 of 3670
Quote:
Originally Posted by Chronoptimist View Post

Actually, it is that simple. But I would much rather see high framerate productions move to 60fps rather than 48fps. 48fps requires new displays for support, 60fps is supported on pretty much anything, and it's a higher framerate.

Are not most digital cinema projectors running at 144hz? I'm fairly certain you will find they are (and those that aren't will be running at some multiple of 24, which is one of the reasons 48 was used for HFR).
post #3522 of 3670
Quote:
Originally Posted by Glimmie View Post

Coolscan,

There were 4K cameras in 1935! It's called Technicolor three stripe.
Yes, and later there was 35mm horizontal and 70mm, which both where abandoned.
But what kind of argument is it you try here? Do you try to argue that we had higher resolution sixty years ago then we have had since the early -70s?

What would the attitude of film makers today have been if the main format had been 70mm the last forty years? Would film makers today accepted 2K digital as an acceptable format?

Quote:
Where do you get the idea that 4K re-scans are done from print? yes some elements are where the OCN is missing or damaged but that's why its called restoration. You do the best you can.
One thing is restoration of old movies where VFX was mostly done on film and the negatives are too damaged to be rescanned, but that is not what I talk about.
New or relative recent 35mm movies which are post produced in a 2K or 4K DI, mixed in with 2K rendered CGI, how do they match the 4K DI with the 2K CGI, like the Sony "mastered in 4K" which from the reports and the screengrabs looks so soft that I assume they are doing the easy and rescan the release print and not render out 4K from the original 4K or 5K files.

You who have the connections should ask around on how they are doing the post flow on 35mm movies with VFX to make them into (the digital shot movies too), like the "mastered in 4K" titles and tell me that the workflow is truly 4K, not just ask me where I get the idea from. It is a rather educated "guess" on my part.
Quote:
How many 4K theaters were there in 2005?
It is you that started the 2005 argument, so don't try to turn that around on me.

Quote:
How many today?
I think you will see the 4K releases are very much in step with the venues. It makes little financial sense to run a full 4K DI process when there are only 20 screens that can reproduce 4K.
About half of US cinema screens are fitted with 4K projectors. (about 15000 4K screens of a total of about 30000 digital screens)
Do you think that show that the release number of 4K movies are in step with the number of 4K movie releases?

Quote:
Now back around 2010, the studios started storing raw camera data on LTO tape. As post production is all computer driven today, it is an easy process to re-conform a feature in 4K. It's mostly an automatic process.
If you think it is that easy, then you know less than I thought.
You have to regrade the 4K files and re-fit the CGI/VFX, you cant just pull up the original 4K footage and re-render it, even if you can relink it to the original edit.
It is very much redoing the post work.
Quote:
That is a very sound and wise business decision. Release 2K today but protect for 4K future. Of course the feature must be worthy of this expense so not every movie gets it.
If you don't shoot for 4K, then you don't get real 4K tomorrow, it will only be a up-conversion.

How many movie producers demand that the DP choose a camera that can ensure that it is get 4K protection in the future?
Seeing that the majority of movies are shot on Alexa, I think that answers itself.

And Arri, the first company that published papers arguing for 4K and even 8K, and is maybe the largest provider of camera equipment to Hollywood movies haven't even made a camera able to shoot for 4K releases.
I think that also show how much the Hollywood production environment care about 4K.
Quote:
You seem to think it's a flip of a switch to go to an all 4K or even 8K model. But where does the funding come from?
If you think this is such a slam dunk then why not go to wall street and start your own studio? Surely the major investment banks will agree with you.
The equipment is there, the small extra cost of rendering out a 4K movie is negligible in relation to the $100mill++ cost of so many Hollywood movies. They still find the funding to buy 35mm film, pay the lab and pay for the scanning.

Maybe the investment banker should think some more about giving the cinema public a superior experience and compete with the multitude of ways to get better quality images on their small screens than the cinemas can provide.
They managed to monetize 3D, so why not try to monetize 4K, or get a step ahead and introduce 8K. The mentality seems more towards milking "good enough" as long as possible.

Maybe the real problem is that the "bean counters" have gotten too much power in Hollywood, something we see complains about now and then, lately between George Clooney slamming hedge fund honcho and Sony Pictures investor Daniel Loeb.

Quote:
BTW, have you applied for SMPTE membership yet? I suggested you do so months ago.
You think I could make difference?
Doesn't matter what SMPTE recommends of standards if nobody in the movie industry actually follows up and use the recommendations.

How good can a movie shot in higher resolution than 4K look?
Even though this is a BD screengrab done by somebody, why doesn't all movies have this image quality? (I mean technical quality, not Look).
Just imagine this in 4K!
Great Gatsby.


Edited by coolscan - 9/30/13 at 10:04am
post #3523 of 3670
Quote:
Originally Posted by coolscan View Post


You who have the connections should ask around on how they are doing the post flow on 35mm movies with VFX to make them into (the digital shot movies too), like the "mastered in 4K" titles and tell me that the workflow is truly 4K, not just ask me where I get the idea from. It is a rather educated "guess" on my part.

If you think it is that easy, then you know less than I thought.
You have to regrade the 4K files and re-fit the CGI/VFX, you cant just pull up the original 4K footage and re-render it, even if you can relink it to the original edit.
It is very much redoing the post work.

If you don't shoot for 4K, then you don't get real 4K tomorrow, it will only be a up-conversion.

You do not have to re-grade for 4K. This is a practiced technique and it works. Yes some effects shots need to be redone based on content but this is not a complete re-post job.

What exactly do you do for a living?

And what do those screen shots prove? They came from a BluRay. And you ask why all features can't look that good? Well most A features today do look that good. And some don't for a multitude of reasons one being creative intent.
Edited by Glimmie - 9/30/13 at 10:17am
post #3524 of 3670
Quote:
Originally Posted by tgm1024 View Post

No, IMO it's almost certainly pure revenue generation.  I've seen it before from time to time and it's obnoxious.  In computer science, even the C language specification (I believe "C11" was the last one?) was pay-to-read and under tight copyright completely without "fair usage" exceptions.

My concern is that the revenue generation goes beyond the HDMI organization. Does the HDMI 2.0 spec allow a CEM to claim a HDMI 2.0 product if the product can not pass ~ 18 Gbit? If the JVC slide is to be believed as well as Sony then something around 10Gbit can be called HDMI 2.0. Sony is telling potential customers of their current 4k sets that there will be an upgrade thru firmware to HDMI 2.0 by the end of the year. Assuming Panny knows something about what will be in the 4k BD spec, having all your 4k devices with 18 gbit HDMI 2.0 is very important and early adopters that thought they bought HDMI 2.0 but only got 10 Gbit will be very disappointed.

As an aside, I hope Panny's chip doing 12 bit 422 or 8 bit 444 is an indication of what we can expect from 4k BD.
post #3525 of 3670
HDMI 2.0 AND CURRENT HARDWARE

Now that HDMI has fully caught up with UHD, are current UHD displays up to the task? Yes, but with an upgrade. There are two likely paths:

1) A firmware update

2) A hardware update

Sony has promised that the UHD displays it introduced in 55- and 65-inch sizes this year will receive a firmware update for HDMI 2.0. How is this possible? Since the HDMI 2.0 signal is electrically identical to HDMI 1.4, the change will be in how fast the chip operates. We speculate that Sony and others may have planned for HDMI 2.0 and used faster chip in their UHD displays to prepare for it. This means the upgrade will be a simple firmware update that's likely downloaded from the internet.

For devices like Sony's VPL VW1000ES projector and 84-inch UHD TV that lack faster HDMI chips, a new 2.0 HDMI input board will be required. Wether this hardware update wil come free is still unknown.

Samsung uses an outboard input box it calls OneConnect. It contains HDMI 1.4 inputs as well as others and the electronics for its Smart functions comes with 2013 UHD TVs. This external box sends all source signals to its UHD TVs via a single cable and unique connector. Samsung tells HD Guru its next generation Evolution Kit for its current UHD TVs will be an updated OneConnect box and it will include HDMI 2.0 inputs.

Some manufacturers might not provide an HDMI 2.0 upgrade path at all. Entry-level UHD TVs from companies like Seiki and TCL appear to be locked into HDMI 1.4 and have given no indication to date of an HDMI 2.0 update path. Other UHD TV makers, including Toshiba*, Sharp, and LG, have not yet annouced an upgrade path for their sets, if one even exists. Given the high prices that consumers have paid for first-generation UHD TVs, I imagine that these name brand 4K TV makers may try to offer some way to keep those sets up-to-date with HDMI 2.0.


* 30-09-2013 Toshiba will allocate resources to large screen Ultra HD (4K) LED LCd TVs, where growing demand is expected, to differentiated functions for viewing and recording.
post #3526 of 3670
^ Thanks for posting that, I had missed it. Don't like the part about sets not being required to be able to process the higher color bit rates because buyers to whom it would matter will have to find out whether their potential purchase can do 10 and 12 bit or not. I have seen several postings that say Sharp currently has the only 10 bit sets. No proof provided.
post #3527 of 3670
Quote:
Originally Posted by dsinger View Post
 
Quote:
Originally Posted by tgm1024 View Post

No, IMO it's almost certainly pure revenue generation.  I've seen it before from time to time and it's obnoxious.  In computer science, even the C language specification (I believe "C11" was the last one?) was pay-to-read and under tight copyright completely without "fair usage" exceptions.

My concern is that the revenue generation goes beyond the HDMI organization. Does the HDMI 2.0 spec allow a CEM to claim a HDMI 2.0 product if the product can not pass ~ 18 Gbit?

 

That's a trademark issue.  DVD, Blu-Ray, just about every "standard" you can think of is controlled as a trademark that is licensable so long as the vendor follows certain guidelines.

post #3528 of 3670
Quote:
Originally Posted by 8mile13 View Post

Sony has promised that the UHD displays it introduced in 55- and 65-inch sizes this year will receive a firmware update for HDMI 2.0. How is this possible? Since the HDMI 2.0 signal is electrically identical to HDMI 1.4, the change will be in how fast the chip operates. We speculate that Sony and others may have planned for HDMI 2.0 and used faster chip in their UHD displays to prepare for it.

 

This was never really in doubt, was it?  How HDMI 2.0 was to achieve a faster rate wasn't that much of a mystery---the line discipline isn't rocket science.  Jumping over the standards hurdle was tougher than the technological one.  The circuitry for 18 Gbps (or for whatever else you need) is something that they could build in ahead of time and wait for whatever demands are placed upon it later on when the standards are designed.

post #3529 of 3670
Just received an email announcement from Panasonic that their VIERA® 65" Class WT600 Series Ultra HD TV (64.5" Diag.)
Model: TC-L65WT600 is now available for order for $5,499.99.

4K 60p Input
HDMI (4K 60p Input)
DisplayPort™ (4K 60p Input)
4K Media Player
4K Fine Remaster Engine
4K Web Browser
4K Online Playback

http://shop.panasonic.com/shop/model/TC-L65WT600?sc_ec=epp-club_viera-televisions_4k-tv-beauty_pcec_1501471_10072013


So if you want one, be ready to 4K over a large bundle of cash to pay for it.wink.gif
post #3530 of 3670
Quote:
Originally Posted by greenland View Post

Just received an email announcement from Panasonic that their VIERA® 65" Class WT600 Series Ultra HD TV (64.5" Diag.)
Model: TC-L65WT600 is now available for order for $5,499.99.

4K 60p Input
HDMI (4K 60p Input)
DisplayPort™ (4K 60p Input)
4K Media Player
4K Fine Remaster Engine
4K Web Browser
4K Online Playback

http://shop.panasonic.com/shop/model/TC-L65WT600?sc_ec=epp-club_viera-televisions_4k-tv-beauty_pcec_1501471_10072013


So if you want one, be ready to 4K over a large bundle of cash to pay for it.wink.gif
Tempting, but still too small...
post #3531 of 3670
Quote:
Originally Posted by rightintel View Post

Tempting, but still too small...

After watching L65WT600 run a gaming demo, I might actually go a bit smaller and start gaming again. The detail on this set is incredible. The wall of 4K was pretty impressive. The feature set sort of looks like the sharp elite series...hmmm
The price has to come down to a more reasonable level on these displays - and get some 4K content out there..
For a youtube video, the picture quality is excellent at 1080p.


If Panasonic could scale this panel up to 75 inches at a reasonable price point, I would think about buying one for gaming and movies...
post #3532 of 3670
I'm in for this at 75inches or above as long as the PQ is good. Come one Panny! (or someone else)
post #3533 of 3670
Quote:
Originally Posted by myoda View Post

After watching L65WT600 run a gaming demo, I might actually go a bit smaller and start gaming again. The detail on this set is incredible. The wall of 4K was pretty impressive. The feature set sort of looks like the sharp elite series...hmmm
The price has to come down to a more reasonable level on these displays - and get some 4K content out there..
For a youtube video, the picture quality is excellent at 1080p.


If Panasonic could scale this panel up to 75 inches at a reasonable price point, I would think about buying one for gaming and movies...

For the first few seconds, I thought this was an actual live car race.
post #3534 of 3670
Quote:
Originally Posted by coolscan View Post

Rec.2020 can as well be standing for the year 2020 when it come to any hope of seeing such wide colorspace be regularly used or used at all. cool.gif
The color gamut scalability report from the last HEVC meeting mentions that several companies are planning on the production of displays that support the Rec. 2020 color space. It may not happen next year but I think we will see displays that support it by 2015.

Quote:
Originally Posted by dsinger View Post

As an aside, I hope Panny's chip doing 12 bit 422 or 8 bit 444 is an indication of what we can expect from 4k BD.
Just a guess but if it is released next year I think the 4K version of Blu-ray will support the Main 10 profile of HEVC (10-bit 4:2:0).
post #3535 of 3670
Food for thought...or should I say: gas on the fire...

I just finished a ~160" screen for my garage and- honest to god- at 20 feet I just cannot imagine seeing more detail. And I'm 20/15.

Put it this way: when I'm right up against the screen I obviously get screen door effect all over the place and can decipher the pixels. But it's hilarious how quickly that all goes out the window with every step you take back. I don't know for certain, but I'd bet by 15 feet MANY people would be a loss to discern ANY of the aforementioned. I know by ~20 feet I'm HARDPRESSED (ok, I don't think I really can, reliably). And this is a 157" screen, gang. Darn near TEN TIMES the size of a 50" display.

I suppose my experience nearly makes sense as 20/20 vision puts the "limit" of fully resolving 1080 at 20.5 feet (the THX recommendation is about 18 feet) on a screen this size, according to a number of the "experts" on what a person with "normal" vision can resolve. Yes I realize all of those on AVS with 20/10 or 20/5 can do much better. rolleyes.gif

I guess what I'm saying is that I'm more skeptical than ever about 4k and its purported benefits. I have found that the more and more you experience LARGE 1080 screens and even remotely reasonable viewing distances the more inescapable these realities become.

Not that this weak iphone pic of a motion-filled, compressed, up-scaled, 720 broadcast tells much, but this was taken from about 15-17 feet...it IS fun, I'll tell ya that much. smile.gif



Just a guy in his garage with a lowly (but gigantic) 1080 screen. smile.gif

James
Edited by mastermaybe - 10/16/13 at 7:59am
post #3536 of 3670
Quote:
Originally Posted by mastermaybe View Post

Food for thought...or should I say: gas on the fire...

I just finished a ~160" screen for my garage and- honest to god- at 20 feet I just cannot imagine seeing more detail. And I'm 20/15.

Put it this way: when I'm right up against the screen I obviously get screen door effect all over the place and can decipher the pixels. But it's hilarious how quickly that all goes out the window with every step you take back. I don't know for certain, but I'd bet by 15 feet MANY people would be a loss to discern ANY of the aforementioned. I know by ~20 feet I'm HARDPRESSED (ok, I don't think I really can, reliably). And this is a 157" screen, gang. Darn near TEN TIMES the size of a 50" display.

I suppose my experience nearly makes sense as 20/20 vision puts the "limit" of fully resolving 1080 at 20.5 feet (the THX recommendation for such a size is about 18 feet) according to a number of the "experts" on what a person with "normal" vision can resolve. Yes I realize all of those on AVS with 20/10 or 20/5 can do much better. rolleyes.gif

I guess what I'm saying is that I'm more skeptical than ever about 4k and its purported benefits. I have found that the more and more you experience LARGE 1080 screens and even remotely reasonable viewing distances the more inescapable these realities become.

Not that this weak iphone pic tells much, but this was taken from about 15-17 feet...it IS fun, I'll tell ya that much. smile.gif


Just a guy in his garage with a lowly (but gigantic) 1080 screen. smile.gif

James

So your screen is ~ 12 ft wide, right? Quite a few people (myself included) like to sit fairly close to their screen, e.g., as close as ~ 1.0 screen width (and some even closer). So this would be ~ 12 ft in your case, much closer than the 15 or 20 ft you discussed. How do things look to you at this closer distance?
post #3537 of 3670
^ Bout 11' 4" wide. And respectfully, I don't know what either of us think "quite a few" constitutes, but I'm very confident that if I sat 20 people 11 feet from this screen at least 18 would think it was too close. There are always outliers, I realize that, but that's precisely what they are. At 18-20 feet the screen is absurdly large and completely enveloping to me. And if I were being really honest, it borders on too much tracking stuff all the way across the screen, left to right/right to left.

I can try this tonight though...at least with a few folks, for World War Z. wink.gif I'm pretty confident that at least I can pick it apart pretty good that close, though.

Damn, lol, that is CLOSE. smile.gif

James
post #3538 of 3670
Quote:
Originally Posted by mastermaybe View Post

^ Bout 11' 4" wide. And respectfully, I don't know what either of us think "quite a few" constitutes, but I'm very confident that if I sat 20 people 11 feet from this screen at least 18 would think it was too close. There are always outliers, I realize that, but that's precisely what they are. At 18-20 feet the screen is absurdly large and completely enveloping to me. And if I were being really honest, it borders on too much tracking stuff all the way across the screen, left to right/right to left.

I can try this tonight though...at least with a few folks, for World War Z. wink.gif I'm pretty confident that at least I can pick it apart pretty good that close, though.

Damn, lol, that is CLOSE. smile.gif

James

I recall someone saying they like a viewing distance of 0.6 SW, but that is too close for me! (I sit ~ 11 ft from my screen, which is 11.3 ft wide for 16x9 pics, and 12 ft W for 2.35).

I certainly didn't start out this close, but it does grow on one. But I agree that it is a very individual preference.
post #3539 of 3670
My fridge is off to the right of the screen (just out of my pic, about 10-12 feet from the screen) and when I need to get up (sigh, I need to do something about that) if I keep watching the screen I can't even take it all in, but I'll try it once sitting down for an entire film...if I can take it. wink.gif

Yep, it can be very subjective...lotta human, lotta preferences. The .6 thing is nearly unbelievable, but at this point in my life I suppose I'll believe anything.

James
post #3540 of 3670
Quote:
Originally Posted by mastermaybe View Post

Put it this way: when I'm right up against the screen I obviously get screen door effect all over the place and can decipher the pixels. But it's hilarious how quickly that all goes out the window with every step you take back. I don't know for certain, but I'd bet by 15 feet MANY people would be a loss to discern ANY of the aforementioned. I know by ~20 feet I'm HARDPRESSED (ok, I don't think I really can, reliably). And this is a 157" screen, gang. Darn near TEN TIMES the size of a 50" display.
A few things about this:

1. What kind of projector is it? Single chip DLPs are the only sharp projectors out there. Anything else puts out a soft image due to the panel type and convergence issues. Guess which of these four projectors is a DLP. (the other three are all JVC projectors)
Cheaper projectors have softer optics than high end, even when the spec otherwise looks the same. Even then, a single-chip DLP is still nowhere near as sharp as a flat panel.

2. Being able to make out screen door is not necessarily the same as being able to make out individual pixels. A lot of that has to do with the panels used and how large the gaps between each pixel are. Screen door is far less than a pixel in size. Poor optics can help mask screen door at the cost of sharpness.

3. 20ft from a 160" screen is only a 32 degree viewing angle. That's way smaller than my TV. The furthest I'd want to be from that screen is 14ft, and I'd prefer 12. (50 degree FoV) The screen is physically large and your eye can tell this because it's focused 20ft away, but that's only filling a small portion of your vision.

Quote:
Originally Posted by mastermaybe View Post

I suppose my experience nearly makes sense as 20/20 vision puts the "limit" of fully resolving 1080 at 20.5 feet (the THX recommendation is about 18 feet) on a screen this size, according to a number of the "experts" on what a person with "normal" vision can resolve. Yes I realize all of those on AVS with 20/10 or 20/5 can do much better. rolleyes.gif
Most of the claims about vision and what the eye can resolve on AVS are misinformed. That awful Carlton Bale chart gets posted here all the time. Here's one which is actually sourced from research:

Y790Yz9s.png

Quote:
Originally Posted by mastermaybe View Post

...it IS fun, I'll tell ya that much.
They certainly are. Even though there are many hassles in getting a projector set up, and they're inferior to flat panels in almost every way but size, I definitely got a lot more enjoyment even out of my first $500 projector setup than a high end flat panel with near perfect image quality. I really miss having a projector, my current place isn't really suited to one.
Quote:
Originally Posted by mastermaybe View Post

^ Bout 11' 4" wide. And respectfully, I don't know what either of us think "quite a few" constitutes, but I'm very confident that if I sat 20 people 11 feet from this screen at least 18 would think it was too close.
With my previous setup, there was only one row of seating which was just over one screen width away and no-one ever commented on the image being too large or sitting too close. Quite the opposite in fact, even from people who had their own projection setups which were more like yours. (too small for me)

I think what helped was that I had done a proper job totally blacking out the rest of the room - you couldn't even see the side walls, or the fact that the screen was less than an inch from them on either side.
That's not recommended as it will normally kill your contrast performance, but I had velvet panels with carefully selected fabric that all but eliminated any reflections. That room would probably have felt claustophobic if it had lighter walls and you could see that the screen was so tightly packed in there.
New Posts  All Forums:Forum Nav:
  Return Home
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › 4k by 2k or Quad HD...lots of rumors? thoughts?