UHD/4K Quandary: To Buy or Not to Buy - Page 77 - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 689Likes
Reply
 
Thread Tools
post #2281 of 2384 Old 07-17-2015, 08:46 PM
Senior Member
 
sarahb75's Avatar
 
Join Date: Mar 2011
Location: Norton, Ohio
Posts: 450
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 335 Post(s)
Liked: 164
Twilight Time Does Not Appear To Be A Company That Creates Artificial Scarcity

Quote:
Originally Posted by EdwinB View Post
Well, that isnot a fair comparison is it?


"A Man ForAll Seasons" did not sell more than 3000 copies because they limited therelease to 3000 copies in the first place. These limited releases alsodrive up prices of the titles compared to the normal releases because theybecome scarce rather quickly, and this may be even a conscious move by thepublisher. However, due to the higher prices a lot of people will lose interestand the title only becomes interesting for the "real" collector that"loves" a specific title.


And does anyonereally expect a catalog title of over 40 years old to sell anywhere near thenumber of a recent blockbuster?
You certainly make an undeniable point, EdwinB, that A Man For All Seasons could not sell more than the 3000 number of Blu-ray copies that the company manufacturing it, Twilight Time, set for its production run. But since Twilight Time has a lot of experience manufacturing and selling catalog titles on Blu-ray, and can easily choose to include any number of copies in a production run, it would seem that the company's decision to produce 3,000 copies would have been based on its best estimate of what could be sold with any reasonable marketing effort. I'm sure that like any other company that's motivated by a desire to make profits, Twilight Time would want to put out the maximum number of copies that they feel they could sell, to generate the maximum amount of profit.

And if your theory that the company deliberately creates scarcity to boost prices was valid, I think Twilight Time would not be selling Blu-rays of movies like A Man For All Seasons, Khartoum, and 1959's original version of Journey to the Center of the Earth, all at the relatively reasonable price of $29.95 apiece. True scarcity would cause prices like the $69.99 price that Amazon is charging for just a new DVD of my favorite comedy from the 1970s, Where's Poppa, which starred George Segal (a popular actor for comedies of that era) and a hilarious Ruth Gordon. If someone is trying to create scarcity by only making 3,000 Blu-rays of a movie, and then only charges a $29.95 price for those Blu-rays, like Twilight Time did for AMFAS, it certainly is not a scheme that's designed to bring in boatloads of money, or even to just derive the maximum profit possible from such a release. People that know and appreciate AMFAS, like yours truly, don't mind paying $29.95 for such a great movie, that had never been on Blu-ray, so Twilight Time would have been pretty stupid to limit production to 3,000 units if it had any reason to believe that consumer demand would be likely to generate a substantially greater number of sales than that.

Of course, your theory could yet prove to be correct, if Twilight Time would end up doing a 2nd production run of AMFAS, while charging people who missed out on ordering from the first run, a much higher price for the movie. But from what I know of Twilight Time's history, in the few cases where the company responded to some consumer demand for a discontinued title by doing another production run, the price was not raised.
sarahb75 is offline  
Sponsored Links
Advertisement
 
post #2282 of 2384 Old 07-17-2015, 11:08 PM
Senior Member
 
sarahb75's Avatar
 
Join Date: Mar 2011
Location: Norton, Ohio
Posts: 450
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 335 Post(s)
Liked: 164
Quote:
Originally Posted by ray0414 View Post
my answers highlighted in red, sorry for making it so crazy looking.




when UHD blu rays hit, i definitely want to see if theres a difference in quality. but the downloads look good enough to where it may not matter. but either way, i think UHD/HDR is here to stay. the people with access to it, will support it 100%. but another key issue will be remastering. older movies will not look as good as newer movies shot with better cameras. but thats okay, you still have an HDR copy with blu ray.
Thank you very much for answering my questions ray0414. Now that I've read your comments about the specific movie comparisons you made between UHD/HDR movies on the internet and the Blu-ray versions, it seems even more likely to me that the upcoming UHD BD format is going to have a very tough time competing. And I'm glad that you mentioned Life of Pi. Although I did not see Life of Pi at the movies, some friends who did, said that the excellent Blu-ray of it, as shown in our home theater, surprised them by how relatively close in quality it seemed compared to the way the movie looked at our favorite Cinemark movie theater. And since you find that the UHD/HDR version of Life of Pi, that's available on the internet, looks much better than the already excellent Blu-ray, there can't be very much room left for improvement beyond what you are seeing with those UHD/HDR files. To me, this indicates that any superiority that UHD Blu-ray may be able to offer over those UHD/HDR files is very likely to be a matter of hair splitting that only makes a difference to the most critical video enthusiasts. 99% of people will probably not notice any difference between these stunning types of UHD/HDR versions of movies, that you are talking about, and movies in the UHD BD format.

Yeah, ray0414, as I continue to read more about the experiences of people like yourself, it's just making it seem more certain that UHD Blu-ray is going to have a snowball's chance in hell of being around for very long. There simply won't be enough people like me who insist on collecting movies in disc form, to allow UHD BD to prosper. And most people I know who used to buy movies on disc, have given up on the practice, and consider me to be out of step with the times.
sarahb75 is offline  
post #2283 of 2384 Old 07-18-2015, 10:00 AM
AVS Forum Special Member
 
losservatore's Avatar
 
Join Date: Jan 2011
Location: Cleveland,Ohio
Posts: 6,496
Mentioned: 8 Post(s)
Tagged: 0 Thread(s)
Quoted: 2677 Post(s)
Liked: 2454
Those pictures reminds me the link below about comparing similar pictures with my VT60.


post #114 of 139


I don't have this movie so I can't do a comparison ,but ! get my point?

Last edited by losservatore; 07-18-2015 at 10:28 AM.
losservatore is offline  
 
post #2284 of 2384 Old 07-18-2015, 01:23 PM
AVS Forum Addicted Member
 
NetworkTV's Avatar
 
Join Date: Oct 2002
Location: CT
Posts: 16,843
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 961 Post(s)
Liked: 1527
Quote:
Originally Posted by sarahb75 View Post
You certainly make an undeniable point, EdwinB, that A Man For All Seasons could not sell more than the 3000 number of Blu-ray copies that the company manufacturing it, Twilight Time, set for its production run. But since Twilight Time has a lot of experience manufacturing and selling catalog titles on Blu-ray, and can easily choose to include any number of copies in a production run, it would seem that the company's decision to produce 3,000 copies would have been based on its best estimate of what could be sold with any reasonable marketing effort. I'm sure that like any other company that's motivated by a desire to make profits, Twilight Time would want to put out the maximum number of copies that they feel they could sell, to generate the maximum amount of profit.
They sell pretty much all their distributions with a 3000 copy limit. If what you said was true, that number would be larger or smaller based on researched demand for a particular film.

What I'm sure the real story is, is 3000 copies is based on a blanket license they negotiate for the rights to those movies they distribute.

Quote:
And if your theory that the company deliberately creates scarcity to boost prices was valid, I think Twilight Time would not be selling Blu-rays of movies like A Man For All Seasons, Khartoum, and 1959's original version of Journey to the Center of the Earth, all at the relatively reasonable price of $29.95 apiece. True scarcity would cause prices like the $69.99 price that Amazon is charging for just a new DVD of my favorite comedy from the 1970s, Where's Poppa, which starred George Segal (a popular actor for comedies of that era) and a hilarious Ruth Gordon. If someone is trying to create scarcity by only making 3,000 Blu-rays of a movie, and then only charges a $29.95 price for those Blu-rays, like Twilight Time did for AMFAS, it certainly is not a scheme that's designed to bring in boatloads of money, or even to just derive the maximum profit possible from such a release. People that know and appreciate AMFAS, like yours truly, don't mind paying $29.95 for such a great movie, that had never been on Blu-ray, so Twilight Time would have been pretty stupid to limit production to 3,000 units if it had any reason to believe that consumer demand would be likely to generate a substantially greater number of sales than that.
$29.95 is pricey. I would never pay that for any bare bones, non-special edition release. For $29.95, it has better have one or more of the following:

- A second disc chock full of special features
- A DVD copy
- a 3D version

The idea anyone, anywhere could get $69.99 for a movie-only edition (at least until the E-Bay sellers buy it out) these days is silly. This isn't 1986, when VHS movies sold for $100. It also isn't 2009 when BDs were $25 and up for every title. Now, $18 is about the most you'll pay for a single disc edition of any movie on BD. The only way they could get close to $70 is to cut the number available by at least 2/3 to ensure a frenzy.

Quote:
Of course, your theory could yet prove to be correct, if Twilight Time would end up doing a 2nd production run of AMFAS, while charging people who missed out on ordering from the first run, a much higher price for the movie. But from what I know of Twilight Time's history, in the few cases where the company responded to some consumer demand for a discontinued title by doing another production run, the price was not raised.
They've already done (or about to do) second runs of a couple of movies that were supposed to be "limited editions". Those were movies that sold out, only to end up mostly resold on E-Bay.
NetworkTV is online now  
post #2285 of 2384 Old 07-19-2015, 05:36 AM
Senior Member
 
sarahb75's Avatar
 
Join Date: Mar 2011
Location: Norton, Ohio
Posts: 450
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 335 Post(s)
Liked: 164
Thumbs up

Quote:
Originally Posted by NetworkTV View Post
They sell pretty much all their distributions with a 3000 copy limit. If what you said was true, that number would be larger or smaller based on researched demand for a particular film.

What I'm sure the real story is, is 3000 copies is based on a blanket license they negotiate for the rights to those movies they distribute.

$29.95 is pricey. I would never pay that for any bare bones, non-special edition release. For $29.95, it has better have one or more of the following:

- A second disc chock full of special features
- A DVD copy
- a 3D version

The idea anyone, anywhere could get $69.99 for a movie-only edition (at least until the E-Bay sellers buy it out) these days is silly. This isn't 1986, when VHS movies sold for $100. It also isn't 2009 when BDs were $25 and up for every title. Now, $18 is about the most you'll pay for a single disc edition of any movie on BD. The only way they could get close to $70 is to cut the number available by at least 2/3 to ensure a frenzy.

They've already done (or about to do) second runs of a couple of movies that were supposed to be "limited editions". Those were movies that sold out, only to end up mostly resold on E-Bay.
NetworkTV, I hear what your saying about how outrageous the idea is of a movie-only disc selling for $69.99. And that disc of the 1970 dark comedy, Where's Poppa, that's priced at $69.99 on Amazon isn't even a Blu-ray, but is a DVD. However, Where's Poppa is a real cult film, and while no doubt it's a pretty small number of people, I can assure you that there are devoted fans of the film who are having no problem with paying $69.99 to get the movie, (plus shipping) if that's the only way that they can obtain it. If I had not brought my bank account down so low by buying 143 Blu-rays in the last 9 months, I know that I'd be ordering Where's Poppa off of Amazon, even though I don't usually like to settle for buying films on DVD anymore.

In fact, after carefully going over our bills and remaining money in the bank today, there's a better than even chance that I'm going to jump on Amazon and order that $69.99 DVD of Where's Poppa, anyway. I'm crazy about that movie, and it's so outrageous, that when I saw it 4 times at the movies in 1970-71, there were scenes that would suddenly crack people up so much that audience members were literally choking on their popcorn from laughing. In all the years I've gone to movies, audience reaction to Where's Poppa is probably the loudest that I've ever heard.
sarahb75 is offline  
post #2286 of 2384 Old 07-20-2015, 06:06 AM
Member
 
donhonk's Avatar
 
Join Date: Mar 2011
Posts: 15
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 12
Personally? Until Blade Runner is on UHD Bluray, I'm fine with 1080p. Once thats out? Ill be compelled to upgrade.
donhonk is offline  
post #2287 of 2384 Old 08-03-2015, 09:10 AM
AVS Forum Special Member
 
Rudy1's Avatar
 
Join Date: Oct 2000
Location: Ft. Lauderdale, FL
Posts: 5,174
Mentioned: 87 Post(s)
Tagged: 0 Thread(s)
Quoted: 1434 Post(s)
Liked: 2121
"The race for 4K ultra HD TV household penetration and expansion in other areas is on"

http://4k.com/news/the-race-for-4k-u...as-is-on-8469/
Rudy1 is offline  
post #2288 of 2384 Old 08-12-2015, 04:39 AM
Newbie
 
Join Date: Aug 2015
Posts: 11
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 0
I will definitely be upgrading most of my blu-rays to the new format.
daniel1967 is offline  
post #2289 of 2384 Old 08-13-2015, 04:22 AM
Senior Member
 
sarahb75's Avatar
 
Join Date: Mar 2011
Location: Norton, Ohio
Posts: 450
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 335 Post(s)
Liked: 164
People May Be Disappointed When Many Films Display Little Improvement On UHD BD

Quote:
Originally Posted by daniel1967 View Post
I will definitely be upgrading most of my blu-rays to the new format.
You may want to upgrade a smaller proportion of your blu-rays to UHD BD than you think now, daniel1967, after you actually compare how upconverting your 1080p blu-rays looks in comparison to the UHD BD versions.

This month's (the September issue) Sound & Vision contains a very interesting statement in the test report of Samsung's top of the line UN65JS9500 UHD TV. (a 65 inch so called SUHD model that sells for $5,000 on Amazon and is one of the first UHD TVs with HDR capability)

That Samsung happens to be compatible with Sony's UHD hard drive movie player which was used to display UHD material on what the article's author called the best performing LCD Full Array Local Dimming UHD TV he has ever seen.

As we all know, UHD material off of a hard drive is superior to that provided by streaming services because it suffers from far less compression. But even in spite of that, the test report's author stated that he has seen upconverted 1080p blu-rays that look as good, and in some cases, are superior, to all of the UHD material that he watched off of the hard drive.

Now, I fully realize that it is promised that the UHD Blu-ray format will add HDR capability, which virtually all video experts say makes a much more noticeable improvement over 1080p blu-ray than 4k resolution does.

But there are several obstacles that could prevent HDR from helping UHD Blu-ray have a successful launch.

1. Although we are in the 3rd year that UHD TVs have been sold to consumers, only a couple HDR capable models have recently hit the market, and at such high prices that few consumers who are buying new UHD TVs are even getting one that can handle HDR. So when UHD Blu-ray finally does hit the market, very, very
few people will be able to even take advantage of HDR.

2. Most of the early movie releases on UHD Blu-ray disc will have not been reprocessed for HDR, anyway. Scans will just be done from the digital files of movies, without HDR reprocessing, for most transfers to UHD BD, (Especially for catalog titles) because additional HDR processing means more expense for the studios, but with a relatively tiny audience of UHD BD consumers to spread these costs over, and with most consumers not having the top of the line UHD TVs that would allow them to see HDR, anyway.

3. The industry will have a few not totally compatible HDR formats competing against each other. (Many observers claim Dolby Vision gives the best results, but the top TV brand, Samsung, is going with a different format which is capable of much less impressive results than Dolby Vision. (BTW, the first 2 movies the industry announced as being processed for HDR are being done in Dolby Vision, and are The Lego Movie and Edge of Tomorrow.

Joe Kane is usually considered the top expert on video matters. He has often said that UHD needs an improvement like HDR to really succeed, because resolution alone offers too small of an improvement over 1080p unless viewed on large front projection screens, or viewed much closer to flat panel TVs than most people will ever position their TVs from their sofas in their living rooms. (for folks with 20/20 vision every major video expert has stated that a 1.5 screen height, or closer, seating distance is required to see all the detail that UHD provides. That's just slightly under 4 feet from the screen of a 65 inch UHD TV.)

And my thought is that considering the fact that a very tiny % of UHD TV owners will have HDR capability in their TVs when UHD Blu-ray hits the market, most of those folks may be very disappointed in how little of an improvement, if any, that they notice UHD Blu-ray discs offering over the same movies on upconverted 1080p Blu-ray. And if folks really desire to see a difference they might have to be ready to set up their 65 inch UHD TVs 4 feet in front of their couches. (Yeah, like my wife would be happy with that. Then I'd be sure to hear that she had a headache at bedtime.)

Plus, we all need to keep in mind that it is highly unlikely that movies on UHD Blu-ray disc will look as amazingly pristine as the 5 minute UHD demo clips that companies like Samsung use to show off their TVs in stores. Those short video clips are produced at incredibly high bit rates that even UHD Blu-ray could never hope to maintain for a 90 min or 2 hour movie.

Last edited by sarahb75; 08-13-2015 at 04:30 AM. Reason: left out a word
sarahb75 is offline  
post #2290 of 2384 Old 08-13-2015, 08:38 AM
AVS Forum Special Member
 
hernanu's Avatar
 
Join Date: Apr 2007
Location: Boston Suburbs
Posts: 3,766
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 682 Post(s)
Liked: 1218
Quote:
Originally Posted by sarahb75 View Post
.
.
But there are several obstacles that could prevent HDR from helping UHD Blu-ray have a successful launch.

1. Although we are in the 3rd year that UHD TVs have been sold to consumers, only a couple HDR capable models have recently hit the market, and at such high prices that few consumers who are buying new UHD TVs are even getting one that can handle HDR. So when UHD Blu-ray finally does hit the market, very, very
few people will be able to even take advantage of HDR.
I'd disagree with this. Vizio and Dolby have an agreement to support the Dolby version of HDR in Vizio's upcoming panels. Given Vizio's aggressive pricing, I think (personally) that the number of people with Dolby HDR capable TV's will outpace the market for UHD blurays.

Quote:
Originally Posted by sarahb75 View Post
2. Most of the early movie releases on UHD Blu-ray disc will have not been reprocessed for HDR, anyway. Scans will just be done from the digital files of movies, without HDR reprocessing, for most transfers to UHD BD, (Especially for catalog titles) because additional HDR processing means more expense for the studios, but with a relatively tiny audience of UHD BD consumers to spread these costs over, and with most consumers not having the top of the line UHD TVs that would allow them to see HDR, anyway.
Again, different studios have committed to the conversion, and seem to be in the process of generating these. I don't think the UHD BD buyers will be driving HDR development; rather, streaming houses like Vudu (also partnering with Dolby) will be much more of a player, and UHD BD will benefit.

Quote:
Originally Posted by sarahb75 View Post
3. The industry will have a few not totally compatible HDR formats competing against each other. (Many observers claim Dolby Vision gives the best results, but the top TV brand, Samsung, is going with a different format which is capable of much less impressive results than Dolby Vision. (BTW, the first 2 movies the industry announced as being processed for HDR are being done in Dolby Vision, and are The Lego Movie and Edge of Tomorrow.
The comparison may be true, and there are some companies aligned with one type of UHD/HDR vs. another, which is why it may be better to wait until the dust settles (IMO), however - Samsung vs. Vizio when you consider most people will buy the less expensive TV may be tilted towards Vizio.

Quote:
Originally Posted by sarahb75 View Post
Joe Kane is usually considered the top expert on video matters. He has often said that UHD needs an improvement like HDR to really succeed, because resolution alone offers too small of an improvement over 1080p unless viewed on large front projection screens, or viewed much closer to flat panel TVs than most people will ever position their TVs from their sofas in their living rooms. (for folks with 20/20 vision every major video expert has stated that a 1.5 screen height, or closer, seating distance is required to see all the detail that UHD provides. That's just slightly under 4 feet from the screen of a 65 inch UHD TV.)
The human eye is much more capable than 1080p. The way we view things is more complex than just a single point of reference. We have two eyes, which move to construct a scene. In the reference, it is more like 6 to 8k resolution at reasonable viewing distances that saturate the eye.

Quote:
Originally Posted by sarahb75 View Post
And my thought is that considering the fact that a very tiny % of UHD TV owners will have HDR capability in their TVs when UHD Blu-ray hits the market, most of those folks may be very disappointed in how little of an improvement, if any, that they notice UHD Blu-ray discs offering over the same movies on upconverted 1080p Blu-ray. And if folks really desire to see a difference they might have to be ready to set up their 65 inch UHD TVs 4 feet in front of their couches. (Yeah, like my wife would be happy with that. Then I'd be sure to hear that she had a headache at bedtime.)

Plus, we all need to keep in mind that it is highly unlikely that movies on UHD Blu-ray disc will look as amazingly pristine as the 5 minute UHD demo clips that companies like Samsung use to show off their TVs in stores. Those short video clips are produced at incredibly high bit rates that even UHD Blu-ray could never hope to maintain for a 90 min or 2 hour movie.
There is always a dependency on a good transfer. Some will look great, some will come out horrible (Fifth Element in blu) and have to be reissued to be good.

We are still very early on this, I believe the game changer is not resolution, but the HDR capability, and the arena where that will be fought out won't be on physical disks, but on streaming.

The UHD blus will benefit from this, but especially when Vizio drops low priced HDR capable screens into the market with Vudu or Amazon HDR content, there will be plenty of target screens. The real question is whether the studios will want to make many UHD bluray disks if they are getting a profit from streaming.

I hope so, since I'm planning on an Oppo UHD bluray player purchase, sometime during 2017.
hernanu is offline  
post #2291 of 2384 Old 08-13-2015, 09:41 AM
AVS Forum Special Member
 
GregLee's Avatar
 
Join Date: Jul 2002
Location: Waimanalo HI
Posts: 4,183
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 868 Post(s)
Liked: 296
Quote:
Originally Posted by sarahb75 View Post
But even in spite of that, the test report's author stated that he has seen upconverted 1080p blu-rays that look as good, and in some cases, are superior, to all of the UHD material that he watched off of the hard drive.
People underestimate the good effects of upconversion. I don't mean real people in general, but people here, posting in this forum. It's because they are wedded to this idea that the best picture is the one that reproduces most faithfully the video signal. According to that odd philosophy, any upconversion is a degradation, because it displays things that are not there in the video signal. It's odd, because (1) we really want the display to resemble the original scene, not the signal, and (2) it neglects the possibility that upconversion can repair damage that was done to the image of a scene by the processing that went on between camera and TV screen.

Here are two ways that 4k video enables repair work to display a superior picture: A. apparent color depth can be increased by using dither to capitalize on the extra pixels available. (I've already argued this here in other posts.) B. 4k is better than 2k at simulating bright highlights, by turning up contrast/backlight on LED sets that can show a lot of brightness. This is because real highlights are small in area, and a 4k picture can confine an area with enhanced brightness to a smaller screen area.

These advantages are due to having 4k pixels on the screen, not necessarily in the signal, so they are potentially available in upconverted displays. Well, they are, at least, if you don't get your TV calibrated by a technician who insists on setting up the TV to produce the standard peak brightness of 100 nits.
dalto and EvLee like this.

Greg Lee
GregLee is offline  
post #2292 of 2384 Old 08-13-2015, 02:17 PM
Senior Member
 
sarahb75's Avatar
 
Join Date: Mar 2011
Location: Norton, Ohio
Posts: 450
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 335 Post(s)
Liked: 164
When, If Ever Will Vizio Turn Its Dolby Vision Prototypes Into Real Products?

Quote:
Originally Posted by hernanu View Post
I'd disagree with this. Vizio and Dolby have an agreement to support the Dolby version of HDR in Vizio's upcoming panels. Given Vizio's aggressive pricing, I think (personally) that the number of people with Dolby HDR capable TV's will outpace the market for UHD blurays.



Again, different studios have committed to the conversion, and seem to be in the process of generating these. I don't think the UHD BD buyers will be driving HDR development; rather, streaming houses like Vudu (also partnering with Dolby) will be much more of a player, and UHD BD will benefit.



The comparison may be true, and there are some companies aligned with one type of UHD/HDR vs. another, which is why it may be better to wait until the dust settles (IMO), however - Samsung vs. Vizio when you consider most people will buy the less expensive TV may be tilted towards Vizio.



The human eye is much more capable than 1080p. The way we view things is more complex than just a single point of reference. We have two eyes, which move to construct a scene. In the reference, it is more like 6 to 8k resolution at reasonable viewing distances that saturate the eye.



There is always a dependency on a good transfer. Some will look great, some will come out horrible (Fifth Element in blu) and have to be reissued to be good.

We are still very early on this, I believe the game changer is not resolution, but the HDR capability, and the arena where that will be fought out won't be on physical disks, but on streaming.

The UHD blus will benefit from this, but especially when Vizio drops low priced HDR capable screens into the market with Vudu or Amazon HDR content, there will be plenty of target screens. The real question is whether the studios will want to make many UHD bluray disks if they are getting a profit from streaming.

I hope so, since I'm planning on an Oppo UHD bluray player purchase, sometime during 2017.

There is a major problem with your contention, hernanu, that Vizio is going to be a significant factor in the early days of UHD Blu-ray by providing inexpensive UHD TVs that are Dolby Vision equipped. At present, Vizio only plans to include Dolby Vision in its 2 Reference Series models, a 65 inch and a huge (and, no doubt, extremely expensive) 120 inch model. As unlikely as it may seem, Vizio hopes to become thought of as a premium brand. (That is, if their Reference Series ever sees the light of day. The continuing delays are a shame because the Reference Series prototypes really wowed people at the Jan. 2014 CES. But then again, it's not unusual to see a new technology exhibited at a trade show, and later find that the company that showed off the potential new product could not manage to bring it to market for a practical cost.

These 2 Vizio Reference Series sets have been promised to be arriving soon ever since 2014. In fact it had been claimed that they would be available in Fall 2014. Vizio reps had said that they were aiming for an introductory price of $4000 for the 65 inch set, but later said that had been a premature estimate.

The vast majority of people who are willing to put down money for a new UHD TV want to spend far less than $4000, especially when you see Samsung 65 inch UHD TVs selling for as low as $1999 in the Sunday newspaper ads for stores like HH Gregg and Best Buy.

So since Vizio is having so much trouble even fielding its first Dolby Vision TVs I would not count on Dolby Vision equipped product from that company having an impact in the early days of UHD Blu-ray. And even if the company could manage to soon introduce such a 65 inch TV at $4000, not too many people will be lined up to pay that much for a Vizio. (they'd much rather buy 2 non-HDR equipped Samsung 65 inch UHD TVs for the same amount of $.)

And with people being careful with their money these days, you can just bet that for every one of those expensive HDR capable UHD TVs that Samsung is selling, the company is selling 10 of either its $2000 65 inch UHD TVs or its $1100 55 inch UHD TVs. (just saw the last price in a trip to BJ's)

So I really stand by my statement that during the early introductory period of UHD Blu-ray very, very few owners of UHD TVs will have models that can do HDR. We are at least 1-2 years away from HDR equipped UHD TVs, from respected brands, coming close to price levels that won't turn off the largest majority of even UHD TV buyers.

And, BTW, the modestly priced models most people are buying simply don't have enough brightness built into their backlighting to ever provide HDR brightness levels, so it will not be possible to use firmware updates to give most TVs HDR capability if they didn't come with it in the first place. That would be like transforming your Honda's engine that maxes out at 200 horsepower, into one that can put out 400 horsepower, simply by sending a firmware update to your car's computer.
sarahb75 is offline  
post #2293 of 2384 Old 08-14-2015, 07:36 PM
AVS Forum Special Member
 
hernanu's Avatar
 
Join Date: Apr 2007
Location: Boston Suburbs
Posts: 3,766
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 682 Post(s)
Liked: 1218
Quote:
Originally Posted by sarahb75 View Post
There is a major problem with your contention, hernanu, that Vizio is going to be a significant factor in the early days of UHD Blu-ray by providing inexpensive UHD TVs that are Dolby Vision equipped. At present, Vizio only plans to include Dolby Vision in its 2 Reference Series models, a 65 inch and a huge (and, no doubt, extremely expensive) 120 inch model. As unlikely as it may seem, Vizio hopes to become thought of as a premium brand. (That is, if their Reference Series ever sees the light of day. The continuing delays are a shame because the Reference Series prototypes really wowed people at the Jan. 2014 CES. But then again, it's not unusual to see a new technology exhibited at a trade show, and later find that the company that showed off the potential new product could not manage to bring it to market for a practical cost.

These 2 Vizio Reference Series sets have been promised to be arriving soon ever since 2014. In fact it had been claimed that they would be available in Fall 2014. Vizio reps had said that they were aiming for an introductory price of $4000 for the 65 inch set, but later said that had been a premature estimate.

The vast majority of people who are willing to put down money for a new UHD TV want to spend far less than $4000, especially when you see Samsung 65 inch UHD TVs selling for as low as $1999 in the Sunday newspaper ads for stores like HH Gregg and Best Buy.

So since Vizio is having so much trouble even fielding its first Dolby Vision TVs I would not count on Dolby Vision equipped product from that company having an impact in the early days of UHD Blu-ray. And even if the company could manage to soon introduce such a 65 inch TV at $4000, not too many people will be lined up to pay that much for a Vizio. (they'd much rather buy 2 non-HDR equipped Samsung 65 inch UHD TVs for the same amount of $.)

And with people being careful with their money these days, you can just bet that for every one of those expensive HDR capable UHD TVs that Samsung is selling, the company is selling 10 of either its $2000 65 inch UHD TVs or its $1100 55 inch UHD TVs. (just saw the last price in a trip to BJ's)

So I really stand by my statement that during the early introductory period of UHD Blu-ray very, very few owners of UHD TVs will have models that can do HDR. We are at least 1-2 years away from HDR equipped UHD TVs, from respected brands, coming close to price levels that won't turn off the largest majority of even UHD TV buyers.

And, BTW, the modestly priced models most people are buying simply don't have enough brightness built into their backlighting to ever provide HDR brightness levels, so it will not be possible to use firmware updates to give most TVs HDR capability if they didn't come with it in the first place. That would be like transforming your Honda's engine that maxes out at 200 horsepower, into one that can put out 400 horsepower, simply by sending a firmware update to your car's computer.

You make some very good points, well thought out. My take on it is that the R series are not going to be the only HDR capable TVs coming from Vizio. I think the P and maybe the M series will have that capability.


Just my opinion in a speculation thread. If that is indeed the case, then HDR capable TVs at many price points will be making a good amount of impact. As to the lateness of the R release, I do think it's coming at a good time if it comes within a month or so. If it doesn't materialize, then all of the new money flowing into Vizio and the hype goes right down the tube.


As for me, I have a very good 2011 Vizio FALD (150 zones) that won't be replaced until 2017 or so when Oppo comes out with their player. By then all of the hullaballoo will have quieted down and we'll have plenty of content.
hernanu is offline  
post #2294 of 2384 Old 08-14-2015, 08:36 PM
Senior Member
 
sarahb75's Avatar
 
Join Date: Mar 2011
Location: Norton, Ohio
Posts: 450
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 335 Post(s)
Liked: 164
Thumbs up

Quote:
Originally Posted by hernanu View Post
You make some very good points, well thought out. My take on it is that the R series are not going to be the only HDR capable TVs coming from Vizio. I think the P and maybe the M series will have that capability.


Just my opinion in a speculation thread. If that is indeed the case, then HDR capable TVs at many price points will be making a good amount of impact. As to the lateness of the R release, I do think it's coming at a good time if it comes within a month or so. If it doesn't materialize, then all of the new money flowing into Vizio and the hype goes right down the tube.


As for me, I have a very good 2011 Vizio FALD (150 zones) that won't be replaced until 2017 or so when Oppo comes out with their player. By then all of the hullaballoo will have quieted down and we'll have plenty of content.
I hope that you are on the right track in looking for Vizio to implement HDR in its P or M series. That would be so cool, not only because it would help mainstream the major improvement provided by HDR, but also because of the great pressure it would put on Samsung, Sony, and LG, to moderate the prices of their own HDR equipped UHD TVs.
sarahb75 is offline  
post #2295 of 2384 Old 08-15-2015, 03:21 PM
Advanced Member
 
Z-Mad's Avatar
 
Join Date: Sep 2012
Posts: 780
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 330 Post(s)
Liked: 332
Quote:
Originally Posted by GregLee View Post
People underestimate the good effects of upconversion. I don't mean real people in general, but people here, posting in this forum.
So these "unreal" people here would be calibrators or other industry professionals and those with understanding of the standards and art of content reproduction?

Quote:
Originally Posted by GregLee View Post
It's because they are wedded to this idea that the best picture is the one that reproduces most faithfully the video signal. According to that odd philosophy, any upconversion is a degradation, because it displays things that are not there in the video signal.
So it is odd to reproduce the content accurately and as intended for the viewer to see, but it is not odd that you would apparently like to take over the job of the director or post production and create your own version of the content, generating your own level of detail or other picture attributes? You do realize that the TV is a display device only and not a post production or editing device. So it is being calibrated to allow you, the viewer, to get the most accurate experience of the content, as its creators intended, and as it is being delivered to you (be it on a DVD, via a stream, or otherwise).

Quote:
Originally Posted by GregLee View Post
It's odd, because (1) we really want the display to resemble the original scene, not the signal, and (2) it neglects the possibility that upconversion can repair damage that was done to the image of a scene by the processing that went on between camera and TV screen.
Seems you believe the "original or undamaged scene" to mean being a bystander at a movie set and not viewing the actual scene as viewed through the camera lens by the director and processed in the studio for you to see as they, the creators of the content, intended?

Quote:
Originally Posted by GregLee View Post
Here are two ways that 4k video enables repair work to display a superior picture: A. apparent color depth can be increased by using dither to capitalize on the extra pixels available. (I've already argued this here in other posts.) B. 4k is better than 2k at simulating bright highlights, by turning up contrast/backlight on LED sets that can show a lot of brightness. This is because real highlights are small in area, and a 4k picture can confine an area with enhanced brightness to a smaller screen area.

These advantages are due to having 4k pixels on the screen, not necessarily in the signal, so they are potentially available in upconverted displays. Well, they are, at least, if you don't get your TV calibrated by a technician who insists on setting up the TV to produce the standard peak brightness of 100 nits.
These technical advances (such as UHD, HDR, etc.) will make it possible for the directors and content creators to utilize them when they make their content, which is when a display capable of displaying so created content will be truly utilized... And it is when these displays will again be calibrated to do nothing more but accurately display this so-created UHD and HDR content in a way the content creator decided to utilize the extra resolution and HDR capabilities within the established standard, and again not the way you think they should have and outside of the standard...

Food for thought...
Z-Mad is offline  
post #2296 of 2384 Old 08-15-2015, 04:01 PM
AVS Forum Special Member
 
GregLee's Avatar
 
Join Date: Jul 2002
Location: Waimanalo HI
Posts: 4,183
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 868 Post(s)
Liked: 296
Quote:
Originally Posted by Z-Mad View Post
You do realize that the TV is a display device only ...
Yes, but we seem to disagree about what it displays. I want it to display a scene so that I can see it as though I were there in person. For news and documentaries, what I mostly watch on TV, it seems to me that is what most people would want. For fictional works, I suppose that the authors and director want me to think the scenes they show me are real, in some sense, but that's their business, using lighting, makeup, and other artifices to portray another reality. To think I should judge picture quality by trying to guess the intent of a director and compare what I see with what is in his head -- this just strikes me as a very strange idea. It's the story I care about, not the means the story teller uses to tell it.

Greg Lee
GregLee is offline  
post #2297 of 2384 Old 08-15-2015, 07:31 PM
Advanced Member
 
Z-Mad's Avatar
 
Join Date: Sep 2012
Posts: 780
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 330 Post(s)
Liked: 332
Quote:
Originally Posted by GregLee View Post
Yes, but we seem to disagree about what it displays. I want it to display a scene so that I can see it as though I were there in person. For news and documentaries, what I mostly watch on TV, it seems to me that is what most people would want. For fictional works, I suppose that the authors and director want me to think the scenes they show me are real, in some sense, but that's their business, using lighting, makeup, and other artifices to portray another reality. To think I should judge picture quality by trying to guess the intent of a director and compare what I see with what is in his head -- this just strikes me as a very strange idea. It's the story I care about, not the means the story teller uses to tell it.
I don't think it is up for debate what a TV is supposed to display: it obviously displays the content as fed to it, not more and not less. The content is created utilizing specific standards, so that the only way to see it accurately is by displaying it on a device calibrated to that same standard. That's why it's not a guess what the content creator had in mind if watching his work on a display calibrated to the same standard the creator used - this is indeed the only way to actually not have to wonder or guess how the creator intended the scene to look or to be broadcasted, if you are watching a news broadcast or any content for that matter. And if one wants to witness the news or a show as viewed in person and not as broadcasted, I suggest visiting the studio... Or petitioning that studios use cameras and standards that create more lifelike broadcast. Though any broadcast is indeed most lifelike on a well calibrated TV anyway... Not sure if you ever had your TV properly calibrated... Anyway...
Z-Mad is offline  
post #2298 of 2384 Old 08-16-2015, 08:59 AM
AVS Forum Special Member
 
GregLee's Avatar
 
Join Date: Jul 2002
Location: Waimanalo HI
Posts: 4,183
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 868 Post(s)
Liked: 296
Quote:
Originally Posted by Z-Mad View Post
I don't think it is up for debate what a TV is supposed to display: it obviously displays the content as fed to it, not more and not less.
You think wrong. This is engineer-think. As a consumer and watcher of TV, I have no interest in how well a TV displays what is fed to it. I don't even know about the signal that is fed to it, and why should I care about that? I'm interested in how well the TV depicts a scene.

Greg Lee
GregLee is offline  
post #2299 of 2384 Old 08-16-2015, 10:15 AM
AVS Forum Special Member
 
RonF's Avatar
 
Join Date: Feb 2001
Location: Sunny SoCal
Posts: 2,067
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 291 Post(s)
Liked: 271
Quote:
Originally Posted by Z-Mad View Post
I don't think it is up for debate what a TV is supposed to display: it obviously displays the content as fed to it, not more and not less. The content is created utilizing specific standards, so that the only way to see it accurately is by displaying it on a device calibrated to that same standard. That's why it's not a guess what the content creator had in mind if watching his work on a display calibrated to the same standard the creator used - this is indeed the only way to actually not have to wonder or guess how the creator intended the scene to look or to be broadcasted, if you are watching a news broadcast or any content for that matter. And if one wants to witness the news or a show as viewed in person and not as broadcasted, I suggest visiting the studio... Or petitioning that studios use cameras and standards that create more lifelike broadcast. Though any broadcast is indeed most lifelike on a well calibrated TV anyway... Not sure if you ever had your TV properly calibrated... Anyway...
Fair to assume you would be a Darbee hater? If so, IMO, kind of silly when it can be used judiciously to good benefit or not so judiciously depending on viewer taste and what THEY want to see out of their images. After calibration. I too would disagree that "what a TV is supposed to display is not up for debate". Way toooo many variables all the way through the chain and TV screen capabilities vs production monitors and viewing room environments. Plus the consumer paid his money and is entitled to adjust to what makes him or her happy. Doesn't make them unclean or bad people because not your particular preference. That is not up for debate.

Last edited by RonF; 08-16-2015 at 10:26 AM.
RonF is online now  
post #2300 of 2384 Old 08-16-2015, 10:19 AM
Member
 
prm1138's Avatar
 
Join Date: Nov 2001
Location: Bay Area
Posts: 51
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 16 Post(s)
Liked: 20
Well, I have had a 4k set now since January. I have watched TV (Sat), DVD, Blu-Ray (Oppo103d), streaming (Netflix) and original 4k material I have shot myself. Stunning images. HD upscaled looks fantastic. SD looks no worse than on my prior 1080 set.

My opinion is that 4k is worth it.
prm1138 is offline  
post #2301 of 2384 Old 08-16-2015, 12:18 PM
Advanced Member
 
EvLee's Avatar
 
Join Date: Dec 2009
Posts: 594
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 339 Post(s)
Liked: 316
Quote:
Originally Posted by Z-Mad View Post
I don't think it is up for debate what a TV is supposed to display: it obviously displays the content as fed to it, not more and not less. The content is created utilizing specific standards, so that the only way to see it accurately is by displaying it on a device calibrated to that same standard. That's why it's not a guess what the content creator had in mind if watching his work on a display calibrated to the same standard the creator used - this is indeed the only way to actually not have to wonder or guess how the creator intended the scene to look or to be broadcasted, if you are watching a news broadcast or any content for that matter. And if one wants to witness the news or a show as viewed in person and not as broadcasted, I suggest visiting the studio... Or petitioning that studios use cameras and standards that create more lifelike broadcast. Though any broadcast is indeed most lifelike on a well calibrated TV anyway... Not sure if you ever had your TV properly calibrated... Anyway...
That was true to a degree back when every facility was using the same model reference Sony CRT for grading and review. Those days are long gone. Now you'll find in review rooms a mix of 3-chip DLP projection, LCD reference monitors and OLED reference monitors which may share a common target but all have notable differences in black level, local contrast, pixel structure and other characteristics relevant to how the picture looks. My point being that you can get the best calibrator to setup your TV at home, but there is no way it is going to simultaneously match all the different setups it may have been viewed on at studio X, Y and Z. Not to mention each of those studios may have a different philosophy on how to best tweak their mastering process to produce a good quality experience for the home. There are many evils that consumer TV's perform in their image processing, but in most cases upscaling is not one that I feel compromises the picture any more than switching between the different display technologies I listed above.
RonF likes this.
EvLee is offline  
post #2302 of 2384 Old 08-16-2015, 05:31 PM
Advanced Member
 
Z-Mad's Avatar
 
Join Date: Sep 2012
Posts: 780
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 330 Post(s)
Liked: 332
Quote:
Originally Posted by EvLee View Post
That was true to a degree back when every facility was using the same model reference Sony CRT for grading and review. Those days are long gone. Now you'll find in review rooms a mix of 3-chip DLP projection, LCD reference monitors and OLED reference monitors which may share a common target but all have notable differences in black level, local contrast, pixel structure and other characteristics relevant to how the picture looks. My point being that you can get the best calibrator to setup your TV at home, but there is no way it is going to simultaneously match all the different setups it may have been viewed on at studio X, Y and Z. Not to mention each of those studios may have a different philosophy on how to best tweak their mastering process to produce a good quality experience for the home. There are many evils that consumer TV's perform in their image processing, but in most cases upscaling is not one that I feel compromises the picture any more than switching between the different display technologies I listed above.
Seems there is not much understanding here for standards and what they mean and why they are being used regardless if on a DLP projector, LCD or an OLED... The notion that all the content is completely randomly mastered on various displays to the point that calibrating a TV to the standard is "useless" and everyone should come up with their own standard on their home TVs is rather absurd... Once calibrated, each of the displays will show their inherent strengths and weaknesses, but calibration brings out their best, while their weaknesses cannot be fixed with deviation from standard. A TV with poor black level will exhibit poor blacks no matter how far I deviate from the standard. Anyway, one can only "lead the horse to water" as they say. Anyone if free to mess with the content as they please, as inadvisable as it is...
Z-Mad is offline  
post #2303 of 2384 Old 08-16-2015, 06:28 PM
AVS Forum Special Member
 
GregLee's Avatar
 
Join Date: Jul 2002
Location: Waimanalo HI
Posts: 4,183
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 868 Post(s)
Liked: 296
Quote:
Originally Posted by Z-Mad View Post
Seems there is not much understanding here for standards and what they mean and why they are being used regardless if on a DLP projector, LCD or an OLED...
I think I understand well enough your theory. I just don't believe it. You imagine being in a standards heaven where there are make-up standards, and lighting standards, and camera standards, and studio monitor standards, and standard directors who know about and rely on all those standards to get a record of their artistic vision. And then there are colorist standards, and grading standards, and lots of other standards that govern what happens to the video signal from the scene shot to your home TV. Providing your home setup complies with all relevant standards, you will see the most authentic possible picture that realizes the intentions of the director and all these standards enforcers.

Isn't that what you imagine happens? But what if that is not what really happens?
RonF likes this.

Greg Lee
GregLee is offline  
post #2304 of 2384 Old 08-16-2015, 06:52 PM
Advanced Member
 
Z-Mad's Avatar
 
Join Date: Sep 2012
Posts: 780
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 330 Post(s)
Liked: 332
Quote:
Originally Posted by GregLee View Post
I think I understand well enough your theory. I just don't believe it. You imagine being in a standards heaven where there are make-up standards, and lighting standards, and camera standards, and studio monitor standards, and standard directors who know about and rely on all those standards to get a record of their artistic vision. And then there are colorist standards, and grading standards, and lots of other standards that govern what happens to the video signal from the scene shot to your home TV. Providing your home setup complies with all relevant standards, you will see the most authentic possible picture that realizes the intentions of the director and all these standards enforcers.

Isn't that what you imagine happens? But what if that is not what really happens?
Again, not much understanding exhibited here... As I said, by all means disregard them and create your own, since you believe everyone else, including studios that create the content for you, are clueless about them and don't use any standards... The movie theaters should probably stop calibrating their projectors to standards too, and everyone should just throw them out the window. That'll be real fun, when every channel broadcasts to their own standard, every theater shows movies to theirs, etc. etc. Hell, the TV's should probably disregard the power standards too, come up with their own voltage and plug, even if it may not fit most sockets... Who needs standards... Anyway...
Z-Mad is offline  
post #2305 of 2384 Old 08-16-2015, 07:27 PM
Advanced Member
 
EvLee's Avatar
 
Join Date: Dec 2009
Posts: 594
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 339 Post(s)
Liked: 316
Quote:
Originally Posted by Z-Mad View Post
Seems there is not much understanding here for standards and what they mean and why they are being used regardless if on a DLP projector, LCD or an OLED... The notion that all the content is completely randomly mastered on various displays to the point that calibrating a TV to the standard is "useless" and everyone should come up with their own standard on their home TVs is rather absurd... Once calibrated, each of the displays will show their inherent strengths and weaknesses, but calibration brings out their best, while their weaknesses cannot be fixed with deviation from standard. A TV with poor black level will exhibit poor blacks no matter how far I deviate from the standard. Anyway, one can only "lead the horse to water" as they say. Anyone if free to mess with the content as they please, as inadvisable as it is...
You can buy the most expensive consumer display in the world and no amount of calibration will ever make it match a reference display. I've looked at enough of them and know how they fall short. They are just not designed for that sort of use. I am talking about their image processing pipeline, having sufficient internal precision to do color conversions without quantization errors, hot quite hitting one primary or another, etc... You can use the standards as a starting point, but when you push on a consumer display too hard with the sole objective of matching a reference standard the image quality begins to suffer because the manufacturers have to build these things en masse and with ridiculously tight price targets. It's been shown numerous times that single-minded calibration can actually make a display look worse. You are better off accepting some compromises and optimizing for image quality. Reference level displays that can actually hit specs are cherry picked and are basically sold at a loss.

Even in the cinemas there are necessary compromises between the standards and what you see in the field. DCI calls for 14 ft-L white, but 3D projection can be as dim as 2 ft-L. Yes, really. DCI set certain tolerances on the RGB primaries, but when they found that the tolerances were too tight to hit consistently in mass deployment they were "relaxed". You think you are really getting 100% of P3 out of a xenon projector? Not likely. So yeah, there are some pretty huge deviations from the standards even in commercial installations, let alone when we start talking about the home.
RonF likes this.
EvLee is offline  
post #2306 of 2384 Old 08-17-2015, 01:46 PM
Advanced Member
 
Z-Mad's Avatar
 
Join Date: Sep 2012
Posts: 780
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 330 Post(s)
Liked: 332
Quote:
Originally Posted by EvLee View Post
You can buy the most expensive consumer display in the world and no amount of calibration will ever make it match a reference display. I've looked at enough of them and know how they fall short. They are just not designed for that sort of use. I am talking about their image processing pipeline, having sufficient internal precision to do color conversions without quantization errors, hot quite hitting one primary or another, etc... You can use the standards as a starting point, but when you push on a consumer display too hard with the sole objective of matching a reference standard the image quality begins to suffer because the manufacturers have to build these things en masse and with ridiculously tight price targets. It's been shown numerous times that single-minded calibration can actually make a display look worse. You are better off accepting some compromises and optimizing for image quality. Reference level displays that can actually hit specs are cherry picked and are basically sold at a loss.

Even in the cinemas there are necessary compromises between the standards and what you see in the field. DCI calls for 14 ft-L white, but 3D projection can be as dim as 2 ft-L. Yes, really. DCI set certain tolerances on the RGB primaries, but when they found that the tolerances were too tight to hit consistently in mass deployment they were "relaxed". You think you are really getting 100% of P3 out of a xenon projector? Not likely. So yeah, there are some pretty huge deviations from the standards even in commercial installations, let alone when we start talking about the home.
I have calibrated my share of TVs, and stating that a consumer TV is not a reference display is stating the obvious - no one claims they are. 100% purity would be an illusion, and that's taking it way beyond a simple point that started this discussion (and certainly beyond of what a consumer would know how to compensate for); the point being that with all the variances you describe, you have to start with a standard if one is to minimize them. Sure, tweaks in light output and certain settings are needed for specific environments, but one would generally still start with calibrating to standard and tweaking from there (if one knows how). That does not negate the existence, the need, nor the application of standards in all those cases.
I think the original point was simply that artificial enhancements of the content (such as artificial sharpness on LCDs for instance), or up-conversion (where 75% of pixels are populated with artificial "filler" calculated by the TV's algorithm), especially if combined with misuse of other TV settings in a search for some non-standard picture, often accompanied with misconception of how the material was supposed or meant to look (I speak from experience in calibrating), mostly do not do the picture quality any good. I am yet to see a TV (with all of its inherent flaws) that didn't look immensely better (resulting in a 'wow' reaction) after standard based calibration...
So while one can go on and split hairs if one so chooses, the very basic fact of governing broadcast standards remains, which is what makes it possible for everyone to be able to receive and display the same content, and arguing standards is rather pointless...
Anyway, if one chooses to ignore it and artificially create altered contents, by all means... but I wouldn't go calling it odd not wanting to alter the content with noticeably artificial effects and rather experience it in its original form and as intended (standards being what allows seeing the intended look, or the closest to the intended look as one will get)...

Anyway, I am not against 4K and look forward to when the actual content is widely available and when 4K becomes....wait for it.... THE STANDARD
Z-Mad is offline  
post #2307 of 2384 Old 08-17-2015, 02:05 PM
AVS Forum Special Member
 
GregLee's Avatar
 
Join Date: Jul 2002
Location: Waimanalo HI
Posts: 4,183
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 868 Post(s)
Liked: 296
Quote:
Originally Posted by Z-Mad View Post
Anyway, if one chooses to ignore it and artificially create altered contents, by all means... but I wouldn't go calling it odd not wanting to alter the content with noticeably artificial effects and rather experience it in its original form and as intended (standards being what allows seeing the intended look, or the closest to the intended look as one will get)...
What I called odd is this notion that the ideal you try to match is some standardized signal. That is what is artificial. The right standard is the scene itself -- real life. All those signal standards work only to the extent that they help to achieve fidelity of a picture to what it is supposed to depict (and that is not a signal).

Greg Lee
GregLee is offline  
post #2308 of 2384 Old 08-17-2015, 07:35 PM
Advanced Member
 
EvLee's Avatar
 
Join Date: Dec 2009
Posts: 594
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 339 Post(s)
Liked: 316
Quote:
Originally Posted by Z-Mad View Post
I have calibrated my share of TVs, and stating that a consumer TV is not a reference display is stating the obvious - no one claims they are. 100% purity would be an illusion, and that's taking it way beyond a simple point that started this discussion (and certainly beyond of what a consumer would know how to compensate for); the point being that with all the variances you describe, you have to start with a standard if one is to minimize them. Sure, tweaks in light output and certain settings are needed for specific environments, but one would generally still start with calibrating to standard and tweaking from there (if one knows how). That does not negate the existence, the need, nor the application of standards in all those cases.
I think the original point was simply that artificial enhancements of the content (such as artificial sharpness on LCDs for instance), or up-conversion (where 75% of pixels are populated with artificial "filler" calculated by the TV's algorithm), especially if combined with misuse of other TV settings in a search for some non-standard picture, often accompanied with misconception of how the material was supposed or meant to look (I speak from experience in calibrating), mostly do not do the picture quality any good. I am yet to see a TV (with all of its inherent flaws) that didn't look immensely better (resulting in a 'wow' reaction) after standard based calibration...
So while one can go on and split hairs if one so chooses, the very basic fact of governing broadcast standards remains, which is what makes it possible for everyone to be able to receive and display the same content, and arguing standards is rather pointless...
Anyway, if one chooses to ignore it and artificially create altered contents, by all means... but I wouldn't go calling it odd not wanting to alter the content with noticeably artificial effects and rather experience it in its original form and as intended (standards being what allows seeing the intended look, or the closest to the intended look as one will get)...

Anyway, I am not against 4K and look forward to when the actual content is widely available and when 4K becomes....wait for it.... THE STANDARD
Keep in mind that content is not being delivered to you as 4:4:4 uncompressed RGB. You are receiving compressed 4:2:0 YCbCr. There is nothing sacred about those pixels, and the conversion from 4:2:0 back to 4:4:4 is not standardized. Every TV out there even at native resolution is performing its own flavor of upscaling (which includes some level of sharpening) to reconstruct RGB from subsampled YCbCr. Even the oldest CRTs had to do this via analog circuitry that would exhibit certain frequency response characteristics including potentially boosting high frequencies (aka sharpness). Now you can upscale an image to 4K without introducing artificial sharpness and it has the benefit that it helps hide the pixel structure of the display as well as the increased resolution makes it easier to prevent ringing artifacts. So you can actually get a better reproduction. I'm not talking about the super-resolution techniques that try to fill in extra information, or any of the other detail enhancement tricks, I agree those can be too much. Just standard linear filtering. So I hope this makes it clear why I don't think upscaling is necessarily a "bad thing".

Last edited by EvLee; 08-17-2015 at 07:38 PM.
EvLee is offline  
post #2309 of 2384 Old 08-22-2015, 04:28 PM
Member
 
krazyboy13's Avatar
 
Join Date: Apr 2013
Posts: 36
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 18 Post(s)
Liked: 10
Quote:
Originally Posted by detroit1 View Post
I just viewed the 4 pics on my Samsung 4k Monitor and yes there is a big difference in the quality. The Colors and detail are better and you can see some things that are far away with more clarity
Seems to make sense.
krazyboy13 is offline  
post #2310 of 2384 Old 08-23-2015, 10:05 AM
Senior Member
 
sarahb75's Avatar
 
Join Date: Mar 2011
Location: Norton, Ohio
Posts: 450
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 335 Post(s)
Liked: 164
Some Say, Sadly, Quality Ain't Worth It

Quote:
Originally Posted by bass excavator View Post
Hey Sarahb75


I love your comments and your insight. Always presented in a straight-forward logical way. I agree with all you have said. I just don't feel that UHD Blu-ray will take off in any sizable way, even compared to standard Blu-ray. I have a Samsung BDplayer and I am waiting to see what Oppo or Cambridge offer in a UHD player with analog outputs so I can finally buy my premium player. I hope there is no sizable price increase or ANY increase for that matter given the costs of the 105 and 752 players. My kids, nephew's and niece simply don't care for anything other than downloading and streaming. Neither does my brother. This is the way of the future I guess. I sure hope when streaming/downloading gets better and physical media (which will still last for a while longer) finally "dies" out, then I hope we have lossless audio by then.


With your 580 blu-rays and over 1000 DVDs physical media for you will never become obsolete. I myself will never give up on physical media.
bass excavator, I appreciate your kind comments, and I also got a kick out of you noting that you use a Samsung Blu-ray player, because that caused me to remember something. My post had mentioned the high income couple, with a top performing 60 inch 1080p TV, who still refuse to pay the nominal extra cost to rent Netflix Blu-rays, instead of Netflix DVDs, because they think the quality improvement is not worth the additional money.

And what's really funny (and also sad) is, these folks are viewing those DVDs on the Samsung Blu-ray player they bought when their DVD player died.
cadarndjg likes this.

Last edited by sarahb75; 08-23-2015 at 02:43 PM.
sarahb75 is offline  
Sponsored Links
Advertisement
 
Reply Latest Industry News

Tags
frontpage

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off