UHD/4K Quandary: To Buy or Not to Buy - Page 33 - AVS Forum
First ... 31  32  33 34  35  ... Last
Latest Industry News > UHD/4K Quandary: To Buy or Not to Buy
losservatore's Avatar losservatore 11:32 AM 08-06-2014
Mrorange303 broke a quoting disagreement record ...


We all get that ,we get that you have lots of love for pixels.

losservatore's Avatar losservatore 11:34 AM 08-06-2014
Lets just enjoy our TVs, this thread is repeating the same stuff over and over.








Peace...
bobby2478's Avatar bobby2478 11:36 AM 08-06-2014
Quote:
Originally Posted by Mrorange303 View Post
No sir I won't support members who clearly use false information. Read sages entry. Then read how Greg Lee politely for the 5th time corrected his statement AGAIN.

Yet for a buyer to get fair and accurate information I have to be a bad guy by pointing out that the 4k benefits are consistently dismissed by the non 4k members.

If you want honesty be honest. Have integrity.

Don't make huge response as to why 4k is the wrong choice now.

Only to follow it up with whoops I missed that part.

I'm not mean. I'm simply pointing out what's going on in here.

I have not been rude. The information is not accurate and potential buyers should know.

It's their right not to read the inaccurate and misleading information given against 4k sets.

I have been honest and have praised both 4k as well as plasma. I think 4k sets have a lot to offer, and I do notice the extra resolution even from regular distances. So don't say that I have been saying 4k is the wrong choice right now, I have NEVER said that. All I said was it was at this point in time they are the wrong choice for ME PERSONALLY. I still like 4k and would have gotten one in a second if it weren't for the things I personally didn't like (lack of 4k content, motion artifacts, extra cost, lighter black levels).


There has been a lot of subjective statements on this thread that make it hard for anyone to get an accurate read on whether to get 4k or not. Hard to find truly impartial and objective statements. People have the right to read the truth about 4k sets and their pros and cons, and inaccuracies and subjective statements that exaggerate downsides of 4k don't help anyone. But you can't make the same mistake by arguing plasma is extinct and basically saying it is worthless and not worth anything right now. Then you aren't being impartial, just like anyone you feel is making stuff up about downside of 4k sets.
8mile13's Avatar 8mile13 12:21 PM 08-06-2014
Quote:
Originally Posted by sarahb75 View Post
First, even as an owner of 2 Panny plasmas, I agree with Mrorange303 that plasma should now be a dead issue. But OLED's future is looking brighter than ever.

Just last week it was announced that Sony and Panasonic are forming a consortium with 2 other Japanese
companies: Japan Display Inc, and Innovation Network Corporation of Japan, in order to commercialize OLED.

Together these 4 Japanese companies will form a new company called JOLED Inc. which is planned for a Jan.
2015 launch. Yes folks, it appears that the Japanese are really serious about overcoming the fabrication problems that lead to low production yields, and thus, high OLED panel costs.
Unfortunately JOLED plans to focus primarily on development of medium-size OLED displays (tablets, mobile PC's etc). Also Panasonic and Sony have together just a poor 10% share in JOLED..
http://www.sony.net/SonyInfo/News/Pr...1407/14-0731E/
Quote:
Originally Posted by sarahb75
Additionally, video display expert Alfred Poor has written that the cost of materials that go into an OLED panel, is actually substantially lower than the cost of materials for same size LCD/LED panels. So he notes that once techniques are developed for efficiently fabricating OLED panels, they should eventually end up being cheaper than LCD/LEDs.
''should eventually'', Yield is a serious problem that must be overcome first. And don't forget that masses might not be interested in adoption of this technology.
Quote:
Originally Posted by sarahb75
They said that OLED combines all the strengths of LCD and Plasma, with none of their weaknesses. (but they didn't mention anything about possible image retention or some reported problems with motion)
And several of their weaknesses
What it comes down to is that nobody knows where OLED is heading. It could go either way..right now OLED is a big ?
GregLee's Avatar GregLee 12:21 PM 08-06-2014
Quote:
Originally Posted by sarahb75 View Post
But OLED's future is looking brighter than ever.
I don't think so. OLED is chasing a moving target -- every year LCD-LED technology improves and gets cheaper. OLED is doomed to get better and better, but to run and run and never catch up. Also, there's a problem with "looking brighter" literally. I think the future of consumer TV lies with HDR and Dolby Vision, and that needs bright displays, for one thing. I don't think OLED is bright enough. All the Dolby Vision demos and prototypes have been LCD-LEDs.
imagic's Avatar imagic 12:55 PM 08-06-2014
Quote:
Originally Posted by GregLee View Post
I don't think so. OLED is chasing a moving target -- every year LCD-LED technology improves and gets cheaper. OLED is doomed to get better and better, but to run and run and never catch up. Also, there's a problem with "looking brighter" literally. I think the future of consumer TV lies with HDR and Dolby Vision, and that needs bright displays, for one thing. I don't think OLED is bright enough. All the Dolby Vision demos and prototypes have been LCD-LEDs.
LCD cannot catch up to OLED image quality. Someday OLED will dominate. Dolby says OLED will do HDR, fwiw.
Mrorange303's Avatar Mrorange303 01:09 PM 08-06-2014
Quote:
Originally Posted by imagic View Post
LCD cannot catch up to OLED image quality. Someday OLED will dominate. Dolby says OLED will do HDR, fwiw.
The inevitable.
Ken Ross's Avatar Ken Ross 01:12 PM 08-06-2014
Quote:
Originally Posted by sage11x View Post
Actually, there was nothing remotely scientific about the 'experiment' and hdtvtest stated as much in their write up.
I said it was a 'more' scientifically controlled test to show that 4K can be identified at screen sizes and distances charts say you can't. 'More' than anecdotal evidence that some present to prove you can't.

Is it a 'true' scientifically controlled test? Of course not. But in this thread everything is relative.
imagic's Avatar imagic 01:15 PM 08-06-2014
Spent a whole day in the showroom of a high end AV dealer. Without genuine 4K content, the new UHDTVs are just glorified 1080p TVs. PERIOD. There is a whole lot of wishful thinking going on about the benefits of UHDTVs when it comes to 1080p upscaling.

Now OLED is another story... Even 1080p OLED looks fantastic.
Mrorange303's Avatar Mrorange303 01:21 PM 08-06-2014
Quote:
Originally Posted by imagic View Post
Spent a whole day in the showroom of a high end AV dealer. Without genuine 4K content, the new UHDTVs are just glorified 1080p TVs. PERIOD. There is a whole lot of wishful thinking going on about the benefits of UHDTVs when it comes to 1080p upscaling.

Now OLED is another story... Even 1080p OLED looks fantastic.
Clearly you can't speak for everyone. Many see a benefit.

That being said didn't I say current 4k sets are perfect transitional sets because they are basically super 1080p sets?

I agree the true next step is stable, dependable 4k oled. That is our new target standard.
Ken Ross's Avatar Ken Ross 01:22 PM 08-06-2014
Quote:
Originally Posted by losservatore View Post
Lets just enjoy our TVs, this thread is repeating the same stuff over and over.

Peace...
I said that pages ago. This thread is toast and is doing nothing but creating an air of hostility, while adding nothing new.
Mrorange303's Avatar Mrorange303 01:25 PM 08-06-2014
Quote:
Originally Posted by Ken Ross View Post
I said that pages ago. This thread is toast and is doing nothing but creating an air of hostility, while adding nothing new.
Yet we all keep posting. And the same stuff. We crazy like that.
Joe Bloggs's Avatar Joe Bloggs 01:42 PM 08-06-2014
Quote:
Originally Posted by GregLee View Post
This is wrong, because 4K does add more than resolution of picture detail. It adds 64 times more colors (through dithering, using the extra pixels). That includes 4 times as many levels of brightness for any given color (including black/white). Since more gradations of brightness are available, the highlights of pictures can occupy less area, and so we can tolerate turning up the brightness more without whiting out detail in the brighter parts of the picture. (This is the basic idea behind the Dolby Vision version of High Dynamic Range.)
Surely using this method it would only add more colours when it isn't being used for 4K (for increased resolution) - eg. when it's being used to show 1920x1080 content.

Also, is there any TV manufacturer that is claiming to add 64 times more colours when displaying 1080p content on an 8 bit 4K set using your method? Or a scientific test showing that a set is achieving 64 times more colours when displaying 1080p content on an 8 bit 4K set? If it's only a theory about how they could do it, and not a fact about how they are actually doing it, then surely it's not valid to claim it as a fact.
sage11x's Avatar sage11x 01:43 PM 08-06-2014
Quote:
Originally Posted by GregLee View Post
This is wrong, because 4K does add more than resolution of picture detail. It adds 64 times more colors (through dithering, using the extra pixels). That includes 4 times as many levels of brightness for any given color (including black/white). Since more gradations of brightness are available, the highlights of pictures can occupy less area, and so we can tolerate turning up the brightness more without whiting out detail in the brighter parts of the picture. (This is the basic idea behind the Dolby Vision version of High Dynamic Range.)
Where is this extra color data coming from? Which of the current 4k sets currently interpolate extra color data and how are they going about limiting artifacts? And I'm sorry but how does dithering, a tool to combat false contouring, add extra color info again?

Again, we're confusing theoretical benefits of 4k with what the tech is capable of right now.
Ragnrok23's Avatar Ragnrok23 01:45 PM 08-06-2014
Ok here's my dilemma

going to finish my walk-out basement in the fall. There will be a sliding glass door as the only source of natural light which may or may not effect the TV lighting (not sure where the TV will be placed yet)

I was going to purchase a 60" F8500 next week for tax free weekend (even though I won't use it till Oct/Nov) mostly because I wanted to make sure I still got one, and would be able to return it if I got a defective unit

I watch mostly OTA, Netflix, Blu-rays, and hockey (stream)

So know I'm wondering if I should just wait till October

budget would be around that $2,500 price point
Mrorange303's Avatar Mrorange303 01:50 PM 08-06-2014
Quote:
Originally Posted by sage11x View Post
Where is this extra color data coming from? Which of the current 4k sets currently interpolate extra color data and how are they going about limiting artifacts? And I'm sorry but how does dithering, a tool to combat false contouring, add extra color info again?

Again, we're confusing theoretical benefits of 4k with what the tech is capable of right now.
No your confusing it.

We?

And both Sony and Samsung use extended color advertising for their 4k sets.

Have since last year.
Mrorange303's Avatar Mrorange303 01:51 PM 08-06-2014
Quote:
Originally Posted by Ragnrok23 View Post
Ok here's my dilemma

going to finish my walk-out basement in the fall. There will be a sliding glass door as the only source of natural light which may or may not effect the TV lighting (not sure where the TV will be placed yet)

I was going to purchase a 60" F8500 next week for tax free weekend (even though I won't use it till Oct/Nov) mostly because I wanted to make sure I still got one, and would be able to return it if I got a defective unit

I watch mostly OTA, Netflix, Blu-rays, and hockey (stream)

So know I'm wondering if I should just wait till October

budget would be around that $2,500 price point
The F8500 is never a wrong choice. But get one as big as you can.
sage11x's Avatar sage11x 02:07 PM 08-06-2014
Quote:
Originally Posted by Mrorange303 View Post
No your confusing it.

We?

And both Sony and Samsung use extended color advertising for their 4k sets.

Have since last year.
Extended color (xvYCC in Sony parlance) is a proposed increase in color space and is not in any way related to the 4k spec. The problem, as always, is nothing supports the increased color gamut accept for a few PC video cards. If the extended color space is ever supported it would be supported on 1080p TVs thusly equipped just as quickly as 4k TVs thusly equipped.

Again, for (hopefully) the last time: you are confusing proposals and marketing for what is possible right now in the real world.
jogiba's Avatar jogiba 03:11 PM 08-06-2014
Quote:
Best Buy, in concert with LG, Samsung and Sony, will kick off a comprehensive 13-week consumer awareness campaign to help drive Ultra HD TV adoption beginning this Saturday.
http://www.twice.com/news/retail/exc...campaign/49801

Most flagship smartphones will have 4K video mode by January 2015 and 4K UHD TVs will be a hot seller in 2015 with 65" 4K UHD TVs selling for as low as $999.
GregLee's Avatar GregLee 03:55 PM 08-06-2014
Quote:
Originally Posted by Joe Bloggs View Post
Surely using this method it would only add more colours when it isn't being used for 4K (for increased resolution) - eg. when it's being used to show 1920x1080 content.
It's so difficult to be clear. The extra colors are there in a 4K signal (I'm assuming), and the color information is carried by the extra pixels. That is, a video camera which can sense 10-bit color information uses dithering over the 4X as many pixels to, in effect, move the color information into local pixel areas by dithering, so that the color depth per pixel can be reduced to 8 bits. At least, that is my inexpert idea of what goes on. It's not a 2k picture with 8 bit color -- it's a 4k picture with 8 bit color. If the resolution were reduced to 2k, that 2 bits of color information would be lost. If the original content were 2k at 8 bit color depth, the extra 2 bits per pixel of color information wouldn't be there at all.
losservatore's Avatar losservatore 04:08 PM 08-06-2014
We already know that 4k have been push to consumers and that's not going back so please understand that in this forum many 1080p users are not jumping in the 4k wagon for other reasons.I bought a plasma this year and I don't regret my purchase I have many reasons on why I choose this TV .People will still buy 1080p or 4k TVs is their choice ,we only represent a very tiny percent of the population.



There is not winner here just enjoy your TV.
GregLee's Avatar GregLee 04:10 PM 08-06-2014
Quote:
Originally Posted by sage11x View Post
Where is this extra color data coming from? Which of the current 4k sets currently interpolate extra color data and how are they going about limiting artifacts? And I'm sorry but how does dithering, a tool to combat false contouring, add extra color info again?
Let's ignore upconverting 2k to 4k, since that confuses things. Aside from that, the extra color data is not added by the TV, but it is part of the video signal. A 4k video camera senses color to at least a 10 bit precision, then uses dithering to move some color information into pixel patterns and thus reduces the 10 bit color information to 8 bits per sub-pixel. It's my understanding that this is just the way 4k video cameras work. The TV doesn't have to do anything but display the 4k video signal.
Joe Bloggs's Avatar Joe Bloggs 04:13 PM 08-06-2014
Quote:
Originally Posted by GregLee View Post
It's so difficult to be clear. The extra colors are there in a 4K signal (I'm assuming), and the color information is carried by the extra pixels. That is, a video camera which can sense 10-bit color information uses dithering over the 4X as many pixels to, in effect, move the color information into local pixel areas by dithering, so that the color depth per pixel can be reduced to 8 bits. At least, that is my inexpert idea of what goes on. It's not a 2k picture with 8 bit color -- it's a 4k picture with 8 bit color. If the resolution were reduced to 2k, that 2 bits of color information would be lost. If the original content were 2k at 8 bit color depth, the extra 2 bits per pixel of color information wouldn't be there at all.
So you are claiming the 64 times the colours is in the 1080p source and it's not the TV that's dithering

So, dithering reduces the actual resolution, but can give the illusion of more colours and can reduce banding.
There's almost no 4K content to watch currently, if you're talking about watching 1080p Blu-ray's upscaled on a 4K TV, and that the dithering is in the 1080p source (10 bit down to 8 bit), then that 10 bit info is already in content that can be watched watched on a 1080p TV set. So are you claiming 4K TVs are adding more (fake, invented colours) dithering to already dithered 8 bit Blu-rays? All they can do is blend the colours together a bit more, they don't have any more real info on the original 10 bit source.
sarahb75's Avatar sarahb75 04:23 PM 08-06-2014
GregLee brought up an interesting point by mentioning how HDR implemented with Dolby Vision could greatly
improve the performance of LCD/LED panels. I wasn't at the 2014 CES, but many who were said the best looking video display demo at the show was Vizio's Reference Series prototype which employed High Dynamic
Range through it's use of Dolby Vision.

However, (and someone please correct me if I'm wrong) it is my understanding that to utilize Dolby Vision, the program material has to have been processed with that system. That requirement could basically render
Dolby Vision into being pretty useless for folks like me whose film collections are heavily weighted toward releases from the 70s, 80s, and 90s. I seriously doubt that Hollywood would expend the resources to reprocess more than a few of these films in Dolby Vision. Yet, a UHD OLED would not only upconvert the 360+ Blu-rays already in our collection, but also provide a better black level and superior color saturation,
without the need to duplicate movies in this collection by having to buy expensive Dolby Vision reprocessed versions. (if the studios would even choose to make such versions available)
GregLee's Avatar GregLee 04:52 PM 08-06-2014
Quote:
Originally Posted by Joe Bloggs View Post
So you are claiming the 64 times the colours is in the 1080p source and it's not the TV that's dithering
No, I'm claiming the extra color information is in the 4k source, not the 1080p source.

Quote:
Originally Posted by Joe Bloggs View Post
So, dithering reduces the actual resolution, but can give the illusion of more colours and can reduce banding.
There's almost no 4K content to watch currently, if you're talking about watching 1080p Blu-ray's upscaled on a 4K TV, and that the dithering is in the 1080p source (10 bit down to 8 bit), then that 10 bit info is already in content that can be watched watched on a 1080p TV set. So are you claiming 4K TVs are adding more (fake, invented colours) dithering to already dithered 8 bit Blu-rays? All they can do is blend the colours together a bit more, they don't have any more real info on the original 10 bit source.
The colors displayed by dithering aren't fake any more than the yellow is fake that is produced by lighting up both R and G subpixels. The mechanism is just producing new colors by combining tiny dots of various colors.

I'm trying to explain how people can look at 4k source displayed on a 4k TV and report that the picture is better than they see on a 2k TV, while at the same time some video experts assure us that at ordinary distances, no human can tell the difference between 2k and 4k. And I believe I have explained it.

Why don't we just defer what happens in upscaling, since that seems to confuse things and is a side-issue, anyhow.

I am not trying to offer anyone advice about whether to buy a 4k TV.
Joe Bloggs's Avatar Joe Bloggs 04:58 PM 08-06-2014
Quote:
Originally Posted by GregLee View Post
No, I'm claiming the extra color information is in the 4k source, not the 1080p source.
But aren't 8 bit 1080p Blu-ray sources often dithered from 10 bit to 8 bit? So by this method aren't 8 bit 1080p TVs with 8 bit Blu-ray's already showing more than 8 bit colour (more than 16.7 million colours - or less if you assume video levels 16-235)? How many colours are they effectively displaying by this method?

Quote:
Originally Posted by GregLee
The colors displayed by dithering aren't fake any more than the yellow is fake that is produced by lighting up both R and G subpixels
They are fake if the TV has no access to the original 10 bit source (if all it has access to is already dithered 8 bit content). If the TV is adding more dithering or blending of pixels but without access to the original source, it's just guessing at what the original colours are - it doesn't know - unlike the people who actually have the original 10 bit source.

Shouldn't we really be saving this "shows much more colours" for when we actually get 10 bit (or higher) sources (without the use of dithering) and 10 bit displays that accept native 10 bit (or higher) sources? Having many more real colours is a reason to get the next generation of UHD TVs/discs/set top boxes. This generation of UHD is only simulated higher colours using dithering (just like dithering is used on 1080p discs). They're talking about having 12 bit colour and high dynamic range for UHD (Phase 2) broadcasts around 2016-2018, so future TVs, content and broadcasts, unlike today's, really will have many more real (not fake/guessed) colours (well real as in what was in the actual source, disregarding any errors due to compression)..
GregLee's Avatar GregLee 05:08 PM 08-06-2014
Quote:
Originally Posted by sarahb75 View Post
However, (and someone please correct me if I'm wrong) it is my understanding that to utilize Dolby Vision, the program material has to have been processed with that system.
With a minor reservation, you're right. Some benefit might be possible using an upconversion algorithm to guess at the additional color information that Dolby Vision requires, even though it is not present explicitly in the source video.
sage11x's Avatar sage11x 05:49 PM 08-06-2014
Quote:
Originally Posted by GregLee View Post
It's so difficult to be clear. The extra colors are there in a 4K signal (I'm assuming), and the color information is carried by the extra pixels. That is, a video camera which can sense 10-bit color information uses dithering over the 4X as many pixels to, in effect, move the color information into local pixel areas by dithering, so that the color depth per pixel can be reduced to 8 bits. At least, that is my inexpert idea of what goes on. It's not a 2k picture with 8 bit color -- it's a 4k picture with 8 bit color. If the resolution were reduced to 2k, that 2 bits of color information would be lost. If the original content were 2k at 8 bit color depth, the extra 2 bits per pixel of color information wouldn't be there at all.
So you're assuming.
I'm sorry, but dithering does not equal increased color space or a wider gamut. Your idea is cool but I'm afraid that's not how t works.

So I want to add to my previous statement about the wider xvYCC color space. After a little research I discovered that xvYCC color IS available in the few 'mastered in 4k blurays' sony released. These discs DO have the wider color gamut baked in so if your bluray player supports it (my PS3 does) and your TV supports it (my VT60 does) you can take advantage of the wider color space.

So much for 4k's superiority over 1080p in the color department. In fact, there are far more 1080p TVs that support xv color than 4k TVs that support it-- I actually had a hard time finding sets that didn't have the capability!

Either way I'm going to pick a couple of these discs up and give them a go. I'll report back with what I learn.
GregLee's Avatar GregLee 06:02 PM 08-06-2014
Quote:
Originally Posted by sage11x View Post
So you're assuming.
I'm sorry, but dithering does not equal increased color space or a wider gamut. Your idea is cool but I'm afraid that's not how t works.
Dithering and color depth concern color resolution. They have nothing to do with color space or wide gamut, nor did I ever imply there was any relationship.
Stereodude's Avatar Stereodude 06:19 PM 08-06-2014
Quote:
Originally Posted by GregLee View Post
It's so difficult to be clear. The extra colors are there in a 4K signal (I'm assuming), and the color information is carried by the extra pixels. That is, a video camera which can sense 10-bit color information uses dithering over the 4X as many pixels to, in effect, move the color information into local pixel areas by dithering, so that the color depth per pixel can be reduced to 8 bits. At least, that is my inexpert idea of what goes on. It's not a 2k picture with 8 bit color -- it's a 4k picture with 8 bit color. If the resolution were reduced to 2k, that 2 bits of color information would be lost. If the original content were 2k at 8 bit color depth, the extra 2 bits per pixel of color information wouldn't be there at all.
Uh, what?

The color space defines the amount of color information that can be conveyed. The resolution of the image has nothing to do with the color space. There currently is no consumer UHD spec or format that uses a wider color space than Rec.709. Bit depth also has no bearing on the color space. It defines the number of shades you can get between black and full color. 10-bit color can give you smoother gradations. However, again there is no consumer UHD spec or format that uses more than 8-bit color. So, at this time UHD has no advantage over HD in color space or bit depth.

A new UHD format could be defined with 10-bit color and a wider color space, but the UHD sets you're buying today won't support it any more than the HD sets you buy today.
Tags: frontpage
First ... 31  32  33 34  35  ... Last

Up
Mobile  Desktop