AVS Forum banner

1 - 20 of 37 Posts

·
Registered
Joined
·
1,287 Posts
Discussion Starter #1
Let’s be honest, a Premium Q90T is only 100 dimming zones not 480 zones like the Q90R they downgraded the entire Samsung line. They also redesigned the Tizen Os to be less animated and look less premium. Obviously they downgraded the GPU so it cannot handle the smooth OS animations.

The LG CX it don’t even want to discuss I am heart broken it will likely IMO be recalled. It’s the first LG OLED to be a technical downgrade from previous years.

Sony, is laughable the X900H is a star but with 30-48 dimming zones it’s a joke. But at least it has ATSC 3.0 and HDMI 2.1 48Gbps confirmed.

Vizio, has increased it dimming zones, added full 48Gbps HDMI 2.1 to its whole product line and even added in new 55” and 65” OLED models.
 

·
Registered
Joined
·
124 Posts
The reality is, you don’t know anything about it yet! Companies make promises, promises are broken or don’t work nearly as well as they claimed they would. Also, zone counts don’t always mean strong performance. For example, Samsung’s 120 zones in the Q90T could outperform Vizio due to a much better algorithm.
 

·
Registered
Joined
·
3,039 Posts
The reality is, you don’t know anything about it yet! Companies make promises, promises are broken or don’t work nearly as well as they claimed they would. Also, zone counts don’t always mean strong performance. For example, Samsung’s 120 zones in the Q90T could outperform Vizio due to a much better algorithm.
Vizio has been using LD a lot longer than Samsung.

Sent from my LGMP450 using Tapatalk
 

·
Registered
Joined
·
157 Posts
Dimming zones mean nothing = the TCL Mini LED got trashed by a Q90R which only has half of the zones. The Algorhytm is equally as important.
Also 48Gbps on a 4K set is not needed, thats why even LG turned it down to 40Gbps on their 4K sets.



The thing with the Tizen OS is hopefully a bad joke on your side ? First a TV only has a System on Chip and just because they decided to remove some animations you cant jump to the conclusion that said SOC is downgraded. I mean wtf dude ?
 

·
Registered
Joined
·
3,039 Posts
Dimming zones mean nothing = the TCL Mini LED got trashed by a Q90R which only has half of the zones. The Algorhytm is equally as important.
Also 48Gbps on a 4K set is not needed, thats why even LG turned it down to 40Gbps on their 4K sets.



The thing with the Tizen OS is hopefully a bad joke on your side ? First a TV only has a System on Chip and just because they decided to remove some animations you cant jump to the conclusion that said SOC is downgraded. I mean wtf dude ?
48 Gbps has 12 bit support, 4K or 8K it doesn't matter.

Sent from my LGMP450 using Tapatalk
 

·
Registered
Joined
·
3,039 Posts
does 10 bit TV benefit from 12 bit processing?,in other words,will your TV be able to output the 12 bit transmitted to it?
Yes, especially Sony and WB content that is mastered on the Dolby Pulsar that has a 12 bit panel.

Dolby Vision has data from the 12 bit master that a 10 bit panel DV enabled display can process.



Sent from my LGMP450 using Tapatalk
 

·
Registered
Joined
·
919 Posts
Yes, especially Sony and WB content that is mastered on the Dolby Pulsar that has a 12 bit panel.

Dolby Vision has data from the 12 bit master that a 10 bit panel DV enabled display can process.



Sent from my LGMP450 using Tapatalk
still will output it in 10 bit,it's your display shortcoming,what is the benefits of mastering content of 12 bit color depth, when your display can't utilize it!
there's no 12 bit display for consumers in the market
 

·
Registered
Joined
·
3,039 Posts
still will output it in 10 bit,it's your display shortcoming,what is the benefits of mastering content of 12 bit color depth, when your display can't utilize it!

there's no 12 bit display for consumers in the market
Enhanced detail, smoother gradation, and less to no posterization. Tone mapping with no visible artifacts when the peak nits exceeds the consumer display. The meta data is based on that 12 bit master.

Vizio use 12 bit processing may be higher this year, Sony use 14 bit processing and I don't know what Samsung, LG or Panasonic use. I wouldn't be surprised if Panasonic displays use 14 bit processing.

Sent from my LGMP450 using Tapatalk
 

·
Registered
Joined
·
919 Posts
Enhanced detail, smoother gradation, and less to no posterization. Tone mapping with no visible artifacts when the peak nits exceeds the consumer display. The meta data is based on that 12 bit master.

Vizio use 12 bit processing may be higher this year, Sony use 14 bit processing and I don't know what Samsung, LG or Panasonic use. I wouldn't be surprised if Panasonic displays use 14 bit processing.

Sent from my LGMP450 using Tapatalk
processing and receiving is one thing,and utilizing and displaying is another,720p TV will receive 1080p signal,but will output 720p,because of panel limitation,how many true 10 bit panels are there,and when did we start to see them?
yes,processing will give you smoother gradation,but it's still 10|8 bit color depth,and some times may cause banding.
it's TVs we're talking about here, not Pro Monitors.
 

·
Registered
Joined
·
3,039 Posts
processing and receiving is one thing,and utilizing and displaying is another,720p TV will receive 1080p signal,but will output 720p,because of panel limitation,how many true 10 bit panels are there,and when did we start to see them?

yes,processing will give you smoother gradation,but it's still 10|8 bit color depth,and some times may cause banding.

it't TVs we're talking about here, not Pro Monitors.
I currently don't know of any display on the market that use 8bit+FRC for HDR, atleast any display after 2015.

Even if we get just enhancements to what is a 10 bit encode, those enhancements give DV content the ability to get as close as possible to the master. Enhancement not possible without the 12 bit grade, dynamic metadata and DV processing.

Sent from my LGMP450 using Tapatalk
 

·
Registered
Joined
·
824 Posts
Let see how they scale and deal with flaring, banding, DSE etc before we worry about zone count and which processor. Those issues have been more perceptible in actual content than the # of zones. Coding and quality of build mean more than the parts used or # of them.

When Sony and Samsung went wide viewing angle, they discovered that using very high zone counts were negated by the dispersion layer so decided to not waste the money. You can argue the value of dispersion vs contrast but that's a choice, not a downgrade.
 

·
Registered
Joined
·
425 Posts
When Sony and Samsung went wide viewing angle, they discovered that using very high zone counts were negated by the dispersion layer so decided to not waste the money. You can argue the value of dispersion vs contrast but that's a choice, not a downgrade.
Precisely this! Off angle viewing of high numbers of local zones was the issue. Think about it a bit, you have a TV panel which has a layer which is creating the color, and behind that layer possibly as much as 1/4 - 1/2 and inch you have the backlight which is then being used to send the light that gets filtered by the LCD panel in front of it to produce the color you then see. This works great if you are sitting directly in front of the TV, everything lines up no problem and having more individual backlights to control ever smaller numbers of pixels from the LCD layer is great as it can allow ever more detailed production of lumen information for the image.


Now, shift the viewing position off angle... All of a sudden, the backlight is no longer behind the pixels you thought it was from this angle because if you move off to the left side of the screen, the backlight that comes through the LCD layer is now off to the right side of the LCD layer, and not the one directly behind the layer. You can try to mitigate this by effectively placing the backlights in mini-cone shaped structures to prevent the off angle viewing from seeing that other LED, however, then you start to run into problems with the display simply looking dimmer as you go off angle since the light it effectively polarized and pointing straight forward out of the display, creating even more of a headache with off angle viewing than you had.


So, the best solution is to do some math to determine based on how far back the LED layer is from the LCD layer, and calculate how far away the next LED can be from any given point based on the viewing angles you want to support such that the LED zone that is controlling the lumen of a pixel is mostly always the one seen no matter that angle used to view that pixel. This results in number of control zones based on how large the TV is and the space between the two layers in the manufacturing. You can increase the number of zones by decreasing the space between the layers.


This is why the mini/micro LED based TV tech is doing. Samsung's "The Wall", and "The Window" have eliminated the gap between the layers by removing the LCD layer entirely and using the LEDs to directly produce the color as well as the light intensity/lumen. TCL's vidrian tech (what they are naming mini-LED) removed much of the gap between the LED and LCD layers. This is what will let you have more hardware backlight zones. Without that, more zones results in a worse display since you are in many cases having pixels that are attempting to be bright white, but the LED backlight providing the light through that pixel to the viewing angle you are looking at it from may be off entirely because it is trying to show the black darkness of space while the white pixel next to it might be showing a star, and in the end all you see a dark gray bloom and black, with no stars...


P.S. This is why I have been so pissed that it is taking so long for micro LED TVs to hit the market. I get it that a lot happened in the last 4 years that they have been demo'ing it and probably 7 years that they have been developing it. We went from 1080p to 8k during that time, which meant that the number of LEDs in the TV expanded at a staggering pace meaning the size of the LED needed to decrease dramatically and requiring the manufacturing defects per LED to be reduced drastically as well (since there are now more LEDs required to produce a panel, if the defects per LED did not decrease, the number of defective panels would increase by a proportional value as to the number of additional LEDs in the the case of just going from 1080p to 4k, that would be a 16x change). Again, I get why it is taking so long, but I still just wish we we see a true consumer class TV and not just the commercial class advertising displays that have been produced.
 

·
Registered
Joined
·
2,446 Posts
I haven't paid much attention to the lineups this year, what with all that's going, and also that they do seem pretty boring without much changed. But isn't this what happens here every year? People tout the virtues of some great "hero" TV or manufacturer every year, "Don't buy (x), just wait until (x) comes out, it's going to be the giant killer." And then it comes out, and then there's a honeymoon period, and then more sets make it into the wild, reviews come in, and issues become apparent and it's "wait until near year." Be wary of "spec sheet" TVs as it seems like they often can't get the basics right.
 

·
Registered
Joined
·
225 Posts
still will output it in 10 bit,it's your display shortcoming,what is the benefits of mastering content of 12 bit color depth, when your display can't utilize it!
there's no 12 bit display for consumers in the market
There are benefits for mastering content at a higher range if you take a 4k video and downsample it to 1080p it looks better than a video filmed at 1080p, Dolby vision downsamples to 10 bit and looked better than 10 bit hdr.
 

·
Registered
Joined
·
612 Posts
I haven't paid much attention to the lineups this year, what with all that's going, and also that they do seem pretty boring without much changed. But isn't this what happens here every year? People tout the virtues of some great "hero" TV or manufacturer every year, "Don't buy (x), just wait until (x) comes out, it's going to be the giant killer." And then it comes out, and then there's a honeymoon period, and then more sets make it into the wild, reviews come in, and issues become apparent and it's "wait until near year." Be wary of "spec sheet" TVs as it seems like they often can't get the basics right.
Yes. "Wait until the next ..." is what happens all the time. It's also what keeps manufacturers moving ahead. If everyone was willing to accept what's currently available, there'd be no reason to make progress. There'd be no reason for makers to compete for the sale. As to what happens after a TV/device is purchased, users give reviews and other feedback that also keeps the needle moving. Without it, we'd all still be watching Sony Trinitron tube TVs. In it's time, it was great. But it got bested by something else from numerous makers. Now, Sony's one of many players and not the 800lb gorilla it once was. That's good for purchasers. I mean, how many of us actually owned that $3k, 300lb+ tube TV (I actually did)? And I've gotten better TVs every time I've purchased since then because of the "what's next" drum that keeps beating.
 

·
Registered
Joined
·
4,432 Posts
The wait till next year scenario has been around forever. However, once every decade or so that wait really is worthwhile. My question is, at this moment what is coming up that is worth waiting for?
 

·
Registered
Joined
·
1,032 Posts
I don't think we will see these Vizios until the 4th quarter. I'm sure if all the other manufacturers waited till the year was almost over to release their televisions they could also have a lot of these features but Vizio is the slow roller.

Plus doesn't Sony have a number of televisions coming out in June anyway.

And Hisense has their weird dual pane thing inbound also? That is totally new &revolutionary tech.
 
1 - 20 of 37 Posts
Top