Official Sony KDL-55W900A Owners Thread - Page 187 - AVS Forum | Home Theater Discussions And Reviews
Baselworld is only a few weeks away. Getting the latest news is easy, Click Here for info on how to join the Watchuseek.com newsletter list. Follow our team for updates featuring event coverage, new product unveilings, watch industry news & more!



Forum Jump: 
 286Likes
Reply
 
Thread Tools
post #5581 of 5687 Old 03-28-2016, 07:50 AM
AVS Special Member
 
steve1971's Avatar
 
Join Date: Jul 2008
Location: Saint Paul, Minnesota.
Posts: 2,674
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 444 Post(s)
Liked: 408
Quote:
Originally Posted by helvetica bold View Post
Hello fellow w9 owners. I know we don't have much to talk about these days. However, yesterday I saw Sonys new 930D at Best Buy and it reminded me the first time I saw the W9 in person. The PQ is very impressive with vivid colors and amazing contrast due to the HDR footage on display. While I'm not upgrading this year (most likely next year) I am excited to see the new technology mature.


Sent from my iPad using Tapatalk

I to saw a 930D this past weekend at my local BB helvetica bold as well. And even though the PQ impressed me along with the HDR footage they were running it didnt make me want to dump my w900a for it. I asked the sales person to let me tinker with it and I got flat out told no! All I was gonna do is get it off Store/Vivid mode and adjust the settings to how it would be in my home but again NOPE!!! My wife wasnt to happy with him but I told her to forget about it because why bother? Anyway. I dont plan upgrading to anything when it comes to tv's anytime soon, maybe in 5 years time and when that time comes it maybe a projection system or OLED. Depends on where we are at Tech wise. The only thing I plan adding to my HT setup is a subwoofer and after that I am done. But the 930D is an impressive tv by Sony no doubt but to me it aint impressive enough to make the jump from what I already have.
steve1971 is offline  
Sponsored Links
Advertisement
 
post #5582 of 5687 Old 04-08-2016, 08:21 AM
AVS Special Member
 
steve1971's Avatar
 
Join Date: Jul 2008
Location: Saint Paul, Minnesota.
Posts: 2,674
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 444 Post(s)
Liked: 408
I see this thread has gone to sleep so I am going to revive it by saying a funny thing happened just the other day. We had some friends over and one of my friends asked if I could turn on my set. He did and the first thing out of his mouth was is "This is a damn nice 4K tv!" He nearly fell out of his chair when I told him that it wasnt 4K but a 1080p set! The look on his face was priceless and he has a 4K tv. Made my freakin day!!!!!
steve1971 is offline  
post #5583 of 5687 Old 04-08-2016, 11:16 PM
Senior Member
 
delfincek's Avatar
 
Join Date: Sep 2009
Location: Slovenia
Posts: 201
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 26 Post(s)
Liked: 52
One of the nice examples and a confirmation of the blind test that 4K for regular user is a total hype.
steve1971 likes this.
delfincek is offline  
post #5584 of 5687 Old 04-08-2016, 11:43 PM
Member
 
Didee's Avatar
 
Join Date: Aug 2013
Posts: 84
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 39 Post(s)
Liked: 39
Wait, that's not fair. 4k panels definetly DO make a much finer image rendering.

... well, IF you have your nose as close as 50cm (or 20") to the screen.

(some random guys with physiological eagle-eyes @150% eyesight exempted, but the vast majority of population in the western world doesn't even fully reach 100% eyesight, me too.)
steve1971 likes this.
Didee is offline  
post #5585 of 5687 Old 04-09-2016, 07:41 AM
Senior Member
 
delfincek's Avatar
 
Join Date: Sep 2009
Location: Slovenia
Posts: 201
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 26 Post(s)
Liked: 52
Yes, I agree. 4K is all about how close you sit. But I do not know anybody who sits 1.5m (5ft) away when watching 55" 4K TV. They usualy put it 3m-5m away. Im sitting 3m away from my Sony W905. And we dont need to mention the TV programme sources which in majority here in Europe are still in SD quality. But ok I can ignore TV watching as I primarely use my Sony to watch HD movies and at this it is really shining when it is set properly.

best, d.
steve1971 likes this.

Last edited by delfincek; 04-09-2016 at 07:44 AM.
delfincek is offline  
post #5586 of 5687 Old 04-09-2016, 07:55 AM
AVS Special Member
 
steve1971's Avatar
 
Join Date: Jul 2008
Location: Saint Paul, Minnesota.
Posts: 2,674
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 444 Post(s)
Liked: 408
Quote:
Originally Posted by delfincek View Post
One of the nice examples and a confirmation of the blind test that 4K for regular user is a total hype.
I agree D for the regular user 4K is all hype and to me it's pretty sad when a person who has a 4K tv and comes over to My home and think's that my W900A is a 4K set. To me that speaks volumes and it also tell's me how good my W9 truly is.

Quote:
Originally Posted by Didee View Post
Wait, that's not fair. 4k panels definetly DO make a much finer image rendering.

... well, IF you have your nose as close as 50cm (or 20") to the screen.

(some random guys with physiological eagle-eyes @150% eyesight exempted, but the vast majority of population in the western world doesn't even fully reach 100% eyesight, me too.)
Yes the larger 4K panel's do make a difference Didee for finer image rendering that I wont argue with at all. That's why when or if I ever do go 4K it will be a projection system or a 65 to 75inch LED tv or OLED. Anything smaller forget it. I know the 4K guy's claim you can see the difference between a 1080p set and 4K set at 55inches but if that's the case then why didnt the guy that came over to my home Know right away that my W9 wasnt 4K? When he thought it was? Hell even the guys at BB say that to truly appreciate 4K you need a set 65 inches or larger, even some of the top rated reviewer's of tv's say the same thing. Can they all be wrong?
steve1971 is offline  
post #5587 of 5687 Old 04-09-2016, 09:32 AM
Advanced Member
 
helvetica bold's Avatar
 
Join Date: Oct 2004
Location: NYC
Posts: 871
Mentioned: 0 Post(s)
Tagged: 1 Thread(s)
Quoted: 273 Post(s)
Liked: 151
Its true that 4K alone doesn't make a huge difference. However combined with HDR (10-12 bit) and wide color gamut I think there is a difference.
Ive already watched a few movies at a Dolby Cinema: The Martian, Force Awakens, Dead Pool and they all looked amazing. I especially thought Force Awakens looked great. But I do think there isn't a new TV out that would make me replace my W9.
My dream TV would be a Sony 65 inch W9 with 4K, FALD with Dolby Vision, 10 or 12 bit screen with IQ Color (Quantum Dots).
steve1971 likes this.
helvetica bold is offline  
post #5588 of 5687 Old 04-09-2016, 11:20 AM
AVS Special Member
 
steve1971's Avatar
 
Join Date: Jul 2008
Location: Saint Paul, Minnesota.
Posts: 2,674
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 444 Post(s)
Liked: 408
Quote:
Originally Posted by helvetica bold View Post
Its true that 4K alone doesn't make a huge difference. However combined with HDR (10-12 bit) and wide color gamut I think there is a difference.
Ive already watched a few movies at a Dolby Cinema: The Martian, Force Awakens, Dead Pool and they all looked amazing. I especially thought Force Awakens looked great. But I do think there isn't a new TV out that would make me replace my W9.
My dream TV would be a Sony 65 inch W9 with 4K, FALD with Dolby Vision, 10 or 12 bit screen with IQ Color (Quantum Dots).

I agree with you. There isnt a tv out right now that makes me want to replace my W9 either and until one does come out that's the way it's gonna stay for me. Second. I agree with you as well and that is my dream tv would indeed be a 65 inch W9 with 4K, FALD with Dolby Vision, 10 or 12 bit panel with Color IQ. That would be my ultimate tv but who am I kidding? That's what they call a pipe dream so I guess I am stuck with my W9 and I dont have and never will have a problem with that.
steve1971 is offline  
post #5589 of 5687 Old 04-09-2016, 01:24 PM
Member
 
Didee's Avatar
 
Join Date: Aug 2013
Posts: 84
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 39 Post(s)
Liked: 39
Regarding the new HDR-Blu-ray format: I've seen
(sorry it's a review in German, but the pictures speak for themselves), and I'm wondering what-the-fu** is going on there? When fed with original HDR-Blu-rays, those UHD-TVs are running at full tilt with some 400watts power drain, but are producing only poor brightness?!? What's going on there? Erronoeus mastering? Errors in device communication about meta-data? Or a misconception of the format in itself?
Anyone has an idea of what IS going on there?
Didee is offline  
post #5590 of 5687 Old 04-16-2016, 08:32 PM
Member
 
elcubano1's Avatar
 
Join Date: Aug 2007
Posts: 27
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 13
So I've messed up my W900A, it won't turn on anymore after a spark flew out the ventilation holes in the back. I attempted to plug in my Video Card's HDMI through an HDMI input and that was the end of my set. Has anyone heard of this scenario before? I do have a 5 year protection plus warranty on it but from this thread I keep reading that sony discontinued the model and won't be sending the original back. Most likely a different model with less stellar quality. What should I do?
elcubano1 is offline  
post #5591 of 5687 Old 04-16-2016, 09:01 PM
Advanced Member
 
helvetica bold's Avatar
 
Join Date: Oct 2004
Location: NYC
Posts: 871
Mentioned: 0 Post(s)
Tagged: 1 Thread(s)
Quoted: 273 Post(s)
Liked: 151
Didee and Co did you guys see this article, it's pretty interesting.

http://www.hdtvtest.co.uk/news/4k-vs-201604104279.htm

This quote I find interesting.
"Contrary to popular belief, the purpose of HDR (high dynamic range) mastering is to expand the available luminance range rather than elevate the overall brightness of HDR videos."


Sent from my iPhone using Tapatalk
Jorgens and steve1971 like this.
helvetica bold is offline  
post #5592 of 5687 Old 04-17-2016, 06:52 AM
Newbie
 
frankiek3's Avatar
 
Join Date: Dec 2013
Posts: 11
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 4 Post(s)
Liked: 13
Quote:
Originally Posted by elcubano1 View Post
So I've messed up my W900A, it won't turn on anymore after a spark flew out the ventilation holes in the back. I attempted to plug in my Video Card's HDMI through an HDMI input and that was the end of my set. Has anyone heard of this scenario before? I do have a 5 year protection plus warranty on it but from this thread I keep reading that sony discontinued the model and won't be sending the original back. Most likely a different model with less stellar quality. What should I do?
If Sony won't Guarantee the same set repaired or a 'better' set (larger 4k) like 65/85 X950B (2014), and you really want to keep the W900a (I wouldn't blame you), you can have it repaired by a third party or if Sony will sell you replacement parts you could diy.

For comparison:
Last year I repaired an LG that blew out due to a lightning storm. The surge most likely came from the HDMI port of a Mac Mini and fried components on the Input board. Replaced that (~$40) and it works fine now. Troubleshooting is a big part to locate the failed components. The LG had a test picture jumper that when shorted and Powered on displayed a test pattern, this let us know that the display and main board where working.
elcubano1 and steve1971 like this.
frankiek3 is offline  
post #5593 of 5687 Old 04-17-2016, 02:18 PM
Member
 
elcubano1's Avatar
 
Join Date: Aug 2007
Posts: 27
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 13
Quote:
Originally Posted by frankiek3 View Post
If Sony won't Guarantee the same set repaired or a 'better' set (larger 4k) like 65/85 X950B (2014), and you really want to keep the W900a (I wouldn't blame you), you can have it repaired by a third party or if Sony will sell you replacement parts you could diy.

For comparison:
Last year I repaired an LG that blew out due to a lightning storm. The surge most likely came from the HDMI port of a Mac Mini and fried components on the Input board. Replaced that (~$40) and it works fine now. Troubleshooting is a big part to locate the failed components. The LG had a test picture jumper that when shorted and Powered on displayed a test pattern, this let us know that the display and main board where working.
Thanks for the advice, I might end up just replacing both power supply and main board off ebay since the TV doesn't make that clicking sound when you plug it into an outlet. If I remove that zip tie on the back of the tv next to power cable will that definitely void my warranty?
elcubano1 is offline  
post #5594 of 5687 Old 04-17-2016, 03:07 PM
Newbie
 
frankiek3's Avatar
 
Join Date: Dec 2013
Posts: 11
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 4 Post(s)
Liked: 13
Quote:
Originally Posted by elcubano1 View Post
Thanks for the advice, I might end up just replacing both power supply and main board off ebay since the TV doesn't make that clicking sound when you plug it into an outlet. If I remove that zip tie on the back of the tv next to power cable will that definitely void my warranty?
It looks like the zip tie base would push out if the back of the TV was off like a plastic push fastener. (Ripping it off might break it.)

Also check if you purchased it with a credit card, if they provide a warranty and if it covers third party repair.

Wow eBay has a w900a for $1000 seller refurbished (says authorized by Sony).
steve1971 likes this.
frankiek3 is offline  
post #5595 of 5687 Old 04-17-2016, 05:05 PM
AVS Special Member
 
steve1971's Avatar
 
Join Date: Jul 2008
Location: Saint Paul, Minnesota.
Posts: 2,674
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 444 Post(s)
Liked: 408
Quote:
Originally Posted by helvetica bold View Post
Didee and Co did you guys see this article, it's pretty interesting.

http://www.hdtvtest.co.uk/news/4k-vs-201604104279.htm

This quote I find interesting.
"Contrary to popular belief, the purpose of HDR (high dynamic range) mastering is to expand the available luminance range rather than elevate the overall brightness of HDR videos."


Sent from my iPhone using Tapatalk


Just re-read the article and did I see a line saying HDR 4K Blu ray might become a nich product? Wow!
steve1971 is offline  
post #5596 of 5687 Old 04-18-2016, 10:05 AM
Member
 
Didee's Avatar
 
Join Date: Aug 2013
Posts: 84
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 39 Post(s)
Liked: 39
I haven't read that particular article, thanks for sharing. The point behind is not surprising, if one in the past has read this Sony article (PDF) explaining the principle & idea behind HDR video.

However. Somehow I'm still under the impression that a big part of the story simply lies in the general method of the mastering process.

Example#1: the differences shown in above Sony paper, you can simply take all the SDR samples and, when lowering (darkening) their Gamma-value, then they look much more simliar to the HDR samples.

Example#2: regarding this comparison from the hdtv test article:



Ah, okay, THIS is the difference between 8bit SDR and 10bit HDR. Look at the sun! Yippieh!

But ... wait a moment. We can see this vast difference in a simple Jpeg image, which has only simple 8bit resolution, and neither the Jpeg nor our monitors know anything about HDR. Which means: this obvious difference can be achived by old-school 8bit SDR technique. The point is that the 8bit SDR video, obviously, has not been mastered accordingly. It could have been, but wasn't.
Jorgens and steve1971 like this.
Didee is offline  
post #5597 of 5687 Old 04-20-2016, 08:16 AM
AVS Special Member
 
steve1971's Avatar
 
Join Date: Jul 2008
Location: Saint Paul, Minnesota.
Posts: 2,674
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 444 Post(s)
Liked: 408
My wife and I were at BB last night and I'm sorry guy's but none of the new 2016 tv's that are coming in are impressing me one bit. Not even the HDR model's and I thought that at first sight the HDR models would blow me away and honestly I was hopeing in some way's they would but they didnt. So for me 2016 is going to come and go yet again when it come's to any new tv replacing my W9. I'm just not impressed with what I saw or am seeing. My wife said I have hit a wall because up until 2 year's ago I would be replacing a tv every year it seemed but when I landed my W9 that all ended. They are adding app's galore to these new tv's and I aint interested. I want to watch tv and if I want to check out app's I'll go to my computer or use my Sony S790 Blu ray player for that. I personally feel with DV and HDR they are changing the look of films that the director never intended. To me the Director's Intent" of how he or she wants the film to look has been thrown by the wayside so to speak. And I know some of you will say that Hollywood is on board with it and I say sure they are because what ever makes them money they will support. But I do know that some big time Hollywood director's do NOT like tv manufacturer's messing with their film's intended look but they are never mentioned because that might change things and god only knows you dont want to get in the way of a big payday. Now I am keeping an eye on OLED but until someone other then LG get's into the OLED game I'm not budging. But I am hearing of Sony and Panasonic getting into the OLED game so if they do then I'll be interested. But until that day I'm staying put for another year and I have no problem with that.
steve1971 is offline  
post #5598 of 5687 Old 04-22-2016, 06:11 PM
Member
 
Jorgens's Avatar
 
Join Date: Jun 2002
Posts: 152
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 36 Post(s)
Liked: 55
Quote:
Originally Posted by elcubano1 View Post
So I've messed up my W900A, it won't turn on anymore after a spark flew out the ventilation holes in the back. I attempted to plug in my Video Card's HDMI through an HDMI input and that was the end of my set. Has anyone heard of this scenario before? I do have a 5 year protection plus warranty on it but from this thread I keep reading that sony discontinued the model and won't be sending the original back. Most likely a different model with less stellar quality. What should I do?
if you are covered by a 5 yr warranty and the indications are the tv died, i wouldnt mess with it in an attempt to repair it because you risk loosing your warranty. speak to your warranty company (? sony or other) and let them send out a technician to look at it first if possible. if they declare it dead, they might not even be bothered about it being sent back during the warranty resolution process (partial refund or replacement with "equivalent" value new tv)

as other have reported, sony wont replace it with a 4k set (unless you maybe make a contribution to cover the difference in value they perceive it to be ?), and their current best 2k tv doesnt match it
steve1971 likes this.
Jorgens is offline  
post #5599 of 5687 Old 04-23-2016, 02:21 AM
Member
 
Didee's Avatar
 
Join Date: Aug 2013
Posts: 84
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 39 Post(s)
Liked: 39
Side note: you should never plug or unplug HDMI connectors on active devices. It is false general consense that this would be safe, "I've done it a thousand times without harm", but in fact it isn't safe. Sure it is annoying, but to be *really* safe, HDMI connectors should be only be plugged while both end devices are powered off.
steve1971 likes this.
Didee is offline  
post #5600 of 5687 Old 04-23-2016, 06:24 PM
AVS Special Member
 
steve1971's Avatar
 
Join Date: Jul 2008
Location: Saint Paul, Minnesota.
Posts: 2,674
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 444 Post(s)
Liked: 408
Question for my fellow W9 owner's. Have you guys ever experienced what they call "The Halo Effect" while watching dark night scene's on your W9's. I have noticed this from time to time when there is that halo effect around stars ect. I didn't think this was supposed to happen with Edge Lit set's. It don't bother me mind you but I was just wondering if some of you notice it on your W9's? Maybe it's because of the local dimming feature I don't know. But I thought this only happen's with FALD set's. Maybe I'm wrong.
steve1971 is offline  
post #5601 of 5687 Old 04-24-2016, 07:38 AM
Advanced Member
 
helvetica bold's Avatar
 
Join Date: Oct 2004
Location: NYC
Posts: 871
Mentioned: 0 Post(s)
Tagged: 1 Thread(s)
Quoted: 273 Post(s)
Liked: 151
I don't notice any halo effect on my W9. What do you have the LED dynamic control set to? Maybe that's causing it.


Sent from my iPhone using Tapatalk
steve1971 likes this.
helvetica bold is offline  
post #5602 of 5687 Old 04-25-2016, 06:56 AM
Member
 
Jorgens's Avatar
 
Join Date: Jun 2002
Posts: 152
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 36 Post(s)
Liked: 55
imho the one main weakness our 900a has is the lack of FALD which would provide a better dark/black gradient and hence more scenery detail in dark video scenes. blacks are very good in our set and it has "deep black" thanks to the panel type and added coating layer, but using FALD instead of edge mounted backlighting would have been a significant improvement (note: even the 4k model from sony that same year did not use FALD so we cant really complain something was "omitted"). iirc in the last few years sony has only provided FALD in its largest flagship models (76 or 80') but those sets are 10.000$ + and much to big for normal home use for most of us

for the new 2016 models coming out from the main brands, i have only seen one interesting model so far, the Panasonic DX900. it is FALD set in both 58' and 65' sizes, and is the premium range of the panasonic lineup for 2016. (one of the early reviews of the euro version in here http://www.hdtvtest.co.uk/news/tx58d...1604174282.htm ).
and even as a premium 4k set it has some downsides:
- poor upscaling of SD to 4k (even midrange sony 4k sets do it much better)
- it doesnt use quantum dots or similar technology, instead using a red phosphor led which produces much less "clean" RGB primal colors ( see the spectrograph in this review: http://www.hdfever.fr/2016/03/14/tes...et-tx-65dx900/ )
- video motion flow is sony's strong point, panasonic might not be able to match it (from early review indications)
- poor latency in game mode
- the calibrated black level is 0.023 cd/m2 (in comparison our sony 900 black level is 0.042 cd/m2 )
- DCI-P3 coverage is 98% ( in comparison our sony kdl-w900a = 94.7% )
- peak brightness = 1310 cd/m2 (but this panasonic uses 350 watt in HDR mode !!). in comparison our sony 900a = 239 cd/m2, and most HD lcd sets are usually 120 - 150 cd/m2 (note: the HDR standard aims for 1000 cd/m2 but most HDR sets viewing HDR video usually dont reach higher then 350 cd/m2. levels btw 350 and 500 cd/m2 would be very unpleasantly bright to look at).
- panel refresh rate = 120Hz

still, panasonic produced high quality plasma sets for many years, it is good to see them making a major push for quality in lcd now they stopped producing plasma's. on pure image quality (going by the few initial reviews of this set that exist so far) this new dx900 outperforms the competing sony/samsung/lg's of the similar 55-65's screen size (because those brands either use edge dimming, or even ips panels from lg)

going by pure image quality for video/tv (and ruling out oled which is still 50% more expensive and still has other unresolved issues), i would say that this panasonic dx900/dx902 is probably the most interesting 55 to 60' set so far for 2016. lets hope this competition from panasonic pushes sony to incorporate FALD and other premium features in its 55/60' range in the future

i am still extremely happy with the 900a, and given that 65% of all FTA tv is still SD here, and their limited HD broadcasts are either 720p or 1080i, the only thing limiting end user video quality for me is their poor quality signal being sent out, NOT the fact we dont have 4k sets. easy confirmation for that is that when i watch a well mastered bluray with a high bit rate, the video on our 900a's is absolutely stunning.

one interesting observation on that same french website is their reporting that when testing their new 4k video players on 2k sets, the ultra high quality HDR source video from the 4k player (being downscaled and converted from UHD rec.2020 to rec.709 for HD) produces a BETTER image on the 2k display then when using the same movie mastered in 2K (without HDR) from a normal bluray player. this makes sense because the 4k video stream on the UHD player is probably 50 mb/sec, versus 25-30 mb/sec for most "good" 2k video, and all the extra data is not just for the higher resolution. (http://www.hdfever.fr/2016/04/24/la-...-uhd-et-1080p/ ).

with the push to higher 4k resolutions and HDR, it is important to remember that more image improvement is gained by the wider color gamut, higher contrast, better peak/low brightness ratio, then just by the higher resolution. netflix for ex already indicated that when bandwidth drops for their 4k broadcasts, they prefer to reduce resolution to 2k then reduce the extra data used for video quality content (because consumers notice it less). so there might even be some trickle down effect for good current 2k sets even if we still delay upgrading while the technology and standards mature further

edit: corrected black level measurement for kdl-w900a, and added peak brightness
helvetica bold and steve1971 like this.

Last edited by Jorgens; 04-26-2016 at 02:51 AM.
Jorgens is offline  
post #5603 of 5687 Old 04-25-2016, 08:15 AM
AVS Special Member
 
steve1971's Avatar
 
Join Date: Jul 2008
Location: Saint Paul, Minnesota.
Posts: 2,674
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 444 Post(s)
Liked: 408
Quote:
Originally Posted by helvetica bold View Post
I don't notice any halo effect on my W9. What do you have the LED dynamic control set to? Maybe that's causing it.


Sent from my iPhone using Tapatalk
I fixed it helvetica. I had my dynamic control set to high by accident. Stupid me! lol

Quote:
Originally Posted by Jorgens View Post
imho the one main weakness our 900a has is the lack of FALD which would provide a better dark/black gradient and hence more scenery detail in dark video scenes. blacks are very good in our set and it has "deep black" thanks to the panel type and added coating layer, but using FALD instead of edge mounted backlighting would have been a significant improvement (note: even the 4k model from sony that same year did not use FALD so we cant really complain something was "omitted"). iirc in the last few years sony has only provided FALD in its largest flagship models (76 or 80') but those sets are 10.000$ + and much to big for normal home use for most of us

for the new 2016 models coming out from the main brands, i have only seen one interesting model so far, the Panasonic DX900. it is FALD set in both 58' and 65' sizes, and is the premium range of the panasonic lineup for 2016. (one of the early reviews of the euro version in here http://www.hdtvtest.co.uk/news/tx58d...1604174282.htm ).
and even as a premium 4k set it has some downsides:
- poor upscaling of SD to 4k (even midrange sony 4k sets do it much better)
- it doesnt use quantum dots or similar technology, instead using a red phosphor led which produces much less "clean" RGB primal colors ( see the spectrograph in this review: http://www.hdfever.fr/2016/03/14/tes...et-tx-65dx900/ )
- video motion flow is sony's strong point, panasonic might not be able to match it (from early review indications)
- poor latency in game mode
- the calibrated black level is 0.023 cd/m2 (in comparison our sony 900 black level is 0.049 cd/m2 )
- DCI-P3 coverage is 98% ( in comparison our sony kdl-w900a = 94.7% )
- peak brightness = 1310 cd/m2 (but it uses 350 watt in HDR mode !!
- panel refresh rate = 120Hz



still, panasonic produced high quality plasma sets for many years, it is good to see them making a major push for quality in lcd now they stopped producing plasma's. on pure image quality (going by the few initial reviews of this set that exist so far) this new dx900 outperforms the competing sony/samsung/lg's of the similar 55-65's screen size (because those brands either use edge dimming, or even ips panels from lg)

going by pure image quality for video/tv (and ruling out oled which is still 50% more expensive and still has other unresolved issues), i would say that this panasonic dx900/dx902 is probably the most interesting 55 to 60' set so far for 2016. lets hope this competition from panasonic pushes sony to incorporate FALD and other premium features in its 55/60' range in the future

i am still extremely happy with the 900a, and given that 65% of all FTA tv is still SD here, and their limited HD broadcasts are either 720p or 1080i, the only thing limiting end user video quality for me is their poor quality signal being sent out, NOT the fact we dont have 4k sets. easy confirmation for that is that when i watch a well mastered bluray with a high bit rate, the video on our 900a's is absolutely stunning.

one interesting observation on that same french website is their reporting that when testing their new 4k video players on 2k sets, the ultra high quality HDR source video from the 4k player (being downscaled and converted from rec.2020 to rec.709 for SD) produces a BETTER image on the 2k display then when using the same movie mastered in 2K (without HDR) from a normal bluray player. this makes sense because the 4k video stream on the UHD player is probably 50 mb/sec, versus 25-30 mb/sec for most "good" 2k video, and all the extra data is not just for the higher resolution. (http://www.hdfever.fr/2016/04/24/la-...-uhd-et-1080p/ ).

with the push to higher 4k resolutions and HDR, it is important to remember that more image improvement is gained by the wider color gamut, higher contrast, better peak/low brightness ratio, then just by the higher resolution. netflix for ex already indicated that when bandwidth drops for their 4k broadcasts, they prefer to reduce resolution to 2k then reduce the extra data used for video quality content (because consumers notice it less). so there might even be some trickle down effect for good current 2k sets even if we still delay upgrading while the technology and standards mature further
Excellent post Jorgens. But it's funny that back in the day when our W9's first hit that some were reporting that our sets had better black levels then some of the FALD sets of the day.
steve1971 is offline  
post #5604 of 5687 Old 04-25-2016, 09:03 PM
Member
 
Jorgens's Avatar
 
Join Date: Jun 2002
Posts: 152
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 36 Post(s)
Liked: 55
Quote:
Originally Posted by steve1971 View Post
..... it's funny that back in the day when our W9's first hit that some were reporting that our sets had better black levels then some of the FALD sets of the day.....
iirc there are 5 elements involved here,
- how black the panel can dim when it stops emitting any light,(this is oled's claim to fame, but i think this aspect is overhyped as a tech spec)
- the maximum brightness it can produce (this where the new HDR standard comes into play)
- the ratio between max bright and max dark/black = contrast ratio.
- and how small of a section of the screen it can specifically control this contrast ratio in: this is where FALD (using multiple sectors of backlighting in grids) is a big factor. oled has the biggest advantage here being able to control it very finely "per pixel"
- and how fine of a gradient of dark/light tones it can reproduce: oled/plasma/lcd technology, lcd panel type (ips/va) and quality, contrast ratio and FALD-vs-Edge all come into this.

a Cnet reviewer discusses backlighting technologies here:http://www.cnet.com/au/news/led-loca...ing-explained/
also interesting to reread the comparison of our 900a's with the previous years flagship model 2k sony HX950 (which didnt have quantum dots, but did have FALD)
- http://www.cnet.com/products/sony-xbr-hx950/
- http://www.cnet.com/au/products/sony-kdl-55w900a/2/

it is that slight difference in better shadow and dark scenery detail i am referring to that is improved with FALD. our 900a's have very good "deep black", but adding FALD would have added better control over light/dark contrast in different sections of the screen, and provided a finer/better shade of grey/black

one reason sony didnt use FALD might be the quantum dot technology used in 2013/2014, which are "strips" of quantum dots overlying the edge led backlight. not sure if the same company (using cadmium technology) ever produced this in sheets to overlay a FALD backlight.

also both Edge and FALD technology improves significantly over the years, so FALD from a few years ago might not be as "good" as the latest edge dimming

if i would have to choose between the better color reproduction of the 900a over the previous years FALD hx950, i would definitely choose our 900a, its just that i would like the best of both worlds in the one package and is it worth spending another 2500$ to get that last 5% of improvement ? nope ! but over the next years as technology improves further (and this new HDR standard is established, and it becoming mainstream in broadcasts ) FALD will be on my spec sheet to determine my next tv choice
helvetica bold and steve1971 like this.
Jorgens is offline  
post #5605 of 5687 Old 04-26-2016, 06:32 AM
Member
 
Jorgens's Avatar
 
Join Date: Jun 2002
Posts: 152
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 36 Post(s)
Liked: 55
looking at the specs for black levels and peak brightness for our 900a, i came across this interesting comparison with the then (2013 - 2014) best samsung, LG (led/lcd) and panasonic (led/lcd) models, and the very good panasonic zt60 plasma

also interesting to note the comparison with our big brother the 4k xbr-x900 model, they might have 4k but we have slightly better black levels . also pretty amazing to note the very similar peak brightness levels between those 2 models, , and the very similar black levels to the previous years 2k FALD model )






source:
http://televisions.reviewed.com/cont...ew/the-science
http://televisions.reviewed.com/cont...w/science-page
Attached Thumbnails
Click image for larger version

Name:	Sony-Bravia-W900A vs hx950-Contrast.jpg
Views:	268
Size:	33.7 KB
ID:	1405754   Click image for larger version

Name:	sony-bravia-x900a-contrast.jpg
Views:	265
Size:	40.6 KB
ID:	1405762  
steve1971 likes this.

Last edited by Jorgens; 04-26-2016 at 09:33 AM.
Jorgens is offline  
post #5606 of 5687 Old 04-26-2016, 07:38 AM
AVS Special Member
 
steve1971's Avatar
 
Join Date: Jul 2008
Location: Saint Paul, Minnesota.
Posts: 2,674
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 444 Post(s)
Liked: 408
Quote:
Originally Posted by Jorgens View Post
looking at the specs for black levels and peak brightness forur 900a, i came across this interesting comparison with the then (2013 - 2014) best samsung, LG (led/lcd) and panasonic (led/lcd) models, and the very good panasonic zt60 plasma

also interesting to note the comparison with our big brother the 4k xbr-x900 model, they might have 4k but we have slightly better black levels pretty amazing to note the very similar peak brightness levels, whicj btw are very similar to the 2014-2015 sony top 2k lcd model which was much acclaimed for its improves brightness (their then claim "3x better then most other lcd tv's" iirc ?)






source:
http://televisions.reviewed.com/cont...ew/the-science
http://televisions.reviewed.com/cont...w/science-page


Again both post's you made were spot on and this one just show's me how good our W9's really are and IMO still are even among today's top model's. I'll give HDR ect a few year's to work out the kink's before I even think of replacing my W9.
steve1971 is offline  
post #5607 of 5687 Old 04-27-2016, 05:40 AM
Member
 
Jorgens's Avatar
 
Join Date: Jun 2002
Posts: 152
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 36 Post(s)
Liked: 55
@ Didee et al,

can you plz check my logic here for any flaws

standard 1080 HD bluray video on the disc = YUV 4.2.0 @ 8 bit

the HD
1080p video technical standards allows for a YUV 4:4:4 format with 10 bit per color

our set w900a is confirmed to be able to accept and display unaltered a 1080p RGB (or YUV) 4.4.4 signal @ 8 bit (could it accept a 10 bit signal to ?)

1080p YUV @ 4.4.4 contains roughly 2x the chroma information of a YUV @ 4.2.0

Blu-ray UHD are encoded @ 2160p (4k) and use YUV 4:2:0 10 bit (can be with or without added HDR)

for the extra luma information of HDR a 10 bit (or higher) file format is required (which is why blu ray 4k HDR uses a 10 bit file format )

video @ 2160p (4k) YUV 4:2:0 10 bit can be converted to HD 1080p RGB 4.4.4 (or YUV 4.4.4) 10 bit (as evidenced here http://www.eoshd.com/2014/02/discove...80p-10bit-444/ ). . this type of conversion maintains a much higher chroma information in the 1080p video, compared to using the more limited bluray disc YUV 4.2.0 format.

info from another avscience poster recently: The new Panasonic UHD player has the ability to be set to a specific luma setting to match the peak output of the display being used (since HDR ability on current displays is so varied). he reported it has a a sliding scale setting of 1-10 (presumably representing 100-1000 nits) . So if for ex your display has a max luma of 360 cd/m2 (as ours does roughly) then you could use for ex a setting of 3.5 or 4, and then you would get more accurate luminance results in the video being displayed while maximizing your displays technical specs (rather then using the SDR luma of 100 cd/m2 by default). indications are you can do the same thing with the new Samsung UHD player by switching to User mode and adjusting the brightness output.

the downscaler in some UHD bluray players (like the samsung), are being reported by some reviewers as outputting
1080p YUV 4:4:4 10 bit.(maybe some HTPC software might be able to output RGB 4.4.4 @ 10 bit ?). with that setup, and when playing a HDR UHD bluray, they are reporting a major improvement in contrast (luma) and color when viewing the video on a good 1080p display (compared to the same movie released on a standard HD bluray using YUV 4:2:0 in 8bit ). see this interesting observation by a french reviewer of the new samsung UHD player with HDR movies http://www.hdfever.fr/2016/04/24/la-...ge-1/#comments

he is basically confirming that when converting HDR 4k to SDR 1080p it is possible to maintain a significantly improved video signal to the 1080p display, producing a much better end result (both in color and contrast)

if this is indeed correct, then indications are we still have many miles left in our w900a's for further improvements, and feeding it a good 4k video source can still give major improvements over using "standard 2k" video. some of these observations are already being confirmed by some of the other users with these new 4k video players, and it might in future even be possible to use similar "magic" from a 4k streaming service like netflix (but you might need some htpc with the correct software, or use a clever little converter box that takes the 4k signal and outputs it as RGB 4.4.4 @ 10 bit.)

if only we could figure out a way, or find a clever programmer to create for us, a HTPC software "feature" that can convert the 4k rec.2020 (or DCI-p3) color format to
x.v.Color (eg xvYCC, or Extended-gamut YCC) that our w900a's can cope with, and we'd even get the extended color gamut over the default rec.709 from HD.

either way, even with not having the 4k resolution, there are several further improvements we can squeeze out of our great 1080p tv while the 4k standards mature. we just need to fiddle with some more twiddly bits to get there . but by all indications, several of the new 4k players already potentially provide significant improvements "out of the box" when using a 4k HDR bluray.

anybody remember if our tv can cope with a 10 bit video signal over HDMI ?




helvetica bold likes this.

Last edited by Jorgens; 04-27-2016 at 05:54 AM.
Jorgens is offline  
post #5608 of 5687 Old 04-27-2016, 07:57 AM
AVS Special Member
 
steve1971's Avatar
 
Join Date: Jul 2008
Location: Saint Paul, Minnesota.
Posts: 2,674
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 444 Post(s)
Liked: 408
Quote:
Originally Posted by Jorgens View Post
@ Didee et al,

can you plz check my logic here for any flaws

standard 1080 HD bluray video on the disc = YUV 4.2.0 @ 8 bit

the HD
1080p video technical standards allows for a YUV 4:4:4 format with 10 bit per color

our set w900a is confirmed to be able to accept and display unaltered a 1080p RGB (or YUV) 4.4.4 signal @ 8 bit (could it accept a 10 bit signal to ?)

1080p YUV @ 4.4.4 contains roughly 2x the chroma information of a YUV @ 4.2.0

Blu-ray UHD are encoded @ 2160p (4k) and use YUV 4:2:0 10 bit (can be with or without added HDR)

for the extra luma information of HDR a 10 bit (or higher) file format is required (which is why blu ray 4k HDR uses a 10 bit file format )

video @ 2160p (4k) YUV 4:2:0 10 bit can be converted to HD 1080p RGB 4.4.4 (or YUV 4.4.4) 10 bit (as evidenced here http://www.eoshd.com/2014/02/discove...80p-10bit-444/ ). . this type of conversion maintains a much higher chroma information in the 1080p video, compared to using the more limited bluray disc YUV 4.2.0 format.

info from another avscience poster recently: The new Panasonic UHD player has the ability to be set to a specific luma setting to match the peak output of the display being used (since HDR ability on current displays is so varied). he reported it has a a sliding scale setting of 1-10 (presumably representing 100-1000 nits) . So if for ex your display has a max luma of 360 cd/m2 (as ours does roughly) then you could use for ex a setting of 3.5 or 4, and then you would get more accurate luminance results in the video being displayed while maximizing your displays technical specs (rather then using the SDR luma of 100 cd/m2 by default). indications are you can do the same thing with the new Samsung UHD player by switching to User mode and adjusting the brightness output.

the downscaler in some UHD bluray players (like the samsung), are being reported by some reviewers as outputting
1080p YUV 4:4:4 10 bit.(maybe some HTPC software might be able to output RGB 4.4.4 @ 10 bit ?). with that setup, and when playing a HDR UHD bluray, they are reporting a major improvement in contrast (luma) and color when viewing the video on a good 1080p display (compared to the same movie released on a standard HD bluray using YUV 4:2:0 in 8bit ). see this interesting observation by a french reviewer of the new samsung UHD player with HDR movies http://www.hdfever.fr/2016/04/24/la-...ge-1/#comments

he is basically confirming that when converting HDR 4k to SDR 1080p it is possible to maintain a significantly improved video signal to the 1080p display, producing a much better end result (both in color and contrast)

if this is indeed correct, then indications are we still have many miles left in our w900a's for further improvements, and feeding it a good 4k video source can still give major improvements over using "standard 2k" video. some of these observations are already being confirmed by some of the other users with these new 4k video players, and it might in future even be possible to use similar "magic" from a 4k streaming service like netflix (but you might need some htpc with the correct software, or use a clever little converter box that takes the 4k signal and outputs it as RGB 4.4.4 @ 10 bit.)

if only we could figure out a way, or find a clever programmer to create for us, a HTPC software "feature" that can convert the 4k rec.2020 (or DCI-p3) color format to
x.v.Color (eg xvYCC, or Extended-gamut YCC) that our w900a's can cope with, and we'd even get the extended color gamut over the default rec.709 from HD.

either way, even with not having the 4k resolution, there are several further improvements we can squeeze out of our great 1080p tv while the 4k standards mature. we just need to fiddle with some more twiddly bits to get there . but by all indications, several of the new 4k players already potentially provide significant improvements "out of the box" when using a 4k HDR bluray.

anybody remember if our tv can cope with a 10 bit video signal over HDMI ?





Great post Jorgen's. I will say this and that's that if I have to go out and buy a 4K Blu ray player to get max potential out of my W9 then so be it I will do it. Reason is is that I dont believe we have seen our W9's full potential and what it can really do PQ wise and if a 4K Blu ray player can help then I'm all for it. Now here's something I saw yet again while watching a Blu ray movie the other night. I hit the tv's Display button and it was saying "10 Bit" while the movie was playing. Which make's me wonder. Is our W9's panel 8 bit or 10 bit? Or is it really 8 bit but can accept 10 bit? I'm confused so maybe Didee can help answer this question along with the questions you posed. But again very good post buddy!!
steve1971 is offline  
post #5609 of 5687 Old 04-27-2016, 09:00 AM
Member
 
Jorgens's Avatar
 
Join Date: Jun 2002
Posts: 152
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 36 Post(s)
Liked: 55
when i use my w900a as a htpc display, it often flashes "12 bit" on the screen when i switch to that hdmi port. i think it simply reports what it is being told by the connection on that port (my amd driver is set to 12 bit), but i dont think it is a reliable confirmation.

from everything i have read, and tested with video and image samples in the last yr or so, i believe we do have a true 10 bit panel. for ex "monitor asset manager" (a windows hardware utility designed to analyse displays) reports it as 10 bit, and i can display test video and images specifically designed to differentiate 8 and 10 bit signals (if for ex we had an 8 bit panel some of those test images/video would show "banding" because the hardware of the display cant accurately display the full smooth grey color gradient.)

there is however always the possibility for it being an 8 bit + dithering = 10 bit, but i dont believe so. there are also possible errors in the htpc software (like madvr or media player) somehow adding dithering (where it tries to mimic/approximate what a higher bit rate panel/software would show)

the reason it is important for us (as w900a owners) is that the extra luma in the HDR video file needs that 10 bit format.

another uncertainty is that i have read no confirmation that the "matching of max brightness from the UHD player with the brightness ability of the display" still works with a down-sampled video output from 4k to 2k (this was mentioned by another avscience poster as a way to still get some of the HDR info to be displayed on a UHD SDR display). the good news from the observation from the french reviewers website however seems to indicate this might be the case for HD sets as well, because his main observation was the much increased contrast and better detail level in black/dark scenes on a 1080p display.

all indications are that we have several good prospects for significantly further improving the video we see on our 900a's, and the current real limitation has more to do with poor quality source files (for ex SD from FTA, or purposefully "quality limited" 2k bluray disks/players ) and not yet having the luxury of testing down sampled 4k video
steve1971 likes this.

Last edited by Jorgens; 04-27-2016 at 09:04 AM.
Jorgens is offline  
post #5610 of 5687 Old 04-27-2016, 09:39 AM
AVS Special Member
 
steve1971's Avatar
 
Join Date: Jul 2008
Location: Saint Paul, Minnesota.
Posts: 2,674
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 444 Post(s)
Liked: 408
Quote:
Originally Posted by Jorgens View Post
when i use my w900a as a htpc display, it often flashes "12 bit" on the screen when i switch to that hdmi port. i think it simply reports what it is being told by the connection on that port (my amd driver is set to 12 bit), but i dont think it is a reliable confirmation.

from everything i have read, and tested with video and image samples in the last yr or so, i believe we do have a true 10 bit panel. for ex "monitor asset manager" (a windows hardware utility designed to analyse displays) reports it as 10 bit, and i can display test video and images specifically designed to differentiate 8 and 10 bit signals (if for ex we had an 8 bit panel some of those test images/video would show "banding" because the hardware of the display cant accurately display the full smooth grey color gradient.)

there is however always the possibility for it being an 8 bit + dithering = 10 bit, but i dont believe so. there are also possible errors in the htpc software (like madvr or media player) somehow adding dithering (where it tries to mimic/approximate what a higher bit rate panel/software would show)

the reason it is important for us (as w900a owners) is that the extra luma in the HDR video file needs that 10 bit format.

another uncertainty is that i have read no confirmation that the "matching of max brightness from the UHD player with the brightness ability of the display" still works with a down-sampled video output from 4k to 2k (this was mentioned by another avscience poster as a way to still get some of the HDR info to be displayed on a UHD SDR display). the good news from the observation from the french reviewers website however seems to indicate this might be the case for HD sets as well, because his main observation was the much increased contrast and better detail level in black/dark scenes on a 1080p display.

all indications are that we have several good prospects for significantly further improving the video we see on our 900a's, and the current real limitation has more to do with poor quality source files (for ex SD from FTA, or purposefully "quality limited" 2k bluray disks/players ) and not yet having the luxury of testing down sampled 4k video

Jorgen's I just sent an email to Sony Support asking them if our W9's have a 10bit panel or an 8bit panel. If I dont hear from them I will give their tech support a call and once I hear something I will post it.
steve1971 is offline  
Sponsored Links
Advertisement
 
Reply LCD Flat Panel Displays

Tags
Sony , Sony Bravia 55 Inch Led Tv 3d Support Bravia Hx850 Kdl 55hx850 , Sony Xbr 55hx950 Bravia Led Hdtv , Sony Kdl 55w900a 55 Inch Led Hdtv
Gear in this thread



Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off