or Connect
AVS › AVS Forum › Display Devices › LCD Flat Panel Displays › Official Sony KDL-55W900A Owners Thread
New Posts  All Forums:Forum Nav:

Official Sony KDL-55W900A Owners Thread - Page 32

post #931 of 4377
Quote:
Originally Posted by brentsg View Post

Those kinds of posts are so disheartening. I quit reading the forum years ago due to all the back and forth FUD regarding TV technology. Unfortunately I now need another display, so I'm reviewing some owner's threads.

Years later and it's still people going back and forth with the same old tired arguments. I just can't wait to get another display purchased so I can look the other way again.

Well if it helps, I finally decided the w900a is staying. The picture quality is not as good as a plasma, but I bought this tv for gaming and it does it extremely well. And please don't think the w900a picture quality is horrible, because it really is spectacular. Im just so used to plasma black levels that I get frustrated when I pay this amount of money and get dark gray. It is nice having those pure bright whites though. If you are looking for pure picture quality plasma is still king. Basically look at it this way: if you want an extremely bright and punchy image with vibrant colors and digital sharpness, then the w900a is for you. If you want a natural picture and are not viewing in a bright room, then plasma is for you. Obviously there is more to it, but thats the gist of it.
post #932 of 4377
Quote:
Originally Posted by fishairflow View Post

Quote:
Originally Posted by tgm1024 View Post

No, not that it's delayed---if true that would be easily verifiable based any early date they may have given.

I wanted to know where you got the sense that it was because Sony was "afraid that people will not notice much different in the 55W900a and the overpriced 4K TVs which will impact sale of the 4k sets". To me that part doesn't follow and I was wondering if it was speculation on your part or if this was industry news or similar. That's all.

EDIT: helvetica bold corrected me on this in a later post. It was not he who said this, it was malcolmp6. Sorry helvetica bold!

Not that I am an expert on videos and I am certainly not a videophile by any strech of imagination - still - I compared both 4K and W900 side by side - with same movie (Avengers) and I could not tell the difference. Now I understand that 4K content looks amazing but sony has a long list of technological advances which failed to come to fruition - and therefore I am going to ignore the 4K for now.

 

You compared the w900A(2K) to the X900A(4K) both running a 2K movie Blu-Ray?  You're likely looking at simple up-convert, which for all we know is limited to Nearest Neighbor.  (1 pix -> 2x2 pixels).

post #933 of 4377
Quote:
Originally Posted by tgm1024 View Post

You compared the w900A(2K) to the X900A(4K) both running a 2K movie Blu-Ray?  You're likely looking at simple up-convert, which for all we know is limited to Nearest Neighbor.  (1 pix -> 2x2 pixels).

I saw a very informative video on the disaster 4k has in store for the majority of content. If I find it again il post it but basically a 1080p movie will look horrible on a 4k display because of the inability to scale it even remotely correct. Not even a super high end scaler would solve the problem, its just the way it maps out.
post #934 of 4377
Quote:
Originally Posted by *UFO* View Post

Well if it helps, I finally decided the w900a is staying. The picture quality is not as good as a plasma, but I bought this tv for gaming and it does it extremely well. And please don't think the w900a picture quality is horrible, because it really is spectacular. Im just so used to plasma black levels that I get frustrated when I pay this amount of money and get dark gray. It is nice having those pure bright whites though. If you are looking for pure picture quality plasma is still king. Basically look at it this way: if you want an extremely bright and punchy image with vibrant colors and digital sharpness, then the w900a is for you. If you want a natural picture and are not viewing in a bright room, then plasma is for you. Obviously there is more to it, but thats the gist of it.

Dude...you are all over the place.
post #935 of 4377
Quote:
Originally Posted by forzanerazzurri View Post

Dude...you are all over the place.

Huh? I was going to return it then decided to keep it. Hows that "all over the place"? I still stand by the fact that plasma is better and I hate knowing i paid $2400 for inferior PQ to a larger plasma, but it was purchased as a gaming set and it does that excellently.
post #936 of 4377
Quote:
Originally Posted by forzanerazzurri View Post

Quote:
Originally Posted by *UFO* View Post

Well if it helps, I finally decided the w900a is staying. The picture quality is not as good as a plasma, but I bought this tv for gaming and it does it extremely well. And please don't think the w900a picture quality is horrible, because it really is spectacular. Im just so used to plasma black levels that I get frustrated when I pay this amount of money and get dark gray. It is nice having those pure bright whites though. If you are looking for pure picture quality plasma is still king. Basically look at it this way: if you want an extremely bright and punchy image with vibrant colors and digital sharpness, then the w900a is for you. If you want a natural picture and are not viewing in a bright room, then plasma is for you. Obviously there is more to it, but thats the gist of it.

Dude...you are all over the place.


How is he all over the place when all he was going to do is return his W900a but decided to keep it?
Quote:
Originally Posted by *UFO* View Post

Quote:
Originally Posted by forzanerazzurri View Post

Dude...you are all over the place.

Huh? I was going to return it then decided to keep it. Hows that "all over the place"? I still stand by the fact that plasma is better and I hate knowing i paid $2400 for inferior PQ to a larger plasma, but it was purchased as a gaming set and it does that excellently.


UFO I like the fact that your standing by your belief that Plasma is better in fact after viewing the new Panasonic ZT Plasmas I would have to agree with you....at least a little bit. But I will still stick with my HX850 no matter what.
post #937 of 4377
I played some Dirt 2 on the W9 this weekend and again the PQ blew me away. When gaming it never fails to impress.
I do think the W9 can reproduce color as accurately as Plasma.

Oh and it ain't too shabby watching movies on it. wink.gif
post #938 of 4377
Quote:
Originally Posted by *UFO* View Post

Cutwolf, this is not trolling. Its well known that plasma has better picture quality. Yes LED-LCD has its place, but even for gaming plasma is better. I have had a TC-P50U50 and a TC-P60U50 since they came out and have no burn in, and I have gamed on them, which is another thing. The input lag on the U50 is lower then the w900a. When using the HDMI splitter it was noticeable that the U50 was faster in handling the input. Everyone has their opinion I am just stating mine. I feel ripped off when I paid over double for a 55" that cant stand up to a 60" in picture quality that was half the cost. My plasmas have been on 9-12 hours a day with not a single problem, but they were properly broken in.

You should check out this link. I believe you're mistaken regarding UT50 having less lag than the W9,
Its the opposite double that of the W9.

http://www.hdtvtest.co.uk/news/input-lag
post #939 of 4377
Quote:
Originally Posted by helvetica bold View Post

You should check out this link. I believe you're mistaken regarding UT50 having less lag than the W9,
Its the opposite double that of the W9.

http://www.hdtvtest.co.uk/news/input-lag

They are testing the UT50 which is the 3d version of the U50 that I have. The U50 is even faster then the UT50 which is already pretty fast. The w900a is around 18ms. Also, tested side-by-side with a HDMI splitter you can tell the U50 is faster. Even a friend who is not into tech at all noticed it very slightly. Even in FPS shooters the difference will not matter. The fact that the w900a can reach almost panasonic plasma response is amazing because every gamer knows that the panasonic plasmas are the holy grail of input lag. Well, used to be that is. biggrin.gif
post #940 of 4377
Quote:
Originally Posted by steve1971 View Post

UFO I like the fact that your standing by your belief that Plasma is better in fact after viewing the new Panasonic ZT Plasmas I would have to agree with you....at least a little bit. But I will still stick with my HX850 no matter what.

 

I'm an LCD guy, but only because I'm spooked by the brightness, IR, BI, and buzzing of plasma's that still bite a relatively few people.

 

Most of the heavy hitting calibrators and industry guys in the "flat panel general and new fp tech" are adamant about the plasma PQ with good reason.  Further, much of what is said about them has to do with motion processing, and I've seen tons of LCD's and so far the only one that came close to beating the plasmas I've seen was the Sony XBR 950.  It was so close I couldn't quite tell.

 

In any case, this lcd vs. plasma war will rage on for quite some time, and God I hope we don't see any more childish retorts here about people somehow NOT being able to say they chose plasma instead.

post #941 of 4377
Quote:
Originally Posted by *UFO* View Post

Quote:
Originally Posted by tgm1024 View Post

You compared the w900A(2K) to the X900A(4K) both running a 2K movie Blu-Ray?  You're likely looking at simple up-convert, which for all we know is limited to Nearest Neighbor.  (1 pix -> 2x2 pixels).

I saw a very informative video on the disaster 4k has in store for the majority of content. If I find it again il post it but basically a 1080p movie will look horrible on a 4k display because of the inability to scale it even remotely correct. Not even a super high end scaler would solve the problem, its just the way it maps out.

 

Sorry, but that's completely completely incorrect.  I have no idea where you got that, but it wasn't any place reputable.

 

First, the point was that you were comparing two TVs each with 2K content.  So you're not going to see a slam-dunk 4K effect from one of them.  So your side by side test was not giving you what you thought.

 

Secondly, (and this is important): Upscaling 2K to a 3840x2160 (UHD) display using nearest neighbor sampling is as clean an up-convert as you can get.  Do you know what that is?  For every incoming 2K pixel, it's duplicated to a 2x2 box.  How is this an advantage over 2K?  Technically, it isn't, except that there's a thing about the grid between the pixels actually being thinner.  This actually does result in a superior display, but not by much.

 

You don't go backwards in quality with UHD 4K.

 

And that's just NN.  If they employ edge detection algorithms, an upconvert would dramatically improve things.

 

However, If the display were something less amenable to scaling (say the DCI "4K" which was/is 4096x2160 or grosser stuff like 4096x3112 and others), then you can get immediate artifacts.  But those are not UHD, a consumer television format.  Nearest-neighbor (1 --> 2x2) is as clean as you can get.  Double check your sources.


Edited by tgm1024 - 7/29/13 at 12:00pm
post #942 of 4377
Quote:
Originally Posted by tgm1024 View Post

Sorry, but that's completely completely incorrect.  I have no idea where you got that, but it wasn't any place reputable.

First, the point was that you were comparing two TVs each with 2K content.  So you're not going to see a slam-dunk 4K effect from one of them.  So your side by side test was not giving you what you thought.

Secondly, (and this is important): Upscaling 2K to a 3840x2160 (UHD) display using nearest neighbor sampling is as clean an up-convert as you can get.  Do you know what that is?  For every incoming 2K pixel, it's duplicated to a 2x2 box.  How is this an advantage over 2K?  Technically, it isn't, except that there's a thing about the grid between the pixels actually being thinner.  This actually does result in a superior display, but not by much.

You don't go backwards in quality with UHD 4K.

And that's just NN.  If they employ edge detection algorithms, an upconvert would dramatically improve things.

However, If the display were something less amenable to scaling (say the DCI "4K" which was/is 4096x2160 or grosser stuff like 4096x3112 and others), then you can get immediate artifacts.  But those are not UHD, a consumer television format.  Nearest-neighbor (1 --> 2x2) is as clean as you can get.  Double check your sources.

Sorry I meant true 4k (4096 × 2160) down to consumer 4k. When they master it they either have to cut a small part of the sides of the film or convert it down.
post #943 of 4377
My Leo Bodnar input lag tester came today and I ran some tests on my w900a. What I found was that game-standard and game-orignal had the exact same amount of input lag, which was consistently 20.1 ms. When I turned on impulse mode, it went up to 30.2 ms, which is still fantastic. Changing things like live color, smooth gradation, detail enhancer, and all other settings made no difference in input lag! This is exciting because these settings really add to the picture quality. When I set scene select to general and selected "standard" with motion flow on "standard", it consistently read 99.3ms eek.gif If anyone would like me to test other settings and their effect on input let me know and I will! cool.gif
post #944 of 4377
Quote:
Originally Posted by tgm1024 View Post

I'm an LCD guy, but only because I'm spooked by the brightness, IR, BI, and buzzing of plasma's that still bite a relatively few people.

Most of the heavy hitting calibrators and industry guys in the "flat panel general and new fp tech" are adamant about the plasma PQ with good reason.  Further, much of what is said about them has to do with motion processing, and I've seen tons of LCD's and so far the only one that came close to beating the plasmas I've seen was the Sony XBR 950.  It was so close I couldn't quite tell.

In any case, this lcd vs. plasma war will rage on for quite some time, and God I hope we don't see any more childish retorts here about people somehow NOT being able to say they chose plasma instead.

What's holding me back on buying a plasma is actually their active 3D (I prefer passive) and concerns about games at lower frame rates. I haven't seen a plasma play games, of course, but I've heard their sub-field drives make games that run at a lower frame rate look full of ghostly double images. LCD/LEDs hide that due to their inherent motion blur.

Do games really look that bad in those situations?
post #945 of 4377
Quote:
Originally Posted by *UFO* View Post

My Leo Bodnar input lag tester came today and I ran some tests on my w900a. What I found was that game-standard and game-orignal had the exact same amount of input lag, which was consistently 20.1 ms. When I turned on impulse mode, it went up to 30.2 ms, which is still fantastic. Changing things like live color, smooth gradation, detail enhancer, and all other settings made no difference in input lag! This is exciting because these settings really add to the picture quality. When I set scene select to general and selected "standard" with motion flow on "standard", it consistently read 99.3ms eek.gif If anyone would like me to test other settings and their effect on input let me know and I will! cool.gif
I would love to see what Cinema 1 numbers look like. You can use my settings in my signature as a template. Thanks.
post #946 of 4377
Quote:
Originally Posted by *UFO* View Post

My Leo Bodnar input lag tester came today and I ran some tests on my w900a. What I found was that game-standard and game-orignal had the exact same amount of input lag, which was consistently 20.1 ms. When I turned on impulse mode, it went up to 30.2 ms, which is still fantastic. Changing things like live color, smooth gradation, detail enhancer, and all other settings made no difference in input lag! This is exciting because these settings really add to the picture quality. When I set scene select to general and selected "standard" with motion flow on "standard", it consistently read 99.3ms eek.gif If anyone would like me to test other settings and their effect on input let me know and I will! cool.gif

Could you try Game mode wirh LED dynamic control: off, low and standard? Thanks!!!
post #947 of 4377
Quote:
Originally Posted by xitman View Post

I would love to see what Cinema 1 numbers look like. You can use my settings in my signature as a template. Thanks.

With your settings it reads 94.6ms
post #948 of 4377
Quote:
Originally Posted by EarlHarrison View Post

Could you try Game mode wirh LED dynamic control: off, low and standard? Thanks!!!

On game mode with LED dynamic control on standard or low it is 20.1ms, and with it off its 19.1ms so id say its worth it to have it on!
post #949 of 4377
Quote:
Originally Posted by *UFO* View Post

On game mode with LED dynamic control on standard or low it is 20.1ms, and with it off its 19.1ms so id say its worth it to have it on!

Thank you!!! Perfect, I have in low and makes the black very deep!!!
post #950 of 4377
Doing more testing I found a mode thats even faster than game mode. If you go to scene select and choose "Graphics", the default settings give you a blistering 17.4ms eek.gif One could easily use this as a monitor. Amazing!
post #951 of 4377
If you can test this settings, would be amazing, are the Steve Wither (http://www.avforums.com/forums/lcd-led-lcd-tvs/1786198-sony-kdl-40w905a-reviewers-recommended-best-settings.html) but for gamming with some options off.

Scene Select: Game
Picture Mode: Game-Original
Backlight: 6 (Make sure you turn the Light Sensor off in System Settings/Eco)
Contrast: 80
Brightness: 49
Colour: 50
Hue: 0
Colour Temperature: Warm 2
Sharpness: 50
Noise Reduction: Off
MPEG Noise Reduction: Off
Dot Noise Reduction: Off
Reality Creation: Off
Smooth Gradation: Off
Motionflow: Off
Film Mode: Off
Advanced Settings
Black Corrector: Off
Adaptive Contrast Enhancer: Off
LED Dynamic Control: Low
Auto Light Limiter: Off
Clear White: Off
Live Colour: Off
White Balance: Red Gain -7, Green Gain -8, Blue Gain 0, Red Cutoff 0, Green Cutoff 0, Blue Cutoff 0
Detail Enhancer: Off
Edge Enhancer: Off

Thanks!!!!!!!
post #952 of 4377
Quote:
Originally Posted by EarlHarrison View Post

If you can test this settings, would be amazing, are the Steve Wither (http://www.avforums.com/forums/lcd-led-lcd-tvs/1786198-sony-kdl-40w905a-reviewers-recommended-best-settings.html) but for gamming with some options off.

Scene Select: Game
Picture Mode: Game-Original
Backlight: 6 (Make sure you turn the Light Sensor off in System Settings/Eco)
Contrast: 80
Brightness: 49
Colour: 50
Hue: 0
Colour Temperature: Warm 2
Sharpness: 50
Noise Reduction: Off
MPEG Noise Reduction: Off
Dot Noise Reduction: Off
Reality Creation: Off
Smooth Gradation: Off
Motionflow: Off
Film Mode: Off
Advanced Settings
Black Corrector: Off
Adaptive Contrast Enhancer: Off
LED Dynamic Control: Low
Auto Light Limiter: Off
Clear White: Off
Live Colour: Off
White Balance: Red Gain -7, Green Gain -8, Blue Gain 0, Red Cutoff 0, Green Cutoff 0, Blue Cutoff 0
Detail Enhancer: Off
Edge Enhancer: Off

Thanks!!!!!!!

Still reads 20.1ms. The only settings that change input lag in game mode are motion flow and LED dynamic control.
post #953 of 4377
Great! Thanks
post #954 of 4377
Quote:
Originally Posted by *UFO* View Post

My Leo Bodnar input lag tester came today and I ran some tests on my w900a. What I found was that game-standard and game-orignal had the exact same amount of input lag, which was consistently 20.1 ms. When I turned on impulse mode, it went up to 30.2 ms, which is still fantastic.
This does not occur on all displays, but I have noticed input lag can go up/down by almost millisecond when you change brightness on LightBoost. This is because of the leading edge of the backlight pulse being delayed. Can you try the lag tester at different brightness levels, with and without Motionflow? You may see no change with non-Motionflow, but you might see a minor change with different brightness levels of Motionflow Impulse.

Yes Sony's Motionflow Impulse is indeed fantastic for 60fps video game use, especially people using HTPC. It actually approximately quadruples motion resolution (e.g. where there was 10 pixels of motion blur, you now see only 2.5 pixels of motion blur), which is a huge improvement in gaming motion resolution during 60fps@60Hz). That's a massive jump in computer/gaming motion clarity. The better-than-plasma motion clarity is especially revealing when you drag windows around on the Windows Desktop on a gaming HTPC, with Impulse enabled.

The low-lag interpolation-free Motionflow Impulse is the first time people can finally play video games on an LCD television *and* get better motion resolution than plasma *and* without the excess input lag of interpolation (~100ms).
Edited by Mark Rejhon - 7/30/13 at 1:36pm
post #955 of 4377
Quote:
Originally Posted by Mark Rejhon View Post

This does not occur on all displays, but I have noticed input lag can go up/down by almost millisecond when you change brightness on LightBoost. This is because of the leading edge of the backlight pulse being delayed. Can you try the lag tester at different brightness levels, with and without Motionflow? You may see no change with non-Motionflow, but you might see a minor change with different brightness levels of Motionflow Impulse.

Yes Sony's Motionflow Impulse is indeed fantastic for 60fps video game use, especially people using HTPC. It actually approximately quadruples motion resolution (e.g. where there was 10 pixels of motion blur, you now see only 2.5 pixels of motion blur), which is a huge improvement in gaming motion resolution during 60fps@60Hz). That's a massive jump in computer/gaming motion clarity. The better-than-plasma motion clarity is especially revealing when you drag windows around on the Windows Desktop on a gaming HTPC, with Impulse enabled.

The low-lag interpolation-free Motionflow Impulse is the first time people can finally play video games on an LCD television *and* get better motion resolution than plasma *and* without the excess input lag of interpolation (~100ms).

Lag is consistently 20.1ms no matter what the brightness is without motion flow and 30.2ms consistently no matter what brightness with it on impulse. Sony obviously knew what they were doing with game mode on this set as they made it so that no matter what settings you do in game mode you can't get past 30.2ms. I had dimming on in that test so im going to try impulse mode on with dynamic LED dimming off. Id expect it to be 29.2ms.
post #956 of 4377
Tested all scene select modes so here is a table of the input lags for each one:

Auto (default settings): 94.6ms
Auto 24p Sync (default settings: 94.6ms
General (default settings): 94.6ms
Cinema (default settings): 94.6ms
Sports (default settings): 94.6ms
Music (default settings): 94.6ms
Animation (default settings): 94.6ms
Photo (default settings): 77.4ms
Game (default settings): 19.1ms
Graphics (default settings): 17.4ms

These tests were done simply by going to scene select and testing with the settings that you would get out of the box.
post #957 of 4377
How low do you guys think the price of this set will go by the end of the year? Could it possibly be under $1500?
post #958 of 4377
Quote:
Originally Posted by SalvatoreOH View Post

How low do you guys think the price of this set will go by the end of the year? Could it possibly be under $1500?

Highly doubt it. It will most likely remain at $2299.99 until they are gone. Sony makes a set number of sets and sells them until they are gone. From what local retailers have told me, these are selling like hot cakes. They were recently out of stock completely and each store received just 2 more.

Edit: just checked the bestbuy stores in my area and within a 50 mile radius they are sold out again. Also, the return rate has been zero. Compare that to samsung who last time I checked had 4 open box sets on the floor.
Edited by *UFO* - 7/30/13 at 8:27pm
post #959 of 4377
Does anyone know which dynamic LED setting is more aggressive? I assume its standard?
post #960 of 4377
Quote:
Originally Posted by *UFO* View Post

Tested all scene select modes so here is a table of the input lags for each one:
Just to be clear -- which square are you measuring in Leo Bodnar? Top, center, bottom?
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: LCD Flat Panel Displays
AVS › AVS Forum › Display Devices › LCD Flat Panel Displays › Official Sony KDL-55W900A Owners Thread