AVS Forum banner
Status
Not open for further replies.
1 - 20 of 31 Posts

·
Registered
Joined
·
4,155 Posts
Discussion Starter · #1 ·
I've been sweating it lately trying to find the right upconverting/HDMI player for my SONY LCD RPTV (42WE655). My reasoning was that it makes good engineering sense to keep the entire video signal in the digital domain from disk to display. But I just remembered that the Sony converts the HDMI signal to analog before digitizing it for the native LCD pixel array (now does that make sense???).


So the question is - does it even make sense to bother with an HDMI capable player?


I'd just get one of the cheap HDMI ones and simply give it a try if I could find one that had the combination of features that I want - but I've found fault with every last one so far - so now I'm thinking that if the HDMI isn't a benefit for my display, I might as well buy a non-HDMI that has the features that I want.


Ed
 

·
Registered
Joined
·
540 Posts
Quote:
Originally Posted by ekb
But I just remembered that the Sony converts the HDMI signal to analog before digitizing it for the native LCD pixel array (now does that make sense???).
Proves so much more that there are idiots everywere.
 

·
Registered
Joined
·
6,402 Posts
Quote:
But I just remembered that the Sony converts the HDMI signal to analog before digitizing it for the native LCD pixel array (now does that make sense???)
That does not make sense. I would be very surprised if Sony actually did that (regardless of what their manual or advertising copy says). The HDMI signal may be scaled from the HDMI 720p or 1080i signal that it is getting, but that is a digital process and not analog.


If the 42WE655 is receiving an analog signal (composite, S-Video, component or VGA) then it converts that to digital, before it does any scaling or de-interlacing. The HDMI signal should bypass the A to D conversion and be scaled (720p) and/or de-interlaced (1080i) to its native resolution.


htpcfan -- I think the idiots are in the press room. No self respecting engineer :rolleyes: would design a digital display device that way.
 

·
Registered
Joined
·
540 Posts
Quote:
Originally Posted by CT_Wiebe
htpcfan -- I think the idiots are in the press room. No self respecting engineer :rolleyes: would design a digital display device that way.
Well the engineers are often not the ones who make the decisions, instead the MBAs in their infinite wisdom do. :)
 

·
Registered
Joined
·
4,155 Posts
Discussion Starter · #5 ·
Quote:
Originally Posted by CT_Wiebe
That does not make sense. I would be very surprised if Sony actually did that (regardless of what their manual or advertising copy says). The HDMI signal may be scaled from the HDMI 720p or 1080i signal that it is getting, but that is a digital process and not analog.


If the 42WE655 is receiving an analog signal (composite, S-Video, component or VGA) then it converts that to digital, before it does any scaling or de-interlacing. The HDMI signal should bypass the A to D conversion and be scaled (720p) and/or de-interlaced (1080i) to its native resolution.


htpcfan -- I think the idiots are in the press room. No self respecting engineer :rolleyes: would design a digital display device that way.
I hear what you're saying - that it doesn't make sense and that you seriously doubt it - but on the contrary, I believe that it's a well established fact that it does go thru D -> A -> D conversion. Maybe I'll poke around the TV forum or start a thread there to see what they say.


Ed
 

·
Premium Member
Joined
·
4,054 Posts
The signal sent to the actual screen has got to be digital all the way. It would make no sense for the set to convert twice a digital signal in a digital set. You have a better chance of convincing other with a schematic or another technology like CRT than with an all digital display.
 

·
Moderator
Joined
·
23,031 Posts
Quote:
Originally Posted by htpcfan
Well the engineers are often not the ones who make the decisions, instead the MBAs in their infinite wisdom do. :)
I doubt very much that the original LCD Wega was designed with a DVI input from day one. I'd bet that the DVI input was put in after the rest of the circuits and layout were designed. Most likely the simplest way to fit it in the circuitry was to convert the input to analog and treat it just like the other inputs. My guess is that the "input board" only had analog outputs since that's all they previously had. DVI was probably a last minute addon for HDCP compatibility. Notice later models remedied this. UMR has schematics for the TVs and may have more insight. And, I could be completely wrong in my hypothesis, but engineering isn't always about doing things optimally with no regard to cost and/or schedules.


larry
 

·
Registered
Joined
·
78 Posts
I have no idea about this particular set, but theoretically, you are should keep the signal digital for as much of the pipeline as possible. So even if the set does D-A-D, you're still better off using HDMI. In practice, this may not be the case, though. Scaling is all done digitally, but every time you scale, you can lose information, and not all scalers are created equal.


Read this very informative sticky:
http://www.avsforum.com/avs-vb/showthread.php?t=477740


Bottom line is that your mileage may and will vary. You can get very good results from component video if you have good (and not necessarily expensive) cables and a good scaler in your TV. As a general rule, you will usually see better results through HDMI/DVI, but it will always depend on your player, your TV, and most importantly, your eyes. Test it out at the store on your TV model if they have it, or buy from somewhere that has a liberal exchange policy until you find the right one. Happy hunting!
 

·
Registered
Joined
·
7,958 Posts
Like Mr Scooper above - I suspect that if the DVI/HDMI input was an "afterthought", or an upgrade to an existing design that only had analogue inputs, then the HDMI input stage may well decode to analogue, to feed into the rest of the TV.


If the main bulk of the TV has not been designed for digital connectivity - and the front-end for the video processing is analogue only (it may well be that the A/D and scaler are integrated into a single lump of silicon, with no digital inputs available?, or just that re-designing the input circuitry wasn't cost effective?) - then it may well have made sense for Sony to convert HDMI to analogue.


There IS a major reason to have HDMI or HDCP+DVI though - as it will allow HD connectivity to HD-DVD and BluRay, where analogue component may not.


(In someways this is similar to a UK DVD recorder on sale with an integrated standard def digital TV receiver. Even though the digital TV broadcast is in MPEG2, and a form that could be burned reasonably directly to DVD, the MPEG2 is actually decoded to base-band digital video, and then re-encoded to MPEG2. This reduces the picture quality - though it does allow you to chose a new recording quality, and know how much space a recording will take I guess. Not quite the same - but sometimes the best "engineering quality" solution isn't always the one chosen)
 

·
Registered
Joined
·
4,155 Posts
Discussion Starter · #10 ·
Quote:
Originally Posted by PooperScooper
I doubt very much that the original LCD Wega was designed with a DVI input from day one. I'd bet that the DVI input was put in after the rest of the circuits and layout were designed. Most likely the simplest way to fit it in the circuitry was to convert the input to analog and treat it just like the other inputs. My guess is that the "input board" only had analog outputs since that's all they previously had. DVI was probably a last minute addon for HDCP compatibility. Notice later models remedied this. UMR has schematics for the TVs and may have more insight. And, I could be completely wrong in my hypothesis, but engineering isn't always about doing things optimally with no regard to cost and/or schedules.


larry
I believe that you are correct. The previous year model (42WE610) is almost identical and I believe that it did not have the HDMI input.


Also, even though the native display is (?)x768 pixels, the HDMI input only accepts 720p, 1080i and 480i/p. Although not proof, one would think that if it was all digital, it would include the 768 input which would require no scaling.


Ed
 

·
Registered
Joined
·
4,155 Posts
Discussion Starter · #11 ·
Quote:
Originally Posted by ogbuehi
The signal sent to the actual screen has got to be digital all the way.
No it doesn't. It can be converted back and forth over and over. It just needs to be converted one final time to digital for display.

Quote:
It would make no sense for the set to convert twice a digital signal in a digital set.
Agreed - but there are reasons as discussed here.


Ed
 

·
Registered
Joined
·
15,826 Posts
One of the members checked the schematics and wrote a few months ago that the Sony LCDs definitely convert the digital HDMI signal to analog.
 

·
Registered
Joined
·
4,155 Posts
Discussion Starter · #13 ·
Quote:
Originally Posted by BillP
One of the members checked the schematics and wrote a few months ago that the Sony LCDs definitely convert the digital HDMI signal to analog.
I also started a thread over in the rear proj TV forum asking about this. The answers are comming back that indeed the Sonys do convert to analog. The supporting evidence is claimed to be examination of schematics.


So now, even thought there still is a weak argument that an HDMI connection between a DVD player and my TV is better than an analog connection - I largely have lost interest in an upconverting/HDMI player. It seems like the big advantage of keeping the entire signal path in the digital domain is gone.


Also on a related note - I wonder whether the TVs that show MB problems with the Farrouja chip are those TVs that are digital all the way and the ones that don't exhibit the problem are ones that have an analog stage.


Ed
 

·
Registered
Joined
·
2,778 Posts
Quote:
Originally Posted by ekb
Also on a related note - I wonder whether the TVs that show MB problems with the Farrouja chip are those TVs that are digital all the way and the ones that don't exhibit the problem are ones that have an analog stage.
No, I have a Panasonic CRT that exhibits severe (to my eyes) MB with the Faroudja players.
 

·
Registered
Joined
·
540 Posts
Quote:
Originally Posted by ekb
I also started a thread over in the rear proj TV forum asking about this. The answers are comming back that indeed the Sonys do convert to analog. The supporting evidence is claimed to be examination of schematics.


So now, even thought there still is a weak argument that an HDMI connection between a DVD player and my TV is better than an analog connection - I largely have lost interest in an upconverting/HDMI player. It seems like the big advantage of keeping the entire signal path in the digital domain is gone.


Also on a related note - I wonder whether the TVs that show MB problems with the Farrouja chip are those TVs that are digital all the way and the ones that don't exhibit the problem are ones that have an analog stage.


Ed
Yes another bombshell is dropping.

I wonder how many so called experts on this forum knew all along but are silent when people discuss the merits of HDMI?
 

·
Registered
Joined
·
6,402 Posts
The only argument that makes sense is that Sony made the addition of the HDMI (or DVI) input to a prior, all analog, input circuit design. Sony, obviously, felt that the added cost of redesign was greater than just putting in a good D to A converter chip for the HDMI input. With good D/A and A/D converters, the amount of information lost can minimized, especially if they did a good job of designing for their 1366 x 768 panels (which they seem to have done).


BTW, since they have to rescale from 720p or 1080i (the 2 HDMI standard HD signals) they have to have a good scaler design (Sony uses 1366 x 768 in more than one of their digital display products) and a good scaler for all is good economics. This would have nothing to do with the D/A and A/D conversion issues.


There is still a benefit to using HDMI with the 42WE655, or similar units and that is less signal loss for long cable runs with smaller cables. Using component cables, for instance, cable capacitance in long cables can lead to progressive loss of high frequency information and produce softer pictures. With digital signals, a bad cable will be instantly noticeable (starting with “sparkliesâ€, and then pixelization, and ultimately no useable signal). One high quality HDMI cable is no bigger than one of the three cables in a component cable set, and is smaller than a similar quality SVGA cable. For cables that are 12’ or less, this is not an issue, but if you’ve got to run 25’ or more, then the relative characteristics become more significant.


The ultimate decision of which input to use depends on what the user percieves as the better end result – good PQ.


I think jpco’s comment puts the MB issue to bed with regard to downstream analog processing. The MB issue is related to how the Faroudja circuits are implemented and the way in which the a particular display reacts to this artifact.
 

·
Registered
Joined
·
7,958 Posts
Quote:
Originally Posted by ekb
Also, even though the native display is (?)x768 pixels, the HDMI input only accepts 720p, 1080i and 480i/p. Although not proof, one would think that if it was all digital, it would include the 768 input which would require no scaling.


Ed
I suspect that HDMI is being treated as a "video" rather than a "PC" interconnect (*) - and so uses the standard video resolutions of 1080/60i, 720/60p, 480/60p, 480/60i (**) rather than PC/Display resolutions of 1366x768 etc.


(*) and thus uses black=16 and white=235, rather than black=0 and white=255

(**) Also 1080/50i, 720/50p, 576/50i and 576/50p in Europe.
 

·
Moderator
Joined
·
23,031 Posts
sneals2000 right and it's not like this is anything new. Consumer "TVs" are made to be fed by consumer "players" (DVD, stb, etc). There are standards which define the resolutions that these consumers devices support. Without these standards DVD player "A" would have to "guess" the video timings of TV "B". AFAIK, there is no standard for 1366x768 (doesn't the newer Sony LCDs RPTVs use a slightly bigger oddball res?) or 852x480 etc. A lot of AVSers have their microscopic view of things and don't understand why manufacturers can't support every feature under the sun. The common consumer doesn't by a PC and connect it to their TV to watch DVDs. They use the computer for email or games, etc. The average consumer doesn't buy a DVD player and expect it to do or try to make it do 1:1 pixel mapping with their digital display. The general consumer has a hard time figuring out where to plug the cables and which ones to use. :)


larry
 

·
Registered
Joined
·
84 Posts
Quote:
Originally Posted by ekb
I

So now, even thought there still is a weak argument that an HDMI connection between a DVD player and my TV is better than an analog connection - I largely have lost interest in an upconverting/HDMI player. It seems like the big advantage of keeping the entire signal path in the digital domain is gone.


Ed
EKB,


I have the same set as you, 42we655, and I have found that my upconverting Sony 975 looks much sharper at 720p than 480p. The reason for this is that the Sony TV's have different filter and sharpeness setting for each combination of input and scan rate. Using the sharpness patterns on DVE, the 480p input over hdmi or component looks overly filtered. The 720p and 1080i inputs over hdmi have less filtering and thus look sharper. I was actually able to reduce the filtering for 480p in the service menu and get a good picture but I decided to stick with 720p. The odd thing is they use less filtering on 480i than 480p. Feeding 480i and letting the TV deinterlace and upscale looks quite sharp.


So basically, for your TV, you can do fine without an upscaling player but you really need to do some of the service mode tweaks to get rid of the excessive filtering at 480p or just run at 480i and let the TV deinterlace.
 

·
Moderator
Joined
·
23,031 Posts
Quote:
But considering the 5910, according to Kris Deering, has the best PQ of any player currently available,
Where'd he say that? While it might be true given the SOTA deinterlacer and scaler in the 5910, I'd don't think he'd make that statement considering he hasn't tested or seen every DVD player currently available. One of the reasons Kris is respected is that he doesn't make (blanket) statements with no data to back them up.


larry
 
1 - 20 of 31 Posts
Status
Not open for further replies.
Top