AVS Forum banner

Status
Not open for further replies.
1 - 16 of 16 Posts

·
Registered
Joined
·
2,736 Posts
Discussion Starter #1
Hi Guys,


Has anyone had a chance to compare the Pioneer Pro 1000 with both a DVI input card at the panel's native rate, and with an SDI input card using its internal deinterlacer and scaler?


If not, what should give a better image quality?


- SDI: no losses between the source and the panel, but subject to the display's onboard scaler quality,


- DVI: potential losses due to the source's SDI signal going through an external scaler, and its conversion to the DVI signal, native rate issues...


Thanks for your learned comments. :)
 

·
Registered
Joined
·
2,736 Posts
Discussion Starter #2
To get the ball rolling, it would seem logical that the best picture would come from a display that has a native rate not requiring scaling from the SDI signal, only deinterlacing?
 

·
Registered
Joined
·
925 Posts
brett - please clarify. i am not aware of any PDP with SDI input capability, so i'm not sure what you're asking folks to compare.


if such a beast existed, i'd be surprised if PQ b/w them was different - though i don't know all the specifics on the SDI interface.


doody
 

·
Registered
Joined
·
1,570 Posts
SDI is 480i signal as I recall but of course in digital format. The only native rate panel for this type of connection will be the current 42" plasmas that have 480p native rate. No scaling will be done just deinterlacing by the panel.
 

·
Registered
Joined
·
1,514 Posts
I'm with Doody on this one; no PDP exists with an SDI input. You can run a DVI input from an HTPC with the right output card, and that looks fabulous from what I hear (I don't own either). You can run an SDI input into 2 and soon 3 scalers (Vigatec Dune and LEEZA, and soon the Rock+), and from there run RGBHV into the display. On a LEEZA with this option, you can go SDI to the LEEZA and DVI to the PDP.


I'm not sure what you are trying to compare here, to be honest.


Cheers
 

·
Banned
Joined
·
17,607 Posts
Ikegami and a few others did it.


Companies in Europe modify Plasmas for SDI.


This makes sense ONLY if the VIDEO PROCESSING SECTION of the PLASMA is very advanced, and its circuitry is designed for a clean tap between the analog to digital .


I could see problems however controlling the color and hue of such modified units.


But if those conditions where met, I would like SDI in...
 

·
Registered
Joined
·
282 Posts
I concur with Joel -- I have experimented and can get a superb picture from my PC outputing DVI into the Pio 503


I still prefer going through the Faroudja NRS to the RGBHV of the Pio..


However Ericbee reports that the best of it is the Leeza at NR going into DVI port


Michael M
 

·
Registered
Joined
·
2,736 Posts
Discussion Starter #8
Here's a link to a UK based outfit that has been modding Plasma Displays for SDI input:
http://www.visionery.co.uk/


There's not much info at their website, and their information in PDF format is that many dead links... Reportedly, it is very expensive - whether justified or not is your guess.


Apparently SDI is not always a 480i signal, as their display is for use in Europe with a 576i SDI signal. Depending upon the source signal and the plasma display's native rate, most likely an SDI signal would have to be either upconverted or downconverted in addition to onboard deinterlacing.


There is at least one post in this forum (Ericbee, was it you?) that mentioned that there were oodles of input cards being made for the Pioneer Pro 1000 including an SDI input card. This means it would have to be upconverted to native rate, but all in the digital realm.


I guess what it all boils down to, is whether the SDI signal is somehow "cleaner" than DVI, but it appears that whether the deinterlacer and scaler is outboard or onboard, they both avoid going through an analogue stage, so it really boils down to which is better, you're external scaler or the display's.


One thing I'm wondering though is whether going through DVI might add artifacts or color problems which wouldn't be there when feeding SDI directly into the display. Or would the display have to itself convert the SDI signal, after deinterlacing and scaling to native rate, into an internal DVI signal?
 

·
Registered
Joined
·
1,514 Posts
Brett, the SMPTE standard for SDI is interlaced only. No progressive. The DVI standard and connection has the bandwidth for progressive and HD content. But both are digital and clean.


All things being equal, any digital end-to-end pathway will be superior to a pathway with an analog stage. If you could get SDI into a plasma, it would still have to be deinterlaced (both film and video sources), and for the moment (Fujitsu's AVM possibly the exception) no internal deinterlacer is as good as the outboard ones, which is why the best pictures have been obtained either via a PC deinterlacing the DVD signal, or an outboard LEEZA deinterlacing the digital signal and outputting DVI.


that's the best sense I can make of it.


Cheers
 

·
Registered
Joined
·
2,736 Posts
Discussion Starter #10
Hi Joel,


Yeah, I realized that. I'll also bet that DV and DVI are entirely animals of a different color? Here's why I wondered, and also if an interlaced DVI signal at a different rate than the display's native rate would be accepted by a Plasma screen (even if not as good as in native rate or an SDI input)? Here's why:

http://www.dv.com/magazine/2001/0201/henage0201.html


You'll guess, I was reading up on Miranda's latest gadget. :)
 

·
Registered
Joined
·
1,681 Posts
Stating the obvious, why not both?


Leeza and a good HTPC should be able to do both.


-Steve
 

·
Registered
Joined
·
1,570 Posts
I think that my PRO's internal deinterlacer is not good at all. I just got digital cable. The signal is very clean. It has both analog and digital channels, both good but the digital channels are a bit superior.


Anyway.. This box has only composite out. The stair-step artifacts are so obvious. On s-video through my 7700 DVD, I don't see anything like that. What's up with that? If the internal scaler is bad, it must show such on all inputs. Component DVD looks incredibly nice.


I think I'll put an HTPC together with DVI out to the pio for deinterlacing and scaling. I am waiting for Faroudja NR to be equipped with DVI output. Maybe at CEA 2002? Hopefully.


Then again, I may change my plasma if panasonic comes out with a new 50", complete with DVI HDCP or expansion slot.
 

·
Registered
Joined
·
1,514 Posts
Interesting link, Brett. I'm no pro at this, but it appears that DV is a IEEE1394 spec, which is different from DVI, a different spec. 1394 is a bidirectional spec, which is likely to be adopted for recording devices because it allows the source to tell the recorder how many (or whether any) copies can be made and for the recorder to acknowledge it. DVI, which is one directional, is likely to be adopted as a standard to send an encoded signal to a display device.


The Miranda product moves SDI to IEEE 1394, and vice versa. That's different -- it's not a DVI output device and will not work with the DVI-equipped PDPs, absent some kind of firewire-to-DVI transcoder, I would imagine and I have no idea if that exists.


The LEEZA does what the Miranda productt does: takes SDI input into DVI output. I would think that's a better match for PDP applications than the Miranda, which is really a means of taking DV-cam or mini-DV video output (on 1394) and converting it to SDI for professional work.


Cheers
 

·
Banned
Joined
·
17,607 Posts
As Joel said


When someone tries an SDI input into the AVM Fujitsu circuit then that will be the true TEST. But I would want to reatain color control.
 

·
Registered
Joined
·
258 Posts
The Sony PFM510A2WU supports SDi with a Sony plug-in card,

the BKM501D ($1999 msrp). This enables 4:2:2 SDI input.


fwiw
 
1 - 16 of 16 Posts
Status
Not open for further replies.
Top