or Connect
AVS › AVS Forum › Video Components › Video Processors › DVI or HDMI vs. HD-SDI
New Posts  All Forums:Forum Nav:

DVI or HDMI vs. HD-SDI - Page 3

post #61 of 96
Quote:
Originally Posted by madshi View Post

I used to do SDI in the past, but now I've gone HTPC instead. SDI has one big disadvantage: It doesn't support 4:2:0, so the decoder must upsample chroma in one direction (4:2:2) and somebody else must upsample it in the other direction (4:4:4). Furthermore proper chroma upsampling results in more than 8bit. So in order to send the upsampled data over SDI, the decoder also has to round the upsampled data down to 8bit. I believe it's technically a better idea to decode in the native video format (usually 8bit 4:2:0) and then to do all necessary video processing (chroma upsampling, color conversion, eventually scaling etc) in one place, in high bitdepth, and the final processing output should be dithered down to the output bitdepth. This is what my HTPC is doing.

These are the kind of differences we should be discussing. Upsampling and rounding errors for a start. Not this silly boutique cable crap!
Edited by Glimmie - 4/23/13 at 2:43pm
post #62 of 96
Quote:
Originally Posted by madshi View Post

FWIW, the chroma upsampling and scaling algorithms of a modern HTPC are better than what e.g. a DVDO processor can do.
Not to get too off-topic, but can you say in particular which algorithms are superior? I've been using FOSS and other free programs, and am starting to buy some commercial stuff. I'd like to get to know the inner secrets of stuff like deinterlacing and frame rate conversion. It seems that the FOSS software has more options than some expensive commercial programs!
post #63 of 96
Well, of course I'm biased... tongue.gif But I find Jinc3 with anti-ringing filter pretty good. For that you'll need to use madVR, and you'll need a reasonably fast GPU. Jinc3 AR can be used for both chroma upsampling and image scaling. But I guess HTPC specific discussions probably belong into the HTPC forum. That said, if you can do a comparison with SDI then that would make this on-topic again. Deinterlacing and frame rate conversion are quite interesting discussions, too, but would probably belong into a new thread.
post #64 of 96
Quote:
Originally Posted by madshi View Post

Well, of course I'm biased... tongue.gif But I find Jinc3 with anti-ringing filter pretty good. For that you'll need to use madVR, and you'll need a reasonably fast GPU. Jinc3 AR can be used for both chroma upsampling and image scaling. But I guess HTPC specific discussions probably belong into the HTPC forum. That said, if you can do a comparison with SDI then that would make this on-topic again. Deinterlacing and frame rate conversion are quite interesting discussions, too, but would probably belong into a new thread.

So what is your recommended software chain for standard def DVD playback? The problem I have had with HTPC is I have yet to find a good de-interlacer for SD material. Also the upconversion? I just get a Teranex as my latest processor upgrade. My goal is to rip all my SD DVDs to a server. I have even considered building an HTPC player that outputs SD SDI via a modified ancient decoder card.

I have a Nvidia FX4000 card with HDSDI on my HD media player which uses VLC as the MPEG4 decoder. This spits out 1080i as the files are 1080i native (old DVHS recordings). This set up is fantastic for these recordings that have been dumped to a RAID. But when I tried DVD images through VLC it don't look that great at all.
post #65 of 96
I'd think there would be an existing thread by now. If not, I guess I'll just have to fix that! wink.gif Thanks!
post #66 of 96
Quote:
Originally Posted by Glimmie View Post

So what is your recommended software chain for standard def DVD playback? The problem I have had with HTPC is I have yet to find a good de-interlacer for SD material. Also the upconversion? I just get a Teranex as my latest processor upgrade. My goal is to rip all my SD DVDs to a server. I have even considered building an HTPC player that outputs SD SDI via a modified ancient decoder card.

I have a Nvidia FX4000 card with HDSDI on my HD media player which uses VLC as the MPEG4 decoder. This spits out 1080i as the files are 1080i native (old DVHS recordings). This set up is fantastic for these recordings that have been dumped to a RAID. But when I tried DVD images through VLC it don't look that great at all.

VLC is a nice player, but it doesn't have highest playback quality as its primary goal. I'd suggest using madVR as the video renderer (I'm the madVR developer, so I'm biased, of course). In order to use it, you need to use a media player which supports madVR. Choices are MPC-HC, MPC-BE, ZoomPlayer, JRiver MC18 and a few more. Then configure the media player to use madVR. In the madVR settings then select Jinc3 with anti-ringing filter enabled for both chroma and image upscaling. You'll need a reasonably fast GPU for this. If your GPU is too slow for Jinc3 AR then try Lanczos3 AR instead (or upgrade your GPU).

Make sure you configure madVR correctly (video vs PC levels). Also make sure you configure the GPU correctly. GPU should be set to always output RGB 0-255, and all video processing options in the GPU control panel should be turned off, except deinterlacing. Finally, for *film* content you can force madVR into film mode. That will give you a high quality IVTC algorithm with automatic cadence detection and support for all kinds of cadences (including Anime) and pretty good PAL 2:2 cadence support. For *video* content of course you shouldn't have madVR set to film mode. In that case DXVA2 deinterlacing is used which is a reasonably good video mode deinterlacer. madVR supports a big number of keyboard shortcuts. E.g. you can press Ctrl+Alt+Shift+D to turn deinterlacing on/off, or Ctrl+Alt+Shift+T to switch the content type (film vs. video). Or you can press Ctrl+Alt+Shift+M to change the decoding matrix or Ctrl+Alt+Shift+P to change the primaries etc etc...

One thing still missing in madVR is support for automatic video vs. film mode detection (ideally per pixel). That's still on my to do list. For now you either have to manually switch between film and video mode. Or you can tag your video files (e.g. "Gladiator [deint=film].mkv"). Or you can always use "auto" mode. In that mode DXVA2 deinterlacing is used, which works reasonable well for both video and film, but it will always turn 60i into 60p, so 3:2 pulldown judder will not be removed for movies.

P.S: Just checked: The FX4000 is waaaay to slow to run decent upscale algorithms or good deinterlacing. It's 9 years old which is an eternity in the computer world. Just for comparison: The FX4000 has a texture fillrate of 4.5 GT/s. Today's GPUs are so much more powerful, which is really necessary to run high quality algorithms. E.g. an NVidia 650 (which is the lowest speed GPU I would recommend, better would be the 660) has a pixel fillrate of 34 GT/s. Same situation with shader power: The FX4000 has around 100 GFLOPS. The 650 has 800 GFLOPS. And the 660 is around twice as fast as the 650. I'd suggest that you try a 650 or 660, of course with HDMI then. I would recommend to have your HTPC configured to output 8bit RGB in the native resolution of your display (probably 1080p). For that HDMI should do just fine. If you want to give madVR a try with your FX4000, set chroma and image upscaling algorithms to "Bilinear", that's probably the max your GPU can do. That's of course not a good idea for image quality. But it might allow you to run a first few tests. Switch your media player to fullscreen view (without any menus or buttons visible), then madVR will automatically go into fullscreen exclusive mode which will make sure there's no tearing and that your GPU can be used to its full potential. If you want to know which cadence the IVTC algorithm detected (only if you forced film mode), you can turn on the debug OSD (Ctrl+J).
post #67 of 96
A question for Glimmie (or maybe madshi): how do you manage to get a HD-SDI card (or a regular HDMI card) to output a properly timed video signal, so an outboard processor is able to do a 100% perfect cadence reconstruction and framerate conversion ? Especially if you output 1080i (Glimmie) using a software decoder like VLC ?

In my experience (or at least everytime I tried in the past) and no matter if using SDI / HD-SDI or HDMI, the missing hardware clock for video playback caused irregular timings. This means an external processor is able to deinterlace 1080i movie material, but if you tried to have 1080i60 converted to 1080p24 (which works perfectly fine with 1080i coming from a media player), it would cause stutter and cadence breaks.
post #68 of 96
Quote:
Originally Posted by Fudoh View Post

how do you manage to get a HD-SDI card (or a regular HDMI card) to output a properly timed video signal, so an outboard processor is able to do a 100% perfect cadence reconstruction and framerate conversion ?...

In my experience (or at least everytime I tried in the past) and no matter if using SDI / HD-SDI or HDMI, the missing hardware clock for video playback caused irregular timings. This means an external processor is able to deinterlace 1080i movie material, but if you tried to have 1080i60 converted to 1080p24 (which works perfectly fine with 1080i coming from a media player), it would cause stutter and cadence breaks.
I'd think that the video signal being put out should be complete enough to be processed without errors regardless of any reasonable timing variations. Back in the day we used time base correctors to reclock outputs that couldn't lock up (like VCRs). These days that's mostly unnecessary because of plentiful RAM and quartz crystal time bases in just about everything. It shouldn't be a problem. Even if it is, thanks to plentiful RAM, most processors also function as frame buffers, so it really shouldn't matter if their input is from an undisciplined source. The processor's input shouldn't care as long as the source doesn't cause a buffer overflow or underrun, and a decent SDI card shouldn't.

If you're using SDI, I presume that you're in a pro/broadcast shop with things like 3-level genlock and master clocks for the house. This isn't home equipment. (If you have the kind of money to do it at home, great, but remember to do the whole plant.) If you're doing work at 24 and/or 24/1.001fps, it naturally follows that you'll want a time base that is capable of deriving the correct frequency/frequencies for the output frame rate, and have the output of your scaler (or whatever the processor is) genlocked to the "p24" reference. It's always the output that's on a disciplined time base, and the input to the frame buffer is assumed to be an undisciplined time base. Sure you can genlock the computer if the output card supports it, but the output of the processor will be the only time base that matters in the chain at that point. If your processing equipment doesn't support genlock/master clock, you need new processing equipment that does. Simple as that. smile.gif

BTW, I may be biased, or hopelessly outdated, but I still like the designs of Yves Faroudja. I don't know if any of the Faroudja algorithms are available in software (that's one thing I'm trying to find out), but my first HD set has Faroudja DCDi circuitry in it and does absolutely beautiful rendering of SD content. I didn't realize just how good it was until I "upgraded" to newer sets without this technology.
Edited by Speed Daemon - 4/24/13 at 5:21am
post #69 of 96
Madshi, thanks for the info. I need to digest it all.

The old FX4000 SDI card I have just works very well with interlaced HD files using VLC. Note it does not with SD files, specifically DVDs. The newer (yet old too) FX5500 SDI card does not work as well either even on the HD material. It may be because the the FX4000 SDI card was separately designed from the stock FX4000 as an SDI card. The FX5500 and newer cards use the stock GPU card and just add an HDMI to HDSDI converter card with some control interfacing - probably for cost reasons and the very limited HDSDI market.. So they may be re-interlacing on the SDI option card when you request an interlaced format. I don't know. But the old FX4000 just works very well with my DVHS recordings. I also have newer MPEG 4 satellite recordings that work equally well on VLC. I think the reason may be that all VLC is doing is the MPEG 2/4 decoding. Give it interlaced MPEG and ask for interlaced output, and this version of VLC just sends it on through. I have tried newer VLC versions that don't look as good. I think I just stumbled on a good match here between the GPU and the VLC version.

As for genlocking, yes some of these cards support that. I don't use it. It also varies in the pro applications. Many times these systems are just free running as the output is for display purposes only as the material is passed around on a SAN. If laying off to tape, they are usually genlocked but that's more for timecode and audio issues. In a broadcast environment they defiantly need to be genlocked. The only thing I lock is my laserdisk player because the internal TBC reference is so poor, my decoder doesn't lock to it reliably. That BTW was a modification to allow that. I also have 3/4 and 1 inch VTRs that require external lock so I have an NTSC sync generator but it's not a "standards" quality unit. I also have a Leader 440 HD sync/test generator but it only does 1080i and 720p, that's why I have it, nobody wants it. cool.gif

Some may wonder why I have these expensive but old cards? Well they were EWASTED when the systems upgraded to the 600 series Madshi speaks of. And as we know the cost of these new cards has come down significantly - well within hobby price points. I may need to rethink my HTPC architecture. I can easily run all HDMI if needed.
Edited by Glimmie - 4/24/13 at 11:57am
post #70 of 96
When doing SD playback with the FX4000, do you output the native format of the video (640x480i60 or 720x576i50)? If you output any other format, VLC has to deinterlace first, then scale, then reinterlace. My best guess is that deinterlacing doesn't work well and that with HD playback you're just lucky because VLC doesn't have to deinterlace but can just output natively. In any case, your current setup seems to be optimized to use the HTPC only for pure playback, without deinterlacing and scaling. The setup suggested by me is completely different in that it includes deinterlacing, color conversion and scaling, resulting in RGB output. So it's a completely different logic. If you decide to give the RGB chain a try, I'd be quite interested to hear your opinion about how image quality compares...
post #71 of 96
Quote:
Originally Posted by madshi View Post

When doing SD playback with the FX4000, do you output the native format of the video (640x480i60 or 720x576i50)? If you output any other format, VLC has to deinterlace first, then scale, then reinterlace. My best guess is that deinterlacing doesn't work well and that with HD playback you're just lucky because VLC doesn't have to deinterlace but can just output natively. In any case, your current setup seems to be optimized to use the HTPC only for pure playback, without deinterlacing and scaling. The setup suggested by me is completely different in that it includes deinterlacing, color conversion and scaling, resulting in RGB output. So it's a completely different logic. If you decide to give the RGB chain a try, I'd be quite interested to hear your opinion about how image quality compares...

Exactly. The version of VLC I use just passes interlaced material though if requested. I will definitely look into your suggestions for SD. That's the beauty of an HTPC, run VLC for some media and another for other files. Use the best software tool for the job.
post #72 of 96
Glimmie, it just dawned on me that all this time you were talking about a video card with a SDI output, and not an actual SDI card! No wonder I couldn't find it when I searched for it. Not every NVIDIA FX4000 chipset has a SDI plug on the card. I just assumed it was a BlackMagic card because their website was down.

Another revelation is that, as a video card first and foremost, that setting it up to output the proper signal is probably going to be a challenge, especially if you're not running Linux or an OS that uses the X Window System. I wouldn't know where to start in Windows. Yes I can see how that could be a major problem, especially with Windows.

I'm not sure what role VLC plays here (aside from being able to select precisely where the output goes) since the application itself wouldn't be involved in setting the output color space, modelines etc. Yes, I'd expect that all interlaced SD material would automatically be deinterlaced as a prerequisite for display on most computer monitors.

Perhaps the old Quadro cards simply can't act as a video card and a proper SDI output at the same time. (I'm just guessing here.) What are you using as your primary computer display? Is there a way to specify SMPTE 259M-C as the standard to use on your SDI output?

In this case it just may be easier to use HDMI. After all, with HDCP it's pretty good at degrading to SD. wink.gif
post #73 of 96
Quote:
Originally Posted by Speed Daemon View Post

Speaking from an engineering perspective, I'm with you--silver is used all the time on the best quality connectors because it's the best conductor. I have plenty of BNC and other "N" series (including N) connectors on my best cables. And while solid silver conductors may be overkill, it's SOP in TV broadcasting and other RF equipment to silver plate conductors and waveguide surfaces for maximum performance. Using silver over copper reduces resistive and reactive losses, period.

Having said that, as long as the digital signal is making it from Point A to Point B, there's no possible way to "see any difference between different conductor types because transmission is done in the digital domain. As long as there's no net BER, the end product is exactly the same. I've been around too many allegedly golden-eared audiophiles who claim to hear things that they can't quite explain. That sells a lot of crap to rich people who have every right to get ripped off by snake oil salespeople.

Since HD-SDI bit error rate can be measured by using standard test equipment, I welcome a throwdown that compares the BER of one cable vs. another. Show me what the BERTScope says and I'll be a believer.

I have no problem showing you the difference between the copper and silver cables.Let me know when you are in Miami and that is a deal.It's very tiresome to try to convince you guys about what I'm witnessing.To say that I touched something or something is broken is insane to me.The belden 1694a that I bought from markertek is in perfect condition and the picture looks good.Bless you guys and no hard feelings but I sure wish each and everyone one of you that doubts what I'm saying could see both cables in action.Again I have no agenda because I don't get anything out of this but some disrespectful engineer called Gimmie telling me that I don't even know what hd-sdi is.
post #74 of 96
Quote:
Originally Posted by jiujitsu35 View Post

I have no problem showing you the difference between the copper and silver cables.Let me know when you are in Miami and that is a deal.
Thanks for the offer, but I can look at both elements right here in the privacy of my own home. No need to travel to Miami...or even across the street for that matter. Perhaps you failed to notice that I said before that I had a great many silver things. I thought it goes without saying that I also have plenty of copper too.
Quote:
It's very tiresome to try to convince you guys about what I'm witnessing.
It's tiresome on this side as well. I explained how you could convince. Why don't you just go do it? Post your data and be done with it.
Quote:
To say that I touched something or something is broken is insane to me.
I never said any such thing! Also, I don't like people making insinuations about my alleged sanity. Please confine your discussion with me to factual things, please. Also, since I'm an individual, not a collective, please stop addressing me as "you guys". There's only one of me here.
Quote:
The belden 1694a that I bought from markertek is in perfect condition and the picture looks good.
Although I'm a Markertek customer, I get Belden 1694A bulk RG6/U cable courtesy of my local cable company. Yes, it works quite well. I can see that by looking at the BER data that my TiVo brand DVRs show me. That's one nice thing about digital signal transmission. When the BER gets to zero, there is literally no improvement that can be made to the signal chain.
post #75 of 96
Quote:
Originally Posted by Speed Daemon View Post

Glimmie, it just dawned on me that all this time you were talking about a video card with a SDI output, and not an actual SDI card! No wonder I couldn't find it when I searched for it. Not every NVIDIA FX4000 chipset has a SDI plug on the card. I just assumed it was a BlackMagic card because their website was down.

Another revelation is that, as a video card first and foremost, that setting it up to output the proper signal is probably going to be a challenge, especially if you're not running Linux or an OS that uses the X Window System. I wouldn't know where to start in Windows. Yes I can see how that could be a major problem, especially with Windows.

I'm not sure what role VLC plays here (aside from being able to select precisely where the output goes) since the application itself wouldn't be involved in setting the output color space, modelines etc. Yes, I'd expect that all interlaced SD material would automatically be deinterlaced as a prerequisite for display on most computer monitors.

Perhaps the old Quadro cards simply can't act as a video card and a proper SDI output at the same time. (I'm just guessing here.) What are you using as your primary computer display? Is there a way to specify SMPTE 259M-C as the standard to use on your SDI output?

In this case it just may be easier to use HDMI. After all, with HDCP it's pretty good at degrading to SD. wink.gif

That's correct. The FX4000 SDI is a dual monitor display card with SDI out. I have it set up in dual monitor mode with the standard desktop running analog VGA at 1024x768 and the second monitor running 1080i/60. I can drag the desktop to the second monitor which is the SDI output. This card allows that under Windows XP. The FX5500 seems to have more restrictions on dual monitor mode and furthermore, compatibility is worse under Windows 7.

This was the last of the dedicated SDI Nvidia cards. The newer cards like the FX5500 are their standard dual DVI/HDMI with a second card supplied for SDI that appears to be a simple DVI/HDMI to HDSDI converter. There is a small ribbon cable to the main graphics card for control I assume. The SDI sub card does not use the PCI buss. Now there is an FPGA in the SDI card so who knows what they are doing? They may just be re interlacing there when interlaced is requested.
post #76 of 96
Quote:
Originally Posted by jiujitsu35 View Post

I have no problem showing you the difference between the copper and silver cables.Let me know when you are in Miami and that is a deal.It's very tiresome to try to convince you guys about what I'm witnessing.

I don't think we doubt what you are seeing. But whatever it is it's not due to the cable. Can't happen with HDSDI. Inferior cables produce white sparkles and intermittent dropouts. You are expecting to see analog type characteristics with a more expensive cable and that's what you are seeing - placebo effect. It's quite common among non technical audio and videophiles. They have a mind set that anything and everything has in influence on sound and image quality.
post #77 of 96
Quote:
Originally Posted by Glimmie View Post

That's correct. The FX4000 SDI is a dual monitor display card with SDI out. I have it set up in dual monitor mode with the standard desktop running analog VGA at 1024x768 and the second monitor running 1080i/60. I can drag the desktop to the second monitor which is the SDI output. This card allows that under Windows XP. The FX5500 seems to have more restrictions on dual monitor mode and furthermore, compatibility is worse under Windows 7.

This was the last of the dedicated SDI Nvidia cards. The newer cards like the FX5500 are their standard dual DVI/HDMI with a second card supplied for SDI that appears to be a simple DVI/HDMI to HDSDI converter. There is a small ribbon cable to the main graphics card for control I assume. The SDI sub card does not use the PCI buss. Now there is an FPGA in the SDI card so who knows what they are doing? They may just be re interlacing there when interlaced is requested.
It seems like it was many, many years ago that I studied the guts of Windows. That was back before XP, when Windows had a lot more public documentation than it does today. So don't presume I actually know what I'm saying here.wink.gif

IIRC the rendering engine for the Windows NT branch, that XP and 7 are descendents of, starts with a giant virtual desktop, and splits it among multiple monitors when there are more than one. As such it's bound to the scan parameters of one monitor or another. And because most computer monitors even before LCD monitors used progressive scan, it's a pretty good bet that whatever gets fed to your video card is going to be something vastly different from the SMPTE 259M-C 480i signal that the SDI port should be outputting. That would mean that either the kernel driver for the video card or the card itself (including the SDI circuitry) would have to be doing the conversion to SMPTE 259M-C, or else it's not being done. If it's not being done, I can say "there's your problem", which isn't too helpful.

OTOH there's this. Specifically Option "MetaModes" "CRT-0:nvidia-auto-select, DFP-1:1280x720_60.00_smpte296".

For the record, SMPTE 296M is a descriptor for 720p video over HD-SDI (SMPTE 292M). Although it doesn't say it, it does beg the question "is there a modeline argument for SMPTE 259M-C?" and "can I do this in Windows?" I don't have an answer for that. Unfortunately even the Linux NVIDIA drivers are closed-source, so I can't look for undocumented features. But it may give you the ammunition that you need to try to coax your NVIDIA cards to put out a standard SMPTE 259M-C bit stream.

Then there's VLC, which has the ability to pipe its output to other places aside from the Windows standard output. If this was a standalone card I'd look for some sort of way to get at least some precursor to SMPTE 259M-C to a named pipe interface to a SDI card. That's about as far as I get. Unfortunately most people are rushing towards HD video, and what I find on the Internet reflects this. My dead trees books on digital video don't address this particular issue either.

I believe you said that you got these NVIDIA cards when someone was tossing them out. If I was in your shoes, I'd keep an eye peeled for any unwanted non-HD SDI cards that may be free for the taking. That, or a HDMI to SDI converter that supports SMPTE 259M-C. After dealing with Windows NT since version 3.5, I can't imagine an easy way to wrest control of the NVIDIA card away from Windows, or to get Windows to be more SD-SDI compatible.
post #78 of 96
I 'm building my first computer and I would like your opinion on a k5000 quadro sdi video card compared to a nvidia 690 card for bluray play back
post #79 of 96
Maybe it would make sense to wait for Glimmie's test results. He said he might try a newer GPU with HDMI output. Personally, I don't have the hardware to compare, anymore. Although my expectation would be that a good HTPC configuration directly connected to the display via HDMI should be comparable (or even better) than an HD-SDI source connected to a(n average) video processor. But that's just what I think. Might be good to wait for somebody with experience to actually compare.
post #80 of 96
I don't see what benefit getting the opinions of people "with experience" might bring to the question "what, if any, are the advantages of HD-SDI over DVI or HDMI?". Electrical engineering isn't an experiential discipline, it's one of facts and numbers. My experiences of HD-SDI is that it works. But that doesn't address the question at hand.

Speaking of the question at hand, we might want to consider the possibility that the question itself really isn't all that relevant. HD-SDI is a standard that was developed for use in supply-side video production and transmission, while HDMI was designed for home use. As such, HD-SDI has absolutely no advantages over HDMI. It's like comparing apples to oranges. If you don't already have lots of HD-SDI equipment, there's no practical reason to buy it for home theater purposes. That's not what it was made for.

One thing that I see as telling is the fact that nobody has recognized that the differences between progressing versions within both standards are much greater than the difference between both standards! They're both evolving standards, and the only way to make any direct comparison is to compare specific versions.

If the question is "which is better?" then the answer is "yes". Or "no". Each is best for its intended purpose.
post #81 of 96
@Speed Daemon,

I'm sorry, but you're oversimplifying here. The usual reason why a consumer uses (HD-)SDI in a home theater is that he wants to avoid bad processing in the source device. Often (actually very often) source devices don't simply decode and output the video stream untouched. Instead they apply all sorts of processing. Actually they have to, even with SDI, because neither SDI nor DVI nor HDMI support 4:2:0 transport. So every source device, regardless of whether it outputs via SDI or HDMI, has to upsample chroma to 4:2:2 first.

The key advantage of SDI mods is that they minimize bad processing in the source device. As such it is absolutely a valid question whether it's better to use an SDI GPU or a HDMI GPU, if in fact the SDI GPU allows untouched decoding output which HDMI usually does not. When using HDMI, the GPU in a typical Windows HTPC "thinks" in RGB. So basically the video is usually converted to RGB. If you switch the GPU HDMI output to YCbCr, the GPU probably does two conversions, one from YCbCr -> RGB and then afterwards back to YCbCr. With a native SDI GPU, when using "good" drivers, a more native output might be possible. This all depends on the OS and the GPU drivers, though, and the color space conversions are done behind our back, so it's not something we can just "know".
post #82 of 96
Who's oversimplifying? rolleyes.gif

If the "video sources" such as cameras use HD-SDI connectors, it's not some vague purist thing. More expensive studio cameras use their own proprietary cables because they work in conjunction with a camera control unit. That sort of signal processing is most certainly desirable! Other less expensive cameras use SDI outputs because that's all they have. It's not because SDI has magical powers; it's because the TV station is using inexpensive ENG cameras in the studio, for example, and that's the best output jack that they have. No camera system takes the raw analog output from CCD chips and just leaves it that way because what comes out of a CCD is not fit for viewing or editing. If you don't understand the equipment, making wild suppositions isn't going to help.

Like I said there are different versions of the various formats, so claims being made about one particular version are not true for others. This constant hammering about imagined inabilities and what supposedly must be done is silly. Since it's become a common theme, I'm going to ignore it from now on. Those who want to curse the darkness of their own making can walk alone.
post #83 of 96
We're talking about home theater, Blu-Ray playback and HTPCs. None of which have anything to do with studio cameras.

It seems to me that you don't understand the key reason why SDI mods exist for home theater source devices. Please read the 3rd post of this thread (which btw is 7 years old) to get some hints about why SDI may sometimes produce better results than HDMI, when talking about consumer video sources. And those are just some examples, I've seen many more things like that mentioned. Today it's 7 years later, maybe the CE companies have learned in the meanwhile how to do proper video output via HDMI, but I'm not sure about that...

Anyway, fact is that HTPCs are more complex than any other source device. Windows doesn't really have good support for YCbCr, Windows mostly runs graphics in RGB. But there's a lot going on in an HTPC behind the scenes, e.g. in the GPU drivers. Because of that you can't simply say for sure whether an old native SDI GPU will produce identical results to a new HDMI GPU. There might be quite big differences, depending on what the GPU drivers do exactly. That's because I said it might make sense to wait for Glimmie to actually test and compare...
post #84 of 96
Quote:
Originally Posted by madshi View Post

We're talking about home theater, Blu-Ray playback and HTPCs. None of which have anything to do with studio cameras.
How many of you are there?

If that's what you're talking about, it would help if you used appropriate terms like "playback devices". I'm not a mind reader.
Quote:
It seems to me that you don't understand the key reason why SDI mods exist for home theater source devices.
I understnad well enough to know that it's snake oil. It wasn't my intention to embarrass anyone here by calling a spade a spade, but in an age where file-based workflow is the norm for both supply and consumer side video, the idea that consumer electronics with professional connectors is somehow better is laughable. I've heard the same old song and dance for decades and I'm not impressed.
Quote:
Windows mostly runs graphics in RGB. But there's a lot going on in an HTPC behind the scenes,
Yes, computers are complicated. But they're not magical. Neither are the algorithms that can switch between different color spaces.

The thing that amuses me most is that in the video production world, component RGB is ideal. That's how it comes out of the camera! Things like chroma keying relies on this type of component video. The video formats that have largely come out of color-under video recording technology aren't the bee's knees by a long shot! But a modern PC has the computing horsepower to convert between different color spaces effortlessly. And since both computer and video displays also use RGB, there's absolutely nothing wrong with sending RGB to them. By the same token, going to a lot of trouble to outfit a Windows PC with SDI connectors, only to convert to sRGB is a bit silly.

I have a number of video playback "sources" (as you insist on calling them). Some put out the common ITU.709 video, but at least one puts out RGB. It's really not a big deal.
post #85 of 96
Quote:
Originally Posted by Speed Daemon View Post

How many of you are there?

If that's what you're talking about, it would help if you used appropriate terms like "playback devices". I'm not a mind reader.

This whole thread is about "playback devices" and external "video processors". Nobody here (except you) talked about cameras. If you post in a forum thread you're supposed to know what the topic is.

Personally, (as already mentioned) I'm no longer using SDI, and I suppose the number of SDI users is very low, even lower today than it was some years ago.

Quote:
Originally Posted by Speed Daemon View Post

I understnad well enough to know that it's snake oil.

You're obviously lacking experience here. There have been many many cases where "playback devices" had either broken processing, or forced certain processing algorithms (like contrast enhancements, sharpening etc) on you with no option to turn them off. As I said, read the 3rd post of this thread to get some real life examples. The advantage of installing an SDI mod into a "playback device" is that you solder it directly to the output pins of the video decoding chip. That gives you the guarantee that no other chip afterwards can damage the video data. I'm not sure if this all still makes sense today because today there are many combined decoding + processing chips, so it might not be possible to get the video data directly after the decoding stage. But some years ago, especially for SD-DVD-players and settop boxes it was a neat trick to get around bad processing performed after decoding.

Quote:
Originally Posted by Speed Daemon View Post

It wasn't my intention to embarrass anyone here by calling a spade a spade

You can't embarrass anyone if you lack the knowledge and experience to do so.

Quote:
Originally Posted by Speed Daemon View Post

but in an age where file-based workflow is the norm for both supply and consumer side video, the idea that consumer electronics with professional connectors is somehow better is laughable. I've heard the same old song and dance for decades and I'm not impressed.

I'm not saying that HD-SDI in itself outputs better quality than HDMI does. However, soldering an SDI mod directly to the output pins of a decoder chip in a "playback device" can give you better quality, *if* the playback device performs bad processing after the decoding stage. That's a technical fact, and quite easy to understand.

Quote:
Originally Posted by Speed Daemon View Post

Yes, computers are complicated. But they're not magical. Neither are the algorithms that can switch between different color spaces.

Color space conversion algorithms are not magical. However, if you know the math, you should also know that a conversion from e.g. YCbCr to RGB results in floating point data, which can even have negative RGB components. Windows usually works in 8bit RGB. So if you go YCbCr -> 8bit RGB -> YCbCr you have two 8bit rounding steps and two 8bit clipping steps in your processing chain, which can also mean you might be losing BTB and WTW (if PC levels RGB is used during the conversion). Being an HTPC video renderer developer myself, let me tell you that this can add banding artifcats to the image. You seem to believe that YCbCr <-> RGB conversions are lossless. They are not, unless the whole processing chain is floating point - which is usually isn't.

Quote:
Originally Posted by Speed Daemon View Post

And since both computer and video displays also use RGB, there's absolutely nothing wrong with sending RGB to them.

There's nothing wrong in itself with an HTPC outputting RGB, *if* the HTPC uses the correct conversion algorithm in floating point and dithers the final result down to the output bitdepth. And btw, that is exactly what my HTPC video renderer does.

*However*, some people are using external video processors like a Lumagen Radiance, a DVDO or a Terranex. And some people have displays/projectors with Realta or Gennum processing chips in them. If you let your HTPC output RGB, you have to deinterlace already inside of the HTPC. The deinterlacing algorithm in the HTPC might be of lesser quality than those offered by the external video processor or by the display/projector. So outputting the original untouched (except for 4:2:2 upsampling) YCbCr data to the external video processor *may* make sense, depending on which hardware you have exactly. Furthermore there are some consumer TVs which work better with YCbCr input compared to RGB input (and vice versa). So it might need trial and error to find the best output colorspace for your specific hardware. If you do want/need to output YCbCr from your HTPC, then *maybe* (depending on driver and OS) a native SDI GPU might output untouched YCbCr which is of course much preferable over twice 8bit rounded and 8bit clipped data, which an HDMI GPU might produce.
post #86 of 96
Well if it's all about the blind leading the ignorant, then someone should have made that plain. rolleyes.gif
post #87 of 96
Quote:
Originally Posted by madshi View Post

This whole thread is about "playback devices" and external "video processors". Nobody here (except you) talked about cameras. If you post in a forum thread you're supposed to know what the topic is.

Personally, (as already mentioned) I'm no longer using SDI, and I suppose the number of SDI users is very low, even lower today than it was some years ago.
You're obviously lacking experience here. There have been many many cases where "playback devices" had either broken processing, or forced certain processing algorithms (like contrast enhancements, sharpening etc) on you with no option to turn them off. As I said, read the 3rd post of this thread to get some real life examples. The advantage of installing an SDI mod into a "playback device" is that you solder it directly to the output pins of the video decoding chip. That gives you the guarantee that no other chip afterwards can damage the video data. I'm not sure if this all still makes sense today because today there are many combined decoding + processing chips, so it might not be possible to get the video data directly after the decoding stage. But some years ago, especially for SD-DVD-players and settop boxes it was a neat trick to get around bad processing performed after decoding.
You can't embarrass anyone if you lack the knowledge and experience to do so.
I'm not saying that HD-SDI in itself outputs better quality than HDMI does. However, soldering an SDI mod directly to the output pins of a decoder chip in a "playback device" can give you better quality, *if* the playback device performs bad processing after the decoding stage. That's a technical fact, and quite easy to understand.
Color space conversion algorithms are not magical. However, if you know the math, you should also know that a conversion from e.g. YCbCr to RGB results in floating point data, which can even have negative RGB components. Windows usually works in 8bit RGB. So if you go YCbCr -> 8bit RGB -> YCbCr you have two 8bit rounding steps and two 8bit clipping steps in your processing chain, which can also mean you might be losing BTB and WTW (if PC levels RGB is used during the conversion). Being an HTPC video renderer developer myself, let me tell you that this can add banding artifcats to the image. You seem to believe that YCbCr <-> RGB conversions are lossless. They are not, unless the whole processing chain is floating point - which is usually isn't.
There's nothing wrong in itself with an HTPC outputting RGB, *if* the HTPC uses the correct conversion algorithm in floating point and dithers the final result down to the output bitdepth. And btw, that is exactly what my HTPC video renderer does.

*However*, some people are using external video processors like a Lumagen Radiance, a DVDO or a Terranex. And some people have displays/projectors with Realta or Gennum processing chips in them. If you let your HTPC output RGB, you have to deinterlace already inside of the HTPC. The deinterlacing algorithm in the HTPC might be of lesser quality than those offered by the external video processor or by the display/projector. So outputting the original untouched (except for 4:2:2 upsampling) YCbCr data to the external video processor *may* make sense, depending on which hardware you have exactly. Furthermore there are some consumer TVs which work better with YCbCr input compared to RGB input (and vice versa). So it might need trial and error to find the best output colorspace for your specific hardware. If you do want/need to output YCbCr from your HTPC, then *maybe* (depending on driver and OS) a native SDI GPU might output untouched YCbCr which is of course much preferable over twice 8bit rounded and 8bit clipped data, which an HDMI GPU might produce.

Great synopsis. Internal chipset image processing, rounding errors, these are all real issues. I think SDI basically went away on the consumer side with BluRay and HDMI. Most videophiles now just have a good Bluray player that does their standard DVDs as well. SDI modified DVD players in the early 2000s were a kludge and problematic for many users.

I do agree that HDMI has specific advantages over SDI for home theater purposes. But it also locks you into a particular processing engine inside the product. Note too that an after market HDSDI tap is no guarantee of a virgin signal path. For example I have looked inside a JVB HDSDI mod. They literally take the RGB and clock directly off the HDMI connector and run through an FPGA to reformat to HDSDI. To get around the HDCP, there is another sub board that appears to be tacked into the system control buss. So whatever processing is going on inside the HDMI chipsets is going out the HDSDI as well. And this is also a blatant DMCA violation as they are deliberately shutting off HDCP, not just tapping the stream before the HDMI engine. But this isn't a law forum.

So as Madshi said, in an HTPC application, HDSDI may not be a good choice at all if all you are doing is feeding a display device. If you can keep the data in native format until the end, you will have better results. Me? I have a specific setup for a huge library of 1080i recordings that have been copied to disk. My solution works well for me but it's highly version specific as I have found. I may try some new HDMI cards with different player software and who knows I may abandon HDSDI as well.
post #88 of 96
Quote:
Originally Posted by Glimmie View Post

For example I have looked inside a JVB HDSDI mod. They literally take the RGB and clock directly off the HDMI connector and run through an FPGA to reformat to HDSDI. To get around the HDCP, there is another sub board that appears to be tacked into the system control buss. So whatever processing is going on inside the HDMI chipsets is going out the HDSDI as well.

Ah, that's quite interesting! Of course a mod like this makes no real sense...

Quote:
Originally Posted by Glimmie View Post

I may try some new HDMI cards with different player software and who knows I may abandon HDSDI as well.

If you do try, please let us know about your test results - thanks!
post #89 of 96
Quote:
Originally Posted by Glimmie View Post

I do agree that HDMI has specific advantages over SDI for home theater purposes.
Could you enumerate specifically what those advantages are?

Just to be clear, I'm asking specifically about HD-SDI, and not the alleged faults of circuitry inside some unnamed disk player. Please bear in mind that most commercially available content is encoded in 4:2:0 and that any "bad" chips can easily be bypassed by doing a digital rip of the media.

BTW I have nothing against people wanting their home stuff to look more professional. I use Anvil cases as furniture, for example. But I'm also aware that it's only a fashion statement if you follow me.
post #90 of 96
Quote:
Originally Posted by Speed Daemon View Post

Could you enumerate specifically what those advantages are?

Just to be clear, I'm asking specifically about HD-SDI, and not the alleged faults of circuitry inside some unnamed disk player. Please bear in mind that most commercially available content is encoded in 4:2:0 and that any "bad" chips can easily be bypassed by doing a digital rip of the media.

BTW I have nothing against people wanting their home stuff to look more professional. I use Anvil cases as furniture, for example. But I'm also aware that it's only a fashion statement if you follow me.

Advantages of which - HDMI or HDSDI? And for who, a HT enthusiast or myself? Because that's two different answers as well.

Didn't you state that HDSDI has little if any benefit in today's home theaters? And didn't I agree with that?
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Video Processors
AVS › AVS Forum › Video Components › Video Processors › DVI or HDMI vs. HD-SDI