AVS Forum banner

Status
Not open for further replies.
1 - 7 of 7 Posts

·
Registered
Joined
·
47 Posts
Discussion Starter · #1 ·
My trusty old ATI Radeon card has died and I am contemplating the replacement.


If I am only driving 100% digital displays (i.e. LCDs or similar) without any need for an analog conversion, do the current crop of ATI's and nVidia cards buy me anything?


Is all the horsepower (and $s) on the latest cards only focused on the digital-to-analog conversion?


What card can anyone recommend for a digital only solution?


Do the lose the ability to scale if I stay in the digital domain?


I was about to pull the trigger on the latest ATI card (~US$500.00) and realized that I might be wasting my money.


Many thanks.
 

·
Registered
Joined
·
3,351 Posts
The following statements assume you're not doing any gaming.


If you are using the overlay as your renderer then any of the 9500+ cards from ATI will probably give you the same results. The 9600 non-pro is probably the best value right now.


If you are using the VMR9 renderer then you will be using the 3D parts of the graphics card. In that case the newer cards will probably have better performance although it's not clear that any of the DVD players are actually taking advantage of that yet.


I can't speak for Nvidia.
 

·
Registered
Joined
·
47 Posts
Discussion Starter · #3 ·
jvincent,


Thanks for the reply.


If I understand you correctly, you are indicating that even if I keep my video signal 100% digital (and I don't need a digital-to-analog conversion to drive my LCDs), I still benefit from the 3D/VMR9 renderer for game playing and DVD playing (maybe).


Am I stating this correctly?


I have never been entirely clear whether 3D renderer on video cards was driven by the digital-to-analog conversion or not.


I am then speculating that the 3D renderer offloads some of the CPU rendering load PRIOR TO the d-to-a conversion. Is this correct?


Do 100% digital video cards (with no d-to-a conversion) which include a 3D renderer exist in the marketplace?


Thanks again.
 

·
Registered
Joined
·
3,351 Posts
Quote:
Originally posted by ParkerV
If I understand you correctly, you are indicating that even if I keep my video signal 100% digital (and I don't need a digital-to-analog conversion to drive my LCDs), I still benefit from the 3D/VMR9 renderer for game playing and DVD playing (maybe).


Am I stating this correctly?
Definitely correct for games. It's unclear to me at this point if they are making full use of the 3D graphics engine for DVDs.

Quote:
I have never been entirely clear whether 3D renderer on video cards was driven by the digital-to-analog conversion or not.


I am then speculating that the 3D renderer offloads some of the CPU rendering load PRIOR TO the d-to-a conversion. Is this correct?
The D/A conversion only applies if your display is connected via analog and doesn't really come into play in the graphics processing other than for adjusting picture properties like colour, brightness, etc.


Think of it this way, the 3D engine generates a digital RGB signal that is either sent to the D/A converter or the DVI interface. In the case of dual head cards it is sent to both.


In the 3D graphics game world the whole goal is to push as much of the rendering tasks onto the GPU as possible thus freeing up the CPU. All of the recent cards have a 3D engine of some kind.
 

·
Registered
Joined
·
1,358 Posts
As jvincent indicated, nobody is sure if any of the DVD software decoders actually use 3D capabilities of the GPU. In absence indication to the contrary, I think a fair assumption is that they are not. I believe they are merely providing support for VMR9 as plug-in component of DirectShow which is what Windows XP and 2000 uses for DVD support.


In other words, they make sure that output pins of decoder properly connect to input pins of VMR9 renderer. Some decoders like Nvidia, have built in interface that allows you to adjust image properties like contrast, saturation, etc. in VMR9, that is done through hardware/driver support of GPU. I would call both "low level" support.


Direct X and GPU drivers are the ones that send 3D engine instructions how to render 2D polygons and assamble image from such poligons. Direct X provides framework for this process, while drivers adapt this process to particular hardware. For example, driver and hardware level support would determine subpixel precision for texture rendering. Older cards had 4 bit per color, then 8 bit, and now most cards provide 10 bit subpixel precison. In their 9500+ series ATI provides support for 128 bit floating point color precision which should translate into 32 bit precision per color). Also, a different rendering process is employed in cards with 4 rendering pipelines (e.g. 9600 series) vs. cards with 8 rendering pipelines (e.g. 9800 series).


I did test 9800 vs. 9600 and 9800 this summer and 9800 appeared to be a bit better than 9600 in terms of image quality. Although 2D rendering process is not so demending as 3D rendering process, it looks like there was some benefit from additional bandwith. Also, algorithms that 9800 uses in 3D rendering are 2.1 pixel shaders vs. 2.0 pixel shaders used in 9600 series. Again, difference was present but not that substantial. It will be interesting to see now ATI's R420 GPU with 3.0 pixel shader support will do (due out this spring).
 

·
Registered
Joined
·
3,351 Posts
Since this is as good a thread as any......


What would be really cool is if someone started to move the ffdshow (or equivalent) algorithms into the GPU.


There are some interesting papers/presentations on the ATI website that show examples of image processing using the GPU. This would have the double benefit of offloading the CPU and potentially improving picture quality by allowing the use of the higher precision rendering in the 3D engines.
 

·
Registered
Joined
·
1,358 Posts
I second that request. It would be great if moderators could start a new thread on the topic of software support for 3D rendering.
 
1 - 7 of 7 Posts
Status
Not open for further replies.
Top