AVS Forum banner
Status
Not open for further replies.
1 - 17 of 17 Posts

·
Registered
Joined
·
3,737 Posts
Discussion Starter · #1 ·
This is becoming a bit of a story, although no U.S. bases sites have really picked up on it except XtremeSystems.


Wasn't sure if this was more appropriate in the gaming or HTPC thread, but since AVS HTPC users swear by nVidia, and the cards are used for both gaming and HTPC, I put it here.


A translated article from 3DCenter of Germany.

http://translate.google.com/translat...language_tools


Here is another French article.
http://216.239.37.104/translate_c?hl...language_tools


excerpt translated by google:


Among of the innovations proposed by NVIDIA, the manufacturer indicates that GeForce 7800 GTX is announced like having a more powerful filtering anisotropic. First thing, ONE should not lose sight of the fact that you makes even its architecture, GeForce 7800 GTX will lose less performances when a complex filtering is activated since it lays out of a number of pipeline higher than that you a number of ROP. Corn it is not all, since NVIDIA A once more modified its filtering anisotropic. IT is difficult to say what A exactly changed, corn IT is clear that that Ci is different. Unfortunately this change involves sometimes a fall of prices of visible quality, since during displacements one can more observe flutters on the certain parts of textures. This flutter is more or less significant according to the level of detail of texture, its orientation, or its level (1st is less affected IN multi texturing). Very more light on GeForce 6800, this flutter is marked more here.
 

·
Registered
Joined
·
1,392 Posts
Would be nice to have that in native english, a little bit hard from the translation to tell what exactly is worst and why.


I've always been a believer that it's not so much the chipset that makes a difference in quality (as far as home cinema is concerned, and the exception of some recent developments like purevideo) but the VGA output filtering that goes on, does ne1 else remember those modifications of old Geforce 2's that you would remove/bridge resistors to remove the EM filtering part of the VGA output and your picture would become alot more crisp and vibrant, also the quality of the power filtering in the video card makes a big difference IMO, and there were some mods for older card way back for them too (changing capacitors for diiferent values/low ESR ones)


Does ne1 know of any mods like this for the more recent cards?
 

·
Registered
Joined
·
3,737 Posts
Discussion Starter · #3 ·
This has nothing to do with EM interference, the articles concern AF filtering.


They are basically saying even on "high quality" mode, users are experiencing lots of flickering and degraded IQ. The comparison to the 6800 line is that this flickering AF filtering problem went away when "high quality" was selected on 6800s, but the flickering is present on the 7800s at the highest IQ mode. It's due to driver optimizations sacrificing IQ for performance.


I'm sure there will be comparisons done by U.S. sites in a matter of time.


Also, if anyone thinks this was me flaming nVidia, read the articles, they are pretty hard on ATI's IQ as well.


I just found this very interesting and though ppls here would notice because IQ is sacred here at AVS, especially in the HTPC section.
 

·
Registered
Joined
·
115 Posts
Anisotropic filtering does absolutely nothing for video display. AF is for the display of a texture that isn't on the same plane as the screen.


So to answer your question: No the 7800 isn't any better than the 6800. Unless you have a 6800 without purevideo.


Coldie
 

·
Registered
Joined
·
3,325 Posts
I thought the 7800 did motion adaptive de-interlacing for native HD interlaced material (i.e. not 24-fps material), whereas the 6600/6800 does this only for SD material. That would be one advantage for the 7800. Both the 6600/6800 and 7800 do film-mode de-interlacing for HD material.
 

·
Registered
Joined
·
107 Posts
Just to make sure everybody got this:

This optimization does not affect video, it's solely aimed at 3D.


My question is: did they alter their trilinear filtering or was it just their anisotropic filtering or even worse: both.
 

·
Registered
Joined
·
115 Posts
Actually there is such a thing. Many of the first cards shipped with the video processor disabled.


If you have a card with the video processor enabled there will be no difference between the 7800 and 6800.


The filtering differences are fairly subtle and well within the range of different not right/wrong. The difference was I believe in the slop calculations so both trilinear/aniso will be affected.


Coldie
 

·
Registered
Joined
·
23,130 Posts
Quote:
Originally Posted by coldie
Actually there is such a thing. Many of the first cards shipped with the video processor disabled.
I think you're confused, the "VPP" stuff has always worked on all 6800s, the issue is that the 6800s (all except the new rev on the plane-6800 PCIe version) have WMV accelleration broken.
 

·
Registered
Joined
·
3,737 Posts
Discussion Starter · #11 ·
I guess nVidia is getting pressured to answer some questions finally. If this should be moved to the HT Gaming section, feel free.

http://www.theinquirer.net/?article=25807


Best part:


"Another German web site Computerbase , went a step further. It made a custom driver by changing the inf, where the driver could not recognise 7800GTX and use its optimisations. The card was listed as unknown but was working just fine. But when the guys went testing they noticed a massive performance drop when using those drivers, close to 30 percent and related it to anisotropic filtering. Nvidia has a lot to explain."
 

·
Premium Member
Joined
·
5,818 Posts
I'm still yet to hear back from someone that can confirm that they see "Pixel Adaptive" deinterlacing set for deinterlacing 1080i in the Nvidia decoder properties page for the 7800 series. All I know is that for the 6600 and 6800 series, that option disapears for 1080i files and only the "Best Available" is set instead.

Quote:
Originally Posted by balazer
Hey,


What about the 7800's HD motion-adaptive de-interlacing? The 6800 doesn't have that.
 

·
Registered
Joined
·
1,732 Posts
Quote:
Originally Posted by mkanet
I'm still yet to hear back from someone that can confirm that they see "Pixel Adaptive" deinterlacing set for deinterlacing 1080i in the Nvidia decoder properties page for the 7800 series. All I know is that for the 6600 and 6800 series, that option disapears for 1080i files and only the "Best Available" is set instead.
Look Here
 

·
Registered
Joined
·
1,732 Posts
Quote:
Originally Posted by balazer
??? That thread doesn't mention the 7800 at all.
You're right. But the 7800/6xxx series have the same issues....
 

·
Premium Member
Joined
·
5,818 Posts
Okay, I'm confused. So, where in the decoder properties does it say that the 7800 is using DXVA "motion adaptive" for 1080i whereas for the 6xxx it doesn't?
 

·
Registered
Joined
·
3,325 Posts
I don't know. All I know is that the Nvidia PureVideo web page says the 7800 has HD pixel-adaptive de-interlacing, and the 6800 doesn't.
 
1 - 17 of 17 Posts
Status
Not open for further replies.
Top