AVS Forum banner

Status
Not open for further replies.
1 - 7 of 7 Posts

·
Registered
Joined
·
160 Posts
Discussion Starter #1
Read the article here

http://www.anandtech.com/showdoc.html?i=1561&p=1


Resolution
http://www.anandtech.com/showdoc.html?i=1561&p=10



All currently shipping Xbox games, as far as we know, are rendered internally at 640 x 480 and then sent to the Conexant chip which either interlaces the frames for output to a regular 480i or HD 1080i display or leaves the full resolution lines intact for every frame when being sent to a 480p or 720p output. This means that even for HDTV owners, 480p is the best you're going to get for now. Because of the sheer memory bandwidth requirements, 1080i doesn't make much sense for game developers. At 1920 x 1080 there are 153,600 more pixels (8% more) to be rendered than at 1600 x 1200 and we already know how memory bandwidth intensive 1600 x 1200 can be. Considering that the Xbox only has 6.4GB/s of memory bandwidth to work with, only in games with relatively small textures and low detail can we expect 1080i to be a reality. The much more desirable option is 720p.


In 720p mode there are 135,168 more pixels to be rendered than at 1024 x 768 which is very easily done at above 60 fps by every card since the GeForce2 GTS. The problem you run into next is that most HDTVs don't support 720p but instead support 1080i. This isn't as big of a problem since the conexant chip can scale the output to 1080i and most TVs even scale unsupported inputs to resolutions they do support. This then becomes a question of what is a better scaler, the conexant chip or your HDTV. In the future we hope to see more use of 720p in games because currently even 480p without AA enabled does result in quite a few jagged edges.


This brings us to the next issue which is the lack of AA use in current games. None of the titles we've played with (DOA3, Halo, NFL Fever or Project Gotham) enable any of the multisample AA modes supported by the NV2A GPU. The games inherently look very good because of their higher resolution textures and use of pixel and vertex shader programs however aliasing is still present at varying degrees among these titles. Because of the benefits of multisampled AA, enabling NVIDIA's Quincunx mode would not hurt performance all that much, especially at 640 x 480. The only problem that would occur would be an increased blurriness and a blurring of text which would require some workarounds to reduce but it's definitely possible. The use of a higher render resolution such as 960 x 720 (720p) would tremendously reduce the amount of aliasing; for non-HD users, who make up the majority of the Xbox buying population, it would make much more sense to render at 640 x 480 and enable some form of AA whether it be Quincunx, 2X or 4X mode depending on the memory bandwidth usage patterns of the game itself. While all hope is lost of AA to be enabled in this first generation of titles, hopefully we'll see more use of the technology as developers learn from the mistakes of those before them.
 

·
Registered
Joined
·
294 Posts
From the second article listed above:
Quote:


The Xbox features a Conexant video encoder chip that supports the following TV output resolutions: 480i, 480p, 720p and 1080i. However the input of that chip is governed to a maximum (according to Conexant's tech-docs) of 1024 x 768. Note that the input resolution and the resolution outputted to your TV don't have to be the same, but if they are not the same you're just going to be scaling or shrinking the image and won't get any additional quality out of it.



Does this mean that internally the XBox can only do 1024x768? I thought the Geforce chip could do 1080i itself (I've done it on my PC with a GeForce3 and PowerStrip).

I'm confused, can someone shine a light?


Cheers,

Moaz
 

·
Registered
Joined
·
20,517 Posts
Quote:
Originally posted by hmoazed
From the second article listed above:



Does this mean that internally the XBox can only do 1024x768? I thought the Geforce chip could do 1080i itself (I've done it on my PC with a GeForce3 and PowerStrip).

I'm confused, can someone shine a light?[/b]
Sounds like the answer to your first question is "yes". This is definitely not due to a limitation of the GeForce3-related XGPU--it's the video encoder chip's problem.


Maybe the game developers can strive for 576x1024p (the largest 16:9 window that will fit into 768x1024).


-- Mike Scott
 

·
Registered
Joined
·
186 Posts
I read this as well, and I think they are wrong. I thought the Conexant video encoder chip was only used to transcode to NTSC, ala SDTV, not HDTV. This means that 1024x768 is max for output to a Standard Defenition Television. Anything hooked up via the Hi-Def pack, should bypass the encoder chip. People already hook their NVida, and other, based video cards to their HDTVs.
 

·
Registered
Joined
·
1,212 Posts
Then why choose an encoder that specifically supports scaling to HDTV compatible resolutions? Surely an encoder that only scales to NTSC standards (640 x 480i) would be cheaper and less complex to incorporate into the XBox design.


As a digital design engineer, my gut feeling would be that they are using the Conexant encoder for all output, and the HD Pack is used soley to breakout the proper connections to the HDTV. Therefore, the highest "true" resolution supported by the XBox is only 1024 x 768.


Of course, this is just my opinion.


Dino
 

·
Registered
Joined
·
1,212 Posts
Some more thoughts on the Conexant Encoder.


Quote from Conexant's web page:


"Conexant's CX25871 video encoder solution is now shipping in Microsoft's Xboxâ„¢ video game system. The CX25871 allows the gaming system to connect to both high definition (HDTV) and traditional Composite, S-Video, and SCART analog television sets worldwide. Introduced in August 2000, Conexant's CX25871 is the first video encoding solution to provide ATSC HDTV output capability, true international video format support and adaptive flicker-filtering in a single chip."



Also, looking at the product brief, the encoder supports either RGB or YPbPr output directly, eliminating the need to add more circuitry to do the color space conversion. If the encoder was bypassed when in HDTV mode, another circuit would need to be added to transcode the Nvidia's RGBHV output to YPbPr. Seems way to redundant (and expensive) to me.


Dino
 

·
Registered
Joined
·
1,236 Posts
CX25870 and CX25871


Conexant's CX25870/871 is specifically designed to meet TV out system requirements for the next-generation desktop PCs, notebook PCs, game consoles and set-top boxes. With pin- and software-forward compatibility to the Bt868/869, manufacturers can quickly bring to market new solutions that support adaptive flicker filtering, ATSC High Definition Television (HDTV) output, true international television display (NTSC, PAL and SECAM output), and resolutions from 320 x 200 to 1024 x 768.



Features


Adaptive flicker filter technology

Supports 1024 X 768 resolution

Displays 16:9 aspect ratio and supports WSS

Worldwide video support (NTSC, PAL, SECAM)

Macrovison support




Its says the chip has macrovision support, does this mean MS doesnt enable this feauture for HD resolutions thus making the progressive scan DVD unavailable?
 
1 - 7 of 7 Posts
Status
Not open for further replies.
Top