|
|
:lol: ,有人替我发了,我来贴些图。省得大家看着麻烦。
Testbed and Methods
To test the performance of XFX GeForce 8800 Ultra Extreme we assembled the following standard test platform:
Intel Core 2 Extreme X6800 processor (3.0GHz, FSB 1333MHz x 9);
Asus P5N32-E SLI Plus mainboard (Nvidia nForce 680i SLI chipset);
Corsair TWIN2X2048-8500C5 (2x1GB, 1066MHz, 5-5-5-15, 2T);
Maxtor MaXLine III 7B250S0 HDD (250GB, Serial ATA-150, 16MB buffer);
Enermax Galaxy DXX EGX1000EWL 1000W power supply;
Dell 3007WFP monitor (30", [email=2560x1600@60Hz]2560x1600@60Hz[/email] max display resolution);
Microsoft Windows Vista Ultimate 32-bit;
ATI Catalyst 7.5;
Nvidia ForceWare 158.24.
Since we believe that the use of tri-linear and anisotropic filtering optimizations is not justified in this case, the AMD and Nvidia graphics card drivers were set up to provide the highest possible quality of tri-linear and anisotropic texture filtering. We have also enabled transparent texture filtering to achieve best image quality by selecting Adaptive antialiasing in AMD Catalyst and Transparency antialiasing in multisampling mode of Nvidia ForceWare drivers. As a result, our settings looked as follows:
ATI Catalyst:
Catalyst A.I.: Standard
Mipmap Detail Level: High Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: On
Temporal antialiasing: Off
High Quality AF: On
Other settings: by default
Nvidia ForceWare:
Texture Filtering: High quality
Vertical sync: Off
Trilinear optimization: Off
Anisotropic optimization: Off
Anisotropic sample optimization: Off
Gamma correct antialiasing: On
Transparency antialiasing: On (multi-sampling)
Other settings: by default
We selected the highest possible graphics quality level in each game using standard tools provided by the game itself. The games configuration files weren’t modified in any way. Performance was measured with the games’ own tools or, if not available, manually with Fraps utility version 2.8.2. We also measured the minimum speed of the cards where possible.
We performed tests in 1280x1024/960, 1600x1200 and 1920x1200 resolutions. The games that didn’t support 16:10 ratio were run in 1920x1440 resolution. We used “eye candy” mode everywhere, where it was possible without disabling the HDR or Shader Model 3.0. Namely, we ran the tests with enabled anisotropic filtering as well as MSAA 4x. We enabled them from the game’s menu. If this was not possible, we forced them using the appropriate driver settings of Catalyst and ForceWare.
XFX GeForce 800 Ultra Extreme will be competing against the following graphics accelerators participating in our test session:
Nvidia GeForce 8800 Ultra (G80, 612/1512/2160MHz, 128sp, 32tmu, 24rop, 384-bit, 768MB)
Nvidia GeForce 8800 GTX (G80, 576/1350/1900MHz, 128sp, 32tmu, 24rop, 384-bit, 768MB)
Nvidia GeForce 8800 GTS (G80, 513/1188/1600MHz, 96sp, 24tmu, 20rop, 320-bit, 640MB)
AMD Radeon HD 2900 XT (R600, 742/1650MHz, 320sp, 16tmu, 16rop, 512-bit, 512MB)
AMD Radeon X1950 XTX (R580+, 650/2000MHz, 48pp, 8vp, 16tmu, 16rop, 256-bit, 512MB)
For our tests we used the following games and benchmarks:
First-Person 3D Shooters
Battlefield 2142
Call of Juarez
Far Cry
F.E.A.R. Extraction Point
Tom Clancy’s Ghost Recon Advanced Warfighter
Half-Life 2: Episode One
Prey
S.T.A.L.K.E.R.: Shadow of Chernobyl
Third-Person 3D Shooters
Hitman: Blood Money
Tomb Raider: Legend
Splinter Cell: Double Agent
RPG
Gothic 3
Neverwinter Nights 2
The Elder Scrolls IV: Oblivion
Simulators
X3: Reunion
Strategies
Command & Conquer 3
Company of Heroes
Supreme Commander
Synthetic Benchmarks
Futuremark 3DMark05
Futuremark 3DMark06
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]() |
|