|
Now finally at 1920x1200 where current owners of Radeon HD 4870 and GeForce GTX 280 graphics cards would generally find themselves, Crysis Warhead is completely unplayable. Even the menus were slow to access at this resolution using the enthusiast settings. The GeForce GTX 280 averaged 22fps, while the Radeon HD 4870 X2 hit the lead rendering 25fps. The GeForce GTX 260 dropped below 20fps, and the rest of the other 11 graphics cards tested scored even lower.
Benchmarks: Gamer
![]()
Reducing the visual quality to the “gamer” preset helped improving performance a lot. At 1440x900 the GeForce GTX 280 is now 44% faster, rendering an average of 46fps.
Interestingly, the Radeon HD 4870 X2 was just 10% faster and fell back to the fourth place. The GeForce GTX 260 and GeForce 9800 GTX+ were both quicker than the dual-GPU monster. Still, the GeForce 9800 GTX+, which was the third fastest graphics card tested, rendered just 36fps on average.
![]()
Increasing the resolution to 1680x1050 allowed the Radeon HD 4870 X2 to claw its way back towards the top, overtaking the GeForce 9800 GTX+ by just 1fps.
The GeForce GTX 280 remained the fastest graphics card tested, averaging 40fps. The GeForce GTX 260 came second with 36fps, followed by the Radeon HD 4870 X2 which managed to average 31fps.
The lower-end graphics cards, such as the Radeon HD 3850 and GeForce 9600 GT, averaged about 18fps. Even the new GeForce 9800 GT, aka 8800 GT, managed 26fps using these modest quality settings.
![]()
Now at 1920x1200 using the “gamer” or high quality settings, the GeForce GTX 280 is able to render 36fps followed by the Radeon HD 4870 X2 which produced 32fps.
Again these average frame rates are really too low, making Crysis Warhead somewhat unplayable in many areas. At the very least you are going to require a GeForce GTX 280 or Radeon HD 4870 to run at this resolution using these quality settings.
Benchmarks: Mainstream
![]()
Now the “mainstream” or medium quality settings, which as you can see allow for significantly better frame rates.
The GeForce GTX 280 is now able to render an average of 58fps. This is a 26% increase in performance at 1440x900 compared to the “gamer” quality settings.
The Radeon HD 4870 X2 saw a massive 38% performance boost, in spite of this it was found to be slower than seven other graphics cards. Crossfire wasn't working here, as the Radeon HD 4870 X2 was a mere 1fps faster than the single-GPU Radeon HD 4870. Furthermore, this made the Radeon HD 4870 X2 just 2fps faster than the Radeon HD 3870, and 3fps slower than the GeForce 9800 GT.
It is no secret that the CryEngine 2 favors Nvidia based graphics cards. But these playable medium quality settings show the Radeon based graphics cards at a significant disadvantage, at least at 1440x900.
![]()
When increasing the resolution to 1680x1050, the GeForce GTX 280 drops just 4fps and remains the fastest tested graphics card. The Radeon HD 4870 X2 on the other hand dropped just 1fps, allowing it to overtake the GeForce 9800 GT, positioning it right behind the GeForce 8800 GTS (512MB). Still, the Radeon HD 4870 X2 is only 2 fps faster than the single-GPU Radeon HD 4870.
Using these mid-range settings at 1680x1050 allowed the GeForce GTX 280 and the GTX 260 to deliver perfectly playable performance. The Radeon HD 4870 X2, GeForce 8800 GTS, 9800 GTX, and 9800 GTX+ also delivered excellent performance. The Radeon HD 4870 and GeForce 9800 GT/8800 GT graphics cards all produced 44fps, which was reasonably playable. Then the Radeon HD 4850 managed 42fps which was borderline playable, as was the Radeon HD 3870 with an average of 40fps.
![]()
At 1920x1200 the GeForce GTX 280 drops another frame per second, hitting an average of 53 fps, while the GTX 260 dropped to 50fps.
The Radeon HD 4870 X2 follows in third place with an average of 45fps, dropping just a single frame from 1680x1050, but still showing little advantage compared to its single GPU counterpart. The Radeon HD 4850 managed to average 40fps, while the GeForce 9800 GT/8800 GT cards averaged 39fps.
Final Thoughts
Accurately bench-testing Crysis Warhead is no easy task. We quickly found out that each level is significantly different from the next in terms of performance. Our results were recorded in the first level, called “Call me Ishmael”. This is primarily a jungle level and one of the more demanding levels in the game. The next level is called “Shore Leave”, and here we experienced a similar level of performance.
The third level “Adapt our Perish” saw the average frame rates drop by 3-4 fps. While this may not sound like much, in a game where the frame rates are already very low this made for a noticeable difference. A similar dip in frame rates was also seen in the other ice level, called “Frozen Paradise”.
The fifth level “Below the Thunder” is an underground level in an enclosed environment and here the frame rates were considerably higher. The average went from 22fps at 1920x1200 in the first level, to around 30fps. There are a few more levels in the game, but we won’t spoil them for you.
Realistically we do not believe the performance of Crysis Warhead is any better than the original, and we failed to see any substantial optimizations. Ideally gamers are going to require a current generation high-end graphics card to play this game in all its glory. More to the point, something like a GeForce GTX 280 is required, and at over $400, good luck building a $600 system that can play Crysis Warhead.
![]()
Despite of this, we loved Crysis Warhead. It's truly an amazing looking game and with the right hardware it can be appreciated to its fullest. In terms of gameplay Crysis Warhead is very enjoyable in my opinion, though as usual the single player campaign was a bit too short.
For those running Vista it is worth mentioning that Crysis Warhead will automatically run in the DirectX 10 rendering mode if possible. This is interesting, as DX10 offers very little in the way of visual enhancements, at least from what we saw and noticed. Running Crysis Warhead in DX9 mode will allow for a few more frames per second, in some cases up to 5 fps more on average with the GeForce GTX 280. This may not sound like much, but when you are averaging 22 fps the extra 5 fps is a life saver.
Based on our findings the only graphics cards that are going to be able to utilize the impressive “enthusiast” quality settings include the GeForce GTX 280/260 and Radeon HD 4870/4870 X2. The “gamer” quality settings also worked well with the Radeon HD 4850, GeForce 9800 GTX/GTX+, and GeForce 8800 GTS.
Everyone has their own idea of what kind of frame rates provide perfectly playable performance, and I know that my standards are quite high. The developers at Crytek say that gamers need only 30 - 35 fps on average, and for the most part this is fine for single player gaming.
However, with an average of 30fps gamers cannot afford to be dropping 3 - 4 fps during intense scenes. Personally I much prefer to have an average frame rate of around 50fps, and for multiplayer no less than 60fps. This kind of performance is going to be very difficult to achieve in Crysis Warhead without sacrificing a great deal of visual quality.
Overall Crysis Warhead looks to be another fun game based on the impressive CryEngine 2. At this stage it still looks like gamers are going to require next generation hardware to truly enjoy this game visuals, while those currently using cutting edge technology are getting a fair taste of what is to come.
It will be interesting to see how the upcoming Far Cry 2 also based on the CryEngine 2 performs considering similar claims on playable performance have been made over the past few months.
Update: We had wrongfully stated that Far Cry 2 uses the CryEngine 2, when in reality it uses its own custom engine called "Dunia" developed from the ground up for this title. Terrible mistake for a Far Cry fan like myself, but that's the kind of thing that happens when you edit until it's 8am in the morning. Thanks everyone for your emails. |
|