Nvidia GeForce 8800 GTX - Review

Nvidia GeForce 8800 GTX - Review


The GeForce 8800 GTX is everything we'd hoped it would be. For a suggested street price of S$895 (US$588.82), the GeForce 8800 GTX brings tremendous processing power to current-generation games. It's also the first card to market that will support all of the 3D gaming-related features of Windows Vista and DirectX 10. The initial release of next-gen games is a bit far off. The poster child, the 3D shooter Crysis, is yet to debut, and even that game might not put all of the next-gen bells and whistles into play. Still, the GeForce 8800 GTX is so powerful, even compared to ATI's fastest dual card combination, that there's no reason to spend more on a pair of Radeon cards when you can outperform them with a single GeForce 8800 GTX. That and the fact that Nvidia has finally caught up to ATI's image-quality advantages earn Nvidia's newest card our Editors' Choice award for high-end 3D graphics cards.

Because of design changes in the GeForce 8800 GTX chip's new architecture, we need to consider some of this card's specs differently than we have in the past. The basics are the same. The GeForce 8800 GTX has a core clock speed of 575MHz, and it comes with 768MB of DDR3 RAM clocked to 900MHz with a 1,800MHz data rate. That memory rate is a significant uptick compared to the 800MHz RAM in Nvidia's last flagship card, the GeForce 7950 GX2. But one of the main differences in the GeForce 8800 GTX's architecture lies in how we consider its pipelines.

In the past, we've said that a 3D chip has X amount of pixel pipelines and Y pipes for shader calculations. But because of the new specifications of DirectX 10, the GeForce 8800 GTX employs what's called a unified architecture. In other words, no pipe is geared toward a particular task. Instead, the GeForce 8800 GTX comes with 128 stream processors, which can dynamically process whatever info is thrown their way. This means that if your card is processing a shader-intensive scene, it can tap from more of the pipeline pool to process that image, rather than being capped at 24 or 48 pipes because some of the other pipes are set aside for geometry only. This capability should give game designers much more flexibility in how they design games, knowing that if they can balance the workload properly, they can pump a lot of processing power into a given calculation.

What's perhaps even more impressive about the GeForce 8800 GTX is its sheer horsepower. Its transistor count sits at 681 million on a 90-nanometer manufacturing process chip. That's more than the two 271-million-transistor chips on the GeForce 7950 GX2 combined. To power a single GeForce 8800 GTX card, Nvidia recommends a 450-watt power supply in a PC with a high-end dual-core chip and a typical combination of internal hardware. But the trick is that the power supply must have two PCI Express card power connectors to plug into the two sockets on the back of the card. Most modern power supplies should have the necessary connectors. If you want to add two 8800 GTX cards in an SLI configuration, however, you've got a challenge on your hands.

To run two GeForce 8800 GTX cards in SLI mode, Nvidia recommends at least a 750-watt power supply. But some of the recommended models on its SLI compatibility list go as high as 850 and even 1,000 watts. We suspect those higher-wattage recommendation will allow you some headroom for adding multiple hard drives and optical drivers, as well as very high-end quad-core processors. Still, it's clear that building a next-gen SLI rig will be no small undertaking, at least for now. Heck, many midtowers PC cases are too small to accept a 1,000-watt power supply.

With no DirectX 10 games available to test on at the moment, we can't speak to the GeForce 8800 GTX's next-generation performance, aside from the fact that it's the only card on the market that claims DirectX 10 compatibility. ATI's next-gen card, the Radeon HD series, also offers DirectX 10 features. And while we can't really say who will win the battle for next-generation performance, the GeForce 8800 GTX dominates every single other card on the market at time of writing.

One of the most important things to note about the GeForce 8800 GTX and its performance is that you would be smart to pair this card with a capable monitor that can go to resolutions of 1,600 x 1,200 or above. Nvidia calls this XHD (extreme high definition) gaming. Whatever you want to call it, if you're not playing at high resolutions with antialiasing, anisotropic filtering, and other image-quality tweaks cranked, you'll likely hit a CPU bottleneck, which means that you're not giving the card enough to do. But when you get up to those high-quality settings, the results are amazing.

GameSpot was kind enough to provide us with benchmarks, as per usual. We suggest you check out their story, too; there are a number of screenshots taken during testing that show off the image quality. We'll focus on frame rates. Our highlight here is The Elder Scrolls IV: Oblivion. That game has been considered the pinnacle of DirectX 9-based game programming and has humbled even ATI's mighty Radeon X1950 XT CrossFire setup, which can barely pass 60 frames per second (fps) with no antialiasing. But the GeForce 8800 GTX blew past ATI's highest-end configuration, scoring 64fps on that test.

Oblivion also lets us highlight how the GeForce 8800 GTX has pulled Nvidia even with ATI on current-gen image quality. ATI has had an advantage on certain games, most conspicuously Oblivion, because through an unofficial patch, Radeon cards let you turn on antialiasing and high dynamic range lighting simultaneously. The resulting image looks noticeably better than if you can do only one or the other, as you can with the GeForce 7000-series cards. Not only can the GeForce 8800 GTX do both AA and HDR lighting, it also does them faster than a Radeon X1950 XT CrossFire rig. On that Oblivion test, the 8800 GTX scored an impressive 45 frames per second, which is much smoother than ATI's 28fps.

You might notice that the GeForce 8800GTX doesn't win on every single test. On Half-Life 2: Episode One at 8X antialiasing, an ATI CrossFire setup edged it out. It's worth noting that the GeForce 8800 GTX hit 80 frames per second, though, so it's not exactly slow. But even better, at 16x antialiasing ,which is more demanding, the GeForce 8800 GTX's score stayed basically the same at 84fps, where the CrossFire cards' scores dropped off. This lends to Nvidia's argument that the GeForce 8800 GTX delivers better performance on extremely high image-quality settings.

The other test it lost on was Quake 4, wherein the Radeon X1950 XT CrossFire beat it at both resolutions by about 15fps or so. Again, even at 2,048 x 1,536, the GeForce 8800 GTX scored 68fps, so it's by no means slow. But it's also worth noting that a Radeon X1950 XT CrossFire setup costs more for the master and the slave cards, and they can't do DirectX 10. The cheaper GeForce 8800 GTX and its forward-looking capabilities are clearly the better deal.

We should note a couple of final thoughts here. The first is that with the GeForce 8800 GTX, Nvidia is also unveiling something called CUDA, which stands for Compute Unified Device Architecture. Because of the 8800's complexity, Nvidia is offering a framework for programmers to write software to use the GPU for intense number calculations. For gamers, Nvidia showed us how developers might use CUDA to really ramp up game physics calculations, but Nvidia is also offering this capability to the medical community and anyone else who might benefit from a combination of intense image-processing and number-crunching power. Nvidia is still getting the word out on CUDA, so there's no way to check it out right now. Nvidia also unveiled its new PureVideo HD software as a component of its new universal ForceWare driver, which debuts today and includes support for the GeForce 8800 cards. PureVideo HD will run on both the 8000-series and the older 7000-series GeForce cards, and it's designed to enhance HD video content coming from your PC. We have a sample system in to play with, and we're in the process of putting it through its paces.