Asus EN 8800 GTX - nVidia's G80 Performance Revealed Page: 1

g80 naked

The last generation of cards saw nVidia slip well behind ATI in terms of image quality, although with their dual-PCB, dual-GPU 7950GX2 they clawed the performance crown back. This was a place that nVidia was not used to being, especially after the success of their 6 series.

DirectX 10 is a technology that we are all waiting for and both nVidia and ATI are producing cards (hopefully) in time for the release. nVidia are the first onto the market with their 8800GTX and GTS. With an architecture almost totally re-worked from the old generation, nVidia have gone all out on a unified architecture.

Asus kindly sent us their 8800GTX for a study on how the card does in real life, but first let us explore G80's features.

Outlining the technology

There's a lot of information out there on nVidia's latest gen of card so I thought I'd try to keep the explanation part simple and concise.

In the 8800GTX nVidia have implemented a parallel, unified shader design consisting of 128 individual stream processors running at 1.35 GHz. As described in my article on ATI's unified shader architecture, nVidia have made a pipeline that processes vertex, pixel, geometry, or physics operations: giving it flexibility and efficiency.

g80 architecture

So what can we see here then?

One noticible difference first of all is that nVidia have implemented ZCull before the data enters the stream processors. ZCull is a process that strips the data that you will not see out of the rendering engine. This means that the GPU does not waste time rendering stuff you will never see on screen. Previously this was implemented in post processing, meaning that vital processing power was used to render the unnecessary pixels, which were then culled afterwards.

Let's see why both nVidia and ATI think that a unified architecture was needed to increase the performance of DX10 cards:

DirectX 9 and traditional Shaders:
dx 9 pipelines

DirectX 10 Unified Shaders:
unified pipelines

So what do we have in the two pictures? In the first we see the classic non-unified pipelines with seperate vertex and pixel shader pipelines. The argument is that when you get a larger amount of either type of shader information then only one of the separate pipelines will be working to maximum effect, with "idle hardware" as nVidia put it.

Let's move onto the Unified example. Here in both geometry and pixel workloads the unified architecture excels (in theory) as the unified shader pipelines use their flexibility to render any of the information sent their way. Couple this with dynamic load balancing and you have a mightily efficient architecture that can handle anything thrown at it.

This means that you have a GPU with 128 shader processors each capable of processing pixel, vertex, geometry and physics data.

nVidia are also hoping that the flexible and efficient (and of course hugely parallel) processor in their G80 will mean that other data can also be processed.

DirectX 10

I don't want to go into too much detail with DirectX 10, as this has been covered in one of our previous articles - see here, but I'll just go over why DX10 will also add to the performance increase.

cpu overhead

dx10 pipeline

DirectX 10 reduces the CPU overhead by reducing the amount that the CPU gets involved in the rendering process. By cutting out the CPU in the most basic API processes, DirectX 10 means that the time that each object get's rendered is hugely reduced. Let's look at it like this (ATI slide)

ati dx 10 slide

DirectX 10 solves this by working towards general processing units. The new Geometry Shader can manipulate Vertexes and many other types of objects. This means that it has more ways to process, access and move data. This extra level of manipulation adds a lot of headroom for developers to introduce new features into games and to utilise the GPU for more than just rendering of a scene.

So Geometry Shaders can manipulate do the developers use this?

Well basically I'm hoping that developers will use this to do things like stop "pop-up" (of trees/objects etc in the distance). I can see that there would be a huge advantage in using these units to change things like water over distance and adding far superior rendering to characters that are in the periphery of games: such as excellent crowd animation in racing/sports games. This is all my own speculation, but it would certainly be nice to see.

Memory interface mid-process

Also added into nVidia's "Stream" processors is the ability to move data to memory and back again in one pass. This means that you should no longer require data to have two or more passes before it can be outputted. Once again this adds to the picture of added efficiency that nVidia are building up.


Shader Model 3 brought far superior instancing than we had seen before. Instancing means that you can render one object and replicate it a whole load of times, creating a fuller effect. This is very useful in trees and grass where you need to replicate basically the same thing many times over.


128-Bit High Dynamic Range

nVidia have implemented 32-bit floating-point precision for a total of 128bits dynamic-range rendering. They claim this is a quality of accuracy that outperforms film renders

This is enough detail for this article at the moment, but I may do a fuller article on this in the future.

Asus EN 8800 GTX - nVidia's G80 Performance Revealed Page: 2

Asus have put the 8800GTX in perhaps the biggest box I have ever seen a graphics card in. It's huge, and you definitely notice that they are giving you GRAW free with this card.

asus 8800GTX box

Also on the front we notice that we've got 3DMark06, GTI Racing and Asus's "Splendid" Video enhancement Technology.

asus 8800GTX box 2  box handle

The box also has a handle which gives a nice touch of quality about the feel of your purchase and allows you to comfortably carry it home from the Post Office!

The inside of the box is again nicely done, with compartments for everything and Asus's usual attention to detail:

asus 8800gtx inside the box asus 8800GTX indide box


Inside the box you will find an array of software and hardware for your perusal:

* 1 x DVI > VGA Adapter
* 1 x S-Video > component Cable
* 2 x Dual Molex to PCI-e power dongles
* 1 x Faux leather CD wallet
* 1 x Asus Speed Setup manual
* 1 x Asus 8800GTX CD Manual
* 1 x Asus VGA driver Manual
* 1 x 3DMark06 Full Edition
* 1 x GTI racing Full Game
* 1 x Ghost recon Advanced Warfighter Full game

asus 8800gtx package

All in all a decent bundle from Asus, although a couple of "could do betters" in there for such a high-end card. I would have liked to see another DVI > VGA converter and an added S-Video > Composite Cable. Also some sort of DVD software would have been a nice addition.

However despite these slight omissions, Asus have put in a nice bundle.

Asus EN 8800 GTX - nVidia's G80 Performance Revealed Page: 3
Page <script type="text/javascript" src=""></script> Posted 18/11/06
Author: Matthew Kemp (kempez)
Product Acquired: Asus

The Card

Your first impressions of the 8800GTX tend to be "wow that's a big card", and they are not wrong. Measuring about 27-27cm in length this thing is an absolute beast. even our 7950 GX2 card looked like a toy compared to the 8800GTX.

asus 8800GTX asus 8800gtx close up

Another thing you'll notice is the fan: the fins are around the edge of the cooler, rather than branching out from the middle. I'll take a closer looks further down.

The Asus card is based on the nVidia reference board (as all release cards are), although Asus have decided to brand it with a character from GRAW. I'm not a fan of branding in this way, but it's not unattractive.

asus 8800gtx rear

The rear of the card is by far the busiest of any card we've seen so far, with a huge amount of copper and transistors on the PCB showing just how much nVidia have squeezed onto this board.

We can see that the card needs a lot of power management on it to service this hungry GPU.

8800gtx cpas asus 8800GTX end view

The black PCB on the card makes it look awesome and with the black cooler: very meaty. It's obviously a dual slot cooler and it will be interesting if nVidia start producing models with a single slot cooler, or whether the heat output of their new cards right now is far too much to be able to.

Dual DVI brings HDCP so that you can run your card into a HDCP enabled monitor, when this is required by DRM.

8800gtx dual dvi

Under the clothes - G80 naked

Let's get this card stripped off and see what's underneath...

g80 with shim

nVidia have gone for the integrated heat-spreader with G80, following AMD and Intel. This looks good and gives the GPU some amount of protection if your using a 3rd party cooler. However they do tend to add 2-3°C onto the temperature.

The chip has a metal shim around the outside of it making it look pretty heavyweight. I'm not quite sure why nVidia felt the need to do this, with the IHS already on the chip, but I'm sure it serves some decent purpose.

G80 memory controller

This chip is the IO chip codenamed NVIO (not to be confused with NV10!). It controls all of the input/output on the card basically functioning as an advanced RAMDAC. The chip supports dual-link DVI ports with HDCP support and HDTV-out.


12 lots of Samsung's highest rated 900MHz GDDR3 add up to 768mb in total in the 368-bit bus sitting around G80.


We've seen a run-through of what is underneath the skin of the G80, let's see how it specs up on paper:

Default clock of Core: 575 and Memory: 1800MHz

NVIDIA® Unified Architecture

* Unified shader architecture
* GigaThreadTM technology
* Full support for Microsoft® DirectX® 10
o Geometry shaders
o Geometry instancing
o Streamed output
o Shader Model 4.0
* Full 128-bit floating point precision through the entire rendering pipeline

NVIDIA LumenexTM Engine

* 16x full screen anti-aliasing
* Transparent multisampling and transparent supersampling
* 16x angle independent anisotropic filtering
* 128-bit floating point high dynamic-range (HDR) lighting with anti-aliasing
o 32-bit per component floating point texture filtering and blending
* Advanced lossless compression algorithms for color, texture, and z-data
* Support for normal map compression
* Z-cull
* Early-Z

NVIDIA Quantum EffectsTM Technology

* Advanced shader processors architected for physics computation
* Simulate and render physics effects on the graphics processor

NVIDIA SLITM Technology1

* Patented hardware and software technology allows two GeForce-based graphics cards to run in parallel.

Scaling performance and enhance image quality on today's top titles.

NVIDIA PureVideoTM HD Technology2

* Dedicated on-chip video processor
* High-definition H.264, VC-1, MPEG2 and WMV9 decode acceleration
* Advanced spatial-temporal de-interlacing
* HDCP capable3
* Spatial-Temporal De-Interlacing
* Noise Reduction
* Edge Enhancement
* Bad Edit Correction
* Inverse telecine (2:2 and 3:2 pull-down correction)
* High-quality scaling
* Video color correction
* Microsoft® Video Mixing Renderer (VMR) support

Advanced Display Functionality

* Two dual-link DVI outputs for digital flat panel display resolutions up to 2560x1600
* Dual integrated 400MHz RAMDACs for analog display resolutions up to and including 2048x1536 at 85Hz
* Integrated HDTV encoder provides analog TV-output (Component/Composite/S-Video) up to 1080i resolution
* NVIDIA nView® multi-display technology capability
* 10-bit display processing

Built for Microsoft® Windows VistaTM

* Full DirectX 10 support
* Dedicated graphics processor powers the new Windows Vista Aero 3D user interface
* VMR-based video architecture

High Speed Interfaces

* Designed for PCI Express® x16
* Designed for high-speed GDDR3 memory

Operating Systems

* Built for Microsoft Windows Vista
* Windows XP/Windows XP 64
* Linux

API Support

* Complete DirectX support, including Microsoft DirectX 10 Shader Model 4.0
* Full OpenGL® support, including OpenGL 2.0

The Cooler

The cooler on the 8800GTX is nVidia's most impressive job so far in my opinion. With a GPU that kicks out as much heat as G80 does, the heatpipes and low speed fan cope pretty well with the heat. ruinning at a steamy 62°C Idle and 80°C Load, the fan wasn't even audible above the case fans in my case (running at low speed settings).

8800gtx heatpipe 8800gtx cooler

The fan is certainly an odd design with the fins "scooping" the air around the outside, rather than coming from the middle of the fan. I believe this is referred to as a "squirrel cage" fan:

asus 8800gtx cooler

This scoops the air straight out of the case which is a good things too with the heat output of this uber-high-end card.

g80 goop g80 goop!

There seems to be an abundence of "goop" on the Asus nVidia reference cooler. I know that the contact area is flat and so I'm a little confused why there's enough TIM that I had to actually scrape it off. The thermal pads are generally used instead of thermal paste on the components not needing such extreme cooing. That said the cooler is pretty good at it's job even under the extreme heat that G80 emits.

Overall the physical aspects of the nVidia reference card are excellent. Apart from BFG who are selling a very highly priced 8800GTX with a DangerDen liquid cooler on them, all nVidia's partners are using the stock cooler so it's good to see nVidia and their partners did a decent job.

Asus EN 8800 GTX - nVidia's G80 Performance Revealed Page: 4

How we tested


With the pure speed of cards out on the market at the moment I tend to focus on gameplay in my reviews. I will play each of the test suite games on as high settings as I could possibly play them at while maintaining a frame rate that is enjoyable to play at. If the quality of the gaming looks the same at different settings then I will choose the setting with the best FPS. This differs from game to game and person to person but I play a lot of games and have played with a lot of graphics cards so I hope that I can impart a decent opinion.


To offset this angle I also run a test suite of 3dMark03, 05 and 06. These will give you some numbers on what the cards perform like at stock speeds, for comparison.


Test Setup


For this high-end card I am using an overclocked Core2Duo E6600 running at 3.33GHz. Hopefully this will rule out as much bottleneck caused by the processor as possible.


Intel Core2Duo E6600 @ 3.33GHz
Asus P5W Dh Deluxe
Mushkin HP2 PC6400 running at SPD (CAS 5) due to motherboard restrictions
HDD: OS - 160gb Hitachi Deskstar SATA II
HDD: Gaming - 2 x 40gb Hitachi Deskstars in RAID 0
Sound: Sound Blaster X-Fi Fatal1ty FPS
Power: PCP&C 510 SLi
Case: Lian Li PC75b Project <<|Black3D|>>
Custom Watercooling


For installation I have installed the card as usual with the normal PCI-E power dongle. Checked that the card is seated correctly and powered on. I am using a clean install of Windows XP Professional SP2 with all the latest patches.


The X1950XTX is using the Cat 6.10's.


For the 7950GX2 I am using nVidia's official Forceware drivers - 91.43. For the 8800GTX I am using 97.02's.


For reference I am using some figures from previously reviewed cards that used the same setup. These are provided as a reference point only and not as a direct comparison.


Powercolor X1950XT-X. Clock Speed: 650. Memory speed: 1000 (2000)


Gainward 7950 GX2. Clock Speed: 2 x 500. Memory Speed 600 (1200)


Asus 8800GTX. Clock Speed: 575. Memory Speed 900 (1800)

Asus EN 8800 GTX - nVidia's G80 Performance Revealed Page: 5
Page <script type="text/javascript" src=""></script> Posted 18/11/06
Author: Matthew Kemp (kempez)
Product Acquired: Asus

Image Quality

To show you some of the image quality playing these games at 1920 x 1200 @ 8 x QAA and 16 x AF on high quality I took some JPEG screenshots of my test games.

Remember the quality will be very slightly degraded due to the format conversion, but this just shows how nice the games look at these Ultra-High settings and resolution.

CoD 2

cod 2 iq 8800GTX

cod 2 iq 8800GTX

Call of Duty 2 is an awesome looking game and the 8800GTX handles this fine at these high settings.

The Image Quality on Anisotropic Filtering is noticeably better than on the 7900GTX/7950GX2, with no shimmering at all and no gliteches that I had seen previously.

Quake 4

quake 4 9900gtx iq

quake 4 iq 8800gtx

Although I'm not a big fan of the Quake 4 engine, the game managed to look awesome at this res and it was a great experience.


fear iq 800gtx

fear iq 8800gtx

fear iq 8800gtx

F.E.A.R. was a very challenging engine when it came out and was the bane of many PC's. It looks amazing strafing in slow motion while shooting up some clones!


oblivion iq

oblivion iq 2

oblivion iq

As you can see Oblivion looks simply awesome at this high res. I was thoroughly impressed at how the card handled the AA and HDR

nVidia's AA and AF features improved

nVidia have improved image quality on their 8800 series cards when using Anisotropic filtering by implementing Anisotropic Filtering taking place at any angle. nVidia's GPU's out before the 8800GTX only implement it on say horizontal and vertical surfaces, whereas the nVidia "Luminex" engine offers AA on all surfaces where there is need of getting rid of those jaggies.

8800gtx af iq

The Lumenex engine provides the highest AA image quality with the lowest performance drop. nVidia designed an antialiasing system that uses Coverage Sampling Antialiasing (CSAA).

"Coverage Sampling Antialiasing uses intelligent coverage information to perform ultrahigh quality antialiasing without bogging down the memory system. CSAA is introduced in the GeForce 8800 GPUs."

As you can se in the screenshots above, AA image quality makes the 8800GTX the current "king of the hill". With both 8 x QAA and 16 x QAA the games are immersive, stunning and free of jaggies.

Let's move onto some benchies...

Asus EN 8800 GTX - nVidia's G80 Performance Revealed Page: 6

F.E.A.R. is a game based on an engine that uses many features of DirectX 9.0c.


It has volumetric lighting, soft shadows, parallax mapping and particle effects, with a slow-motion mode that really taxes today's top of the line GPU's. I fully patched version the game with the latest patch. I played three two-minute runs on a taxing part of the game with plenty of action, using slow-motion for the full time whilst firing at enemy soldiers and using grenades that produce a cool "blast" contortion effect when blown up.


Here are the settings that were found to be optimal for F.E.A.R. on a Dell 2405FPW display:

fear settings

fear in game fps

As you can see the 8800GTX only sank to 19FPS, though when I looked on the FPS results this was only at one point and the second lowest FPS was 35FPS. I considered not including the 1 reading dip, but it came out in our test so I put it in. The game looked awesome at this res and quality and it was an immersive pleasure to play.

Call of Duty 2

Call of Duty 2 is a fairly recent game that uses a lot of DirectX 9.0c features, including real time shadows, amazing smoke effects and some nice looking HDR effects. This makes the game very taxing at these high resolutions. I played a fully patched up version of the game. Once again I played through the game with a two minute gaming session including explosions, smoke and also lots of snow. Will the 8800GTX walk this one too?



Here are the settings that were found to be optimal for Call of Duty on a Dell 2405FPW display:

call of duty settings

call of duty 2 fps

Once again Call of Duty 2 played superbly well at extreme details, creating by far the best WWII gameplay experience I've had so far. With frames never dipping under 28FPS and detail levels bordering on insane, it was awesome fun, and the 8800GTX took it all in it's stride.

Asus EN 8800 GTX - nVidia's G80 Performance Revealed Page: 7
Quake 4

Quake 4 is a game built on the Doom 3 engine. This uses many DX 9.0c features and is a game that has not always ran so well on ATI hardware, being an OpenGL game. Once again I did three two minute runs on Quake 4 on each card and took the average of all my readings from these. I played a fast and furious part of the game that required both internal and external scenes.

Here are the settings that were found to be optimal for Quake 4 on a Dell 2405FPW:

quake 4 settings

oblivion fps graph

Quake 4 also ran ultra-smooth at these ultra-high details setings. Settings like this in Quake look very nice with not a jaggy in sight. Once again the 8800GTX nailed Quake 4 and was kept back by the rest of the system.


Oblivion is an awesome RPG with a simply huge immersive environment, great graphics and incredibly realistic scenery. This game is currently one of the most testing games that you can buy and it is certainly a test of the high-end cards here. I chose to do a run-through of the Arena part of the game. I spoke to a character, did some magic whilst in a fight and fought in the arena that is pretty huge. Also as well as doing this test I took a wander around to make sure that the benchmark resembled the general gameplay with each card. This benchmark really tests today's cards and ATI cards seem to do very well in this game, with nVidia cards lagging behind slightly.


Also note that I used the "Chuck" patch to enable AA and HDR for the ATI cards this was integrated in ATI's latest drivers. The nVidia card also allowed me to run the game with AA and HDR.


Here are the settings that were found to be optimal for Oblivion on a Dell 2405FPW display:

oblivion settings

oblivion fps

As I mentioned earlier: Oblivion is probably the most challenging game out on the market. Previously I could just about run it smoothly on the X1950XTX on 2 x AA. With the 8800GTX this is all taken in the cards stride and I felt that the restriction was once again on the test system's CPU. Oblivion is an awesome game and with 8 x QAA added onto the scene as well as HDR it is an immersive and almost spell-binding universe.

It's great to see nVidia manage to get HDR and AA in most situations with the 8800GTX and the sheer speed that the card implements this is beyond what we were imagining could be done before 8800 came out.

Asus EN 8800 GTX - nVidia's G80 Performance Revealed Page: 8
Page <script type="text/javascript" src=""></script> Posted 20/11/06
Author: Matthew Kemp (kempez)
Product Acquired: Asus

Benchmarking - 3DMark

I used the popular gaming benchmarks made by Futuremark to bench all of the cards. I used 3dMark 03, 05 and 06. All benches were performed at stock speeds for this section. I ran all benchmarks from the stock settings just like the free versions to give you a good comparison of scores.


The Results were as follows:

3dMark 03

First we start with 3dMark03. This is a benchmark that relies heavily on DirectX 8 features. This will give an indication of how the card will run on games that rely on DX 8.


3dmark03 8800gtx

The 8800GTX didn't get much of a higher score in 3DMark03 than the 7950 GX2, but since this is a very old test and very CPU bound it isn't a huge suprise that it didn't do much better in 3DMark03.


I ran 3dMark05. This benchmark requires some more features of DirectX 9 and gets slightly more taxing on the cards.

8800gtx 3dmark05

3DMark05 shows a more healthy increase in marks with the 8800GTX. 17k as a stock 3DMark05 for a single GPU is very impressive, but let's get onto what the 8800GTX excels in even more...


3dMark06 is the latest in the benchmarking tests from Futuremark. It has a lot of DirectX 9.0c features such as HDR and use of Shader model 3.0. This benchmark is very taxing for the cards and also includes quite a harsh CPU benchmark. Seeing as this was run with the exact same CPU this was not an issue.


3dmark06 8800gtx

Being a DX10 card you wouldn't have thought that the DX 9.0c performance would be quite as good...but it is very much so. an increase on the 7950 GX2 (a dual GPU card remember) of 2359 Marks is very impressive.

It seems that the 8800GTX is the card to break records with in the ORB for Futuremark. With these fast cards already topping the charts anyone who is a bencher will want one to break those records.

Asus EN 8800 GTX - nVidia's G80 Performance Revealed Page: 9
Page <script type="text/javascript" src=""></script> Posted 18/11/06
Author: Matthew Kemp (kempez)
Product Acquired: Asus


There's no doubting that in the 8800 series, nVidia have made themselves a very nice card indeed. The new architecture and implementation of unified shaders have left a lot of people in the technology community surprised and excited. nVidia have "done" unified shaders the right way with a clean and clever implementation.

The high-quality feature implementation when using AA and AF are a huge improvement on the 7 series cards and at last nVidia have sorted out the quality problems they had rendering AF. Not only have nVidia solved these quality issues, but they have done so in such a way as to speed up the rendering of the higher quality features far above that on the previous generation of cards.

For high quality, ultra-high resolution gaming there's no doubt that an 8800GTX should be being built into your Gaming PC right now. It looks like ATI will be late to the party so the 8 series may be sitting in quite a few people's PC's before too long.

DirectX 10 is of course not out yet. This is where I foresee a stumbling block with people opening their wallets. Games like Crytek need to be out to fully utilise DX10 and DX10 cards, but we haven't seen any of them out yet as the API isn't out yet. With Microsoft saying that to play DX10 games you need Vista this makes the upgrade to DX10 that little bit more expensive which puts a bit of a downer on this incredibly fast card.

Coming onto the Asus bundle: it's a very nice bundle that comes with the card with two full games and 3DMark06 as well so full marks to Asus on that one. If you want to pick one up then it's yours for the princely fee of £461.25 @ SpecialTech. Is it worth this money right now: well it all depends if you're cutting edge or not. Personally I think this card is worth every penny, but then I'm in that "must have it now" user bracket.

I'm awarding the nVidia 8800GTX the "Innovation Award" and the Asus 8800GTX the "Gamers Choice Award" as well as an "Editors Choice Award". Quite some feat for one review item!

Asus 8800GTX

editors choice

gamers choice

nVidia 8800 Series GPU's

asus 8800gtx innovation


+ Very fast
+ First card with DX10 support
+ Quiet Cooler
+ Great bundle
+ Excellent Image Quality
+ Innovative implementation of DX10 API


- Awaiting DX10
- CPU bound

Thanks to Asus for providing the review sample

Comment in the Forums