A Look At Triple-GPU Performance And Multi-GPU Scaling, Part 1by Ryan Smith on April 3, 2011 7:00 AM EST
It’s been quite a while since we’ve looked at triple-GPU CrossFire and SLI performance – or for that matter looking at GPU scaling in-depth. While NVIDIA in particular likes to promote multi-GPU configurations as a price-practical upgrade path, such configurations are still almost always the domain of the high-end gamer. At $700 we have the recently launched GeForce GTX 590 and Radeon HD 6990, dual-GPU cards whose existence is hedged on how well games will scale across multiple GPUs. Beyond that we move into the truly exotic: triple-GPU configurations using three single-GPU cards, and quad-GPU configurations using a pair of the aforementioned dual-GPU cards. If you have the money, NVIDIA and AMD will gladly sell you upwards of $1500 in video cards to maximize your gaming performance.
These days multi-GPU scaling is a given – at least to some extent. Below the price of a single high-end card our recommendation is always going to be to get a bigger card before you get more cards, as multi-GPU scaling is rarely perfect and with equally cutting-edge games there’s often a lag between a game’s release and when a driver profile is released to enable multi-GPU scaling. Once we’re looking at the Radeon HD 6900 series or GF110-based GeForce GTX 500 series though, going faster is no longer an option, and thus we have to look at going wider.
Today we’re going to be looking at the state of GPU scaling for dual-GPU and triple-GPU configurations. While we accept that multi-GPU scaling will rarely (if ever) hit 100%, just how much performance are you getting out of that 2nd or 3rd GPU versus how much money you’ve put into it? That’s the question we’re going to try to answer today.
From the perspective of a GPU review, we find ourselves in an interesting situation in the high-end market right now. AMD and NVIDIA just finished their major pushes for this high-end generation, but the CPU market is not in sync. In January Intel launched their next-generation Sandy Bridge architecture, but unlike the past launches of Nehalem and Conroe, the high-end market has been initially passed over. For $330 we can get a Core i7 2600K and crank it up to 4GHz or more, but what we get to pair it with is lacking.
Sandy Bridge only supports a single PCIe x16 link coming from the CPU – an awesome CPU is being held back by a limited amount of off-chip connectivity; DMI and a single PCIe x16 link. For two GPUs we can split that out to x8 and x8 which shouldn’t be too bad. But what about three GPUs? With PCIe bridges we can mitigate the issue some by allowing the GPUs to talk to each other at x16 speeds and dynamically allocate CPU-to-GPU bandwidth based on need, but at the end of the day we’re splitting a single x16 lane across three GPUs.
The alternative is to take a step back and work with Nehalem and the x58 chipset. Here we have 32 PCIe lanes to work with, doubling the amount of CPU-to-GPU bandwidth, but the tradeoff is the CPU. Gulftown and Nehalm are capable chips on its own, but per-clock the Nehalem architecture is normally slower than Sandy Bridge, and neither chip can clock quite as high on average. Gulftown does offer more cores – 6 versus 4 – but very few games are held back by the number of cores. Instead the ideal configuration is to maximize performance of a few cores.
Later this year Sandy Bridge E will correct this by offering a Sandy Bridge platform with more memory channels, more PCIe lanes, and more cores; the best of both worlds. Until then it comes down to choosing from one of two platforms: a faster CPU or more PCIe bandwidth. For dual-GPU configurations this should be an easy choice, but for triple-GPU configurations it’s not quite as clear cut. For now we’re going to be looking at the latter by testing on our trusty Nehalem + x58 testbed, which largely eliminates a bandwidth bottleneck in a tradeoff for a CPU bottleneck.
Moving on, today we’ll be looking at multi-GPU performance under dual-GPU and triple-GPU configurations; quad-GPU will have to wait. Normally we only have two reference-style cards of any product on hand, so we’d like to thank Zotac and PowerColor for providing a reference-style GTX 580 and Radeon HD 6970 respectively.
Post Your CommentPlease log in or sign up to comment.
View All Comments
Castiel - Monday, April 4, 2011 - linkWhy didn't you just use a P67 board equipped with a NF200 chip for testing? Using X58 is a step in the wrong direction.
UrQuan3 - Monday, April 4, 2011 - linkMr Smith,
When you do the multi-monitor SLI\Crossfire review, could you briefly go over different connection modes? The last time I messed with SLI, it forced all monitors to be connected to the first card. Since the cards in question only had two outputs, I had to turn off SLI to connect three monitors. This caused some strange problems for 3D software.
Would you go over the options currently available in your next review?
Ryan Smith - Monday, April 4, 2011 - linkWhen was this? That doesn't sound right; you need SLI to drive 3 monitors at the present time.
UrQuan3 - Thursday, April 7, 2011 - linkRight this second I'm typing on a PC with 2 GTX 260s (not sure which revision) with two monitors plugged into the first and a third monitor plugged into the second. At the time, SLI would only allow monitors plugged into the first card. Of course, since IT doesn't trust us to do our own upgrades, I'm still running driver version 260.89.
Of course, Windows supports multiple dissimilar cards with a monitor or two on each, even different brand cards. However, 3D support in this mode is, er, creative. In this mode most programs (games) can only drive one card's monitors. You can, however, have different programs running 3D on different cards' monitors.
Since you'll have the hardware sitting on your desk, I'd love to see a quick test of the options.
BLHealthy4life - Monday, April 4, 2011 - linkHow the heck did you get 11.4 preview to work with crossfire??
I have 6970 crossfire and I cannot for the life of me get 11.4p to work. I have used 11.2 and 11.3 with no problems. I removed previous drivers with ATI uninstaller followed by driver sweeper. Then I've installed 11.4 p 3/7 and 3/29 and neither one of them work.
I even went as far as to do TWO fresh installs of W7 x64 Ultimate and then install 11.p and the f*cking driver breaks crossfire....
Ryan Smith - Monday, April 4, 2011 - linkI'm afraid there's not much I can tell you. We did not have any issues with 11.4 and the 6970s whatsoever.
quattro_ - Monday, April 4, 2011 - linkdid you use DOF when benching METRO ? i find the HD6990's score high! i only get 37fps average : 980x @4.4 and single hd6990 stock clocks and 11.4 preview driver .
Ryan Smith - Monday, April 4, 2011 - linkNo, we do not. Metro is bad enough; DOF crushes performance.
ClagMaster - Monday, April 4, 2011 - linkI will never understand why people will by 2 or 3 graphics cards, require a 1200W power supply, so they can get 10-20 fps or more subtile eye candy.
There are some things that are beyond the point of reason and fall into the madness of Captain Ahab. This is just about as crazy as insisting on a 0.50 cal Browning Target rifle than a more sensible 0.308 Win Target rifle for 550m target shooting and white tail deer hunting. The 0.308 Win is less punishing on the body and pocketbook to shoot than the 0.50 Browning.
I always believed in working with one (1) graphics card that takes up 1 slot and requires 65 to 85W of power. A 9600GT plays all my games on a 1600x1200 CRT just fine.
looper - Tuesday, April 5, 2011 - linkExcellent post... well-said.