Fitting Three Video Cards in an ATX Case

I thought we’d flip our normal GPU review style on its head by starting with Power, Temperature, and Noise first. NVIDIA and AMD have both long recommended against placing high-end video cards directly next to each other, in favor of additional spacing between video cards. Indeed this is a requirement for their latest dual-GPU cards, as both the GTX 590 and 6990 draw relatively massive amounts of air using a fan mounted at the center of the card and exhaust roughly half their air inside of the case. Their reference style single-GPU cards on the other hand are fully exhausting with fans mounted towards the rear of the card. Thus multi-GPU configurations with the cards next to each other is supposed to be possible, though not ideal.

There’s a reason I want to bring this up first, and a picture is worth a thousand words.

While AMD and NVIDIA’s designs share a lot in common – a rear-mounted blower fan pushes air over a vapor chamber cooler – the shrouds and other external equipment are quite different. It’s not until we see a picture that we can appreciate just how different they are.

With the Radeon HD 6000 series, AMD’s reference designs took on a very boxy design. The cards fully live up to the idea of a “black box”; they’re enclosed on all sides with a boxy cooler and a black metal backplate. As a GPU reviewer I happen to like this design as the GPUs are easy to stack/store, and the backplate covers what would normally be the only exposed electronics on the card. The issue with this boxy design is that AMD is taking full advantage of the PCIe specification, leading to the 6900 series being the full width allowed.

NVIDIA on the other hand has always had some kind of curve in their design, normally resulting in a slightly recessed shroud around the blower intake. For the GTX 580 and GTX 570 they took a further step in recessing the shroud around this area, leading to the distinct wedge shape. At the same time NVIDIA does not use a backplate, saving precious millimeters of space. The end result of this is that even when packed like sardines, the GTX 580 and GTX 570 blowers have some space reserved for air intake.

The Radeon HD 6970 does not, and this is our problem. The picture of the 6970 in triple-CF really paints the picture, as the middle card is directly pressed up against the top card. Because these cards are so large and heavy the rear ends tend to shift and dip some when installed against a vertical motherboard – in fact this is why we can normally get away with a dense dual-CF setup since the bottom card dips a bit more – but in a triple-CF configuration the end result is that one of the cards will end up getting up-close and personal with another one.

Without outside intervention this isn’t usable. We hit 99C on the middle card in Crysis when we initially installed the three cards, and Crysis isn’t the hardest thing we run. For the purposes of our test we ultimately resorted to wedging some space between the cards with wads of paper, but this isn’t a viable long-term solution.

Unfortunately long-term alternatives are few if you want to give a triple-GPU setup more space. Our testbed uses an Asus Rampage II Extreme, which features three PCIe slots mixed among a total of 6 slots; the way it’s laid out makes it impossible to have our triple-GPU configuration setup in any other manner. Even something like the ASRock P67 Extreme4 can’t escape the fact that the ATX spec only has room for 7 slots and that when manufacturers actually use the 7th and topmost slot that it’s a short PCIe x1 slot. In short you won’t find an ATX motherboard that can fit three video cards and at the same time gives each one a slot’s worth of breathing room. For that you have to use a larger than ATX form factor.

So what’s the point of all of this rambling? With AMD’s current shroud design it’s just not practical to do triple-CF on air on an ATX motherboard. If you want to play with three AMD boards you need to think outside of the box: either use water cooling or use a larger motherboard.

Index The Test, Power, Temps, and Noise
Comments Locked


View All Comments

  • A5 - Sunday, April 3, 2011 - link

    I'm guessing Ryan doesn't want to spend a month redoing all of their benchmarks over all the recent cards. Also the only one of your more recent games that would be at all relevant is Shogun 2 - SC2 runs well on everything, no one plays Arma 2, and the rest are console ports...
  • slickr - Sunday, April 3, 2011 - link

    apart from Shift, no game is console port.

    PC only is mafia 2, SC2, Arma 2, shogun 2, dead space is also not a console port. its PC port to consoles.
  • Ryan Smith - Sunday, April 3, 2011 - link

    We'll be updating the benchmark suite in the next couple of months as keeping with our twice a year schedule. Don't expect us to drop Civ V or Crysis, however.
  • Dustin Sklavos - Monday, April 4, 2011 - link

    Jarred and I have gone back and forth on this stuff to get our own suite where it needs to be, and the games Ryan's running have sound logic behind them. For what it's worth...

    Aliens vs. Predator isn't worth including because it doesn't really leverage that much of DX11 and nobody plays it because it's a terrible game. Crysis Warhead STILL stresses modern gaming systems. As long as it does that it'll be useful, and at least provides a watermark for the underwhelming Crysis 2.

    Battleforge and Shogun 2 I'm admittedly not sure about, same with HAWX and Shift 2.

    Civ 5 should stay, but StarCraft II should definitely be added. There's a major problem with SC2, though: it's horribly, HORRIBLY CPU bound. SC2 is criminally badly coded given how long it's been in the oven and doesn't scale AT ALL with more than two cores. I've found situations even with Sandy Bridge hardware where SC2 is more liable to demonstrate how much the graphics drivers and subsystem hit the CPU rather than how the graphics hardware itself performs. Honestly my only justification for including it in our notebook/desktop suites is because it's so popular.

    Mass Effect 2 to Dead Space 2 doesn't make any sense; Dead Space 2 is a godawful console port while Mass Effect 2 is currently one of the best if not THE best optimized Unreal Engine 3 games on the PC. ME2 should get to stay almost entirely by virtue of being an Unreal Engine 3 representative, ignoring its immense popularity.

    Wolfenstein is currently the most demanding OpenGL game on the market. It may seem an oddball choice, but it really serves the purpose of demonstrating OpenGL performance. Arma 2 doesn't fill this niche.

    Mafia II's easy enough to test that it couldn't hurt to add it.
  • JarredWalton - Monday, April 4, 2011 - link

    Just to add my two cents....

    AvP is a lousy game, regardless of benchmarks. I also toss HAWX and HAWX 2 into this category, but Ryan has found a use for HAWX in that it puts a nice, heavy load on the GPUs.

    Metro 2033 and Mafia II aren't all that great either, TBH, and so far Crysis 2 is less demanding *and* less fun than either of the two prequels. (Note: I finished both Metro and Mafia, and I'd say both rate something around 70%. Crysis 2 is looking about 65% right now, but maybe it'll pick up as the game progresses.)
  • c_horner - Sunday, April 3, 2011 - link

    I'm waiting for the day when someone actually reports on the perceived usability of Mutli-GPU setups in comparison to a single high-end GPU.

    What I mean is this: often times even though you might be receiving an arbitrarily larger frame count, the lag and overall smoothness of the games aren't anywhere near as playable and enjoyable as a game that can be run properly with a single GPU.

    Having tried SLI in the past I was left with a rather large distaste for plopping down the cost of another high end card. Not all games worked properly, not all games scaled well, some games would scale well in the areas it could render easily but minimum frame rates sucked etc. etc. and the list goes on.

    When are some of these review sites going to post subjective and real world usage information instead of a bunch of FPS comparisons?

    There's more to the story here.
  • semo - Sunday, April 3, 2011 - link

    I think this review covers some of your concerns. It seems that AMD with their latest drivers achieve a better min FPS score compared to nVidia.

    I've never used SLI myself but I would think that you wouldn't be able to notice the latency due to more than one GPU in game. Wouldn't such latencies be in the micro seconds?
  • SlyNine - Monday, April 4, 2011 - link

    And yet, those micro seconds seemed like macro seconds, Micro studder was one of the most annoying things ever! I hated my 8800GT SLI experience.

    Haven't been back to multi videocard setups since.
  • DanNeely - Monday, April 4, 2011 - link

    Look at's reviews. Instead of FPS numbers from canned benches they play the games and list the highest settings that were acceptable. Minimum FPS levels and for SLI/xFire microstuttering problems can push their recommendations down because even when the average numbers look great the situation might actually not be playable.
  • robertsu - Sunday, April 3, 2011 - link

    How is microstuttering with 3 GPU's? Is there any in this new versions?

Log in

Don't have an account? Sign up now