Jul 162012
 

I have now been waiting for nVidia long enough. Still, support claims, SLI support for the GTX 680 is coming to XP / XP x64 in the future but nobody knows when. Additionally, one support person said on the quiet, that he doesn’t believe in this feature seeing the light of day anmore. One WHQL driver and several Betas and nothing. Only that weird 302.59 driver that had a broken SLI implemented (worked only in D3D windowed mode and OpenGL, otherwise the dreaded, well-known redscreen appeared).

So, what to do? I sold my two GTX 680 cards to an old friend of mine, he used them to replace his noisy GTX 480 SLI. With the money that I got from the sale, I bought the biggest, baddest GTX 580 cards ever built, the EVGA GeForce GTX 580 Classified Ultra. Actually, only one of them is an Ultra, the other one a regular Classified, but hey, both run on Ultra clockrates, so that’s fine. So here is Fermi again, in its maximum configuration with 512 shader clusters.

Usually, a GTX 580 features 772MHz rasterizer and 1544MHz shader clock with 1.5GB of VRAM running at 4GHz GDDR5 data rate. The Ultra with its heavily modified custom PCB design however runs the rasterizer at 900MHz, the shaders at 1800MHz and has 3GB of VRAM running at 4.2GHz data rate. So it can almost reach the GTX 680 speed, by spending a very high energy consumption under load because of its absolutely monstrous 14+3 phase voltage regulation module. While aircooled, this card was definitely made for running under LN2 pots to reach world records. Whatever, SLI is once again working fine for me now:

Energy consumption is an issue here though as mentioned. The cards are even hungrier than a pair of dreaded GTX 480 Fermis. Where the GTX 480 SLI would hit the ceiling at 900W in Furmark, the two extreme 580s here can eat up to 1140W under the same load. That’s intimidating! Regular load while playing The Witcher 2 was around 800-880W, which is still a lot. Especially considering the necessary power adaptions. Since each card has 2 x 8-Pin and 1 x 6-Pin PCI Express power plugs, you need a total of 4 x 8P and 2 x 6P, which even my Tagan Piperock 1.3kW couldn’t provide. So I had to adapt my two 6P to 8P and 4 Molex 4P HDD plugs to 2 x 6P. It seems to work fine though even with the 6P cables being overloaded by almost 100%. Obviously Tagans current limiters aren’t kicking in here, which is good as those limiters (usually sold to the end user as “rails”) are mostly pointless anyway. At least, idle power consumption was only marginally higher when compared to the slim GTX 680. Here I can see 280-300W instead of 250W, which is still somewhat acceptable.

This might very well be the one most powerful GPU solution possible under XP x64 unless GK110 (the full-size Kepler, that will most likely be sold as a GTX 780) is faster than both Ultras here. Of course there is still the slim chance that the 680 (GK104) and maybe even GK110 will get XP / XP x64 SLI, but I don’t count on that anymore. For now, the Ultra SLI is the best I can get. We’ll see if that’s going to change in the future or not.

CC BY-NC-SA 4.0 The GTX 680 SLI problem: Solved. The stupid way. by The GAT at XIN.at is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

 Leave a Reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> <pre lang="" line="" escaped="" cssfile="">

(required)

(required)