I wish there was a separate button to point out this sort of thing so they could silently correct it. Dont get me wrong, I think its good to have accurate information, just clutters things up a bit.
While the basic design of the GTX 690 resembles the GTX 590, NVIDIA has replaced virtually every bit "with plastic with metal" for aesthetic/perceptual purposes.
surely "with plastic with metal" to "of plastic with metal"
1st line on page 2 "Much like the GTX 680 launch and the GTX 590 before it, the first generation the first generation of GTX 690 cards are reference boards being built by NVIDIA"
If you look at accumulated benchmarks across the web, the 680 Nvidia cards beat the 7970 amd cards by a much higher percentage in 1920x1080 (17.61% ahead) than they do in 1920x1200 (10.14% ahead). This means anand reviews always tests in 1920x1200 to give the amd cards a prettier looking review, instead of testing in 1920x1080 (the most commonly available resolution at 1920x that they could easily set their 1920x1200 monitors to). Hence their tests here at anand are likely also amd favorably biased in higher resolutions. http://translate.google.pl/translate?hl=pl&sl=...
Sadly, it is a very uncommon resolution for new monitors. Almost every 22-24" monitor your buy today is 1080p instead of 1200p. :(
Not mine. I'm running a 1920x1200 IPS. 1920x1200 is more common in the higher end monitor market. A quick glance at newegg shows 16 1920x1200 models with at 24" alone. (starting at $230) Besides, I can't imagine many buy a $1000 dollar video card and pair it with a single $200 display.
It makes more sense to me to check 1920x1200 performance than 1920x1080 for several reasons: 1) 1920x1200 splits the difference between 16x10 and 25x14 or 25x16 better than 1920x1080. 1680x1050 = ~1.7MP 1920x1080=~2MP 1920x1200=~2.3MP 2560*1440=~3.7MP 2560x1600=~4MP
2) People willing to spend $1000 for a video card are generally in a better position to get a nicer monitor. 1920x1200 monitors are more common at higher prices.
3) They already have three of them around to run 5760x1200. Why go get another monitor?
Opinionated Side Points: Movies transitioned to resolutions much wider than 1080P long ago. A little extra black space really makes no difference. 1920x1200 is a perfectly valid resolution. If Nvidia is having trouble with it, I want to know. When particular resolutions don't scale properly, it is probable that there is either a bug or shenanigans are at work in the more common resolutions. I prefer using 1920x1200 as a starting point for moving to triple screen setups. I already thing 1920x1080 looks squashed, so 5760x1080 looks downright flattened. Also 3240x1920 just doesn't look very surround to me (3600x1920 seems borderline surround).
There are only 18 models available in all of newegg with 1920x1200 resolution - only 6 of those are under $400, they are all over $300. + There are 242 models available in 1920x1080, with nearly 150 models under $300. You people are literally a bad joke when it comes to even a tiny shred of honesty.
I don't know about the 'sadly' there in all honesty. I personally like 1920*1080 better than *1200, because nearly everything is done in the former resolution.
Who buys a GTX690 to play on a 1080P display? Even a 680 is overkill for 1080. You can save a lot of money with a 7870 and still run everything out there.
Stuka i agree with you.....but when you buy such a card....you think in the future....5 maybe 6 years....and i can't gurantee that we will do gaming in 1080p then:)....
"Stuka i agree with you.....but when you buy such a card....you think in the future....5 maybe 6 years....and i can't gurantee that we will do gaming in 1080p then:)...."
I have to totally disagree with that. Anyone that pays $500+ for a video card is a certain "type" of buyer. That type of buyer will NEVER wait 5-6 years for an upgrade. That guy is getting the latest and greatest of every other generation, if not every generation of cards.
You shouldn't "totally disagree".......meet me...."the exception"....i am the type of buyer who is looking for the "long run"....but i must confess....if i could....i would be the type of buyer you describe....cya
retrospooty and I mean you no disrespect, but if you're spending $500 and buying for the "long run," you're doing it wrong.
If you had spent $250, you could have 80% of the performance for 2.5 years, then spend another $250 and have 200% of the performance for the remaining 2.5 years.
I bought two (2) HD 7970s on the premise that I'm not going to upgrade them for a good long while. At least four years, probably closer to six. I ran from 2005 to 2012 with a GeForce 7800GT just fine and my single core AMD CPU was actually the larger reason why I needed to move on.
Now granted, I also purchased a snazzy U2711 just so the power of these cards wouldn't go to waste (though I'm quite CPU-bound by this i7-3820), but I don't consider dropping AA in future titles to maintain performance to be that big of a loss; I already only run with 8x AF because , frankly, I'm too busy killing things to notice otherwise. I intend to drive this rig for the same mileage. It costs less for me to buy the best of the best at the time of purchase for $1000 and play it into the ground than it is to keep buying $350 cards to barely keep up every two years, all over a seven year duration. Since I now have this fancy 2560x1440 resolution and want to use it, the $250-$300 offerings don't cut it. And the, don't forget to adjust for inflation year over year.
So yes, I'm going to be waiting between 4 and 6 years to upgrade. Under certain conditions, buying the really expensive stuff is as much of an economical move as it is a power grab. Not all of us who build $3000 computers do it on a regular basis.
P.S. Thank you consoles for extending PC hardware life cycles. Makes it easier to make purchases.
Keep laughing, this card cannot solid v-sync 60 at that "tiny panel" with only 4xaa in the amd fans revived favorite game crysis. Can't do it at 1920X guy. I guess you guys all like turning down your tiny cheap cards settings all the time, even with your cheapo panels? I mean this one can't even keep up at 1920X, gotta turn down the in game settings, keep the CP tweaked and eased off, etc. What's wrong with you guys ? What don't you get ?
Currently the only native 120Hz displays (true 120Hz input, not 60Hz frame doubling) are 1920x1080. If you want VSYNC @ 120Hz, then you need to be able to hit at least 120fps @ 1080p. Even the GTX690 fails to do that at maximum quality settings on some games...
It can't do 60 v-sync at 1920 in crysis, and that's only on 4xaa. These people don't own a single high end card, that's for sure, or something is wrong with their brains.
You must be talking about minimum fps, because on Page 5 the GTX690 is clearly averaging 85fps @1080p.
Tom's Hardware (love 'em or hate 'em) has benchmarks with AA enabled and disabled. Maximum quality with AA disabled seems to be the best way to get 120fps in nearly every game @ 1080p with this card.
You must be ignoring v-sync and stutter with frames that drop below 60, and forget 120 frames a sec. Just turn down the eye candy... on the 3 year old console ports, that are "holding us back"... at 1920X resolutions. Those are the facts, combined with the moaning about ported console games. Ignore those facts and you can rant and wide eye spew like others - now not only is there enough money for $500 card(s)/$1000dual, there's extra money for high end monitors when the current 1920X pukes out even the 690 and CF 7970 - on the old console port games. Whatever, everyone can continue to bloviate that these cards destroy 1920X, until they look at the held back settings benches and actually engage their brains for once.
Yes you went out of your way, why did you have to they are so common, I'm sure you did. In any case, since they are so rare the bias is still present here as shown.
You are correct, I don't own one... I own three in triple screen. Dell U2412m's.
I really am at a loss as to what you are on about. It is well known that 16:10 is preferred amongst enthusiasts and professionals for a few reasons. If you want 16:9, fine, go for it, but don't act like it's weird that AT are benching with 16:10 just because you went with cheap ass 16:9 screens.
Are you being sarcastic or an idiot? From my experience 1900x1200 24" monitors are the MAJORITY. My work has roughly 50 24" monitors all in that resolution. My HP ZR24W is 1900x1200 as well. The only 24" monitor that I have even seen is the TN panel that came with an HP computer.
If you are talking about computer monitors, 1900x1200 is the dominant resolution. If you are talking about TVs, then obviously 1080p is the norm.
There are 242 - count them, well over 200, nearly 250 1920X1080 monitors at the egg. _ In your great experience, there are 16 that fit your 1920X1200 dreampipe FUD scenario at the egg, with most of them, well over half, over $400 each, while the 242 common monitors you all buy as you pinch every penny and whine about $10 difference in videocard prices are well under $200 each a lot of the time. So now suddenly, you all spend way over $300 to plus $400 for 11% more pixels... ROFL HAHAHHAHHA instead of $150 or $200... I guess that's why this place is so biased, the little bloggers are just as whacked when it comes to being honest.
Good grief... resorting to personal attacks isn't exactly a good way to get people to listen to you.
I'm not going to argue that 1080p isn't more common (from what I've read, no one is), because it is more common, you are quite correct there, however I must point out that your logic to arrive at that conclusion is faulty: You're contending that 1080p is more common (it is) because there are more models available on Newegg, but just knowing how many models are available doesn't tell us how many units those move. If, for example, each of those 22 models of 1920x1200 monitors moves 10 times as much stock as each of the 1920x1080, nearly as many of the 1920x1200 will have been sold as the 1920x1080 ones. Now, I don't think that's likely, and I do agree with you that 1080p is more common nowadays (see next point), but your argument is invalid, even though you have come to the correct conclusion. Consider this: there are currently two models of iPhone available, compared to dozens of Android phones. By the same logic as you're using, I could say that the iPhone is incredibly rare - I'd get laughed out of town if I tried to make that argument.
The second point is that 1920x1200 hasn't been nearly as rare in the past as it is today. When I bought my previous monitor and my laptop (both 1920x1200), 1080p monitors were almost unheard of. Since monitors tend to last a while, it's not at all unreasonable for a disproportionate amount of people to be using them compared to their current sales.
Thirdly, there is a point of diminishing returns. Notice the complete lack of any benchmarks at or below 1680x1050? These cards are so fast that comparisons at those resolutions are pointless - they're all fast enough for anything you could do to them at that screen res - even Crysis. 1920x1080 almost falls into that category, heck, even 1920x1200 almost falls into that category. Benchmarks are only about who wins if there is some advantage to winning. Below 2560x1600, which card you're using is almost completely irrelevant, so why does it even matter whether they used 1920x1080 or 1920x1200?
Hm, odd. Not only do I have 1920x1200 monitor on my desktop, I have TWO laptops with 1920x1200 screens. Using one right now. Yes, they're rarer than 1080p screens, but this is a site for enthusiasts, therefore, it is more likely.
The truth is a bit more simple than that. 5760x1200 is because our choice in monitors for multi-monitor testing was based on my personal monitor, which is another PA246Q. NVIDIA's limited display flexibility (same res + same sync) meant that it was easiest to just pair the PA246Q with some more PA246Qs, Consequently it's easier to just test these monitors at their native resolution when we're using NVIDIA cards.
1920x1200 was very common for several years. Until a few years ago, they were much more common than 1920x1080. I even have an old laptop that's 1920x1200. Looking at what's available to buy new, today, doesn't tell the whole story. Because people don't replace their monitors every day.
Anandtech has always recommended spending up and getting a quality monitor. You see it in nearly every review. So, I think the readers here are more likely than the average guy on the street to own less common screens. I've had the same 2560x1600 monitor through 3 computers now, and I spent more on it than I've ever spent on any computer.
Yes, you're all super premium monitor buyers, and moments ago you were hollering the videocards are way too expensive and you cannot possibly afford them unless you are an idiot with too much money. I love this place, the people are so wonderfully honest.
1920x1200 is only rare now. i've gone thru enough monitor to know what I like and cheap 16:9 TN panels are not if its that good enough for you then enjoy.
As for your other comment about v-sync and 4xAA Guess what some of us don't care to have 8x AA and 16XAF running all the time.
I would rather play at 1200p at high settings with AA and AF off if it means playable fps and a enjoyable experience. This isn't [H] i'm not gonna spend $1000 on a Gpu so I can meet your approved settings for playing games dude. Get a clue!
No...I couldn't afford one but I very much wanted to buy one. It is much prettier than 16:9 for workstation purposes. New ones are being released all the time. You just have to pay more, but its worth it.
Oh, so someone who almost wants to be honest. So isn't it absolutely true a $500 videocard is much easier to buy when your monitor doesn't cost half that much let alone twice that much or $2,000 plus ? You don't need to answer. We all know the truth. Everyone in this thread would take a single videocard 680 or 7970 and a 1080P panel for under $200 before they'd buy a $450 1200P monitor and forfeit the 680 or 7970 for a $200 videocard instead. It's absolutely clear, no matter the protestations. In fact if they did otherwise, they would be so dumb, they would fit right in. Oh look at that, why maybe they are that foolish.
Oh? A little over a year ago, I had some money for an upgrade and I wanted to upgrade either my monitor or my video card. Now, I have (and play) Crysis, which can only now, just barely, be handled by a single card, so obviously I could have used the GPU upgrade (still can, for that matter). I also had a decent (though not great) 22" 1920x1200 monitor.
However, despite that, I chose to buy a new monitor, and bought a used 3008WFP (30" 2560x1600). I have not regretted that decision one bit, and that was a lot more money than your $200-300 upsell for 1920x1200 Now, admittedly, there were other factors that were a consideration, but even without those, I would have made the same decision. Putting money into a good monitor which I'll use ALL the time I'm on the computer vs. putting money into a good video card that I'll use some of the time is a no-brainer for me. If all of my electronics were taken and I were starting from scratch, I'd get another 2560x1600 monitor before I even bought a video card. I'd suffer through the integrated IGP as long as I needed.
Now, that's my choice, and everyone's needs are different, so I wouldn't demand that you make the same decision I did, but, by the same token, you shouldn't be expecting everyone to be following the same needs that you have. ;)
You've jumped from 1920 to 2560 so who cares, not even close. In your case you got no video card. ROFL - further proving my point, and disproving everyone elses who screamed if you get this card you have another two grand for monitors as well - which everyone here knows isn't true.
I never demanded anyone follow any needs, let alone mine which are unknown to you despite your imaginary lifestyle readings, and obverse to the sudden flooding of monitor fanboys and the accompanying lies.
It's not that rare; I got a fairly inexpensive 24" 1920x1200 HP monitor from Newegg a year ago. There weren't many options but it was there and it's great.
You are right that the average Joe doesn't have a 1920x1200 monitor, but this is an enthusiast web-site! Not a single enthusiast I know owns a 1080 display. 1920x1200 monitors aren't hard to find, but you will need to spend a tad more.
Nope, 242 vs 16 is availability, you lose miserably. You all didn't suddenly have one along with your "friends" you suddenly acquired and have memorized their monitor sizes instantly as well. ROFL - the lies are innumerable at this point.
It's either 1920x1200 @ 60hz, or 1920x1080 @ 120hz. I prefer smoother gameplay over 120 pixels. Also I know quite a few gamers that like using their TV for their PC gaming, so this would also be limited to 1080p.
I'm be more worried about AMD's performance going down in certain games due to Crossfire than something as trival as this. As a 4870X2 owner I can tell you this is not at all uncommon for AMD. I still have to disable 1 GPU in most games, including BF3, because AMDs drivers for any card more than 12 months old are just terrible. As you can see even the 6990 is being beat by a 6970 in games as modern as Skyrim - their drivers are just full of fail.
A much higher percentage?!? that's 7% more... nothing extraordinary...Let's just say a higher percentage, when you say much, it makes us beleive Nvidia's paying you.
10% you might be able to ignore, 17% you cannot. It's much higher, it changes several of the games here as to who wins in the article in the accumulated benches. It's a big difference.
Except that nVidia wins in the article and all of the accumulated benches here, even at 1920x1200 (which this card would be a complete waste on...), so what exactly are you complaining about? It's bias if they say that the AMD cards are better when they're not, but in the benchmarks and in the conclusions (here and elsewhere), nVidia is consistently ahead, so any claims of bias are completely groundless...
Read my first post instead of asking or having already read it attack like you just did and continue to be a jerk who cares, right ? You obviously are all seriously upset about the little facts I gave in my very first post. You're all going bonkers over it, and you all know I'm correct, that's what really bothers all of you. Continue to be bothered, you cannot help it, that's for sure.
I guess all of you got very upset that my point was made, you're looking at a biased for amd set of benchmarks. I'm sure most of you are very happy about that, but angered it has been effectively pointed out.
On the one hand, if they are trolling just for the reaction, it's fascinating. What kind of weird creature lies behind the internet persona? In most cases, we all know it must be a sad figure of a person with all sorts of interesting personality problems.
But on the flip side, if this person actually means and believes what they say is some sort of honest analysis, it's just as fascinating. What kind of thick bastard must then lurk behind the keyboard in question?
I think that these creatures are Nvidia's fanboy, they react always the same way. CeriseCogburn remember me one of them a little while ago, can't remember his name. He was convinced that the 7970 pricing was the worst thing to ever happen to humanity since the birth of Justin Bieber, or at least, it looked alot like that. Sure the price wasn't attractive, but there's some limit you must not cross to stay in the real world.
So as weird a creature they can be, I believe they are a result of Nvidia's funding them to spread insanity in forums speaking of video cards. They can't be natural things after all, they just don't make sense. Their closed mind is second to none. Or else, they could only have the possibility to type insanities and a filter to read the replies to stop some information entering their brain.
Do you really think ATI and nVidia would pay these weird, sad, little trolls to piss off readers every time one of their products is reviewed? It's an embarassment and a distraction. No, I think they would pay someone like that NOT to talk about their products if they could. I'm sure that employees do write comments on product reviews, but guys like this are bad for business. Nobody wants someone like that on their side. If I were nVidia, I'd pay that guy to become an AMD fan!!!
I'm certain they would pay none of you since not a single one can be honest nor has a single argument to counter my points. You're all down to name calling trolls - and you all have to face the facts now, that your clueless ignorance left out of your minds for some time. Have fun buying your cheap 1080P panels and slow and cheapo amd cards - LOL Oh sorry, you all now buy premium flat panels...
No actually I expected a lot more from the people here. I expected a big thank you, or a thanks for the information we'll keep that in mind and it helps our purchasing decisions. Instead we got a flood of raging new monitor owners and haters and name callers. Next time just thanking me for providing very pertinent information would be the right thing to do, but at this point I don't expect any of you to ever do the right thing.
I'm curious why the 680 and 690 trail AMD cards in Crysis and Metro, seeing as those seem to be the most GPU intensive games, while they win in most other tests. Would it be shading performance or something else?
My mind is pretty blow that we have cards that can run Crysis and Metro at 5760x1200 at very comfortable framerates now, that's insane. But barring that resolution or 2560 for some games, I'm sure most of us don't see appeal here, it will be sold in a very very small niche. For normal monitor resolutions, I doubt games in large quantities will get much more demanding until we have new consoles out.
Oh, wow, they also are so biased toward amd they removed the actual most demanding game, Shogun 2, Total War, because I kept pointing out how the Nvidia 680's swept that game across the board - so now it's gone ! ROFL (before you attack me I note the anand reviewer stated S2TW is the most demanding, it's right in the reviews here - but not this one.
Oh I see it was added because the patch broke the Nvidia cards - but in amd's favor again, the tester kept the breaking patch in, instead of providing results. Wow, more amd bias. Glad my epic fails are so productive. :-) U still mad ? Or madder and raging out of control?
So, if they failed to add it, it'd have been AMD bias, but considering they DID add it... it's AMD bias.
And you're the one talking about rage, trollboi?
Had you just merely mentioned that the patch doesn't provide favourable results for NVIDIA cards, Ryan might have been tempted to reinstall the game and retest. Well, he might have - can't speak for the guy. Doubt he will now, though.
It's a very pertinent topic because despite the protestations the vast majority of readers are likely to have a 1080p monitor, and apparently they all have amnesia as well since this came up in a very, very recent article - one of the last few on the video cards - where it was even pointed out by Anand himself that one should extrapolate the fps data with the 11% pixel difference, something every last one of you either never read or completely forgot or conveniently omitted. Unfortunately Anand isn't correct, period. He should know it, but of course has to have an answer for the complainers - so he gave one. What should be clear, even by just looking at the 1920X benches here is that below that resolutin and above it don't always jibe with it - and personally I've noticed nVidia has issues with 1600X1200. So, all the hundreds of readers with 1080p monitors can thank me deep down in their hearts, as they now know Nvidia is supreme at that resolution, by 17%+ on average, more so than at 1200p, and thus "saving money" on an amd card using this sites 1920 1200p stats is for many, incorrect.
And yet they specifically called out the fact that the patch broke performance on nVidia cards, went out of their way to state what performance was like before the patch (which is clearly better than any of the other cards), and finally stated that they're pretty sure that the game is at fault, not nVidia or their drivers...
Yeah, they really must have it out for nVidia... *sigh*
Except in the 680 tests all you fools ignored S2TW, which I had to repeatedly point out was the hardest game in the bunch, not Crysis or M2033 - over and over again I had to tell all the fools blabbering, and now suddenly the game is "broken". ROFL - it's beyond amazing.
Are you using the Steam version? Your results differ from that of HardOCP, hardwareheaven, and hardwarecanucks. They get scaling you don't. Your version should be dated March 2012, thats when the patch was released.
For a $1000 card that is not a very good showing. i'm thinking that 2GB limit per gpu is really starting to hurt them not to mention the 256bit memory bus.
This card has 2GB per GPU, not 4. Also, the lack of memory (!) will limit performance before the memory bus does. Compared to previous NVIDIA products, the 680 has far faster memory which mitigates having a narrower bus.
There is no limit with the 2G of memory, but none of you have looked at teh dozens of reviews and hundreds of blogs proofs, so you will keep babbling stupid things, forever, it appears.
The few reviews I've seen have 4GB GTX 680 card between 5% and 10% faster at high resolutions (starting at 2560x1440 to 7860x1600). Adding, on top of that some more memory bandwidth would have been the gaming card most people expected from nVidia. As it stands, the GTX 680 is good, but also very expensive (I can have t he 7970 for 65€ less). The GTX 690 is a good product for people who want SLI but don't have the space, PSU, SLI enabled mainboard or want 4 GPUs.
They're being held back like the "real 680" top nVidia core, because nVidia is making more money selling the prior launches and the new 2nd tier now top dog cards. It's a conspiracy of profit and win.
Yes, because making a small number of full size Kepler cores is obviously going to make them more money than a large number of less complex Kepler cores. *rolleyes*
NVIDIA, assuming they had the ability to get them manufactured in large enough quantities, would make far more profit off a 660 or 670 than they ever would off a 680.
For instance the entire lot of 7870's and 7850's on the egg are outsold by a single GTX680 by EVGA - confirmed purchasers reviews. So it appears nVidia knows what it's doing in a greatly superior manner than your tiny mind spewing whatever comes to it in a few moments of raging rebuttal whilst you try to "point out what's obvious" - yet is incorrect.
Every time my Anandtech feed updates, the first thing I'm hoping to see is reviews for the more-reasonably priced, and less power-hoggy GTX 640 (w/GDDR5) and GTX 660 Ti. If we see a review, then at least we know it'll show up at retail very soon after.
All I want for xmas is a mid-range NVidia card with a higher idle wattage to maximum performance ration than AMD (because NVidia > AMD wrt drivers, esp under linux).
The GTX680 has sold more card by the verified reviewers at NewEgg than the entire lot of the 7870's and 7850's at NewEgg combined, and that's just with ONE GTX680 sold by EVGA - check it out my friend... ROFL GTX680 in one listing outsells the entire lineup of 7870 and 7850 COMBINED at newegg- with verified owners count. HAHAHA Yes, the supply is always "key". ROFL
I must confess that every logic i can think of says i don't need this GPU.....but.....i want it....i don't need it.....but damn it....i want it.....it's nvidia....it's aluminium....it's 4 GB VRAM....it's probably 5 times faster than what i have.......and i want to congratulate the team for the review wich i read from start to finish...but to be honest with you.....you don't need 19 pages to describe it...for me...."futureproof" says it all....
Simply put, NVIDIA has superior software department in comparison with AMD. AMD is mainstream. Whenever they try to reach the high end, they fail miserably, both on GPU and CPU camps. Driver issues with crossfire, trifire and quadfire with or without eyefinity in numerous games (with eyefinity even more problems) etc. If they don't get their problems solved by Catalyst 12.5 buying AMD cards for high end builds (anything multicard related) is a waste of money. And that is sad.
Yes, and the reviewer is constantly trying to catch nVidia in a big lie - and it shows - he even states how he never believed a word nVidia said about this card but had to admit it was all true. I have never, in many years, seen the same bad attitude given to amd's gpu's. The bias in the write up is so blatant every time it's amazing they still get nVidia cards for review. The reviewer is clearly so pro amd he cannot hide it.
He did say that Crossfire was so broken that he couldn't recommend it. He's been pointing out flaws in both companies along the way I think you should dial back the bias accusations a little bit.
Well if you want me to point out like 10 blatant direct wordings in this article I will. I'm not the only one who sees it, by the way. you want to tell me how he avoids totally broken amd drivers when he's posting the 7970CF ? Not like he had a choice there, your point is absolutely worthless.
Because you idiots aren't worth the time and last review the same silverblue stalker demanded the links to prove my points and he got them, and then never replied. It's clear what providing proof does for you people, look at the sudden 100% ownership of 1920x1200 monitors.. ROFL If you want me to waste my time, show a single bit of truth telling on my point on the first page. Let's see if you pass the test. I'll wait for your reply - you've got a week or so.
It is indeed sad. AMD comes up with really good hardware features like eyefinity but then never polishes up the drivers properly. Looking some of crossfire results is sad too: in Crysis and BF3 CF scalling is better than SLI (unsure but I think the trifire and quadfire results for those games are even more in AMD's favour), but in Skyrim it seems that CF is totally broken.
Of course compared to Intel, AMD's drivers are near perfect but with a bit more work they could be better than Nvidia's too rather than being mostly at 95% or so.
Tellingly, JHH did once say that Nvidia were a software company which was a strange thing for a hardware manufacturer to say. But this also seems to mean that they forgotten the most basic primary thing which all chip designers should know: how to design hardware which works. Yes I'm talking about bumpgate.
See despite all I said about AMD's drivers, I will never buy Nvidia hardware again after my personal experience of their poor QA. My 8800GT, my brother's 8800GT, this 8400M MXM I had, plus number of laptops plus one nForce motherboard: they all had one thing in common, poorly made chips made by BigGreen and they all died way before they were obsolete.
Oh, and as pointed out in the Anand VC&G forums earlier today:
Yep, that's true. They killed cards with a driver. They should implement hardware auto shutdown, like CPUs. As for the nForce, I had one motherboard, the best nForce they made: nForce 2 for AMD Athlon. The rest of mobo chipsets were bullshit, including nForce 680.
The QA I don't think is NVIDIA's fault but videocard manufacturers.
The QA I don't think is NVIDIA's fault but videocard manufacturers.
No, 100% Nvidia's fault. Although maybe QA isn't the right word. I was referring to Nvidia using the wrong solder underfil for a few million chips (the exact number is unknown): they were mainly mobile parts and Nvidia had to put $250 million aside to settle a class action.
Although that wiki article is rather lenient towards Nvidia since that bit about fan speeds is red herring: more accurately it was Nvidia which spec'ed their chips to a certain temperature and designs which run way below that will have put less stress on the solder but to say it was poor OEM and AIB design which lead to the problem is not correct. Anyway, the proper expose was by Charlie D. in the Inquirer and later SemiAccurate
But in fact it was a bad heatsink design, thank HP, and view the thousands of heatsink repairs, including the "add a copper penny" method to reduce the giant gap between the HS and the NV chip. Charlie was wrong, a liar, again, as usual.
Don't be silly. While HP's DV6000s were the most notorious failures and that was due to HP's poorly designed heatsink / cooling bumpgate also saw Dells, Apples and others:
The problem was real, continues to be real and also affects G92 desktop parts and certain nForce chipsets like the 7150.
Yes, the penny shim trick will fix it for a while but if you actually were to read up on technicians forums who fix laptops, that plus reflows are only a temporary fix because the actual chips are flawed. Re-balling with new, better solder is a better solution but not many offer those fixes since it involves 100s of tiny solder balls per chip.
Before blindly leaping to Nvidia's defence like a fanboy, please do some research!
Before blindly taking the big lie from years ago repeated above to attack nvidia for no reason at all other than all you have is years old misinformation, then wail on about it, while telling someone else some more lies about it, check your own immense bias and lack of knowledge, since I had to point out the truth for you to find, and you forgot DV9000, dv2000 and dell systems with poor HS design, let alone apple amd console video chip failings, and the fact that payment was made and restitution was delivered, which you also did not mention, because of your fanboy problems, obviously in amd's favor.
I have some issues with this article, the first of course being availability. Checking the past week, I have yet to see any availability of the 680 besides $200+ over retail premium cards on ebay. How can you justify covering yet another paper launch card without blaring bold print caveats, that for all intents and purposes, nVidia can't make for a very long time? There is a difference between ultra rare and non-existant.
Is a card or chip really the fastest if it doesn't exist to be sold?
Second, the issue of RAM, that's a problem in that most games are 32 bit, and as such, they can only address 3.5GB of RAM total between system and GPU RAM. This means you can have 12GB of RAM on your video card and the best you will ever get is 3GB worth of usage.
Until games start getting written with 64 bit binaries (which won't happen until Xbox 720 since almost all PC games are console ports), anything more than 2-3GB GPU RAM is wasteful. We're still looking at 2014 until games even START using 64 bit binaries.
Want it to change? Lobby your favorite gaming company. They're all dragging their feet, they're all complicit.
While I'm afraid we're not at liberty to discuss how many 680 and 690 cards NVIDIA has shipped, we do have our ears to the ground and as a result we have a decent idea as to how many have shipped. Suffice it to say, NVIDIA is shipping a fair number of cards; this is not a paper launch otherwise we would be calling NVIDIA out on it. NVIDIA absolutely needs to improve the stock situation, but at this point this is something that's out of their hands until either demand dies down or TSMC production picks up.
The 690 is a stunning product... but I'm left wanting to see the more mainstream offerings. That's really where NVIDIA will make its money, but we're just left wondering about supply issues and the fact that AMD isn't suffering to the same degree.
A single EVGA GTX680 sku at newegg has outsold the entire line up of 7870 and 7850 cards combined with verified owners reviews. So if availability is such a big deal, you had better ask yourselves why the 7870 and 7850 combined cannot keep pace with a single EVGA 680 card selling at Newegg. Go count them up - have at it - you shall see. 108 sales for the single EVGA 680, more than the entire combined lot of all sku's in stock and out of the 7870 and 7850 combined total sales. So when you people complain, I check out facts - and I find you incorrect and failing almost 100% of the time. That's what happens when one repeats talking points like a sad PR politician, instead of checking available data.
Have you considered using WinZip 16.5 with it's OpenCL accelerated file compression/decompression as a compute benchmark? File compression/decompression is a common use case for all computer users, so could be the broadest application of GPGPU relevant to consumers if there is an actual benefit. The OpenCL acceleration in WinZip 16.5 is developed/promoted in association with AMD so it'll be interesting to see if it is hobbled on nVidia GPUs, as well as how well if scales with GPU power, whether it scales with SLI/dual GPU cards, and whether there are advantages with close IGP-CPU integration as with Llano and Ivy Bridge.
I actually don't know if it's AMD only. I know AMD worked on it together with WinZip. I just assumed that since it's OpenCL, it would be vendor/platform agnostic. Given AMD's complaints about use of vendor-specific CUDA in programs, if they developed an AMD-only OpenCL application, I would find that very disappointing.
"WinZip has been working closely with Advanced Micro Devices (AMD) to bring you a major leap in file compression technology. WinZip 16.5 uses OpenCL acceleration to leverage the significant power of AMD Fusion processors and AMD Radeon graphics hardware graphics processors (GPUs). The result? Dramatically faster compression abilities for users who have these AMD products installed! "
Excuse me but you're wrong, again. " by Ryan Smith on Thursday, May 10, 2012 According to WinZip it only supports AMD GPUs, which is why we're not using it in NVIDIA reviews at this time. " Ryan's comment from the 670 release review.
I'm sure the gamer's manifesto amd company "ownz it" now, and also certain it has immediately become all of yours favorite new benchmark you cannot wait to demand be shown here 100% of the time, it's so gaming evolved.
Here's some research mt know it all: " by Ryan Smith on Thursday, May 10, 2012 According to WinZip it only supports AMD GPUs, which is why we're not using it in NVIDIA reviews at this time. " -- Congratulations on utter FAIL.
First off, thank you for this review. If you didn't do this, we'd have no idea how these GPUs perform in the wild. It is very nice to come here and read a graph and make educated decisions on which card we should purchase. It is appreciated.
The one thing that I wanted to question is why you feel that you can't recommend the 7970. At the very least perhaps the recommendation of which card to get should be based on the game you're playing.
Reviewing the data you published, the average frame rates for the 5 top performers over all bench marks are;
Also, the number of times which the 7970 dipped below 60 fps in the benchmarks (excluding the minimum frame rate benchmarks) alone, without the 680 doing the same was 4. This is over 29 benchmarks and some of the dips were minimal.
This aligned with the price considerations makes me wonder why one wouldn't consider the 7970?
"The one thing that I wanted to question is why you feel that you can't recommend the 7970. At the very least perhaps the recommendation of which card to get should be based on the game you're playing."
Under normal circumstances we would do this. For example GTX 570 vs Raadeon HD 6970 last year; the two traded blows often enough that it came down to the game being played. However the key was that the two were always close.
In 20% of our games, 7970CF performance is nowhere close to GTX 690 because CF is broken in those games. It would be one thing if AMD's CF scaling in those games was simply weaker, but instead we have no scaling and negative scaling in games that are 5+ months old.
For single card setups AMD is still fine, but I cannot in good faith recommend CF when it's failing on major games like this. Because you never know what games in the future may end up having the same problem.
I must say I found it quite odd and hilarious to see people accusing Anandtech of favouring AMD by using a monitor with a 1200 vertical resolution. 16:10 monitors are not that uncommon and we really should be showing the industry what we think by not purchasing 16:9 monitors.
Anyway, if anything this review seems to be Nvidia biased, in my opinion. The 7970 CF does not do too badly, In fact it beats the 690 / 680 CF in many games and only loses out in the games where it's "broken". I am not sure why you cannot recommend it based on the numbers in your benchmarks since it hardly embarrasses itself.
It's not "people", it's "person"... and he's only here to troll graphics card articles.
When AMD gets it right, CrossFire is absolutely blistering. Unfortunately, the sad state of affairs is that AMD isn't getting it right with a good proportion of the games in this review.
NVIDIA may not get quite as high scaling as AMD when AMD does get it right, but they're just far more consistent at providing good performance. This is the main gripe about AMD; with a few more resources devoted to the project, surely they can overcome this?
Yes, of course, call names forever, but never dispute the facts. I will agree with you though, amd drivers suck especially in CF, and they suck for a lot of games for a long long time.
No, I said AMD's drivers have issues with Crossfire, not that they suck in general.
I've also checked three random British websites and there's no issues whatsoever in finding a 1920x1200 monitor. I also looked at NewEgg and found eight immediately. It's really not difficult to find one.
1920x1200 all of you protesteth far too much. The cat is out of the bag and you won't be putting it back in. Enjoy the bias, you obviously do, and leave me alone, stop the stalking.
I'm with ya bro. Forget these high resolution monitor nancy's who don't know what they're missing. I'm rockin' games just fine with 60+ fps on my 720p plasma tv, and that's at 600hz! Just you try to get 24xAAAA in 3D (that's 1200hz total) on that 1920x1200 monitor of yours!
On page 2 of the review - where you have all the pictures of the card - we have no real basis for figuring out the cards true size. Could you include a reference in one of those photos? Say, a ruler or a pencil or something, so we have an idea what the size of the card truly is?
why they back to 256 bits and the gtx 590 have 384 bits?!?! cause they dont want to have a lot of advantage? maybe the next gtx 790 will have again 384 bits and it would be better than gtx690 ....come on!!!
Wonder what the 7990 will look like next month. AMD clearly waited on purpose to see how the 690 was going to perform. They easily could have released a dual 7970 card already or at the very least sent specs to card manufacturers but they haven't.
We know they left a lot of headroom on the 7970 - some people have even suggested we'll get a 7980 at some point - wonder if now we'll get 2 x fully clocked 7970s on the same card ... will be interesting to see how they deal with that power consumption at load though.
No, unlike OWS protesters, there are some successful people in this world who get off their butts and work hard enough to be able to afford a $1,000 GPU (or in my case 2 GTX 680 $530 GPUs).
"Thus even four years since the release of the original Crysis, “but can it run Crysis?” is still an important question, and the answer when it comes to setups using a pair of high-end 28nm GPUs is “you better damn well believe it.”"
No they actually cannot. 1920X, even the cf 7970 or 690 need help with lowered settings, as in many of the games. Can't even keep up with the 1920X monitors resolution refresh rate, set at a low 60. Sorry, more fantasies another for you perhaps. :)
Hey guys! Thanks for the article, I enjoyed the read (although I am not in the market for dual GPU configurations after trying the HD3870X2 and 2*8800GTS, happy with one 7970 OC'ed to the max.). But you seem to be missing the numbers for noise from the HD7970 in a CF configuration. I hope you can post them! :D -DA
This was mentioned on the Test page, but we don't have a matching pair of 7970 cards; what we have is a reference card and an XFX 7970 BEDD. Power and temperature are the same regardless, but it would be improper to list noise because of the completely different acoustic properties of the BEDD.
How does that stack up, especially price/performance? Why didn't your conclusion address that question at all? Totally limits the usefulness of the review in my opinion.
As per the data in your article, the GTX690 is clocked 10% below the GTX680, and has a 5% lower boost clock. This may be a small compromise, but it is a compromise nonetheless.
Seems to me they should be saving money in the construction when compared to two 680 in SLI. Half the fans, half the connectors, have the circuit boards. They should have at least cut $50 off the suggested retail price.
Also when will be see 3 of these running in SLI form?
Also when will be see 3 of these running in SLI form?
you won't only one SLI connector.
not that impressed, i'll be holding on to my overclocked 1.5 GB TRI-SLI GTX 480 hydro coppers for the forseeable future, this card should atleast double the RAM it has now...
Half the magnesium, half the aluminum, half the PLX chips, half the R&D, half the vapor chambers, half the chip binning, half the power circuits, half the copper pcb's.... oh no wait, all those are added expenses, not reductions... + I guess they should be charging $200 over the 2x$499 dollar usual price. See how actually using the facts, instead of sourpuss emotion delivers a different picture ?
These cards are sold out on Newegg for $1200 per. talk about taking advantage on a 20% markup over the msrp,hopefully AMD knocks the prices way down when they bring out there 7990, $800 sounds about right.
Now I need a new keyboard because I was drooling into mine as I read this review. I have a GTX 680, but I don;t like to run SLI setups - I had a bad experience with my dual 560ti's. This looks like a truly awesome card that would hold its value for resale later. Nevertheless, there is no way I'm spending a grand on a video card.
Not precisely. That $350 performance point? It used to be a $200 performance point. Similarly, that $350 point will turn into a $400 performance point. So, assuming I maintain the price tier, graphics returns for my dollar are gradually tapering off. I look at the performance I was getting out of my 7800 GT at 1280x1024, and it wasn't worth upgrading to a newer card, period, because of Windows XP, my single core CPU, and the fact that I was already maxing out every game I had and still getting decent frame rates. I think they key factor is that I do not care if I dip below 60 frames, as long as I'm above 30 and getting reasonable frame times.
I also know that consoles extend the life of PC hardware. The 7800GT is a 20-pipe version of the GTX, which is in turn the GPU found in the PS3.Devs have gotten much better at optimization in titles that matter to me.
You spend well over $1,600 on a decent system. It makes no sense to spend all that money, then buy monitors the cards in question cannot successfully drive on 3 year old Crysis game, let alone well over half the benchmarks in this article set without turning DOWN the settings. You cannot turn up DX11 tesselation, keep it on medium. You cannot turn up MSAA past 4X, and better keep it at 2X. You had better turn down your visual distance in game. That in fact, with "all the console ports" moanings "holding us back". I get it, the obvious problem is none of you seem to, because you want to moan and pretend spending $1,000.00 on a monitor alone, or more, is "how it's done", because you whine you cannot even afford $500 for a single video card. These cards successfully drive 1920X1080 monitors in the benchmarks, but just barely - and if you turn the eye candy up, they cannot do it.
Thanks for telling everyone how correct I am by doing a pure 100% troll attack after you and yours could not avoid the facts. Your mommy, if you knew who she was, must be very disappointed.
You can use cards 2 generations back for that, but like these cards, you will be turning down most and near all of the eye candy, and be stuck rweaking and clocking, and jittering and wishing you had more power. These cards cannot handle 1920X at current "console port" games unless you turn them down, and that goes ESPECIALLY for the AMD cards that suck at extreme tesselation and have more issues with anything above 4XAA, and often 4XAA. The 5770 is an eyefinity card and runs 5760X1200 too. I guess none of you will ever know until you try it, and it appears none of you have spent the money and become disappointed turning down the eye candy settings - so blabbering about resolutions is all you have left.
After checking Newegg it would seem that, unfortunately for Nvidia, this will be another piece of vaporware. Perhaps they should scale the Kepler's to 22nm and contract Intel to fab them since TSMC has major issues with 28nm. Just a thought.
I guess I should retract my comments about TSMC as other customers are not experiencing supply issues with 28nm parts. Apparently the issues are with Nvidia's design, which may require another redo. I'm guessing AMD will be out with their 8000 series before Nvidia gets their act together. Sad because I have used several generations of Nvidia cards and was always happy with them.
The GTX680 by EVGA in a single sku outsells the combined total sales of the 7870 and 7850 at newegg. nVidia "vaporware" sells more units than the proclaimed "best deal" 7000 series amd cards. ROFL Thanks for not noticing.
Compute performance in this case may have to do with 2 things: - Amount of memory available for the threaded computational algorithm being run, and - the memory IO throughput capability.
From the rumor-mill, the next NVidia chip may contain 4 GB per chip and a 512 bit bus (which is 2x larger than the GK104).
If you can't feed the beast as fast as it can eat it, then adding more cores won't increase your overall performance.
I am a new reader and equally new to the subject matter, so sorry if this is a dumb question. The second page mentioned that NVIDIA will be limiting its partners' branding of the cards, and that the first generation of GTX 690 cards are reference boards. Does NVIDIA just make a reference design that other companies use to make their own graphics cards? If not, then why would anyone but NVIDIA have any branding on the cards?
anyone who sides with AMD or NVIDIA are retards - side with yourself as a consumer - buy the best card at the time that is available AND right for your NEEDs.
fact is the the 690 is trash regardless of whether you are comparing it to a NVIDIA card to a AMD card - if im buying a card like a 690 why the FUCK would i want anything below 1200 P even if it is uncommon its a mfing trash of a $1000 card considering:
and that SLI and CF both beat(or equal) the 690 at higher res's and cost less(by 1$ for NVIDIA but still like srsly wtf NVIDIA !? and 40$ for AMD) ... WHAT !?
furthermore you guys fighting over bias when the WHOLE mfing GFX community (companies, software developers is built on bias) is utterly ridiculous, GFX vendoers (AMD and NVIDA) have skewed results for games for the last decade + , and software vendors two - there needs to laws against specfically building a software for a particular graphics card in addition to making the software work worse on the other (this applies to both companies)
hell workstation graphics cards are a very good example of how the industry likes to screw over consumers ( if u ever bios modded - not just soft modded a normal consumer card to a work station card , you would know all that extra charge(up-to 70% extra for the same processor) of a workstation card is BS and if the government cleaned up their shitty policies we the consumer would be better for it)
I know this is a really old review, and everyone has long since stopped the discussion - but I just couldn't resist posting something after reading through all the comments. Understand, I mean no disrespect to anyone at all by saying this, but it really does seem like a lot of people haven't actually used these cards first hand.
I see all this discussion of nVidia surround type setups with massive resolutions and it makes me laugh a little. The 690 is obviously an amazing graphics card. I don't have one, but I do use 2x680 in SLI and have for some time now.
As a general rule, these cards have nowhere near the processing power necessary to run those gigantic screen resolutions with all the settings cranked up to maximum detail, 8xAA, 16xAF, tessellation, etc....
In fact, my 680 SLI setup can easily be running as low as 35 fps in a game like Metro 2033 with every setting turned up to max - and that is at 1920x1080.
So, for all those people that think buying a $1000 graphics card means you'll be playing every game out there with every setting turned up to max across three 1920x1200 displays - I promise you, you will not - at least not at a playable frame rate.
To do that, you'll be realistically looking at 2x$1000 graphics cards, a ridiculous power supply, and by the way you better make sure you have the processing power to push those cards. Your run of the mill i5 gaming rig isn't gonna cut it.
More than 1 year since it is announced. I hope new products will be better. My suggestion: 1 Add HDMI, it is standard. 2. consider to allow us to add memory / SSD for better/ faster performance, especially for rendering 3D animation, and other
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
200 Comments
Back to Article
tviceman - Thursday, May 3, 2012 - link
Last page: Based on our benchmarks we’re looking at 95% of the performance of the GTX "580 SLI" - 580 SLI should read 680 SLI.UltraTech79 - Thursday, May 3, 2012 - link
I wish there was a separate button to point out this sort of thing so they could silently correct it. Dont get me wrong, I think its good to have accurate information, just clutters things up a bit.Ryan Smith - Thursday, May 3, 2012 - link
My inbox is always open.=)mediaconvert - Friday, May 4, 2012 - link
p 2While the basic design of the GTX 690 resembles the GTX 590, NVIDIA has replaced virtually every bit "with plastic with metal" for aesthetic/perceptual purposes.
surely "with plastic with metal" to "of plastic with metal"
still a good review
rockqc - Thursday, May 3, 2012 - link
1st line on page 2 "Much like the GTX 680 launch and the GTX 590 before it, the first generation the first generation of GTX 690 cards are reference boards being built by NVIDIA"First generation has been written twice.
Torrijos - Thursday, May 3, 2012 - link
The first benchmark plotted (Crysis) has a resolution of 5760 x 1200, this has to be wrong!tipoo - Thursday, May 3, 2012 - link
It's crazy but right. He tested that resolution on multiple games.CeriseCogburn - Thursday, May 3, 2012 - link
If you look at accumulated benchmarks across the web, the 680 Nvidia cards beat the 7970 amd cards by a much higher percentage in 1920x1080 (17.61% ahead) than they do in 1920x1200 (10.14% ahead).This means anand reviews always tests in 1920x1200 to give the amd cards a prettier looking review, instead of testing in 1920x1080 (the most commonly available resolution at 1920x that they could easily set their 1920x1200 monitors to).
Hence their tests here at anand are likely also amd favorably biased in higher resolutions.
http://translate.google.pl/translate?hl=pl&sl=...
A5 - Thursday, May 3, 2012 - link
It's not like 19x12 is an uncommon or unavailable resolution. Maybe Nvidia should improve their 19x12 performance?crimson117 - Thursday, May 3, 2012 - link
Sadly, it is a very uncommon resolution for new monitors. Almost every 22-24" monitor your buy today is 1080p instead of 1200p. :(JPForums - Thursday, May 3, 2012 - link
Not mine. I'm running a 1920x1200 IPS.
1920x1200 is more common in the higher end monitor market.
A quick glance at newegg shows 16 1920x1200 models with at 24" alone. (starting at $230)
Besides, I can't imagine many buy a $1000 dollar video card and pair it with a single $200 display.
It makes more sense to me to check 1920x1200 performance than 1920x1080 for several reasons:
1) 1920x1200 splits the difference between 16x10 and 25x14 or 25x16 better than 1920x1080.
1680x1050 = ~1.7MP
1920x1080=~2MP
1920x1200=~2.3MP
2560*1440=~3.7MP
2560x1600=~4MP
2) People willing to spend $1000 for a video card are generally in a better position to get a nicer monitor. 1920x1200 monitors are more common at higher prices.
3) They already have three of them around to run 5760x1200. Why go get another monitor?
Opinionated Side Points:
Movies transitioned to resolutions much wider than 1080P long ago. A little extra black space really makes no difference.
1920x1200 is a perfectly valid resolution. If Nvidia is having trouble with it, I want to know. When particular resolutions don't scale properly, it is probable that there is either a bug or shenanigans are at work in the more common resolutions.
I prefer using 1920x1200 as a starting point for moving to triple screen setups. I already thing 1920x1080 looks squashed, so 5760x1080 looks downright flattened. Also 3240x1920 just doesn't look very surround to me (3600x1920 seems borderline surround).
CeriseCogburn - Saturday, May 5, 2012 - link
There are only 18 models available in all of newegg with 1920x1200 resolution - only 6 of those are under $400, they are all over $300.+
There are 242 models available in 1920x1080, with nearly 150 models under $300.
You people are literally a bad joke when it comes to even a tiny shred of honesty.
Lerianis - Sunday, May 6, 2012 - link
I don't know about the 'sadly' there in all honesty. I personally like 1920*1080 better than *1200, because nearly everything is done in the former resolution.Stuka87 - Thursday, May 3, 2012 - link
Who buys a GTX690 to play on a 1080P display? Even a 680 is overkill for 1080. You can save a lot of money with a 7870 and still run everything out there.vladanandtechy - Thursday, May 3, 2012 - link
Stuka i agree with you.....but when you buy such a card....you think in the future....5 maybe 6 years....and i can't gurantee that we will do gaming in 1080p then:)....retrospooty - Thursday, May 3, 2012 - link
"Stuka i agree with you.....but when you buy such a card....you think in the future....5 maybe 6 years....and i can't gurantee that we will do gaming in 1080p then:)...."I have to totally disagree with that. Anyone that pays $500+ for a video card is a certain "type" of buyer. That type of buyer will NEVER wait 5-6 years for an upgrade. That guy is getting the latest and greatest of every other generation, if not every generation of cards.
vladanandtechy - Thursday, May 3, 2012 - link
You shouldn't "totally disagree".......meet me...."the exception"....i am the type of buyer who is looking for the "long run"....but i must confess....if i could....i would be the type of buyer you describe....cyaorionismud - Thursday, May 3, 2012 - link
retrospooty and I mean you no disrespect, but if you're spending $500 and buying for the "long run," you're doing it wrong.If you had spent $250, you could have 80% of the performance for 2.5 years, then spend another $250 and have 200% of the performance for the remaining 2.5 years.
von Krupp - Thursday, May 3, 2012 - link
Don't say that.I bought two (2) HD 7970s on the premise that I'm not going to upgrade them for a good long while. At least four years, probably closer to six. I ran from 2005 to 2012 with a GeForce 7800GT just fine and my single core AMD CPU was actually the larger reason why I needed to move on.
Now granted, I also purchased a snazzy U2711 just so the power of these cards wouldn't go to waste (though I'm quite CPU-bound by this i7-3820), but I don't consider dropping AA in future titles to maintain performance to be that big of a loss; I already only run with 8x AF because , frankly, I'm too busy killing things to notice otherwise. I intend to drive this rig for the same mileage. It costs less for me to buy the best of the best at the time of purchase for $1000 and play it into the ground than it is to keep buying $350 cards to barely keep up every two years, all over a seven year duration. Since I now have this fancy 2560x1440 resolution and want to use it, the $250-$300 offerings don't cut it. And the, don't forget to adjust for inflation year over year.
So yes, I'm going to be waiting between 4 and 6 years to upgrade. Under certain conditions, buying the really expensive stuff is as much of an economical move as it is a power grab. Not all of us who build $3000 computers do it on a regular basis.
P.S. Thank you consoles for extending PC hardware life cycles. Makes it easier to make purchases.
Makaveli - Thursday, May 3, 2012 - link
lol agree let put a $500 videocard with a $200 TN panel at 1920x1080 umm ya no!CeriseCogburn - Thursday, May 3, 2012 - link
Keep laughing, this card cannot solid v-sync 60 at that "tiny panel" with only 4xaa in the amd fans revived favorite game crysis.Can't do it at 1920X guy.
I guess you guys all like turning down your tiny cheap cards settings all the time, even with your cheapo panels?
I mean this one can't even keep up at 1920X, gotta turn down the in game settings, keep the CP tweaked and eased off, etc.
What's wrong with you guys ?
What don't you get ?
nathanddrews - Thursday, May 3, 2012 - link
Currently the only native 120Hz displays (true 120Hz input, not 60Hz frame doubling) are 1920x1080. If you want VSYNC @ 120Hz, then you need to be able to hit at least 120fps @ 1080p. Even the GTX690 fails to do that at maximum quality settings on some games...CeriseCogburn - Thursday, May 3, 2012 - link
It can't do 60 v-sync at 1920 in crysis, and that's only on 4xaa.These people don't own a single high end card, that's for sure, or something is wrong with their brains.
nathanddrews - Thursday, May 3, 2012 - link
You must be talking about minimum fps, because on Page 5 the GTX690 is clearly averaging 85fps @1080p.Tom's Hardware (love 'em or hate 'em) has benchmarks with AA enabled and disabled. Maximum quality with AA disabled seems to be the best way to get 120fps in nearly every game @ 1080p with this card.
CeriseCogburn - Friday, May 4, 2012 - link
You must be ignoring v-sync and stutter with frames that drop below 60, and forget 120 frames a sec.Just turn down the eye candy... on the 3 year old console ports, that are "holding us back"... at 1920X resolutions.
Those are the facts, combined with the moaning about ported console games.
Ignore those facts and you can rant and wide eye spew like others - now not only is there enough money for $500 card(s)/$1000dual, there's extra money for high end monitors when the current 1920X pukes out even the 690 and CF 7970 - on the old console port games.
Whatever, everyone can continue to bloviate that these cards destroy 1920X, until they look at the held back settings benches and actually engage their brains for once.
hechacker1 - Thursday, May 3, 2012 - link
Well not if you want to do consistent 120FPS gaming. Then you need all the horsepower you can get.Hell my 6970 struggles to maintain 120FPS, and thus makes the game choppy, even though it's only dipping to 80fps or so.
So now that I have a 120FPS monitor, it's incredibly easy to see stutters in game performance.
Time for an upgrade (1080p btw).
Sabresiberian - Thursday, May 3, 2012 - link
Actually, they use the 5760x1200 because most of us Anandtech readers prefer the 1920x1200 monitors, not because they are trying to play favorites.CeriseCogburn - Thursday, May 3, 2012 - link
Those monitors are very rare. Of course none of you have even one.Traciatim - Thursday, May 3, 2012 - link
My monitor runs 1920x1200, and I specifically went out of my way to get 16:10 instead of 16:9. You fail.CeriseCogburn - Friday, May 4, 2012 - link
Yes you went out of your way, why did you have to they are so common, I'm sure you did.In any case, since they are so rare the bias is still present here as shown.
james.jwb - Thursday, May 3, 2012 - link
You are correct, I don't own one... I own three in triple screen. Dell U2412m's.I really am at a loss as to what you are on about. It is well known that 16:10 is preferred amongst enthusiasts and professionals for a few reasons. If you want 16:9, fine, go for it, but don't act like it's weird that AT are benching with 16:10 just because you went with cheap ass 16:9 screens.
CeriseCogburn - Friday, May 4, 2012 - link
Yes of course you are at a loss, you don't understand a word so why reply ?You're all at a loss.
ROFL
yelnatsch517 - Friday, May 4, 2012 - link
Are you being sarcastic or an idiot?From my experience 1900x1200 24" monitors are the MAJORITY. My work has roughly 50 24" monitors all in that resolution. My HP ZR24W is 1900x1200 as well. The only 24" monitor that I have even seen is the TN panel that came with an HP computer.
If you are talking about computer monitors, 1900x1200 is the dominant resolution. If you are talking about TVs, then obviously 1080p is the norm.
CeriseCogburn - Saturday, May 5, 2012 - link
There are 242 - count them, well over 200, nearly 250 1920X1080 monitors at the egg._
In your great experience, there are 16 that fit your 1920X1200 dreampipe FUD scenario at the egg, with most of them, well over half, over $400 each, while the 242 common monitors you all buy as you pinch every penny and whine about $10 difference in videocard prices are well under $200 each a lot of the time.
So now suddenly, you all spend way over $300 to plus $400 for 11% more pixels... ROFL HAHAHHAHHA instead of $150 or $200...
I guess that's why this place is so biased, the little bloggers are just as whacked when it comes to being honest.
InsaneScientist - Saturday, May 5, 2012 - link
Good grief... resorting to personal attacks isn't exactly a good way to get people to listen to you.I'm not going to argue that 1080p isn't more common (from what I've read, no one is), because it is more common, you are quite correct there, however I must point out that your logic to arrive at that conclusion is faulty:
You're contending that 1080p is more common (it is) because there are more models available on Newegg, but just knowing how many models are available doesn't tell us how many units those move.
If, for example, each of those 22 models of 1920x1200 monitors moves 10 times as much stock as each of the 1920x1080, nearly as many of the 1920x1200 will have been sold as the 1920x1080 ones.
Now, I don't think that's likely, and I do agree with you that 1080p is more common nowadays (see next point), but your argument is invalid, even though you have come to the correct conclusion.
Consider this: there are currently two models of iPhone available, compared to dozens of Android phones. By the same logic as you're using, I could say that the iPhone is incredibly rare - I'd get laughed out of town if I tried to make that argument.
The second point is that 1920x1200 hasn't been nearly as rare in the past as it is today. When I bought my previous monitor and my laptop (both 1920x1200), 1080p monitors were almost unheard of. Since monitors tend to last a while, it's not at all unreasonable for a disproportionate amount of people to be using them compared to their current sales.
Thirdly, there is a point of diminishing returns. Notice the complete lack of any benchmarks at or below 1680x1050? These cards are so fast that comparisons at those resolutions are pointless - they're all fast enough for anything you could do to them at that screen res - even Crysis. 1920x1080 almost falls into that category, heck, even 1920x1200 almost falls into that category. Benchmarks are only about who wins if there is some advantage to winning. Below 2560x1600, which card you're using is almost completely irrelevant, so why does it even matter whether they used 1920x1080 or 1920x1200?
CeriseCogburn - Tuesday, May 8, 2012 - link
Blah blah blah blah and I'm still 100% correct and you are not at all.Decembermouse - Tuesday, May 8, 2012 - link
You're quite a character.anirudhs - Thursday, May 3, 2012 - link
I use 2 at work - HP ZR24W.piroroadkill - Sunday, May 6, 2012 - link
Hm, odd.Not only do I have 1920x1200 monitor on my desktop, I have TWO laptops with 1920x1200 screens. Using one right now.
Yes, they're rarer than 1080p screens, but this is a site for enthusiasts, therefore, it is more likely.
Ryan Smith - Thursday, May 3, 2012 - link
The truth is a bit more simple than that. 5760x1200 is because our choice in monitors for multi-monitor testing was based on my personal monitor, which is another PA246Q. NVIDIA's limited display flexibility (same res + same sync) meant that it was easiest to just pair the PA246Q with some more PA246Qs, Consequently it's easier to just test these monitors at their native resolution when we're using NVIDIA cards.Makaveli - Thursday, May 3, 2012 - link
Some of us don't buy 16:9 monitors or TN panels!I want results at 1920x1200 and other 16:10 resolution you can shut up with your amd bias which you have no proof of other than your flawed logic.
CeriseCogburn - Thursday, May 3, 2012 - link
Then you don't buy much. 1920x1200 is a very rare monitor.Parhel - Thursday, May 3, 2012 - link
1920x1200 was very common for several years. Until a few years ago, they were much more common than 1920x1080. I even have an old laptop that's 1920x1200. Looking at what's available to buy new, today, doesn't tell the whole story. Because people don't replace their monitors every day.Anandtech has always recommended spending up and getting a quality monitor. You see it in nearly every review. So, I think the readers here are more likely than the average guy on the street to own less common screens. I've had the same 2560x1600 monitor through 3 computers now, and I spent more on it than I've ever spent on any computer.
CeriseCogburn - Saturday, May 5, 2012 - link
Yes, you're all super premium monitor buyers, and moments ago you were hollering the videocards are way too expensive and you cannot possibly afford them unless you are an idiot with too much money.I love this place, the people are so wonderfully honest.
Makaveli - Thursday, May 3, 2012 - link
1920x1200 is only rare now. i've gone thru enough monitor to know what I like and cheap 16:9 TN panels are not if its that good enough for you then enjoy.As for your other comment about v-sync and 4xAA Guess what some of us don't care to have 8x AA and 16XAF running all the time.
I would rather play at 1200p at high settings with AA and AF off if it means playable fps and a enjoyable experience. This isn't [H] i'm not gonna spend $1000 on a Gpu so I can meet your approved settings for playing games dude. Get a clue!
CeriseCogburn - Saturday, May 5, 2012 - link
But you'll spend well over $400 for 11% more monitor pixels because "you'd rather".. "all of a sudden".LOL
Way to go, thanks for helping me.
anirudhs - Thursday, May 3, 2012 - link
No...I couldn't afford one but I very much wanted to buy one. It is much prettier than 16:9 for workstation purposes. New ones are being released all the time. You just have to pay more, but its worth it.CeriseCogburn - Saturday, May 5, 2012 - link
Oh, so someone who almost wants to be honest.So isn't it absolutely true a $500 videocard is much easier to buy when your monitor doesn't cost half that much let alone twice that much or $2,000 plus ?
You don't need to answer. We all know the truth.
Everyone in this thread would take a single videocard 680 or 7970 and a 1080P panel for under $200 before they'd buy a $450 1200P monitor and forfeit the 680 or 7970 for a $200 videocard instead.
It's absolutely clear, no matter the protestations.
In fact if they did otherwise, they would be so dumb, they would fit right in. Oh look at that, why maybe they are that foolish.
InsaneScientist - Saturday, May 5, 2012 - link
Oh? A little over a year ago, I had some money for an upgrade and I wanted to upgrade either my monitor or my video card.Now, I have (and play) Crysis, which can only now, just barely, be handled by a single card, so obviously I could have used the GPU upgrade (still can, for that matter). I also had a decent (though not great) 22" 1920x1200 monitor.
However, despite that, I chose to buy a new monitor, and bought a used 3008WFP (30" 2560x1600). I have not regretted that decision one bit, and that was a lot more money than your $200-300 upsell for 1920x1200
Now, admittedly, there were other factors that were a consideration, but even without those, I would have made the same decision. Putting money into a good monitor which I'll use ALL the time I'm on the computer vs. putting money into a good video card that I'll use some of the time is a no-brainer for me.
If all of my electronics were taken and I were starting from scratch, I'd get another 2560x1600 monitor before I even bought a video card. I'd suffer through the integrated IGP as long as I needed.
Now, that's my choice, and everyone's needs are different, so I wouldn't demand that you make the same decision I did, but, by the same token, you shouldn't be expecting everyone to be following the same needs that you have. ;)
CeriseCogburn - Sunday, May 6, 2012 - link
You've jumped from 1920 to 2560 so who cares, not even close.In your case you got no video card. ROFL - further proving my point, and disproving everyone elses who screamed if you get this card you have another two grand for monitors as well - which everyone here knows isn't true.
I never demanded anyone follow any needs, let alone mine which are unknown to you despite your imaginary lifestyle readings, and obverse to the sudden flooding of monitor fanboys and the accompanying lies.
bobsmith1492 - Thursday, May 3, 2012 - link
It's not that rare; I got a fairly inexpensive 24" 1920x1200 HP monitor from Newegg a year ago. There weren't many options but it was there and it's great.a5cent - Thursday, May 3, 2012 - link
You are right that the average Joe doesn't have a 1920x1200 monitor, but this is an enthusiast web-site! Not a single enthusiast I know owns a 1080 display. 1920x1200 monitors aren't hard to find, but you will need to spend a tad more.CeriseCogburn - Saturday, May 5, 2012 - link
Nope, 242 vs 16 is availability, you lose miserably. You all didn't suddenly have one along with your "friends" you suddenly acquired and have memorized their monitor sizes instantly as well.ROFL - the lies are innumerable at this point.
UltraTech79 - Thursday, May 3, 2012 - link
They make up about 10% stock. I wouldn't call that very rare. Newegg and other places have a couple dozen+ to choose from.Maybe YOU dont buy very much.
CeriseCogburn - Tuesday, May 8, 2012 - link
Closer to 5% than it is to 10%, and they cost a lot more for all the moaning penny pinchers who've suddenly become flush.Digimonkey - Thursday, May 3, 2012 - link
It's either 1920x1200 @ 60hz, or 1920x1080 @ 120hz. I prefer smoother gameplay over 120 pixels. Also I know quite a few gamers that like using their TV for their PC gaming, so this would also be limited to 1080p.CeriseCogburn - Friday, May 4, 2012 - link
No one here is limited, they all said, so no one uses their big screens, they all want it @ 1200P now because amd loses not so badly there...ROFL
Dracusis - Thursday, May 3, 2012 - link
I'm be more worried about AMD's performance going down in certain games due to Crossfire than something as trival as this. As a 4870X2 owner I can tell you this is not at all uncommon for AMD. I still have to disable 1 GPU in most games, including BF3, because AMDs drivers for any card more than 12 months old are just terrible. As you can see even the 6990 is being beat by a 6970 in games as modern as Skyrim - their drivers are just full of fail.Galidou - Thursday, May 3, 2012 - link
A much higher percentage?!? that's 7% more... nothing extraordinary...Let's just say a higher percentage, when you say much, it makes us beleive Nvidia's paying you.CeriseCogburn - Saturday, May 5, 2012 - link
10% you might be able to ignore, 17% you cannot. It's much higher, it changes several of the games here as to who wins in the article in the accumulated benches.It's a big difference.
InsaneScientist - Sunday, May 6, 2012 - link
Except that nVidia wins in the article and all of the accumulated benches here, even at 1920x1200 (which this card would be a complete waste on...), so what exactly are you complaining about?It's bias if they say that the AMD cards are better when they're not, but in the benchmarks and in the conclusions (here and elsewhere), nVidia is consistently ahead, so any claims of bias are completely groundless...
CeriseCogburn - Tuesday, May 8, 2012 - link
Read my first post instead of asking or having already read it attack like you just did and continue to be a jerk who cares, right ?You obviously are all seriously upset about the little facts I gave in my very first post. You're all going bonkers over it, and you all know I'm correct, that's what really bothers all of you.
Continue to be bothered, you cannot help it, that's for sure.
Sabresiberian - Thursday, May 3, 2012 - link
It's certainly not crazy, I'd certainly run 3 1920x1200 monitors over 3 1920x1080s.;)
CeriseCogburn - Thursday, May 3, 2012 - link
I guess all of you got very upset that my point was made, you're looking at a biased for amd set of benchmarks. I'm sure most of you are very happy about that, but angered it has been effectively pointed out.Makaveli - Thursday, May 3, 2012 - link
The only thing were are upset about is your being a tool!And what point? you haven't shown a shread of evidence to back up this bias claim only whats floating around in your head!
CeriseCogburn - Sunday, May 6, 2012 - link
Go look at the link you missed since you cannot read and only attack and call names.james.jwb - Thursday, May 3, 2012 - link
I always love these guys who behave like this.On the one hand, if they are trolling just for the reaction, it's fascinating. What kind of weird creature lies behind the internet persona? In most cases, we all know it must be a sad figure of a person with all sorts of interesting personality problems.
But on the flip side, if this person actually means and believes what they say is some sort of honest analysis, it's just as fascinating. What kind of thick bastard must then lurk behind the keyboard in question?
It boggles the mind :)
silverblue - Thursday, May 3, 2012 - link
Reminds me of SiliconDoc. That particular numpty got banned as far as I remember.Galidou - Thursday, May 3, 2012 - link
I think that these creatures are Nvidia's fanboy, they react always the same way. CeriseCogburn remember me one of them a little while ago, can't remember his name. He was convinced that the 7970 pricing was the worst thing to ever happen to humanity since the birth of Justin Bieber, or at least, it looked alot like that. Sure the price wasn't attractive, but there's some limit you must not cross to stay in the real world.So as weird a creature they can be, I believe they are a result of Nvidia's funding them to spread insanity in forums speaking of video cards. They can't be natural things after all, they just don't make sense. Their closed mind is second to none. Or else, they could only have the possibility to type insanities and a filter to read the replies to stop some information entering their brain.
Parhel - Friday, May 4, 2012 - link
Do you really think ATI and nVidia would pay these weird, sad, little trolls to piss off readers every time one of their products is reviewed? It's an embarassment and a distraction. No, I think they would pay someone like that NOT to talk about their products if they could. I'm sure that employees do write comments on product reviews, but guys like this are bad for business. Nobody wants someone like that on their side. If I were nVidia, I'd pay that guy to become an AMD fan!!!CeriseCogburn - Saturday, May 5, 2012 - link
I'm certain they would pay none of you since not a single one can be honest nor has a single argument to counter my points.You're all down to name calling trolls - and you all have to face the facts now, that your clueless ignorance left out of your minds for some time.
Have fun buying your cheap 1080P panels and slow and cheapo amd cards - LOL
Oh sorry, you all now buy premium flat panels...
CeriseCogburn - Sunday, May 6, 2012 - link
No actually I expected a lot more from the people here.I expected a big thank you, or a thanks for the information we'll keep that in mind and it helps our purchasing decisions.
Instead we got a flood of raging new monitor owners and haters and name callers.
Next time just thanking me for providing very pertinent information would be the right thing to do, but at this point I don't expect any of you to ever do the right thing.
UltraTech79 - Thursday, May 3, 2012 - link
Never seen a triple screen setup before?tipoo - Thursday, May 3, 2012 - link
I'm curious why the 680 and 690 trail AMD cards in Crysis and Metro, seeing as those seem to be the most GPU intensive games, while they win in most other tests. Would it be shading performance or something else?My mind is pretty blow that we have cards that can run Crysis and Metro at 5760x1200 at very comfortable framerates now, that's insane. But barring that resolution or 2560 for some games, I'm sure most of us don't see appeal here, it will be sold in a very very small niche. For normal monitor resolutions, I doubt games in large quantities will get much more demanding until we have new consoles out.
CeriseCogburn - Thursday, May 3, 2012 - link
Oh, wow, they also are so biased toward amd they removed the actual most demanding game, Shogun 2, Total War, because I kept pointing out how the Nvidia 680's swept that game across the board - so now it's gone !ROFL
(before you attack me I note the anand reviewer stated S2TW is the most demanding, it's right in the reviews here - but not this one.
Ryan Smith - Thursday, May 3, 2012 - link
Um, it's there. Page 8.Sabresiberian - Thursday, May 3, 2012 - link
LOL.Cerise, epic fail!
;)
CeriseCogburn - Thursday, May 3, 2012 - link
Oh I see it was added because the patch broke the Nvidia cards - but in amd's favor again, the tester kept the breaking patch in, instead of providing results.Wow, more amd bias.
Glad my epic fails are so productive. :-)
U still mad ? Or madder and raging out of control?
silverblue - Thursday, May 3, 2012 - link
So, if they failed to add it, it'd have been AMD bias, but considering they DID add it... it's AMD bias.And you're the one talking about rage, trollboi?
Had you just merely mentioned that the patch doesn't provide favourable results for NVIDIA cards, Ryan might have been tempted to reinstall the game and retest. Well, he might have - can't speak for the guy. Doubt he will now, though.
tipoo - Thursday, May 3, 2012 - link
So back on non-trolling topic...?CeriseCogburn - Sunday, May 6, 2012 - link
It's a very pertinent topic because despite the protestations the vast majority of readers are likely to have a 1080p monitor, and apparently they all have amnesia as well since this came up in a very, very recent article - one of the last few on the video cards - where it was even pointed out by Anand himself that one should extrapolate the fps data with the 11% pixel difference, something every last one of you either never read or completely forgot or conveniently omitted.Unfortunately Anand isn't correct, period. He should know it, but of course has to have an answer for the complainers - so he gave one.
What should be clear, even by just looking at the 1920X benches here is that below that resolutin and above it don't always jibe with it - and personally I've noticed nVidia has issues with 1600X1200.
So, all the hundreds of readers with 1080p monitors can thank me deep down in their hearts, as they now know Nvidia is supreme at that resolution, by 17%+ on average, more so than at 1200p, and thus "saving money" on an amd card using this sites 1920 1200p stats is for many, incorrect.
Makaveli - Thursday, May 3, 2012 - link
no one is mad or raging I think most are just amused at your stupidity! and sad attempt at trolling.Trololol
CeriseCogburn - Sunday, May 6, 2012 - link
Yet my point is absolutely prescient and helpful to anyone whom isn't a raging fanboy fool.Doesn't appear you've risen above that category.
InsaneScientist - Sunday, May 6, 2012 - link
And yet they specifically called out the fact that the patch broke performance on nVidia cards, went out of their way to state what performance was like before the patch (which is clearly better than any of the other cards), and finally stated that they're pretty sure that the game is at fault, not nVidia or their drivers...Yeah, they really must have it out for nVidia... *sigh*
CeriseCogburn - Sunday, May 6, 2012 - link
Except in the 680 tests all you fools ignored S2TW, which I had to repeatedly point out was the hardest game in the bunch, not Crysis or M2033 - over and over again I had to tell all the fools blabbering, and now suddenly the game is "broken". ROFL - it's beyond amazing.CeriseCogburn - Sunday, May 6, 2012 - link
Oh look it's not broken, TPU can handle it with nearly 100% single card SLI scaling with the 690http://www.techpowerup.com/reviews/NVIDIA/GeForce_...
Gee I guess it was "harder" here.
blackened23 - Thursday, May 3, 2012 - link
you appear to be using an old version of Batman without the latest patch.blackened23 - Thursday, May 3, 2012 - link
http://www.computerandvideogames.com/340746/batman...You are using an outdated version, the newest version released in March enhances both SLI and CF performance.
Ryan Smith - Thursday, May 3, 2012 - link
We're on the latest version. I triple checked.blackened23 - Thursday, May 3, 2012 - link
Are you using the Steam version? Your results differ from that of HardOCP, hardwareheaven, and hardwarecanucks. They get scaling you don't. Your version should be dated March 2012, thats when the patch was released.Ryan Smith - Thursday, May 3, 2012 - link
Correct, we're using the Steam version. I reloaded it as of last week.bpwnes - Thursday, May 3, 2012 - link
...but will it blend?Luscious - Thursday, May 3, 2012 - link
I'll take three, thanks!!!tipoo - Thursday, May 3, 2012 - link
Send me 10 grand and I'll send you three of the nitrogen enriched versions.Makaveli - Thursday, May 3, 2012 - link
Is this a review for the 7970 CF or 690 lolFor a $1000 card that is not a very good showing. i'm thinking that 2GB limit per gpu is really starting to hurt them not to mention the 256bit memory bus.
shin0bi272 - Thursday, May 3, 2012 - link
That's what I was thinking. Imagine what they could have done if they'd of expanded the bus to 384bit per gpu... such a sad showing.CeriseCogburn - Thursday, May 3, 2012 - link
There are plenty of 4BG 680 reviews out there that certainly disprove your thinking, and lack of knowledge.silverblue - Thursday, May 3, 2012 - link
This card has 2GB per GPU, not 4. Also, the lack of memory (!) will limit performance before the memory bus does. Compared to previous NVIDIA products, the 680 has far faster memory which mitigates having a narrower bus.shin0bi272 - Saturday, May 5, 2012 - link
I guess we'll see when the gk110 arrives this summerCeriseCogburn - Saturday, May 5, 2012 - link
There is no limit with the 2G of memory, but none of you have looked at teh dozens of reviews and hundreds of blogs proofs, so you will keep babbling stupid things, forever, it appears.Death666Angel - Thursday, May 3, 2012 - link
The few reviews I've seen have 4GB GTX 680 card between 5% and 10% faster at high resolutions (starting at 2560x1440 to 7860x1600). Adding, on top of that some more memory bandwidth would have been the gaming card most people expected from nVidia.As it stands, the GTX 680 is good, but also very expensive (I can have t he 7970 for 65€ less). The GTX 690 is a good product for people who want SLI but don't have the space, PSU, SLI enabled mainboard or want 4 GPUs.
CeriseCogburn - Saturday, May 5, 2012 - link
Sure, link us to a single review that shows us that. Won't be HardOcp nor any as popular, as every review has shown the exact opposite.kallogan - Thursday, May 3, 2012 - link
Where are the middle range gpus ?Wait. Nvidia don't release them cause they can't provide enough quantities.
CeriseCogburn - Thursday, May 3, 2012 - link
They're being held back like the "real 680" top nVidia core, because nVidia is making more money selling the prior launches and the new 2nd tier now top dog cards.It's a conspiracy of profit and win.
silverblue - Thursday, May 3, 2012 - link
Yes, because making a small number of full size Kepler cores is obviously going to make them more money than a large number of less complex Kepler cores. *rolleyes*NVIDIA, assuming they had the ability to get them manufactured in large enough quantities, would make far more profit off a 660 or 670 than they ever would off a 680.
silverblue - Thursday, May 3, 2012 - link
(I mean making far more profit off the 660/670 series than the 680 series, not specific cards nor the profit per card)CeriseCogburn - Tuesday, May 8, 2012 - link
A lot of prior gen stock moving, take a look you're on the internet, not that hard to do, wonder why you people are always so clueless.CeriseCogburn - Tuesday, May 8, 2012 - link
For instance the entire lot of 7870's and 7850's on the egg are outsold by a single GTX680 by EVGA - confirmed purchasers reviews.So it appears nVidia knows what it's doing in a greatly superior manner than your tiny mind spewing whatever comes to it in a few moments of raging rebuttal whilst you try to "point out what's obvious" - yet is incorrect.
zcat - Thursday, May 3, 2012 - link
Ditto.Every time my Anandtech feed updates, the first thing I'm hoping to see is reviews for the more-reasonably priced, and less power-hoggy GTX 640 (w/GDDR5) and GTX 660 Ti. If we see a review, then at least we know it'll show up at retail very soon after.
All I want for xmas is a mid-range NVidia card with a higher idle wattage to maximum performance ration than AMD (because NVidia > AMD wrt drivers, esp under linux).
zcat - Thursday, May 3, 2012 - link
correction: idle Watts <= 10 && max performance >= AMD.InsaneScientist - Sunday, May 6, 2012 - link
Well, let's be fair... it's not nVidia's fault. It's TSMC that can't get their act together to produce 28nm chips in volume.CeriseCogburn - Sunday, May 6, 2012 - link
The GTX680 has sold more card by the verified reviewers at NewEgg than the entire lot of the 7870's and 7850's at NewEgg combined, and that's just with ONE GTX680 sold by EVGA - check it out my friend...ROFL
GTX680 in one listing outsells the entire lineup of 7870 and 7850 COMBINED at newegg- with verified owners count.
HAHAHA
Yes, the supply is always "key". ROFL
vladanandtechy - Thursday, May 3, 2012 - link
I must confess that every logic i can think of says i don't need this GPU.....but.....i want it....i don't need it.....but damn it....i want it.....it's nvidia....it's aluminium....it's 4 GB VRAM....it's probably 5 times faster than what i have.......and i want to congratulate the team for the review wich i read from start to finish...but to be honest with you.....you don't need 19 pages to describe it...for me...."futureproof" says it all....mamisano - Thursday, May 3, 2012 - link
Is there any way you can post the average FPS achieved during OCCT tests? Curious how 680 SLI, 690 and 7970 CF compare in this regard.Ryan Smith - Thursday, May 3, 2012 - link
Sorry, but we don't currently record that data (though if it's a big enough deal we can certainly start).Filiprino - Thursday, May 3, 2012 - link
Simply put, NVIDIA has superior software department in comparison with AMD.AMD is mainstream. Whenever they try to reach the high end, they fail miserably, both on GPU and CPU camps. Driver issues with crossfire, trifire and quadfire with or without eyefinity in numerous games (with eyefinity even more problems) etc.
If they don't get their problems solved by Catalyst 12.5 buying AMD cards for high end builds (anything multicard related) is a waste of money. And that is sad.
CeriseCogburn - Thursday, May 3, 2012 - link
Yes, and the reviewer is constantly trying to catch nVidia in a big lie - and it shows - he even states how he never believed a word nVidia said about this card but had to admit it was all true.I have never, in many years, seen the same bad attitude given to amd's gpu's.
The bias in the write up is so blatant every time it's amazing they still get nVidia cards for review. The reviewer is clearly so pro amd he cannot hide it.
N4g4rok - Thursday, May 3, 2012 - link
He did say that Crossfire was so broken that he couldn't recommend it. He's been pointing out flaws in both companies along the way I think you should dial back the bias accusations a little bit.CeriseCogburn - Thursday, May 3, 2012 - link
Well if you want me to point out like 10 blatant direct wordings in this article I will. I'm not the only one who sees it, by the way. you want to tell me how he avoids totally broken amd drivers when he's posting the 7970CF ? Not like he had a choice there, your point is absolutely worthless.silverblue - Thursday, May 3, 2012 - link
Okay then, for our benefit (because we're stupid and that), please point out the reviewer's transgressions.InsaneScientist - Sunday, May 6, 2012 - link
Or don't...It's 2 days later, and you've been active in the comments up through today. Why'd you ignore this one, Cerise?
CeriseCogburn - Sunday, May 6, 2012 - link
Because you idiots aren't worth the time and last review the same silverblue stalker demanded the links to prove my points and he got them, and then never replied.It's clear what providing proof does for you people, look at the sudden 100% ownership of 1920x1200 monitors..
ROFL
If you want me to waste my time, show a single bit of truth telling on my point on the first page.
Let's see if you pass the test.
I'll wait for your reply - you've got a week or so.
KompuKare - Thursday, May 3, 2012 - link
It is indeed sad. AMD comes up with really good hardware features like eyefinity but then never polishes up the drivers properly. Looking some of crossfire results is sad too: in Crysis and BF3 CF scalling is better than SLI (unsure but I think the trifire and quadfire results for those games are even more in AMD's favour), but in Skyrim it seems that CF is totally broken.Of course compared to Intel, AMD's drivers are near perfect but with a bit more work they could be better than Nvidia's too rather than being mostly at 95% or so.
Tellingly, JHH did once say that Nvidia were a software company which was a strange thing for a hardware manufacturer to say. But this also seems to mean that they forgotten the most basic primary thing which all chip designers should know: how to design hardware which works. Yes I'm talking about bumpgate.
See despite all I said about AMD's drivers, I will never buy Nvidia hardware again after my personal experience of their poor QA. My 8800GT, my brother's 8800GT, this 8400M MXM I had, plus number of laptops plus one nForce motherboard: they all had one thing in common, poorly made chips made by BigGreen and they all died way before they were obsolete.
Oh, and as pointed out in the Anand VC&G forums earlier today:
"Well, Nvidia has the title of the worst driver bug in history at this point-
http://www.zdnet.com/blog/hardware/w...hics-card/7... "
killing cards with a driver is a record.
Filiprino - Thursday, May 3, 2012 - link
Yep, that's true. They killed cards with a driver. They should implement hardware auto shutdown, like CPUs. As for the nForce, I had one motherboard, the best nForce they made: nForce 2 for AMD Athlon. The rest of mobo chipsets were bullshit, including nForce 680.The QA I don't think is NVIDIA's fault but videocard manufacturers.
KompuKare - Thursday, May 3, 2012 - link
No, 100% Nvidia's fault. Although maybe QA isn't the right word. I was referring to Nvidia using the wrong solder underfil for a few million chips (the exact number is unknown): they were mainly mobile parts and Nvidia had to put $250 million aside to settle a class action.
http://en.wikipedia.org/wiki/GeForce_8_Series#Prob...
Although that wiki article is rather lenient towards Nvidia since that bit about fan speeds is red herring: more accurately it was Nvidia which spec'ed their chips to a certain temperature and designs which run way below that will have put less stress on the solder but to say it was poor OEM and AIB design which lead to the problem is not correct. Anyway, the proper expose was by Charlie D. in the Inquirer and later SemiAccurate
CeriseCogburn - Friday, May 4, 2012 - link
But in fact it was a bad heatsink design, thank HP, and view the thousands of heatsink repairs, including the "add a copper penny" method to reduce the giant gap between the HS and the NV chip.Charlie was wrong, a liar, again, as usual.
KompuKare - Friday, May 4, 2012 - link
Don't be silly. While HP's DV6000s were the most notorious failures and that was due to HP's poorly designed heatsink / cooling bumpgate also saw Dells, Apples and others:http://www.electronista.com/articles/10/09/29/suit...
http://www.nvidiadefect.com/nvidia-settlement-t874...
The problem was real, continues to be real and also affects G92 desktop parts and certain nForce chipsets like the 7150.
Yes, the penny shim trick will fix it for a while but if you actually were to read up on technicians forums who fix laptops, that plus reflows are only a temporary fix because the actual chips are flawed. Re-balling with new, better solder is a better solution but not many offer those fixes since it involves 100s of tiny solder balls per chip.
Before blindly leaping to Nvidia's defence like a fanboy, please do some research!
CeriseCogburn - Saturday, May 5, 2012 - link
Before blindly taking the big lie from years ago repeated above to attack nvidia for no reason at all other than all you have is years old misinformation, then wail on about it, while telling someone else some more lies about it, check your own immense bias and lack of knowledge, since I had to point out the truth for you to find, and you forgot DV9000, dv2000 and dell systems with poor HS design, let alone apple amd console video chip failings, and the fact that payment was made and restitution was delivered, which you also did not mention, because of your fanboy problems, obviously in amd's favor.Ashkal - Thursday, May 3, 2012 - link
In price comparison in Final words you are not referring with AMD products. I think AMD is better in price performance ratio.prophet001 - Thursday, May 3, 2012 - link
I agreeCeriseCogburn - Friday, May 4, 2012 - link
I disagreechadwilson - Thursday, May 3, 2012 - link
I have some issues with this article, the first of course being availability. Checking the past week, I have yet to see any availability of the 680 besides $200+ over retail premium cards on ebay. How can you justify covering yet another paper launch card without blaring bold print caveats, that for all intents and purposes, nVidia can't make for a very long time? There is a difference between ultra rare and non-existant.Is a card or chip really the fastest if it doesn't exist to be sold?
Second, the issue of RAM, that's a problem in that most games are 32 bit, and as such, they can only address 3.5GB of RAM total between system and GPU RAM. This means you can have 12GB of RAM on your video card and the best you will ever get is 3GB worth of usage.
Until games start getting written with 64 bit binaries (which won't happen until Xbox 720 since almost all PC games are console ports), anything more than 2-3GB GPU RAM is wasteful. We're still looking at 2014 until games even START using 64 bit binaries.
Want it to change? Lobby your favorite gaming company. They're all dragging their feet, they're all complicit.
Ryan Smith - Thursday, May 3, 2012 - link
Hi Chad;While I'm afraid we're not at liberty to discuss how many 680 and 690 cards NVIDIA has shipped, we do have our ears to the ground and as a result we have a decent idea as to how many have shipped. Suffice it to say, NVIDIA is shipping a fair number of cards; this is not a paper launch otherwise we would be calling NVIDIA out on it. NVIDIA absolutely needs to improve the stock situation, but at this point this is something that's out of their hands until either demand dies down or TSMC production picks up.
-Thanks
Ryan Smith
silverblue - Thursday, May 3, 2012 - link
The 690 is a stunning product... but I'm left wanting to see the more mainstream offerings. That's really where NVIDIA will make its money, but we're just left wondering about supply issues and the fact that AMD isn't suffering to the same degree.CeriseCogburn - Sunday, May 6, 2012 - link
A single EVGA GTX680 sku at newegg has outsold the entire line up of 7870 and 7850 cards combined with verified owners reviews.So if availability is such a big deal, you had better ask yourselves why the 7870 and 7850 combined cannot keep pace with a single EVGA 680 card selling at Newegg.
Go count them up - have at it - you shall see.
108 sales for the single EVGA 680, more than the entire combined lot of all sku's in stock and out of the 7870 and 7850 combined total sales.
So when you people complain, I check out facts - and I find you incorrect and failing almost 100% of the time.
That's what happens when one repeats talking points like a sad PR politician, instead of checking available data.
ltcommanderdata - Thursday, May 3, 2012 - link
Have you considered using WinZip 16.5 with it's OpenCL accelerated file compression/decompression as a compute benchmark? File compression/decompression is a common use case for all computer users, so could be the broadest application of GPGPU relevant to consumers if there is an actual benefit. The OpenCL acceleration in WinZip 16.5 is developed/promoted in association with AMD so it'll be interesting to see if it is hobbled on nVidia GPUs, as well as how well if scales with GPU power, whether it scales with SLI/dual GPU cards, and whether there are advantages with close IGP-CPU integration as with Llano and Ivy Bridge.ViRGE - Thursday, May 3, 2012 - link
Doesn't WinZip's OpenCL mode only work with AMD cards? If so, what use would that be in an NVIDIA review?ltcommanderdata - Thursday, May 3, 2012 - link
I actually don't know if it's AMD only. I know AMD worked on it together with WinZip. I just assumed that since it's OpenCL, it would be vendor/platform agnostic. Given AMD's complaints about use of vendor-specific CUDA in programs, if they developed an AMD-only OpenCL application, I would find that very disappointing.ViRGE - Thursday, May 3, 2012 - link
Going by their website it's only for AMD cards."WinZip has been working closely with Advanced Micro Devices (AMD) to bring you a major leap in file compression technology. WinZip 16.5 uses OpenCL acceleration to leverage the significant power of AMD Fusion processors and AMD Radeon graphics hardware graphics processors (GPUs). The result? Dramatically faster compression abilities for users who have these AMD products installed! "
CeriseCogburn - Friday, May 4, 2012 - link
Oh, amd the evil company up to it's no good breaking of openCL misdeeds again.Wow that's evil- the way it's meant to be unzipped.
chadwilson - Thursday, May 3, 2012 - link
OpenCL by it's very nature is open, it is not an AMD API.CeriseCogburn - Friday, May 4, 2012 - link
Not after amd gets through with it.silverblue - Friday, May 4, 2012 - link
We'll see once somebody posts benchmarks of it.CeriseCogburn - Friday, May 11, 2012 - link
Excuse me but you're wrong, again." by Ryan Smith on Thursday, May 10, 2012
According to WinZip it only supports AMD GPUs, which is why we're not using it in NVIDIA reviews at this time. "
Ryan's comment from the 670 release review.
chadwilson - Friday, May 4, 2012 - link
You haven't bothered to do even the most basic research as to who owns OpenCL have you? Perhaps you should visit google before posting hyperboleCeriseCogburn - Saturday, May 5, 2012 - link
I'm sure the gamer's manifesto amd company "ownz it" now, and also certain it has immediately become all of yours favorite new benchmark you cannot wait to demand be shown here 100% of the time, it's so gaming evolved.CeriseCogburn - Friday, May 11, 2012 - link
Here's some research mt know it all: " by Ryan Smith on Thursday, May 10, 2012According to WinZip it only supports AMD GPUs, which is why we're not using it in NVIDIA reviews at this time. "
--
Congratulations on utter FAIL.
eman17j - Sunday, August 19, 2012 - link
look at this websitehttp://developer.nvidia(dot)com/cuda/opencl
prophet001 - Thursday, May 3, 2012 - link
First off, thank you for this review. If you didn't do this, we'd have no idea how these GPUs perform in the wild. It is very nice to come here and read a graph and make educated decisions on which card we should purchase. It is appreciated.The one thing that I wanted to question is why you feel that you can't recommend the 7970. At the very least perhaps the recommendation of which card to get should be based on the game you're playing.
Reviewing the data you published, the average frame rates for the 5 top performers over all bench marks are;
680 SLI 119 fps
690 GTX 116 fps
7970 CF 103 fps
680 GTX 72.9 fps
7970 65.5 fps
Also, the number of times which the 7970 dipped below 60 fps in the benchmarks (excluding the minimum frame rate benchmarks) alone, without the 680 doing the same was 4. This is over 29 benchmarks and some of the dips were minimal.
This aligned with the price considerations makes me wonder why one wouldn't consider the 7970?
Ryan Smith - Thursday, May 3, 2012 - link
"The one thing that I wanted to question is why you feel that you can't recommend the 7970. At the very least perhaps the recommendation of which card to get should be based on the game you're playing."Under normal circumstances we would do this. For example GTX 570 vs Raadeon HD 6970 last year; the two traded blows often enough that it came down to the game being played. However the key was that the two were always close.
In 20% of our games, 7970CF performance is nowhere close to GTX 690 because CF is broken in those games. It would be one thing if AMD's CF scaling in those games was simply weaker, but instead we have no scaling and negative scaling in games that are 5+ months old.
For single card setups AMD is still fine, but I cannot in good faith recommend CF when it's failing on major games like this. Because you never know what games in the future may end up having the same problem.
theSeb - Thursday, May 3, 2012 - link
I must say I found it quite odd and hilarious to see people accusing Anandtech of favouring AMD by using a monitor with a 1200 vertical resolution. 16:10 monitors are not that uncommon and we really should be showing the industry what we think by not purchasing 16:9 monitors.Anyway, if anything this review seems to be Nvidia biased, in my opinion. The 7970 CF does not do too badly, In fact it beats the 690 / 680 CF in many games and only loses out in the games where it's "broken". I am not sure why you cannot recommend it based on the numbers in your benchmarks since it hardly embarrasses itself.
silverblue - Thursday, May 3, 2012 - link
It's not "people", it's "person"... and he's only here to troll graphics card articles.When AMD gets it right, CrossFire is absolutely blistering. Unfortunately, the sad state of affairs is that AMD isn't getting it right with a good proportion of the games in this review.
NVIDIA may not get quite as high scaling as AMD when AMD does get it right, but they're just far more consistent at providing good performance. This is the main gripe about AMD; with a few more resources devoted to the project, surely they can overcome this?
CeriseCogburn - Friday, May 4, 2012 - link
Yes, of course, call names forever, but never dispute the facts.I will agree with you though, amd drivers suck especially in CF, and they suck for a lot of games for a long long time.
silverblue - Friday, May 4, 2012 - link
No, I said AMD's drivers have issues with Crossfire, not that they suck in general.I've also checked three random British websites and there's no issues whatsoever in finding a 1920x1200 monitor. I also looked at NewEgg and found eight immediately. It's really not difficult to find one.
CeriseCogburn - Saturday, May 5, 2012 - link
1920x1200 all of you protesteth far too much.The cat is out of the bag and you won't be putting it back in.
Enjoy the bias, you obviously do, and leave me alone, stop the stalking.
seapeople - Saturday, May 5, 2012 - link
I'm with ya bro. Forget these high resolution monitor nancy's who don't know what they're missing. I'm rockin' games just fine with 60+ fps on my 720p plasma tv, and that's at 600hz! Just you try to get 24xAAAA in 3D (that's 1200hz total) on that 1920x1200 monitor of yours!Framerate fanboys unite!
CeriseCogburn - Sunday, May 6, 2012 - link
Ahh, upped the ante to plasma monitors ? ROFL - desperation of you people knows no bounds.saf227 - Thursday, May 3, 2012 - link
On page 2 of the review - where you have all the pictures of the card - we have no real basis for figuring out the cards true size. Could you include a reference in one of those photos? Say, a ruler or a pencil or something, so we have an idea what the size of the card truly is?Ryan Smith - Thursday, May 3, 2012 - link
The card is 10" long, the same length as the GTX 590 (that should be listed on page 2). But I'll take that under consideration for future articles.ueharaf - Thursday, May 3, 2012 - link
why they back to 256 bits and the gtx 590 have 384 bits?!?!cause they dont want to have a lot of advantage?
maybe the next gtx 790 will have again 384 bits and it would be better than gtx690 ....come on!!!
paul878 - Thursday, May 3, 2012 - link
Nvidia is getting very good at building Vaporware.paul878 - Thursday, May 3, 2012 - link
Nvidia is getting very good and making Vaporware.krumme - Thursday, May 3, 2012 - link
Is 6000 pcx. within the first month fx. a paper launch in your view?As selling numbers in that size, does nothing for the economy directly, what do you think is the strategic choices behind putting it on "sale" now?
How do you think marketing at NV thinks about how they can tailor perception from the reviewers on what is perciewed as a paper launch?
Do NV marketing present themselves as one of your kind, having the same background, understanding your dilemmas and problems?
mac2j - Thursday, May 3, 2012 - link
Wonder what the 7990 will look like next month. AMD clearly waited on purpose to see how the 690 was going to perform. They easily could have released a dual 7970 card already or at the very least sent specs to card manufacturers but they haven't.We know they left a lot of headroom on the 7970 - some people have even suggested we'll get a 7980 at some point - wonder if now we'll get 2 x fully clocked 7970s on the same card ... will be interesting to see how they deal with that power consumption at load though.
CeriseCogburn - Friday, May 4, 2012 - link
With 2x7970 @ STOCK they are already 175 watts over the 690's power draw.Good luck with that "headroom".
CeriseCogburn - Friday, May 4, 2012 - link
amd is late to the race, they never showed up this time, and when they do, they will lose, think housefires.Beenthere - Thursday, May 3, 2012 - link
Really? There are some sick people in this world. ;)Nfarce - Thursday, May 3, 2012 - link
No, unlike OWS protesters, there are some successful people in this world who get off their butts and work hard enough to be able to afford a $1,000 GPU (or in my case 2 GTX 680 $530 GPUs).anactoraaron - Thursday, May 3, 2012 - link
"Thus even four years since the release of the original Crysis, “but can it run Crysis?” is still an important question, and the answer when it comes to setups using a pair of high-end 28nm GPUs is “you better damn well believe it.”":D
CeriseCogburn - Friday, May 4, 2012 - link
No they actually cannot. 1920X, even the cf 7970 or 690 need help with lowered settings, as in many of the games. Can't even keep up with the 1920X monitors resolution refresh rate, set at a low 60.Sorry, more fantasies another for you perhaps. :)
Death666Angel - Thursday, May 3, 2012 - link
Hey guys!Thanks for the article, I enjoyed the read (although I am not in the market for dual GPU configurations after trying the HD3870X2 and 2*8800GTS, happy with one 7970 OC'ed to the max.). But you seem to be missing the numbers for noise from the HD7970 in a CF configuration. I hope you can post them! :D
-DA
Ryan Smith - Thursday, May 3, 2012 - link
This was mentioned on the Test page, but we don't have a matching pair of 7970 cards; what we have is a reference card and an XFX 7970 BEDD. Power and temperature are the same regardless, but it would be improper to list noise because of the completely different acoustic properties of the BEDD.Mygaffer - Thursday, May 3, 2012 - link
How does that stack up, especially price/performance? Why didn't your conclusion address that question at all? Totally limits the usefulness of the review in my opinion.rs2 - Thursday, May 3, 2012 - link
As per the data in your article, the GTX690 is clocked 10% below the GTX680, and has a 5% lower boost clock. This may be a small compromise, but it is a compromise nonetheless.More accuracy and less hyperbole, please.
pixelstuff - Thursday, May 3, 2012 - link
Seems to me they should be saving money in the construction when compared to two 680 in SLI. Half the fans, half the connectors, have the circuit boards. They should have at least cut $50 off the suggested retail price.Also when will be see 3 of these running in SLI form?
Holler - Thursday, May 3, 2012 - link
you won't only one SLI connector.
not that impressed, i'll be holding on to my overclocked 1.5 GB TRI-SLI GTX 480 hydro coppers for the forseeable future, this card should atleast double the RAM it has now...
CeriseCogburn - Saturday, May 5, 2012 - link
Half the magnesium, half the aluminum, half the PLX chips, half the R&D, half the vapor chambers, half the chip binning, half the power circuits, half the copper pcb's.... oh no wait, all those are added expenses, not reductions...+
I guess they should be charging $200 over the 2x$499 dollar usual price.
See how actually using the facts, instead of sourpuss emotion delivers a different picture ?
will54 - Friday, May 4, 2012 - link
These cards are sold out on Newegg for $1200 per. talk about taking advantage on a 20% markup over the msrp,hopefully AMD knocks the prices way down when they bring out there 7990, $800 sounds about right.faster - Friday, May 4, 2012 - link
Now I need a new keyboard because I was drooling into mine as I read this review. I have a GTX 680, but I don;t like to run SLI setups - I had a bad experience with my dual 560ti's. This looks like a truly awesome card that would hold its value for resale later. Nevertheless, there is no way I'm spending a grand on a video card.Origin32 - Saturday, May 5, 2012 - link
I predict that the 790 will, finally, be able to run Crysis. Next year an era will end. Enjoy it while it lasts, folks.von Krupp - Saturday, May 5, 2012 - link
Not precisely. That $350 performance point? It used to be a $200 performance point. Similarly, that $350 point will turn into a $400 performance point. So, assuming I maintain the price tier, graphics returns for my dollar are gradually tapering off. I look at the performance I was getting out of my 7800 GT at 1280x1024, and it wasn't worth upgrading to a newer card, period, because of Windows XP, my single core CPU, and the fact that I was already maxing out every game I had and still getting decent frame rates. I think they key factor is that I do not care if I dip below 60 frames, as long as I'm above 30 and getting reasonable frame times.I also know that consoles extend the life of PC hardware. The 7800GT is a 20-pipe version of the GTX, which is in turn the GPU found in the PS3.Devs have gotten much better at optimization in titles that matter to me.
CeriseCogburn - Saturday, May 5, 2012 - link
You spend well over $1,600 on a decent system.It makes no sense to spend all that money, then buy monitors the cards in question cannot successfully drive on 3 year old Crysis game, let alone well over half the benchmarks in this article set without turning DOWN the settings.
You cannot turn up DX11 tesselation, keep it on medium.
You cannot turn up MSAA past 4X, and better keep it at 2X.
You had better turn down your visual distance in game.
That in fact, with "all the console ports" moanings "holding us back".
I get it, the obvious problem is none of you seem to, because you want to moan and pretend spending $1,000.00 on a monitor alone, or more, is "how it's done", because you whine you cannot even afford $500 for a single video card.
These cards successfully drive 1920X1080 monitors in the benchmarks, but just barely - and if you turn the eye candy up, they cannot do it.
CeriseCogburn - Saturday, May 5, 2012 - link
Thanks for telling everyone how correct I am by doing a pure 100% troll attack after you and yours could not avoid the facts.Your mommy, if you knew who she was, must be very disappointed.
geok1ng - Sunday, May 6, 2012 - link
This card was not build for 2560x1600 gaming. a single 680 is more than enough for that.The 690 was built for 5760x1200 gaming.
I would like to see triple 30" tests. Nothing like gaming at 7680x1600 to feel that you are spending well your VGA money.
CeriseCogburn - Sunday, May 6, 2012 - link
You can use cards 2 generations back for that, but like these cards, you will be turning down most and near all of the eye candy, and be stuck rweaking and clocking, and jittering and wishing you had more power.These cards cannot handle 1920X at current "console port" games unless you turn them down, and that goes ESPECIALLY for the AMD cards that suck at extreme tesselation and have more issues with anything above 4XAA, and often 4XAA.
The 5770 is an eyefinity card and runs 5760X1200 too.
I guess none of you will ever know until you try it, and it appears none of you have spent the money and become disappointed turning down the eye candy settings - so blabbering about resolutions is all you have left.
_vor_ - Tuesday, May 8, 2012 - link
"... blabbering..."Pot, meet kettle.
CeriseCogburn - Sunday, May 6, 2012 - link
They cost $400 to $2,000 plus, not $150 like the 242 1080p.Thanks for playing.
hechacker1 - Monday, May 7, 2012 - link
Nope, you can already get IPS, 27", 2560x1440 panels (the same that Apple uses) for $400.They're rare, but currently they are building them in batches of 1000 to see how strong demand is for them.
Sure the 120Hz will sort of go to waste due to the slow IPS switching speed, but it will accept that signal with 0 input lag.
The only problem is that only the 680 seems to have a ramdac fast enough to do 120Hz. Radeon's tend to cap out at 85Hz.
marine73 - Monday, May 7, 2012 - link
After checking Newegg it would seem that, unfortunately for Nvidia, this will be another piece of vaporware. Perhaps they should scale the Kepler's to 22nm and contract Intel to fab them since TSMC has major issues with 28nm. Just a thought.marine73 - Monday, May 7, 2012 - link
I guess I should retract my comments about TSMC as other customers are not experiencing supply issues with 28nm parts. Apparently the issues are with Nvidia's design, which may require another redo. I'm guessing AMD will be out with their 8000 series before Nvidia gets their act together. Sad because I have used several generations of Nvidia cards and was always happy with them.CeriseCogburn - Thursday, May 10, 2012 - link
The GTX680 by EVGA in a single sku outsells the combined total sales of the 7870 and 7850 at newegg.nVidia "vaporware" sells more units than the proclaimed "best deal" 7000 series amd cards.
ROFL
Thanks for not noticing.
Invincible10001 - Sunday, May 13, 2012 - link
Maybe a noob question, but can we expect a mobile version of the 690 on laptops anytime soon?trumpetlicks - Thursday, May 24, 2012 - link
Compute performance in this case may have to do with 2 things:- Amount of memory available for the threaded computational algorithm being run, and
- the memory IO throughput capability.
From the rumor-mill, the next NVidia chip may contain 4 GB per chip and a 512 bit bus (which is 2x larger than the GK104).
If you can't feed the beast as fast as it can eat it, then adding more cores won't increase your overall performance.
Joseph Gubbels - Tuesday, May 29, 2012 - link
I am a new reader and equally new to the subject matter, so sorry if this is a dumb question. The second page mentioned that NVIDIA will be limiting its partners' branding of the cards, and that the first generation of GTX 690 cards are reference boards. Does NVIDIA just make a reference design that other companies use to make their own graphics cards? If not, then why would anyone but NVIDIA have any branding on the cards?Dark0tricks - Saturday, June 2, 2012 - link
anyone who sides with AMD or NVIDIA are retards - side with yourself as a consumer - buy the best card at the time that is available AND right for your NEEDs.fact is the the 690 is trash regardless of whether you are comparing it to a NVIDIA card to a AMD card - if im buying a card like a 690 why the FUCK would i want anything below 1200 P
even if it is uncommon its a mfing trash of a $1000 card considering:
$999 GeForce GTX 690
$499 GeForce GTX 680
$479 Radeon HD 7970
and that SLI and CF both beat(or equal) the 690 at higher res's and cost less(by 1$ for NVIDIA but still like srsly wtf NVIDIA !? and 40$ for AMD) ... WHAT !?
furthermore you guys fighting over bias when the WHOLE mfing GFX community (companies, software developers is built on bias) is utterly ridiculous, GFX vendoers (AMD and NVIDA) have skewed results for games for the last decade + , and software vendors two - there needs to laws against specfically building a software for a particular graphics card in addition to making the software work worse on the other (this applies to both companies)
hell workstation graphics cards are a very good example of how the industry likes to screw over consumers ( if u ever bios modded - not just soft modded a normal consumer card to a work station card , you would know all that extra charge(up-to 70% extra for the same processor) of a workstation card is BS and if the government cleaned up their shitty policies we the consumer would be better for it)
nyran125 - Monday, June 4, 2012 - link
yep........Ultra expensive and Ultra pointless.
kitty4427 - Monday, August 20, 2012 - link
I can't seem to find anything suggesting that the beta has started...trameaa - Friday, March 1, 2013 - link
I know this is a really old review, and everyone has long since stopped the discussion - but I just couldn't resist posting something after reading through all the comments. Understand, I mean no disrespect to anyone at all by saying this, but it really does seem like a lot of people haven't actually used these cards first hand.I see all this discussion of nVidia surround type setups with massive resolutions and it makes me laugh a little. The 690 is obviously an amazing graphics card. I don't have one, but I do use 2x680 in SLI and have for some time now.
As a general rule, these cards have nowhere near the processing power necessary to run those gigantic screen resolutions with all the settings cranked up to maximum detail, 8xAA, 16xAF, tessellation, etc....
In fact, my 680 SLI setup can easily be running as low as 35 fps in a game like Metro 2033 with every setting turned up to max - and that is at 1920x1080.
So, for all those people that think buying a $1000 graphics card means you'll be playing every game out there with every setting turned up to max across three 1920x1200 displays - I promise you, you will not - at least not at a playable frame rate.
To do that, you'll be realistically looking at 2x$1000 graphics cards, a ridiculous power supply, and by the way you better make sure you have the processing power to push those cards. Your run of the mill i5 gaming rig isn't gonna cut it.
Utomo - Friday, October 25, 2013 - link
More than 1 year since it is announced. I hope new products will be better. My suggestion: 1 Add HDMI, it is standard. 2. consider to allow us to add memory / SSD for better/ faster performance, especially for rendering 3D animation, and otherTPLVG - Sunday, March 5, 2017 - link
GTX 690 in known as "The nuclear bomb" in the Chinese IT communities because its power consumption and temperature.