The NVIDIA GeForce GTX 650 Ti Review, Feat. Gigabyte, Zotac, & EVGAby Ryan Smith on October 9, 2012 9:00 AM EST
"Once more into the fray.
Into the last good fight I'll ever know."
At a pace just shy of a card a month, NVIDIA has been launching the GeForce 600 series part by part for over the last half year now. What started with the GeForce GTX 680 in March and most recently saw the launch of the GeForce GTX 660 will finally be coming to an end today with the 8th and what is likely the final retail GeForce 600 series card, the GeForce GTX 650 Ti.
Last month we saw the introduction of NVIDIA’s 3rd Kepler GPU, GK106, which takes its place between the high-end GK104 and NVIDIA’s low-end/mobile gem, GK107. At the time NVIDIA launched just a single GK106 card, the GTX 660, but of course NVIDIA never launches just one product based on a GPU – if nothing else the economics of semiconductor manufacturing dictate a need for binning, and by extension products to attach to those bins. So it should come as no great surprise that NVIDIA has one more desktop GK106 card, and that card is the GeForce GTX 650 Ti.
The GTX 650 Ti is the aptly named successor to 2011’s GeForce GTX 550 Ti, and will occupy the same $150 price point that the GTX 550 Ti launched into. It will sit between the GTX 660 and the recently launched GTX 650, and despite the much closer similarities to the GTX 660 NVIDIA is placing the card into their GTX 650 family and pitching it as a higher performance alternative to the GTX 650. With that in mind, what exactly does NVIDIA’s final desktop consumer launch of 2012 bring to the table? Let’s find out.
|GTX 660||GTX 650 Ti||GTX 650||GT 550 Ti|
|Memory Clock||6.008GHz GDDR5||5.4GHz GDDR5||5GHz GDDR5||4.1GHz GDDR5|
|Memory Bus Width||192-bit||128-bit||128-bit||192-bit|
|FP64||1/24 FP32||1/24 FP32||1/24 FP32||1/12 FP32|
|Manufacturing Process||TSMC 28nm||TSMC 28nm||TSMC 28nm||TSMC 40nm|
Coming from the GTX 660 and its fully enabled GK106 GPU, NVIDIA has cut several features and functional units in order to bring the GTX 650 Ti down to their desired TDP and price. As is customary for lower tier parts, GTX 650 Ti ships with a binned GK106 GPU with some functional units disabled, where it unfortunately takes a big hit. For the GTX 650 Ti NVIDIA has opted to disable both SMXes and ROP/L2/memory clusters, with a greater emphasis on the latter.
On the shader side of the equation NVIDIA is disabling just a single SMX, giving GTX 650 Ti 768 CUDA cores and 64 texture units. On the ROP/L2/memory side of things however NVIDIA is disabling one of GK106’s three clusters (the minimum granularity for such a change), so coming from the GTX 660 the GTX 650 Ti will have much less memory bandwith and ROP throughput than its older sibling.
Taking a look at clockspeeds, along with the reduction in functional units there has also been a reduction in clockspeeds across the board. The GTX 650 Ti will ship at 925MHz, 65MHz lower than the GTX 660 Ti. Furthermore NVIDIA has decided to limit GPU boost functionality to the GTX 660 and higher families, so the GTX 650 Ti will actually run at 925MHz and no higher. The lack of a boost clock means the effective difference is closer to 100MHz. On the other hand the lack of min-maxing here by NVIDIA will have some good ramifications for overclocking, as we’ll see. Meanwhile the memory clock will be at 5.4GHz, which at only 600MHz below NVIDIA’s standards-bearer Kepler memory clock of 6GHz is not nearly as big as the loss of memory bandwidth from the memory bus width reduction.
Overall this gives the GTX 650 Ti approximately 72% of the shading/texturing performance, 60% of the ROP throughput, and 60% of the memory bandwidth of the GTX 660. Meanwhile compared to the GTX 650 the GTX 650 Ti has 175% of shading/texturing performance, 108% of the memory bandwidth, and 87% of the ROP throughput of its smaller predecessor. For what little tradition there is, NVIDIA’s x50 parts are traditionally geared towards 1680x1050/1600x900 resolutions. And while NVIDIA is trying to stretch that definition due to the popularity of 1920x1080 monitors, the loss of the ROP/memory cluster all but closes the door on the GTX 650 Ti’s 1080p ambitions. The GTX 650 Ti will be for all intents and purposes NVIDIA’s fastest sub-1080p Kepler card.
Moving on, it was interesting to find out that NVIDIA is not going to be disabling SMXes for the GTX 650 Ti in a straightforward manner. Because of GK106’s asymmetrical design and the pigeonhole principle – 5 SMXes spread over 3 GPCs – NVIDIA is going to be shipping GTX 650 Ti certified GPUs with both 2 GPCs and 3 GPCs, depending on which GPC houses the defective SMX that NVIDIA will be disabling. To the best of our knowledge this is the first time NVIDIA has done something like this, particularly since Fermi cards had far more SMs per GPC. Despite the fact that 3 GPC versions of the GTX 650 Ti should technically have a performance advantage due to the extra Raster Engine, NVIDIA tells us that the performance is virtually identical to the 2 GPC version. Ultimately since GTX 650 Ti is going to be ROP bottlenecked anyhow – and hence lacking the ROP throughput to take advantage of that 3rd Raster Engine – the difference should be just as insignificant as NVIDIA claims.
Meanwhile when it comes to power consumption the GTX 650 Ti is being given a TDP of 110W, some 30W lower than the GTX 660. Even compared to the GTX 550 Ti this is still a hair lower (116W vs. 110W), while the gap between the GTX 650 Ti and GTX 650 will be 34W. Idle power consumption on the other hand will be virtually unchanged, with the GTX 650 Ti maintaining the GTX 660’s 5W standard.
As NVIDIA’s final consumer desktop GeForce 600 card for the year, NVIDIA is setting the MSRP of the 1GB card at $150, between the $109 GTX 650 and the $229 GTX 660. This is another virtual launch, with partners going ahead with their own designs from the start. NVIDIA’s reference design will not be directly sold, but most of the retail boards will be very similar to NVIDIA’s reference card anyhow, implementing a single-fan open air cooler like NVIDIA’s. PCBs should also be similar; 2 of the 3 retail cards we’re looking at use the reference PCB, which on a side note is identical to the GTX 650 reference PCB as GTX 650 Ti and GTX 650 are pin compatible. Meanwhile similar to the GTX 660 Ti launch, partners will be going ahead with a mix of memory capacities, with many partners offering both 1GB and 2GB cards.
At launch the GTX 650 Ti will be facing competition from both last-generation GeForce cards and current-generation Radeon cards. The GeForce GTX 560 is currently going for almost exactly $150, making it direct competition for the GTX 650 Ti. The 560 cannot match the GTX 650 Ti’s power consumption, but thanks to its ROP performance and memory bandwidth it’s a potent competitor for rendering performance.
Meanwhile the Radeon competition will be the tag-team of the 7770 and the 7850. The 7770 is not nearly as powerful as the GTX 650 Ti, but with prices at-or-below $119 it significantly undercuts the GTX 650 Ti. Meanwhile the Pitcairn based 7850 1GB can more than give the GTX 650 Ti a run for its money, but is priced on average $20 higher at $169, and as the 1GB version is a bit of a niche product for AMD the selection the card selection won’t be as great.
To sweeten the deal NVIDIA has a new game bundle promotion starting up for the GTX 650 Ti. Retailers will be bundling vouchers for Assassin’s Creed III with GTX 650 Ti cards in North America and in Europe. Unreleased games tend to be good deals value-wise, but in the case of Assassin’s Creed III this also means waiting nearly 2 months for the PC version of the game to ship.
|Fall 2012 GPU Pricing Comparison|
|Radeon HD 7950||$329|
|$299||GeForce GTX 660 Ti|
|Radeon HD 7870||$239/$229||GeForce GTX 660|
|Radeon HD 7850 2GB||$189|
|Radeon HD 7850 1GB||$169||GeForce GTX 650 Ti 2GB|
|$149||GeForce GTX 650 Ti 1GB|
|Radeon HD 7770||$109||GeForce GTX 650|
|Radeon HD 7750||$99||GeForce GT 640|
Post Your CommentPlease log in or sign up to comment.
View All Comments
flipmode - Tuesday, October 9, 2012 - linkPlease, that mantra is goofy. Of course there is such a thing as a bad product. You're telling me you've never run into a product that you wouldn't buy at any price? I have. Not saying the GTX 650 Ti fits that description - it doesn't - but I just wish you'd dispense with that silly expression.
Paulman - Wednesday, October 10, 2012 - linkI think it's a good saying, especially when applied to the two horse race between AMD/ATI and NVIDIA. Both companies have been executing fairly well over the past half decade or more, and ultimately the biggest factor that determines the success or value of a card is the performance vs. price. The only thing that would mess with that is a significant spat of failing parts, or ridiculously high power/noise consumption that can't be mitigated, or unfixably buggy drivers. But barring such catastrophe scenarios, if your part isn't that great by the time it hits the market, just lower the price :P
CeriseCogburn - Friday, October 12, 2012 - linkIt's amazing the amd fanboy brain farts spewing here.
AMD lowered their frikkin 7850 price, not the card that "isn't that great that just hit the market".
I'll also point out that this nVidia card does 4 monitors out of the box, and the Asus version at the egg has a great port setup for that, and is inexpensive.
It's just amazing to me really. AMD drops in price, and the idiot response is late and slow for the card reviewed demanding a lower price.
LOL - it's so so freakin sad.
rarson - Friday, October 12, 2012 - linkYou don't understand economics, do you?
Homeles - Saturday, October 13, 2012 - link"AMD lowered their frikkin 7850 price, not the card that 'isn't that great that just hit the market.'"
You need to brush up on your reading comprehension skills, kid. You have completely missed the point of the post you are replying to. Quite laughably, really, especially given your condescension.
Siana - Monday, October 15, 2012 - linkOMG a sane person on the Internet!
Uritziel - Thursday, October 11, 2012 - linkNothing keeps a price from being negative, so the saying isn't really wrong. Bet you'd buy that bad product you have in mind for -$5000...
CeriseCogburn - Friday, October 12, 2012 - linkHere, where are the amd fanboys usual bloviating load of crap spews ?
I'll pretend I'm them.
This card OverClocks to 7850 speeds and passes it for $5o LESS ! you'd have to be an idiot to buy the amd card when every single nVidia 650Ti hit the same awesome overclock flying past the 7850 !
Not to mention eyefinity sucks and is dead now that 4 monitors are rockin on these 650Ti's !
I'd sure like to see amd innovate but all they care about is MONEY $$$ so they charge more!
There we go amd fanboys, FTFY, and the worse part of it all for you is it's all true instead of big fat lies like when you do it !
rarson - Friday, October 12, 2012 - linkThis has nothing to do with fanboys, just like the last post didn't. We're talking about economics here, not AMD vs. Nvidia. Stop looking at everything through your green-tinted glasses and try reading what is actually on the screen. The comment you replied to has nothing to do with the cards you mentioned.
Galidou - Saturday, October 13, 2012 - linkHe says everyone is lying when speaking about AMD while he can hardly stay in the right path himself.... He's taking the side of the most powerful companies in the world(anything that's against AMD is worth taking their side) while spewing shit like: ''all they care about is MONEY $$$.''
Let's go, take the side of the giants of this world, kill the small companies spewing shit about them so the world can turn more monopolistic than it is now... LoL funniest vomit the world had to know about... Make the rich even more rich and KILL everyone below... I have to admit AMD is in a bad situation, their CPU division fares ALOT worse than their GPU division but it's not a reason to be so stupid... so freaking imbecile..... Just so stubbornly refusing to have any respect toward anyone that doesn't TOTALLY embrace his stupid closed vision of the computer industry.
I just wish AMD gets out of there, if not then too bad, we can't change things for them. They are fighting against the giants of the computer industry that have a hundred times more budget than they do... Just for that, I'm wishing they succeed in the future.