NVIDIA's GeForce GTX Titan, Part 1: Titan For Gaming, Titan For Computeby Ryan Smith on February 19, 2013 9:01 AM EST
The launch of the Kepler family of GPUs in March of 2012 was something of a departure from the normal for NVIDIA. Over the years NVIDIA has come to be known among other things for their big and powerful GPUs. NVIDIA had always produced a large 500mm2+ GPU to serve both as a flagship GPU for their consumer lines and the fundamental GPU for their Quadro and Tesla lines, and have always launched with that big GPU first.
So when the Kepler family launched first with the GK104 and GK107 GPUs – powering the GeForce GTX 680 and GeForce GT 640M respectively – it was unusual to say the least. In place of “Big Kepler”, we got a lean GPU that was built around graphics first and foremost, focusing on efficiency and in the process forgoing a lot of the compute performance NVIDIA had come to be known for in the past generation. The end result of this efficiency paid off nicely for NVIDIA, with GTX 680 handily surpassing AMD’s Radeon HD 7970 at the time of its launch in both raw performance and in power efficiency.
Big Kepler was not forgotten however. First introduced at GTC 2012, GK110 as it would come to be known would be NVIDIA’s traditional big, powerful GPU for the Kepler family. Building upon NVIDIA’s work with GK104 while at the same time following in the footsteps of NVIDIA’s compute-heavy GF100 GPU, GK110 would be NVIDIA’s magnum opus for the Kepler family.
Taped out later than the rest of the Kepler family, GK110 has taken a slightly different route to get to market. Rather than launching in a consumer product first, GK110 was first launched as the heart of NVIDIA’s Tesla K20 family of GPUs, the new cornerstone of NVIDIA’s rapidly growing GPU compute business.
Oak Ridge National Laboratory's Titan Supercomputer
Or perhaps as it’s better known, the GPU at the heart of the world’s fastest supercomputer, Oak Ridge National Laboratory’s Titan supercomputer.
The Titan supercomputer was a major win for NVIDIA, and likely the breakthrough they’ve been looking for. A fledging business merely two generations prior, NVIDIA and their Tesla family have quickly shot up in prestige and size, much to the delight of NVIDIA. Their GPU computing business is still relatively small – consumer GPUs dwarf it and will continue to do so for the foreseeable future – but it’s now a proven business for NVIDIA. More to the point however, winning contracts like Titan are a major source of press and goodwill for the company, and goodwill the company intends to capitalize on.
With the launch of the Titan supercomputer and the Tesla K20 family now behind them, NVIDIA is now ready to focus their attention back on the consumer market. Ready to bring their big and powerful GK110 GPU to the consumer market, in typical NVIDIA fashion they intend to make a spectacle of it. In NVIDIA’s mind there’s only one name suitable for the first consumer card born of the same GPU as their greatest computing project: GeForce GTX Titan.
GeForce GTX Titan: By The Numbers
At the time of the GK110 launch at GTC, we didn’t know if and when GK110 would ever make it down to consumer hands. From a practical perspective GTX 680 was still clearly in the lead over AMD’s Radeon HD 7970. Meanwhile the Titan supercomputer was a major contract for NVIDIA, and something they needed to prioritize. 18,688 551mm2 GPUs for a single customer is a very large order, and at the same time orders for Tesla K20 cards were continuing to pour in each and every day after GTC. In the end, yes, GK110 would come to the consumer market. But not until months later, after NVIDIA had the chance to start filling Tesla orders. And today is that day.
Much like the launch of the GTX 690 before it, NVIDIA intends to stretch this launch out a bit to maximize the amount of press they get. Today we can tell you all about Titan – its specs, its construction, and its features – but not about its measured performance. For that you will have to come back on Thursday, when we can give you our benchmarks and performance analysis.
|GTX Titan||GTX 690||GTX 680||GTX 580|
|Stream Processors||2688||2 x 1536||1536||512|
|Texture Units||224||2 x 128||128||64|
|ROPs||48||2 x 32||32||48|
|Memory Clock||6.008GHz GDDR5||6.008GHz GDDR5||6.008GHz GDDR5||4.008GHz GDDR5|
|Memory Bus Width||384-bit||2 x 256-bit||256-bit||384-bit|
|VRAM||6||2 x 2GB||2GB||1.5GB|
|FP64||1/3 FP32||1/24 FP32||1/24 FP32||1/8 FP32|
|Transistor Count||7.1B||2 x 3.5B||3.5B||3B|
|Manufacturing Process||TSMC 28nm||TSMC 28nm||TSMC 28nm||TSMC 40nm|
Diving right into things then, at the heart of the GeForce GTX Titan we have the GK110 GPU. By virtue of this being the 2nd product to be launched based off the GK110 GPU, there are no great mysteries here about GK110’s capabilities. We’ve covered GK110 in depth from a compute perspective, so many of these numbers should be familiar with our long-time readers.
GK110 is composed of 15 of NVIDIA’s SMXes, each of which in turn is composed of a number of functional units. Every GK110 packs 192 FP32 CUDA cores, 64 FP64 CUDA cores, 64KB of L1 cache, 65K 32bit registers, and 16 texture units. These SMXes are in turn paired with GK110’s 6 ROP partitions, each one composed of 8 ROPs, 256KB of L2 cache, and connected to a 64bit memory controller. Altogether GK110 is a massive chip, coming in at 7.1 billion transistors, occupying 551mm2 on TSMC’s 28nm process.
For Titan NVIDIA will be using a partially disabled GK110 GPU. Titan will have all 6 ROP partitions and the full 384bit memory bus enabled, but only 14 of the 15 SMXes will be enabled. In terms of functional units this gives Titan a final count of 2688 FP 32 CUDA cores, 896 FP64 CUDA cores, 224 texture units, and 48 ROPs. This makes Titan virtually identical to NVIDIA’s most powerful Tesla, K20X, which ships with the same configuration. NVIDIA does not currently ship any products with all 15 SMXes enabled, and though NVIDIA will never really explain why this is – yield, power, or otherwise – if nothing else it leaves them an obvious outlet for growth if they need to further improve Titan’s performance, by enabling that 15th SMX.
Of course functional units are only half the story, so let’s talk about clockspeeds. As a rule of thumb bigger GPUs don’t clock as high as smaller GPUs, and Titan will be adhering to this rule. Whereas GTX 680 shipped with a base clock of 1006MHz, Titan ships at a more modest 837MHz, making up for any clockspeed disadvantage with the brute force behind having so many functional units. Like GTX 680 (and unlike Tesla), boost clocks are once more present, with Titan’s official boost clock coming in at 876MHz, while the maximum boost clock can potentially be much higher.
On the memory side of things, Titan ships with a full 6GB of GDDR5. As a luxury card NVIDIA went for broke here and simply equipped the card with as much RAM as is technically possible, rather than stopping at 3GB. You wouldn’t know that from looking at their memory clocks though; even with 24 GDDR5 memory chips, NVIDIA is shipping Titan at the same 6GHz effective memory clock as the rest of the high-end GeForce 600 series cards, giving the card 288GB/sec of memory bandwidth.
To put all of this in perspective, on paper (and at base clocks), GTX 680 can offer just shy of 3.1 TFLOPS of FP32 performance, 128GTexels/second texturing throughput, and 32GPixels/second rendering throughput, driven by 192GB/sec of memory bandwidth. Titan on the other hand can offer 4.5 TFLOPS of FP32 performance, 187GTexels/second texturing throughput, 40GPixels/second rendering throughput, and is driven by a 288GB/sec memory bus. This gives Titan 46% more shading/compute and texturing performance, 25% more pixel throughput, and a full 50% more memory bandwidth than GTX 680. Simply put, thanks to GK110 Titan is a far more powerful GPU than what GK104 could accomplish.
Of course with great power comes great power bills, to which Titan is no exception. In GTX 680’s drive for efficiency NVIDIA got GTX 680 down to a TDP of 195W with a power target of 170W, a remarkable position given both the competition and NVIDIA’s prior generation products. Titan on the other hand will have a flat 250W power target – in line with prior generation big NVIDIA GPUs – staking out its own spot on the price/power hierarchy, some 28%-47% higher in power consumption than GTX 680. These values are almost identical to the upper and lower theoretical performance gaps between Titan and GTX 680, so performance is growing in-line with power consumption, but only just. From a practical perspective Titan achieves a similar level of efficiency as GTX 680, but as a full compute chip it’s unquestionably not as lean. There’s a lot of compute baggage present that GK104 didn’t have to deal with.
Post Your CommentPlease log in or sign up to comment.
View All Comments
ehpexs - Tuesday, February 19, 2013 - linkGreat card, too rich for my blood though. For those who can afford one or two (or four) enjoy. I'll stick to my $550 pair of 7950s.
Wreckage - Tuesday, February 19, 2013 - linkRemember the 7970's were $1100 at launch.
TheCrackLing - Tuesday, February 19, 2013 - linkThen how did I manage to pay only $1150 for 2 at launch?
The 7970s were around $550-600 at launch, nowhere near $1100.
Wreckage - Tuesday, February 19, 2013 - linkI was (obviously) responding to his statement "pair of 7950s". If I could edit my post I suppose I could change it to CF and up the price $50. Either way Titan is in line with AMD pricing.
Stuka87 - Tuesday, February 19, 2013 - linkSo according to your logic, we can expect to pay 2k for cards that are twice as fast as the titan in the future?
just4U - Tuesday, February 19, 2013 - linkWell... I recall paying almost 800 for a Geforce3 at launch. So hmmm.. i don't think the high end has gone up much (if at all..) over the past decade. Sometimes it comes down if Nvidia/Amd are duking it out on pricing but overall it's remainded fairly consistant.
JonnyDough - Wednesday, February 20, 2013 - linkAnd it shouldn't. These cards get cheaper and cheaper for them to produce. Their profit margins just continue to climb. In other words, we're getting poorer and poorer in comparison to the upper class but nobody is taking notice...game on, until you can't afford to live.
shompa - Wednesday, February 20, 2013 - linkCheaper and cheaper to produce? How do you know that? Do you have wafer prices from TSMC and Global foundries? Or are you like many uneducated people who believes everything gets automatic cheaper with smaller process technology? *hint* Wafer prices goes up for each shrink. Thats why the majority of all microprocessors are manufactured at 65-90nm!
Look also at AMDs profit margins. Oh... They are loosing money. Guess they have zero profit.
To many uneducated people on the internet!
rupert3k - Saturday, June 1, 2013 - linkIs that tone really necessary?
TheJian - Wednesday, February 20, 2013 - linkROFL. AMD lost 1.18B this last year and NV only made 725mil (with a 300mil Intel payment).
NV made ~800mil in FY07 and lost money in 08 & 09 FY's lost 100mil, '10 made 235, '11 530m, last year finally hitting 725...It's taken them 5yrs to come close to what they USED TO MAKE.
You apparently don't read balance sheets or earnings reports.
For AMD I'll just give you the sum of the last 5-6. They lost a total of 5.1B roughly...THEY HAVEN'T MADE MONEY over the last 5 years they lost their fabs, wrote down ATI, laid off 30% of employees etc...They just lost 1.18B last year for christ's sake. You'd better PRAY amd stops giving away games and raises the price of their gpus/cpus before they go bankrupt. At the rate they are burning cash now they will be out of funding by the end of the year. Do you understand that? 5 years=5B+ in losses. Read a balance sheet once in a while before you say junk like this.
Also note, just looking at NV, they are getting ~300mil per year right now from INTC. So they aren't even making what I said! AMD's margins are at 15%! NV 53%. Regardless of what you think of the price, neither is sticking it to you compared to the performance gains they are giving you every year and what it costs to get them. AMD, looking at the entire life of the company (I'm not going to actually do it), I don't think has actually made a $1 profit...LOL. They're gouging you? Feel free to pull up every year of earnings they have had since existence. I think you'll find they have actually LOST money.
AMD's 10yr, has lost at least a few billion with a quick look. Is this computing in your head yet? In the last 10yrs AMD total has lost ~3-4Billion dollars. They aren't making SQUAT! If it weren't for large backers they'd be bankrupt ages ago. They were just downgraded by Fitch TWO grades to JUNK BOND status (just like USA two downgrades since obama took office)...In financial terms, it means NOT INVESTMENT GRADE.
They are getting killed by INTC/NVDA. Now a cadre of a good 5 players are entering their cpu/server business. They will continue to lose money and be lucky to make it to 2014 without yet another borrowing fiasco and this time and even higher rates of interest due to junk status.
NVDA finally hit record cash/revenue this year (and margins, but by a decimal, helped by Intel 300mil), after 5 years! What margins are you talking about? While a FAR better company than AMD, gaining share, entering new markets etc, NV isn't getting RICH either. Their future looks bright (AMD looks bankrupt or bought by 2014 without help), but unless you can prove otherwise they are in no way ripping you off. BOTH companies should be charging $50+ more on every card under $500. Granted the high end is what it is (middle income people don't drive Lamborghini's either), but the low end is costing them both with their current war that's been on for ~5yrs.
AMD has 1B in cash. If they lose another 1B this year that's gone, how do you think they run the company with no cash? No money coming from consoles will go on the books until the end of the year (and those sales won't be phenomenal IMHO, look at vita/3ds/wiiu failures and cuts), and mobile won't bring them a dime until mid 2014 at best on the books with no ARM until then. Are you doing the math here?
Get a better job, or quit buying things your budget can't afford! While your at it vote in a president who is PRO BUSINESS and ANTI TAX/SPEND. Start voting for people who CUT GOVT SPENDING & TAXES, then maybe you'll pay a little less in taxes, and more of us we'll be working to cover it because guess what happens when you cut taxes? (see Coolidge presidency, or Reagan, Coolidge was Reagan's hero...LOL...well duh - heard of the ROARING 20's?). Companies hire workers, and people start small businesses...Which causes...Wait for it...REVENUE to come in to cover the tax cuts! It should go without saying you need to CUT spending also when doing this for it to work (they keep kicking the can down the road today, again next month).
Guess what came after the roaring 20's? A TAX AND SPENDER! What do they create? THE DEPRESSION! LOL. What is Obama creating? The depression v2.0 (well, amped up to 16.5 trillion levels I guess depression v9.9?).
He didn't bail out flood victims, didn't bail out farmers (piss off), not even his hometown state when they had a natural disaster! He definitely would have kicked out every one of the 20mil illegals YOU & I are about to pay for medically etc (actually you're already paying, just not getting the service, and the high risk fund is already broke...LOL). When you wonder why your medical is so high in 2015 and why service sucks, you can thank your president and the illegal aliens. Coolidge vetoed every spending bill that was pork too! NO PORK passed his plate. That's how you get the DEBT DOWN.
1929, 8 months later depression...LOL. How does he respond, govt works projects (spend, govt growth) etc...FAILURE. Sound familiar? "SHOVEL READY" anybody?...ROFL. He raised the top tax bracket from 25-63%...ROFL. Sound familiar? Raised taxes on Business! Sound familiar?
"Hoover's defeat in the 1932 election was caused primarily by his failure to end the downward economic spiral, although his support for strong enforcement of prohibition was also a significant factor". This country just voted in Hoover's big brother TWICE!
So take away your money, take away your rights (no drinking for you!) and spend spend spend, tax tax tax...Sound familiar? Obama attacking guns, our constitution, spying on everyone, raising taxes, killing business, taxing rich who create jobs (upping the brackets) unemployment everywhere...See the problem?
Get a better job, or vote your govt out. Those are your options...LOL. Your graphics card price is NOT the problem and even if they give it to you FREE you won't live a better life...ROFLMAO
$20 says you voted obama.
"He was concerned about all Americans, especially the working class man and woman. He wanted them to be free, independent, and self-reliant. Like Jefferson, he wanted them to be able to rise as far as their abilities permitted."
The exact opposite of WELFARE obama. "you didn't build that"
A few more for ya: "He never sought to make history for himself."... "his conservative, Constitution-based principles of government". His nickname was SILENT CALVIN.
Obama's on TV every chance he gets. Obama's nickname? THE WELFARE PRESIDENT. He golfed with tiger woods Sunday...ROFL spent 8hrs with Tiger's coach first, golfed 18 with Tiger who left, obama went another 9...LOL. No time to fix the debt though. Obama believes in destroying the constitution & your ability to be SELF RELIANT. If you rise too high, He'll have to take your money and re-distribute it...LOL. It's YOUR job to make a better life for yourself and the govt's job to get out of your way. I clearly see you think the rich OWE you and desire govt intervention to get what you think they owe you. I want to game on, but I keep having to pay for PEOPLE like you who won't look to themselves to improve their lives. :(
Reality bites eh? ;)