Nvidia won the design wins for the summer notebooks as well as the refresh of the macs. Most computers with dedicated graphics are coming with nvidia 620m to 670m. While the radeon cards 7700 and 7800 series are practically non existent in summer laptops. This is ironic since the previous year the situation was reversed.
Tegra is also selling well in the tablet business (not so well in the phone business). It helps when Tegra 2 was Google's design platform for tablets.
It worked wonderfully for amd, as right here, the amd fanboys were spewing the company line like sheepled parrots and added on the attacking moan that nVidia lost market share to AMD and blah blah blah.
So the thing is whatever lie AMD tells it's many deprived and deluded minions, they will swill down with drunken abandon, regurgitate while puking it back out in sensationalized form, and add their own mental farts off the top of their empty and fact less heads in attacks on Intel and nVidia.
Thus the work to correct the amd fanboy becomes onerous.
The ironic thing, considering that nVidia is a GPU manufacturer, is that Tegra's weak point has always been their GPU. It's hard to differentiate on the CPU if you're just another ARM licensee. Qualcomm does, since they design the CPU themselves, and nVidia has tried with the fifth low-power core, but ultimately there isn't much of a difference CPU-wise between a Tegra 3 and any other quad-core Cortex A9 chip.
The place where we do tend to see differentiation is with the GPU, since there are currently four popular mobile GPU brands (PowerVR SGX, ARM Mali, Qualcomm Adreno, and nVidia GeForce ULP). To date, however, the GeForce ULP has always been significantly slower than the competition.
Tegra 4 ("Wayne") will certainly benefit from the jump to the Cortex A15, which by all accounts should be a beast of a chip. But everybody else will be making the jump to the A15 as well (excepting Qualcomm, who is already shipping an equivalent product). The real question will be, is Wayne's GPU going to be competitive? Because that hasn't been the case for any Tegra parts to date.
The mobile GPU space is a very interesting one, a space that is not yet mature enough that it can be free of surprises. Many people (myself included) were very surprised when ARM's Mali came out of nowhere to take the performance per watt crown. Surprised because it was basically ARM's first GPU, and I never expected such a competitive product from a first-time GPU designer.
They've been working on Kepler, then mobile versions of Kepler will come out. In the meantime they used the existing GPU (several years old) since they knew Kepler was about finished.
In AnandTech's battery life test, the Tegra 3 was about 16% greater life on phone calls than Krait, even at 40 nm compared to 28 nm for Krait. So on low level functions it was greater despite larger die size. For people who mainly do calls and emails/txt that's important, especially when it moves to 28 nm.
And it is much worse in the WiFi and 3G Web Browsing Tests (I compare int. One X vs AT&A One X). If you buy a smartphone to just use it for talking, great! But then you bought the wrong product. ;)
The point is, when they move to 28 nm, comparing same die size build, the design appears to have some noticeable benefits. Conceptually the 28 nm should have been better at all tests, but it was not.
Tegra has better margins than GPU and 28nm margins are lower than 40nm margins(for now). As for declining GPU, there is no drop in attach rates in notebook, OEMs don't really have what to differentiate with between high end and lesser systems except with the GPU. In desktop most of the market is from DiY and they might be getting some help there from higher res screens. Also to note that last year consumer was 591 mil and Tegra was only 360 mil.Nvidia said it expects Tegra to grow at least 50% this year. (got a feeling they have 1 more high profile tablet - not Win RT). If Grey comes anytime soon, or ever, there is quite a bit of upside there too.
nVDIA may see eroded margins in smat phones in the coming days. Really want to see how Broadcomm is likely to despite the fact they are quite late. But power consumption front they are well positioned better than nVIDIA.
Power consumption for mobile devices are concerned, Tegra processor fared poor (TI, qualcomm, Broadcomm). They have very good design team. But integrated processor, Non modified ARM processor are their weak points.
Graphic processor also not so great in terms of power consumptions.
nVIDIA's great advantage is their marketing team. They are aggressive in catching in numbers.
Nvidia's done well because they released an extra chip inbetween everyone else.
i.e. Most companies went from dual A9 @ 40nm to something much faster @28nm. Tegra 3 is a bridging chip, which has enabled them to capture the market before 28nm is cheap and plentiful.
Also if anything I wouldn't say it's marketing, it's software. I suspect a big reason both google and ms went with nvidia first is because nvidia had not only the chip but software team to write the drivers and assist on getting everything working so final product isn't released late.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
14 Comments
Back to Article
Wreckage - Friday, August 10, 2012 - link
According to AMD..http://www.marketwatch.com/story/amd-reports-secon...
"Overall weakness in the global economy, softer consumer spending"
How could NVIDIA increase revenue if what AMD says is true...
Roland00Address - Friday, August 10, 2012 - link
Nvidia won the design wins for the summer notebooks as well as the refresh of the macs. Most computers with dedicated graphics are coming with nvidia 620m to 670m. While the radeon cards 7700 and 7800 series are practically non existent in summer laptops. This is ironic since the previous year the situation was reversed.Tegra is also selling well in the tablet business (not so well in the phone business). It helps when Tegra 2 was Google's design platform for tablets.
ExarKun333 - Saturday, August 11, 2012 - link
LOL, true. NV and Intel can make money, but AMD still struggles. Interesting how that never changes...CeriseCogburn - Tuesday, November 6, 2012 - link
AMD= cya for fanboy consumption.It worked wonderfully for amd, as right here, the amd fanboys were spewing the company line like sheepled parrots and added on the attacking moan that nVidia lost market share to AMD and blah blah blah.
So the thing is whatever lie AMD tells it's many deprived and deluded minions, they will swill down with drunken abandon, regurgitate while puking it back out in sensationalized form, and add their own mental farts off the top of their empty and fact less heads in attacks on Intel and nVidia.
Thus the work to correct the amd fanboy becomes onerous.
Guspaz - Friday, August 10, 2012 - link
The ironic thing, considering that nVidia is a GPU manufacturer, is that Tegra's weak point has always been their GPU. It's hard to differentiate on the CPU if you're just another ARM licensee. Qualcomm does, since they design the CPU themselves, and nVidia has tried with the fifth low-power core, but ultimately there isn't much of a difference CPU-wise between a Tegra 3 and any other quad-core Cortex A9 chip.The place where we do tend to see differentiation is with the GPU, since there are currently four popular mobile GPU brands (PowerVR SGX, ARM Mali, Qualcomm Adreno, and nVidia GeForce ULP). To date, however, the GeForce ULP has always been significantly slower than the competition.
Tegra 4 ("Wayne") will certainly benefit from the jump to the Cortex A15, which by all accounts should be a beast of a chip. But everybody else will be making the jump to the A15 as well (excepting Qualcomm, who is already shipping an equivalent product). The real question will be, is Wayne's GPU going to be competitive? Because that hasn't been the case for any Tegra parts to date.
The mobile GPU space is a very interesting one, a space that is not yet mature enough that it can be free of surprises. Many people (myself included) were very surprised when ARM's Mali came out of nowhere to take the performance per watt crown. Surprised because it was basically ARM's first GPU, and I never expected such a competitive product from a first-time GPU designer.
eddman - Friday, August 10, 2012 - link
That's probably because they did it on purpose. Focus on CPU cores, and make the GPU simply "good enough", not great.Tegra 3 with 4+1 cores is about 80 mm^2. For comparison, exynos 4210 is 118 mm^2 and apple A5 is 120 mm^2, and both are dual-cores.
Of course, nvidia uses a 40nm process and samsung and apple 45nm, but that alone wouldn't result in such a difference.
See what I mean.
fm123 - Sunday, August 12, 2012 - link
Yeah, size and cost-wise it was sufficient until Kepler and the move to 28 nm.fm123 - Sunday, August 12, 2012 - link
They've been working on Kepler, then mobile versions of Kepler will come out. In the meantime they used the existing GPU (several years old) since they knew Kepler was about finished.In AnandTech's battery life test, the Tegra 3 was about 16% greater life on phone calls than Krait, even at 40 nm compared to 28 nm for Krait. So on low level functions it was greater despite larger die size. For people who mainly do calls and emails/txt that's important, especially when it moves to 28 nm.
Death666Angel - Monday, August 13, 2012 - link
And it is much worse in the WiFi and 3G Web Browsing Tests (I compare int. One X vs AT&A One X). If you buy a smartphone to just use it for talking, great! But then you bought the wrong product. ;)fm123 - Wednesday, August 15, 2012 - link
The point is, when they move to 28 nm, comparing same die size build, the design appears to have some noticeable benefits. Conceptually the 28 nm should have been better at all tests, but it was not.jjj - Friday, August 10, 2012 - link
Tegra has better margins than GPU and 28nm margins are lower than 40nm margins(for now).As for declining GPU, there is no drop in attach rates in notebook, OEMs don't really have what to differentiate with between high end and lesser systems except with the GPU. In desktop most of the market is from DiY and they might be getting some help there from higher res screens.
Also to note that last year consumer was 591 mil and Tegra was only 360 mil.Nvidia said it expects Tegra to grow at least 50% this year. (got a feeling they have 1 more high profile tablet - not Win RT).
If Grey comes anytime soon, or ever, there is quite a bit of upside there too.
maximumGPU - Saturday, August 11, 2012 - link
Thanks for an interesting read, it was better than the write up of financial websites whose sole purpose is to analyse companies!vasanthakumar - Saturday, August 11, 2012 - link
nVDIA may see eroded margins in smat phones in the coming days. Really want to see how Broadcomm is likely to despite the fact they are quite late. But power consumption front they are well positioned better than nVIDIA.Power consumption for mobile devices are concerned, Tegra processor fared poor (TI, qualcomm, Broadcomm). They have very good design team. But integrated processor, Non modified ARM processor are their weak points.
Graphic processor also not so great in terms of power consumptions.
nVIDIA's great advantage is their marketing team. They are aggressive in catching in numbers.
Dribble - Monday, August 13, 2012 - link
Nvidia's done well because they released an extra chip inbetween everyone else.i.e. Most companies went from dual A9 @ 40nm to something much faster @28nm.
Tegra 3 is a bridging chip, which has enabled them to capture the market before 28nm is cheap and plentiful.
Also if anything I wouldn't say it's marketing, it's software. I suspect a big reason both google and ms went with nvidia first is because nvidia had not only the chip but software team to write the drivers and assist on getting everything working so final product isn't released late.