NVIDIA’s GeForce RTX 3060 Gets a Release Date: February 25thby Ryan Smith on February 12, 2021 12:30 PM EST
- Posted in
- GeForce RTX
- RTX 3060
NVIDIA this morning has sent over a quick note revealing the release date for their next GeForce desktop video card, the RTX 3060. The mainstream(ish) video card, previously revealed at CES 2021 with a late February release date, has now been locked in for a launch on February 25th, with prices starting at $329.
As a quick recap, the RTX 3060 is the next card down in NVIDIA’s Ampere architecture consumer video card stack. Using the new GA106 GPU – which is already shipping in RTX 3060 laptops – the RTX 3060 follows the traditional price/performance cadence for video card launches, with NVIDIA releasing a cheaper and lower performing video card for the mainstream-enthusiast video card market. NVIDIA’s 60-tier cards have long been the company’s workhorse parts for 1080p gaming – as well as some of their highest-volume parts in North America – and the RTX 3060 is expected to fill the same role within the Ampere/30-series family.
|NVIDIA GeForce Specification Comparison|
|RTX 3060||RTX 3060 Ti||RTX 2060||GTX 1060|
|Memory Clock||14Gbps? GDDR6||14Gbps GDDR6||14Gbps GDDR6||8Gbps GDDR5|
|Memory Bus Width||192-bit||256-bit||192-bit||192-bit|
|Single Precision Perf.||12.8 TFLOPS||16.2 TFLOPS||6.5 TFLOPS||4.4 TFLOPS|
|Tensor Perf. (FP16)||51.2 TFLOPS||64.8 TFLOPS||51.6 TFLOPS||N/A|
|Tensor Perf. (FP16-Sparse)||102.4 TFLOPS||129.6 TFLOPS||51.6 TFLOPS||N/A|
|Manufacturing Process||Samsung 8nm?||Samsung 8nm||TSMC 12nm "FFN"||TSMC 16nm|
|Launch Price||MSRP: $329||MSRP: $399||MSRP: $349||MSRP: $249
NVIDIA has already published most of the specifications for the card back in January. Including the fact that it offers 28 SMs (3584 CUDA cores), and 12GB of GDDR6 running on a 192-bit memory bus. As with previous 60-tier cards, the non-power-of-two memory bus means that NVIDIA is shipping with a somewhat odd amount of memory, in this case 12GB, which is actually more than what comes on even the RTX 3080. However with the only other option being an anemic-for-2021 6GB, NVIDIA is opting to make sure that the card isn’t for want of VRAM capacity.
Meanwhile, for better or worse the RTX 3060 is all-but-guaranteed to fly off of shelves quickly. With every video card more powerful than a GTX 1050 Ti seemingly getting shanghaied into mining Ethereum, desperate gamers will be fighting with hungry miners for supplies. Even with the 192-bit memory bus, I would be shocked if the RTX 3060 wasn’t profitable, especially with Ethereum reaching record highs. So for anyone thinking of grabbing the card, best be prepared to camp out at your favorite retailer or e-tailer on that Thursday morning.
On a final note, unlike the other RTX 30 series cards launched to date, NVIDIA will not be producing any Founders Edition cards for the RTX 3060 series. So all of the cards released will be AIB cards with their own respective designs. And, if tradition holds, don't be surprised if we see the AIBs outfit their cards with premium features and raise their prices accordingly.
Post Your CommentPlease log in or sign up to comment.
View All Comments
Sychonut - Friday, February 12, 2021 - linkLooking forward to the unavailability.
Tomatotech - Friday, February 12, 2021 - linkToo little, too late, too much money. I was going to upgrade from a 1060, but have abandoned all hope of upgrading. I bought a cloud gaming subscription and am very happy with it. Currently on GeForce Now. £50 per year (£25 every 6 months) for a 1080 / 2080 (varies) for many (but not all) of my favourite games is mighty fine.
Your loss, NVIDIA.
mobutu - Saturday, February 13, 2021 - linkThe GeForceNow Subscription you pay monthly/yearly goes to nVidia so they haven't lost anything.
Tomatotech - Saturday, February 13, 2021 - link£25 every six months and I’ll leave for a different service the moment a better option comes up, versus however many hundreds of pounds a new 3000 series card costs?
GeForce Now is probably profitable (if not running at a small loss as it’s early days for them) as it runs (I think) on unused cloud capacity, but the profit margin can’t be as large as selling a whole card to me. So yes it’s their loss. Not that they care, as they don’t have any spare cards to sell anyway.
Yojimbo - Saturday, February 13, 2021 - linkI wouldn't venture to guess the economics of GeForce Now versus cards. I think not even NVIDIA really knows that. There's a certain unknown longer term value to the growth of the service regardless of the near-term financial implications. And NVIDIA's purpose with building GeForce Now is to build a service that won't have a better option exactly like their purpose with selling GPU modules is to offer graphics cards that don't have a better option. My guess is that NVIDIA's margins on GeForce now should eventually be greater than for selling modules to AIBs because they are providing more of the service. You can imagine that whatever other service you sign up for may be buying their cards from NVIDIA for that service.
An argument of greater liquidity in the gaming as a service market is not an argument against gaming as a service, as NVIDIA may gain from that liquidity as much as they lose from it (You may not be locked into a 300 pound purchase, but neither is the buyer of AMD cards). And as mentioned, even if you switch to another service you may be using NVIDIA's cards.
But I don't see why you are blaming NVIDIA here as if they don't want to sell a card to you, or as if they wouldn't be making more cards to sell to you if they could. Automobile plants are completely shutting down because of a lack of microchips. There are all sorts of problems with supplies at the moment.
Lucky Stripes 99 - Friday, February 12, 2021 - linkCan't say that I am happy to see the upward trend in power consumption continue. Looks like an RTX 3050 will be in my future.
Yojimbo - Saturday, February 13, 2021 - linkThat's going to be a long-term trend in computing.
RSAUser - Sunday, February 14, 2021 - linkIt won't, Nvidia is about at the max for their higher end cards, the lower end might go up a bit more than this, but doubt it tbh.
This is mostly a case of moving to a process node that's slightly worse for power while trying to clock it at its highest it can hold before issues arise, notice how they all have about 1850 as top clocks.
You should treat these the same as AMD hot cards were, clock them 10% lower than their max and enjoy the large reduction in power and fan noise.
Yojimbo - Monday, February 15, 2021 - linkThe power will continue to go up because Dennard scaling has broken down. The limit on the power is a limit on how much power can be delivered and how much heat can be dissipated economically in a client setting. People already get triple fan solutions so eventually they can push it up to 400 or 500 watts. That's why NVIDIA introduced the 12-pin connector.
NVIDIA introduced a more power efficient architecture with Kepler and Maxwell. And in the Pascal generation finfets were introduced. Those things kept power requirements from creeping up for a good 6 years. But there are less and less efficiency gains to wring out as the architectures mature and the power scaling going from planar fets to finfets was a one time thing. Samsung's 8 nm process is probably a bit behind the curve in terms of power efficiency so I wouldn't expect an increase in power usage next generation, but in subsequent generations it's going to go up.
486 machines came with about 200 watt power supplies. It's been a long term trend to increase power usage, but more recently that seems to have been increasing faster, and that despite power efficiency now being a primary design parameter even in desktop processors (it's a primary parameter now because of the breakdown in Dennard scaling, and the desktop was the last segment to be concerned with power efficiency, after mobile and server. One can probably look at Intel's Alder Lake, with its inclusion of the small Gracemont cores, as a continuation in that direction).
BlueScreenJunky - Saturday, February 13, 2021 - linkDo they have a release date for the 3080 in the EU ?