The AMD Radeon VII Review: An Unexpected Shot At The High-End
by Nate Oh on February 7, 2019 9:00 AM ESTBenchmarking Testbed Setup
To preface, because of the SMU changes mentioned earlier, no third party utilities can read Radeon VII data, though patches are expected shortly. AIB partner tools such as MSI Afterburner should presumably launch with support. Otherwise, Radeon Wattman was the only monitoring tool possible, except we observed that the performance metric log recording and overlay sometimes caused issues with games.
On that note, a large factor in this review was the instability of press drivers. Known issues include being unable to downclock HBM2 on the Radeon VII, which AMD clarified was a bug introduced in Adrenalin 2019 19.2.1, or system crashes when the Wattman voltage curve is set to a single min/max point. There are also issues with DX11 game crashes, which we also ran into early on, that AMD is also looking at.
For these reasons, we won't have Radeon VII clockspeed or overclocking data for this review. To put simply, these types of issues are mildly concerning; while Vega 20 is new to gamers, it is not new to drivers, and if Radeon VII was indeed always in the plan, then game stability should have been a priority. Despite being a bit of a prosumer card, the Radeon VII is still the new flagship gaming card. There's no indication that these are more than simply teething issues, but it does seem to lend a little credence to the idea that Radeon VII was launched as soon as feasibly possible.
Test Setup | |||||
CPU | Intel Core i7-7820X @ 4.3GHz | ||||
Motherboard | Gigabyte X299 AORUS Gaming 7 (F9g) | ||||
PSU | Corsair AX860i | ||||
Storage | OCZ Toshiba RD400 (1TB) | ||||
Memory | G.Skill TridentZ DDR4-3200 4 x 8GB (16-18-18-38) |
||||
Case | NZXT Phantom 630 Windowed Edition | ||||
Monitor | LG 27UD68P-B | ||||
Video Cards | AMD Radeon VII AMD Radeon RX Vega 64 (Air) AMD Radeon R9 Fury X NVIDIA GeForce RTX 2080 NVIDIA GeForce RTX 2070 NVIDIA GeForce GTX 1080 Ti |
||||
Video Drivers | NVIDIA Release 417.71 AMD Radeon Software 18.50 Press |
||||
OS | Windows 10 x64 Pro (1803) Spectre and Meltdown Patched |
Thanks to Corsair, we were able to get a replacement for our AX860i. While the plan was to utilize Corsair Link as an additional datapoint for power consumption, for the reasons mentioned above it was not feasible for this time. On that note, power consumption figures will differ for earlier GPU 2018 Bench data.
In the same vein, for Ashes, GTA V, F1 2018, and Shadow of War, we've updated some of the benchmark automation and data processing steps, so results may vary at the 1080p mark compared to previous GPU 2018 data.
289 Comments
View All Comments
Alistair - Thursday, February 7, 2019 - link
Because everyone is already playing Anthem at 4k 60fps with a $400 card? Ray tracing is totally useless and we need way more rasterization performance per dollar than we have right now. Give me a 7nm 2080 ti without the RT cores for $699 and then we'll talk.eva02langley - Friday, February 8, 2019 - link
Fair, the main objective of gaming GPU are shaders per $. Gameworks gimmick are not something I call a selling factor... and Nvidia is forced to cook their books because of it.RSAUser - Thursday, February 7, 2019 - link
Why are you adding the Final Fantasy benchmark when it has known bias issues?Zizy - Thursday, February 7, 2019 - link
Eh, 2080 is slightly better for games and costs the same, while unfortunately MATLAB supports just CUDA so I can't even play with compute.Hul8 - Thursday, February 7, 2019 - link
On page 19, the "Load GPU Temperatur - FurMark" graph is duplicated.Ryan Smith - Thursday, February 7, 2019 - link
Thanks. The FurMark power graph has been put back where it belongs.schizoide - Thursday, February 7, 2019 - link
Man, I've never seen such a hostile response to an Anandtech article. People need to relax, it's just a videocard.I don't see this as a win for AMD. Using HBM2 the card is expensive to produce, so they don't have a lot of freedom to discount it. Without a hefty discount, it's louder, hotter, and slower than a 2080 at the same price. And of course no ray-tracing, which may or may not matter, but I'd rather have it just in case.
For OpenCL work it's a very attractive option, but again, that's a loser for AMD because they ALREADY sold this card as a workstation product for a lot more money. Now it's discounted to compete with the 2080, meaning less revenue for AMD.
Even once the drivers are fixed, I don't see this going anywhere. It's another Vega64.
sing_electric - Thursday, February 7, 2019 - link
There's still a lot of people for whom a Radeon Instinct was just never going to happen, INCLUDING people who might have a workstation where they write code that will mostly run on servers, and it means you can run/test your code on your workstation with a fairly predictable mapping to final server performance.As Nate said in the review, it's also very attractive to academics, which benefits AMD in the long run if say, a bunch of professors and grad students learn to write ML/CL on Radeon before say, starting or joining companies.
schizoide - Thursday, February 7, 2019 - link
Yes, it's attractive to anyone who values OpenCL performance. They're getting workstation-class hardware on the cheap. But that does devalue AMD's workstation productline.Manch - Thursday, February 7, 2019 - link
Not really. The instinct cards are still more performant. They tend to be bought by businesses where time/perf is more important than price/perf.