Kaveri: Aiming for 1080p30 and Compute

The numerical differences between Kaveri and Richland are easy enough to rattle off – later in the review we will be discussing these in depth – but at a high level AMD is aiming for a middle ground between the desktop model (CPU + discrete graphics) and Apple’s Mac Pro dream (offloading compute onto different discrete graphics cards) by doing the dream on a single processor. At AMD’s Kaveri tech day the following graph was thrown in front of journalists worldwide:

With Intel now on board, processor graphics is a big deal. You can argue whether or not AMD should continue to use the acronym APU instead of SoC, but the fact remains that it's tough to buy a CPU without an integrated GPU.

In the absence of vertical integration, software optimization always trails hardware availability. If you look at 2011 as the crossover year when APUs/SoCs took over the market, it's not much of a surprise that we haven't seen aggressive moves by software developers to truly leverage GPU compute. Part of the problem has been programming model, which AMD hopes to address with Kaveri and HSA. Kaveri enables a full heterogeneous unified memory architecture (hUMA), such that the integrated graphics topology can access the full breadth of memory that the CPU can, putting a 32GB enabled compute device into the hands of developers.

One of the complexities of compute is also time: getting the CPU and GPU to communicate to each other without HSA and hUMA requires an amount of overhead that is not trivial. For compute, this comes in the form of allowing the CPU and GPU to work on the same data set at the same time, effectively opening up all the compute to the same task without asynchronous calls to memory copies and expensive memory checks for coherency.

The issue AMD has with their HSA ecosystem is the need for developers to jump on board. The analogy oft cited is that on Day 1, iOS had very few apps, yet today has millions. Perhaps a small equivocation fallacy comes in here – Apple is able to manage their OS and system in its entirety, whereas AMD has to compete in the same space as non-HSA enabled products and lacks the control. Nevertheless, AMD is attempting to integrate programming tools for HSA (and OpenCL 2.0) as seamlessly as possible to all modern platforms via a HSA Instruction Layer (HSAIL). The goal is for programming languages like Java, C++ and C++ AMP, as well as common acceleration API libraries and toolkits to provide these features at little or no coding cost. This is something our resident compute guru Rahul will be looking at in further detail later on in the review.

On the gaming side, 30 FPS has been a goal for AMD’s integrated graphics solutions for a couple of generations now.

Arguably we could say that any game should be able to do 30 FPS if we turn down the settings far enough, but AMD has put at least one restriction on that: resolution. 1080p is a lofty goal to hold at 30 FPS with some of the more challenging titles of today. In our testing in this review, it was clear that users had a choice – start with a high resolution and turn the settings down, or keep the settings on medium-high and adjust the resolution. Games like BF4 and Crysis 3 are going to tax any graphics card, especially when additional DirectX 11 features come in to play (ambient occlusion, depth of field, global illumination, and bilateral filtering are some that AMD mention).

Introduction and Overview The Steamroller Architecture: Counting Compute Cores and Improvements over Piledriver
Comments Locked

380 Comments

View All Comments

  • bobbozzo - Tuesday, January 14, 2014 - link

    Hi, both the 45w and 65w tables on page 1 list the A8-7600.
    AFAICT, the A8-7600 is 65w, so a different part number should probably be on the 45w table.

    I'm looking forward to an HTPC comparison between Kaveri and Haswell.
    CPU performance differences seem mostly irrelevant nowadays for HTPC, but I'm wondering which would have better 4k playback, etc.

    Also wish some AMD ITX boards would be made with some decent (non-Realtek) NICs.

    thanks!
  • T1beriu - Tuesday, January 14, 2014 - link

    "You may notice that the Kaveri model listed is the same model listed in the 45W table. This is one of the features of AMD’s new lineup – various models will have a configurable TDP range, and the A8-7600 will be one of them."
  • hyperspaced - Tuesday, January 14, 2014 - link

    I am looking to build a nice HTPC to run XBMC, maybe with some casual gaming.
    The A8-7600 @45W is ideal for my needs: decent CPU performance, killer GPU at $120...

    I'm pretty sure those A8's will sell like hot cakes.
  • bobalazs - Tuesday, January 14, 2014 - link

    A8-5500 is 65W TDP and not 45W as stated on page "Testing Platform"
  • Ryan Smith - Tuesday, January 14, 2014 - link

    Thanks!
  • lorribot - Tuesday, January 14, 2014 - link

    I still don't get the point of integrated graphics for anything other than general 2D use. In 3d gaming they are of no real use on any sensible sized screen. They waste die space cost huge amounts to extra to develop and if you put a dGPU in have absolutly no use at all.
    Granted if you can offload all the FP to the GPU and just do Int on the CPUthere may be some benefit but that is along way off, probably another two or three years before that becomes a reality.
    For the majority of user scenarios Intel are in the right area, AMD are way off the mark with their sickly CPU designs, and by the time iGPU performance becomes a real issue Intel will have the hardware in place with their faster development cycles.
  • nos024 - Tuesday, January 14, 2014 - link

    Totally agreed. Two years ago I was excited about AMD's direction in APU. But after buying first generation and realizing that I was not satisfied with the performance (even as a casual gamer) and end up getting a discrete GPU. Two years or so after and this is still the case. The sad reality is that once they get these APU to reach 1080p @ 60fps, the world would have moved on to 4k performance at 60fps.
  • andrewaggb - Wednesday, January 15, 2014 - link

    Yeah. The PS4 and XB1 designs both addressed the shortcomings, the XB1 with edram and the ps4 with ddr5. As AMD helped design both, I don't see why they couldn't have done a solution similiar to one or the other, with e-dram seeming the most obvious choice due to cheap ddr3. Even if it doesn't sell extremely well, it's tech you already have and it would be the first time in a long time that you could come out and show an overwhelming lead over intel with enough performance to actually meet consumer needs.

    Their current offering, though having better igpu performance than intel, can't run modern titles at good enough quality to get taken seriously.

    Releasing something today with XB1 equivalent gpu speed and roughly equivalent multi-core performance would have drawn a crowd and a whole lot of positive talk and excitement. Combined with Mantle, Steam OS, and already having XB1 and PS4, they could have really had some momentum, even with their comparatively lousy cpu performance.

    Feels like a real opportunity wasted to me. Sure they could announce that next year... but it's another year for intel, nvidia, and who knows who else to catch up. First to market with time to capitalize can pay off for years and years. Look at the iphone and app store. That comes from being out first with no serious competition for a long period of time. If AMD had an XB1 equivalent APU today, + mantle, guarantee they'd gets lots of mantle support and have a good couple years at least.
  • TEAMSWITCHER - Tuesday, January 14, 2014 - link

    I very much agree with your sentiment. I have absolutely no use for the graphics that are included with these devices...AMD or Intel...it's all garbage to me. Even without a GPU, I would still buy an Intel 4th Generation Core processor because it's the best at what I need a CPU to do. Intel is wasting hundreds of millions of transistors on every desktop processor they manufacture. Think of the incredibly small dies and high yields they could have enjoyed.

    And all those who would have been interested in this crap, now have even lower cost ARM tablets that can easily replace the low cost PC's these devices were meant to power. That's a fitting end given the "our crap is slightly better than your crap" game they wanted to play.

    The industry should abandon Integrated Graphics on all desktop processors. Intel's upcoming Haswell-E parts should be the ONLY desktop parts. Any part with a GPU should be reserved for laptops, convertibles, tablets, or other small form factor segments.
  • nader_21007 - Saturday, January 18, 2014 - link

    "In 3d gaming they are of no real use on any sensible sized screen". You are either blind or an intel fanboy.I agree that intel IGP is pointless because it's very weak, But AMD' Apu's can play every game at 720p or even 1080p Comfortably. If you have higher expectations you have to pay for a pricey GFX card.Hardcore gaming is not Free LOL. Tell me if you could get a better IGP/GPU from Intel.

Log in

Don't have an account? Sign up now