Comments Locked

154 Comments

Back to Article

  • Drumsticks - Wednesday, November 18, 2015 - link

    If AMD really could get a 40% single threaded performance boost on their CPUs for Zen, and they can do it no later than Kaby Lake, then they really might get a moment to breath. That puts single threaded performance right around Intel's i3 parts, and would put multi-threaded performance (and likely graphics although that's a different story) well ahead. It's not going to take back the desktop market overnight, but it would be enough to get PC builders and maybe some OEMs interested and get enough volume moving for them to survive.

    Even if we budget a 10% IPC boost for Intel in Kaby Lake, that puts their i3's barely ahead, and still probably significantly behind in multi threaded performance compared to a 4 core Zen part. Here's hoping for an AMD recovery! I'd love to recommend AMD parts in more than just the $300 region now. Even if Zen only gets a single OEM to genuinely notice AMD, it will be an improvement.
  • V900 - Wednesday, November 18, 2015 - link

    You seriously think AMD is going to sell a 4 core Zen processor for the same amount that a dual core Intel i3 sells for?

    In that case I got a bridge to sell you!

    Make no mistake, AMD doesn't sell cheap APUs out of the goodness of their hearts.

    The reason they're the budget option is because they don't have anything remotely competitive with Intel's Core CPUs, and therefore only can compete on the very low end of the market.

    If their Zen core turns out to be on par with an intel processor, they'll sell it at the prices Intel charges, or slightly lower.

    You won't see a quadcore Zen selling for roughly the same price Intel charges for an I3. You'll have AMD selling their quadcore Zen for the same 300$ Intel charges for an i5
  • yankeeDDL - Wednesday, November 18, 2015 - link

    I don't fully agree.
    Yes, AMD's IPC is much lower than Intel's, and there's a gap in energy efficiency (although, much reduced with Carrizo).
    But, as you correctly indicate, AMD prices they chip accordingly. So at ~120usd, the A8/A10 are extremely attractive, in my opinion. For home users, which have the PC on on a relatively small fraction of the time, having more cores, and an excellent GPU (compared to intel's at those price point) is quite beneficial.
    Skylake changes things a bit, but up to Haswell (included) the performance of Intel's Core i3 in the low $100s, was easily beaten.
  • Dirk_Funk - Wednesday, November 18, 2015 - link

    I don't think he/she said a single word about how zen would be priced. I don't know why you responded this way. Also, i5 sells for like $200-$250.
  • Aspiring Techie - Wednesday, November 18, 2015 - link

    If Zen is as good as advertised, then AMD can afford to increase the price of their CPUs by 20%. This would make their quad-cores in the $130-150 range, way cheaper than Intel's i5s. Granted, even Zen won't be as good as Kaby Lake. If AMD's performance per clock is 60% of Intel's, then Zen's will be about 84% of Intel's. Add in that a much better power efficiency (because the microarchitecture will have fewer pipeline stages) and possibly more cache with the smaller process node and you get roughly 85% i5 performance for $30 less. This doesn't even consider their APUs, which still could be priced at near i3 levels. They would beat the crap out of i3s and sometimes i5s (if HSA is utilized).

    Bottom line: Zen is AMD's last chance. AMD probably won't make the stupid mistake of pricing their CPU's too high. If they do, then bye-bye AMD for good.
  • JoeMonco - Wednesday, November 18, 2015 - link

    "Bottom line: Zen is AMD's last chance. AMD probably won't make the stupid mistake of pricing their CPU's too high. If they do, then bye-bye AMD for good."

    Because if AMD is known for anything it's for its great business decisions. rofl
  • medi03 - Thursday, November 19, 2015 - link

    Yeah, that's why they are in both major consoles at the moment, because of the "bad" business decisions.
  • Klimax - Thursday, November 19, 2015 - link

    There's a reason why Intel was uninterested in consoles. AMD barely makes any money on them...
  • Kutark - Thursday, November 19, 2015 - link

    ^ This. Being in the consoles is because it was a massive volume order of parts and MS and Sony are looking to save as much as possible, fractions of a dollar per part matter when you're paying for literally millions of parts.
  • anubis44 - Sunday, November 29, 2015 - link

    The consoles are still providing AMD with a solid, baseline income every year, and their presence in consoles also make games easier to port to AMD's architecture, something that will become more apparent with DX12, since consoles are already using a DX12/Mantle-like API. AMD's decision to sweep the consoles and push Intel and nVidia out of them will have longer term reprocussions than many realize. AMD is also almost certain to win the next generation of consoles, too, with Zen-based APUs and Greenland-type graphics with HBM. In fact, AMD will probably release something like that for the mainstream PC market by 2017 and nVidia will be relegated to only the high-end of add-in graphics: AMD will be putting solidly mainstream graphics into their APUs, and an add-in mainstream AMD card will simply crossfire with the built-in graphics.
  • jfelano - Monday, December 28, 2015 - link

    Oh they were interested, they just didn't get the contracts cause AMD was on the ball and previous experience using AMD for consoles. Your talking billions of dollars, Intel was interested in billions of dollars believe me.
  • JoeMonco - Thursday, November 19, 2015 - link

    So the $158 million operating loss last quarter was due to making great business decisions?
  • KranK_ - Friday, November 20, 2015 - link

    AMD is in both consoles...and, yet they continue to NEVER make a profit. All AMD does is lose money, have you seen their quarterly financial reports? lmao
  • Alexvrb - Saturday, November 21, 2015 - link

    Yeah if you look at them, you'll notice they're making money off the consoles. It's not nearly enough to offset the gushing losses elsewhere, but it is a positive flow of revenue. If they didn't have those sales, they'd be that much further in the hole.

    In fact, their console wins are probably the best thing they've done recently. They need to execute well with initial Zen-based designs and Arctic Islands alike. After that, they need to push HBM2 down into their APUs and to continue improving upon the foundation Zen has laid.

    I'll be very interested to see what clocks they can hit while balancing power on the new node.
  • sld - Thursday, November 19, 2015 - link

    Those who laugh at AMD are those who enjoyed the pricey new CPUs from Intel's near-monopoly.
  • Kutark - Thursday, November 19, 2015 - link

    Yes, and the excellent framerates that come with it. Oh did i mention how i paid ~$300 for a CPU 4.5 years ago and it still whips any 4core AMD can offer me? 4.5 years after the fact.

    If i had a bought an AMD in 2011 how many times would i have had to replace it by now? At least once, probably be due for another one. My 2600k even at stock speeds is faster than anything AMD can offer in a 4 core right now, and mines been sitting at 4.3ghz rock stable on air cooling since i bought it.
  • silverblue - Friday, November 20, 2015 - link

    Four cores or four modules?
  • fokka - Sunday, November 22, 2015 - link

    good thing then that amd has been offering more than 4 cores in the consumer sector for quite a while.
  • JoeMonco - Thursday, November 19, 2015 - link

    Or we simply laugh at AMD because trainwrecks can be amusing.
  • darkfalz - Saturday, November 21, 2015 - link

    Intel has competition though - from its previous generation(s). They need to convince people to upgrade from a price/performance perspective.
  • Archetype - Wednesday, December 9, 2015 - link

    As far as processors are concerned. AMD needs a few crowd-pleasers again. Although I personally will always appreciate that they push some important boundaries in PC technology - Very often at no gain to themselves since they like to promote open standards - They will need to build up good will through good value and performance - Somewhere between mainstream and enthusiast. Would not hurt to shine a bit in the enthusiast market either.
  • syryquil1 - Thursday, January 17, 2019 - link

    Rip you I guess.
  • silverblue - Wednesday, November 18, 2015 - link

    I don't see how it'd be too difficult for AMD to get that 40%; K10 and Bulldozer both had bottlenecks that have since been identified. As for single threaded performance, wouldn't i3 be similar to i7?
  • KAlmquist - Wednesday, November 18, 2015 - link

    AMD's target is a 40% IPC improvement over Excavator, not over K10 or Bulldozer.
  • silverblue - Wednesday, November 18, 2015 - link

    I know; I was referring to Bulldozer in terms of the architectural family. Excavator may be much improved, but it's still Bulldozer in the end.
  • Flunk - Wednesday, November 18, 2015 - link

    No, i3's aren't similar to i7s single-threaded because i7s have more cache and in most cases higher clock speeds (except the U-series where the i3,i5 and i7 distinctions don't really exist).
  • silverblue - Wednesday, November 18, 2015 - link

    There ARE highly clocked i3s out there, so a comparison can certainly be made at the same clock speed. Cache will make a difference but not a huge one.

    http://www.anandtech.com/bench/product/1197?vs=836

    The i3 has a higher base clock, sure, but lacks turbo, which helps propel the i7 to 3.9GHz. I know, it's not a like-for-like comparison, but that i3 can certainly hold its own at gaming and single-threading in general.
  • gamervivek - Thursday, November 19, 2015 - link

    For some reason AMD GPUs performance suffers on i3 while is alright with i5. See the single threaded draw call results here, where the i5 can do 60% more draw calls.

    http://www.eurogamer.net/articles/digitalfoundry-2...
  • medi03 - Thursday, November 19, 2015 - link

    4 cores vs 2 dude
  • BurntMyBacon - Thursday, November 19, 2015 - link

    @media: "4 cores vs 2 dude"

    In response to the OP which included this statement:

    @gamervivek: "See the single threaded draw call results here, where the i5 can do 60% more draw calls."

    Am I missing something here?
  • tipoo - Thursday, November 19, 2015 - link

    In DX12 that will matter, in DX11 going from single threaded to multithreaded for draw calls doens't help. Check the link.
  • tipoo - Thursday, November 19, 2015 - link

    On the other hand " Also note that the FX 6300's DX12 results on R7 260X, GTX 970 and R9 290X comprehensively beat the more expensive Core i3 4130."
  • silverblue - Thursday, November 19, 2015 - link

    A lot may be simply down to the number of floating point units; the Pentium has two, the i3 has two with turbo and hyperthreading, and the FX6300 has three with turbo and a sort of hyperthreading mode as the FlexFPU can handle two 128-bit instructions at once or work together for a single 256-bit instruction, though Piledriver is horrendously crippled with the latter.
  • Kutark - Thursday, November 19, 2015 - link

    I feel like a "bruh" would have been better here instead of "dude".
  • Morawka - Thursday, November 19, 2015 - link

    no because of cache size, and more actual hardware on the i7's die
  • hojnikb - Saturday, November 21, 2015 - link

    What kind of actual hardware does i7 have, that i3 dont ?

    I mean, i3 is really just an i7 cut in half.
  • eanazag - Wednesday, November 18, 2015 - link

    I fully expect AMD to disappoint with Zen. I see no facts that indicate they will not do what they did in the past - fail to meet expectations at the time they said they would.
  • Oxford Guy - Wednesday, November 18, 2015 - link

    Just like when they were beating Intel's Netburst.
  • JoeMonco - Wednesday, November 18, 2015 - link

    And that was what? Like 15 years ago? It's definitely brought AMD great financial success that they beat a CPU microarchitecture from nearly 2 decades ago.
  • silverblue - Thursday, November 19, 2015 - link

    I don't think anybody was saying that.
  • JoeMonco - Thursday, November 19, 2015 - link

    So then why brin it up? What relevance does Netburst have to do with modern-day Intel or AMD?
  • silverblue - Friday, November 20, 2015 - link

    About as much relevance as you comparing current AMD CPU performance with that of Netburst.

    Other than that, it only serves to highlight that AMD have beaten Intel before, but I don't think Intel will get that complacent again anytime soon.
  • Klimax - Thursday, November 19, 2015 - link

    Only because Intel made mistake. (And hit multiple unobvious problems later like frequency wall)
  • JoeMonco - Wednesday, November 18, 2015 - link

    Don't bring in your silly *facts*. This is an AMD wankfest where we ignore all evidence of claim and perpetually claim that [Year+1] is gonna be the year AMD releases that great CPU! Oh and we'll high five each other over AMD beating Netburst over a decade ago because that's hugely relevant still.
  • anubis44 - Wednesday, November 18, 2015 - link

    "This is an AMD wankfest where we ignore all evidence of claim and perpetually claim that [Year+1] is gonna be the year AMD releases that great CPU! Oh and we'll high five each other over AMD beating Netburst over a decade ago because that's hugely relevant still."

    Spoken like a true jerk. AMD's Zen is not a mere refresh of Bulldozer, and it wasn't designed by the same people who designed Bulldozer or Phenom/Phenom II. It was designed by the guy who is widely considered the very best CPU engineer on the planet. It's going to be entertaining to see what you have to say when AMD's comeback is truly in full swing.
  • JoeMonco - Thursday, November 19, 2015 - link

    "It's going to be entertaining to see what you have to say when AMD's comeback is truly in full swing."

    No it'll actually be more entertaining when Zen fails like every other AMD processor for the last decade. Then you and your ilk will be going on and on about how Zen 2 is going to be the one to finally beat Intel.
  • sld - Thursday, November 19, 2015 - link

    Well said by someone who enjoys paying the near-monopoly prices for Intel's CPUs.
  • JoeMonco - Thursday, November 19, 2015 - link

    No, I'm just a realist.
  • BurntMyBacon - Thursday, November 19, 2015 - link

    eanazag: "I fully expect AMD to disappoint with Zen. I see no facts that indicate they will not do what they did in the past - fail to meet expectations at the time they said they would."

    If your expectation is that AMD will disappoint with Zen, then I don't suppose they'd be too sad if they failed to meet that expectation ;' )
  • JoeMonco - Thursday, November 19, 2015 - link

    Unless the benchmarks are going to translate into sales, it's not really going to matter.
  • zeeBomb - Wednesday, November 18, 2015 - link

    Go go Rocket APU!
  • jaydee - Wednesday, November 18, 2015 - link

    Would be nice to see this compared to the latest Intel offering. Any plans on reviewing Skylake i3/Pentium? I doubt Intel sends out samples of those, but they've been publicly available for a month or so.
  • hojnikb - Wednesday, November 18, 2015 - link

    Yep, HD530 is pretty beefy and coupled with ddr4 it would probobly be quite competitive with apus gpu.
  • Valantar - Wednesday, November 18, 2015 - link

    You did see the table comparing the A8-7670Ks iGPU with the i5-6600s HD 530, right? It's literally one paragraph up from the comments in the conclusion, if you missed it. To sum things up: the 7670K wins every test outright, with only a few specific 99th percentile numbers not being clear wins. Even with the Intel CPU running DDR4 (although at the same frequency as the DDR3 used).
  • nikaldro - Wednesday, November 18, 2015 - link

    You said it yourself : Intel's iGPU was intentionally held back.
  • Valantar - Wednesday, November 18, 2015 - link

    Testing both chips at Jedec spec is holding one of them back? Othereise, shouldn't both be tested with OCd memory?
  • hojnikb - Thursday, November 19, 2015 - link

    Thats just one game though... I want to see a full benchmark comparing more games.
  • Ian Cutress - Wednesday, November 18, 2015 - link

    I'm trying to get a source for the whole stack for testing review. Keep fingers crossed. :)
  • nathanddrews - Wednesday, November 18, 2015 - link

    I expected the the Intel HD 530 graphics to perform much worse than the 7670K, despite the CPU processing advantage and extra $100 in price. 9-18% faster average frame rates, but very small gains on 99th percentile. The 530 can be found on CPUs all the way down the stack to the $60 Pentiums. I was expecting something closer to a 50% gap, but I'm probably envisioning 7870K vs Haswell IGP.

    Too bad there's no Carrizo desktop SKU.
  • imaheadcase - Wednesday, November 18, 2015 - link

    This really looks like it should not of even existed. Look at that lineup how close everything is together.
  • Shadowmaster625 - Wednesday, November 18, 2015 - link

    Where is the i3-6100 in those gaming tests?
  • Ian Cutress - Wednesday, November 18, 2015 - link

    We haven't had one in to test.
  • Samus - Wednesday, November 18, 2015 - link

    Yeah, my guess is the i3-6100 is going to render AMD's entire A-series irrelevant (unless you're looking for a sub-$100 chip) simply because of it's DX12 iGPU + DDR4. Early reviews show it to be around 15% faster than the i3-4130 in single threaded performance alone.
  • tipoo - Wednesday, November 18, 2015 - link

    No option for dual graphics with a cheap Radeon though, you'd have to wait for DX12 Multiadaptor to take off, if it does.
  • Samus - Wednesday, November 18, 2015 - link

    That's true. AMD still has their niche features. But I suspect Intel's 500-series IGP is going to be pretty strong against 6 of AMD's compute cores based on DDR4 bandwidth alone. Will have to wait for the review, obviously.
  • hojnikb - Thursday, November 19, 2015 - link

    Why bother with dual graphics at all ?
    Just buy a beefier card and an athlon x4.

    Unless you hit the right combo, hybrid crossfire will be a mess.
  • Flunk - Wednesday, November 18, 2015 - link

    Intel needs to work on their drivers, their current ones are a real mess. I feel like if they improved their drivers they might actually get to the point you're talking about where you could buy a system with Intel graphics for light gaming, but right now it's a really poor experience.
  • tipoo - Wednesday, November 18, 2015 - link

    AMDs APUs seem more bandwidth constrained than limited by themselves. I hope they take HBM2 as an opportunity to provide at least a PS4 equivalent GPU in an APU in the coming years.
  • Refuge - Wednesday, November 18, 2015 - link

    They don't plan to bring HBM to their APU's anytime soon from the product slides that they released with their HBM announcement.
  • tipoo - Wednesday, November 18, 2015 - link

    Yeah, that's why I gave it a longer time frame. It's going to be high end GPU exclusive for a while, but I hope the trickle down comes fast enough. Or if not, go eDRAM.
  • tipoo - Wednesday, November 18, 2015 - link

    The FPS by percentile graph is nice, is that new to AT?
  • Ian Cutress - Wednesday, November 18, 2015 - link

    Ryan does some percentile data in GPU reviews, and we did some stuff around the CFX/SLI sync issues. But we did the FPS by percentile graphs a bit in the Fable Legends testing. Some benchmarks provide the per-frame data by default, others do not (depends on how you're polling), and then there's some post-processing which takes longer than you think. It's a sort of graph that only 3/4 lines can be on it without going overboard.
  • tipoo - Wednesday, November 18, 2015 - link

    Cool. I like it.
  • tipoo - Wednesday, November 18, 2015 - link

    For GTA V, it looks like dual graphics on the same settings gives lower framerates than the 240 alone? Is that right? Other than that you see a pretty nice boost with it.
  • jaydee - Wednesday, November 18, 2015 - link

    I noticed that too and was wondering
  • yannigr2 - Wednesday, November 18, 2015 - link

    APUs are excellent solutions for systems without graphics. If you are going to add a graphics card, then excluding dual graphics, APUs seem much less attractive. A8 7600 is in my opinion the best cheap APU you can get for a system without a discrete graphics card, 78X0K are an option ONLY when you are definitely never going to add a graphics card and you want the best possible integrated GPU, without having to rob a bank for an Iris GPU.

    But even when AMD's APUs are the best option, people will still rush to make people avoid them. That's AMD's doing and we can only blame them. Recently someone asked for a cheap system to play an old FPS game. Videos on youtube where showing that an A8 7600 could play the game. That didn't stopped people rushing in the thread to insist that, that game wasn't going to be fun with under 140 fps. Yes, 140 fps. They didn't considered gaming peripherals, or a gaming monitor important, only 140 fps. Anything to "save" someone from an AMD APU and send him to Intel. Even if this means convincing someone paying at least $80-$100 that he may not have. At the same time the same persons blame ONLY AMD for the lack of competition.
  • Dribble - Wednesday, November 18, 2015 - link

    AMD's solutions appeal to pretty well no one. If you don't have an add in gpu chances are you don't want to game in which case Intel wins hands down. If you want to game you'll buy a cheap graphics card in which case Intel wins hands down.

    The mythical buying market that want to game on a desktop but can't even afford a basic add on card just doesn't exist. We know this because AMD have been trying to sell apu's like this one for years now and no one is buying.
  • Yorgos - Wednesday, November 18, 2015 - link

    That's false,
    I am gaming in 1080p and I play many games above 25fps with almost high settings.
    A good example is skyrim with the high definition mod that you can get from steam and I am constantly at 25 fps.
    All this was tested on my 8750k and 2400 MHz ram. What's your point of reference?

    OTOH, there is a shitload of ppl that play only MOBA games or FPS like cs where you don't need to spend 300 or more in a system. A 150$ system can get you constant 30 fps w/o a sweat.Plus you get a discrete grade GPU embedded with your cpu and you get all the goodies and the high quality drivers/support the discrete gpus get. Intel has none of the previous.
  • Dribble - Wednesday, November 18, 2015 - link

    I would point you to AMDs CPU sales. No one is buying, AMD have been trying to sell cpu's like this one for years and failing - the market isn't there.
  • medi03 - Thursday, November 19, 2015 - link

    I would point you to the fact that Netburst outsold superior Athlon 64s 4 to 1.
  • BurntMyBacon - Thursday, November 19, 2015 - link

    @medi03: "I would point you to the fact that Netburst outsold superior Athlon 64s 4 to 1."

    True, a superior architecture doesn't guarantee better sales, even at better prices. However, Dribble didn't accuse this solution of being inferior for the market it is targeting. He stated:
    @Dribble: "No one is buying, ..." and "... the market isn't there."

    I don't entirely agree, though (by CPU sales) the market doesn't seem to be very large and is clearly low margin. These are not the processors that will save AMD's business. Zen will likely be the most important CPU architecture in the company's history (whether the design is good/bad/novel/obvious).
  • medi03 - Thursday, November 19, 2015 - link

    I would point you to the fact that Netburst outsold superior Athlon 64s 4 to 1.
  • JoeMonco - Thursday, November 19, 2015 - link

    But your silly *facts* don't matter. [Year+1] with the release AMD [Microarchitecture+1] is gonna finally beat Intel! And I know this because my irrational brand loyalty says so!!
  • barleyguy - Thursday, November 19, 2015 - link

    You mocking irrational brand loyalty is irony at its finest. ;-)
  • JoeMonco - Thursday, November 19, 2015 - link

    I don't see what the irony is supposed. Criticizing AMD doesn't mean I like Intel. ARM is the only real competition that Intel faces.
  • hojnikb - Thursday, November 19, 2015 - link

    How much did you spend on that ram ?
    I bet you could get pentium+4g of ram and a nice 250X (or even 260x on sale) for around the same money.

    Thats the problem with apus. They need fast ram in dual channel to be taken advantage of.
  • yannigr2 - Thursday, November 19, 2015 - link

    An A8 7600 costs only $20-$30 more than a Pentium? 8GBs of 1866-2000MHz costs only $10-$20 more than 8GB 1333Mhz? And the card you get in the APU is a little faster than an R7 240 as the review shows.

    So, in the end you save $30-$40 for the same if not a tiny better gaming experience. This is HUGE if you don't have the money, or if you are a retail shop that tries to create an ultra cheap system that you can market it as gaming and also have one less part in there that makes the assembly easier and also lowers the possibility of an RMA because of the extra part(the discrete card).

    AMD can't sell much, because Intel controls the market, people who don't know about hardware buy the Intel brand and individuals who are asked to help others to build such low cost machines, usually exclude AMD from the beginning without even considering it as an option, or even try as hard as they can to make other avoid AMD's solutions. That's even when AMD's solutions are the perfect solutions for specific cases, and those are the same people who constantly cry about competition. My example in my previous post proves that and it was an example based on a last week's case.
  • silverblue - Thursday, November 19, 2015 - link

    Fast RAM which isn't exactly at a premium anymore... at least, right now. I just took a quick glance at HyperX Savage prices on Amazon and a 16GB (2x8) kit was £64 to £65 for 1600 and 1866MHz, and £68 for 2400MHz, with 2133MHz being a little more still. I know, it's a small sample for a single product range, and a lot of these look to have had massive reductions recently, but right now it's not really an expense going with faster RAM.

    If you play games that benefit from Hybrid Crossfire, it's an option, certainly more than it used to be, however it's still not at the level that I would consider to be worthwhile outside of that particular scenario, and scaling is still minimal even when it does work (in general).

    I would like to know what AMD's current CPU market share is. People are forever saying that nobody is buying AMD, however based on popularity on www.dabs.com it appears that the 860K is the top CPU, with the 8320E in second place. The i3-6100 is in third place.
  • JoeMonco - Thursday, November 19, 2015 - link

    25fps? Is that supposed to be impressive?
  • yannigr2 - Thursday, November 19, 2015 - link

    You are wrong. I guess two years ago you would have been absolutely sure that this generation consoles wouldn't sell because the hardware wasn't strong enough to run latest AAA titles at the highest possible settings.
  • Linuxenthusiast - Thursday, November 26, 2015 - link

    I think one of the main problems for AMD is that there are very few brand name computers using the faster AMD APUs in the stores. That, combined with all the people that think they “have” to have an Intel processor because it's “better” works against AMD.

    I have been building desktops for friends and family and friends of friends with AMD cpus exclusively for the past 20 years and AMD APUs for the past four years or so. All the PCs used either the integrated chipset AMD video or the APUs integrated video. I can build a sturdier, longer lasting PC, with higher quality parts at a lower price than brand name PCs by using AMD APUs and motherboards. I basically build them the PCs that they can't buy in the stores.

    Most of the users have been web browser/email/YouTube/photos/music users and have been very happy with the price and performance of the PCs and have no idea what they are supposedly “missing” by not having the additional costs of an Intel cpu and motherboard. If you judge the performance of the PC by how well it handles typical multithreaded everyday apps rather than benchmarks, I think you will find that AMD APUs are more than adequate for most users.

    In the past year or so, I've built 5 PCs for teenage or twenty-something Windows gamers and one mid-forties gamer who has $500+ invested in a fancy “race-car” seat and pedals and gear shift that feel like ones in a real race car. The race car gamer didn't believe that the AMD APU alone could provide sufficient video performance without an additional add-in video card, but I convinced him to try it with the idea that we could always add a video card later, if he needed it.

    Well, much to his surprise, the Kaveri AMD APU was more than adequate for his needs and he's quite happy with it.

    The teenagers and twenty-somethings are also using only the IGP of the Kaveri APUs and they are all quite pleased with the performance of the PCs. None of them have asked for an add-in video board. I don't think any of them are playing the most demanding games at the highest possible settings, but the main point is that they are all quite happy with the performance of their PCs and all of them are using the IGP of the Kaveri APUs.

    Every time someone thinks they need an Intel cpu, I have been able to convince them to try a PC with an AMD APU and no one has ever been disappointed with the performance of the APU PCs.

    I also try to recommend notebooks with AMD APUs and I wish there were more APU-based Thinkpads as well as more APU based Lenovos and HPs and the people who've bought them are also quite happy with their performance.

    I would encourage people who've never used AMD APUs to give the current Kaveri and later APUs a try. It won't cost much and I think you'll be pleasantly surprised.
  • Vesperan - Wednesday, November 18, 2015 - link

    I've been pleasantly surprised by what a 7850K (at 2133mhz memory) can achieve. 5-10 years ago I would have said gaming on integrated is impossible, but now you can do a lot on them.

    That said - I've definately avoided some titles until I get a graphics card.
  • CaedenV - Wednesday, November 18, 2015 - link

    And this is the kicker... even you will still need a dGPU in the end. An APU is either for a non-gaming system that could easily get by with Intel iGPU level graphics; or it is a stop-gap until you can afford a dGPU, after which point the APU is going to be your bottleneck for the rest of it's life. Far better (at least in my opinion) to start with the CPU you want in the long term, limp along with the iGPU until you can find a good GPU on sale.

    Over the 5-10 year life of a modern computer, what is 3-6 months to scrape together the money for a GPU?
  • Vesperan - Thursday, November 19, 2015 - link

    I'm an outlier (as is anyone reading/posting here) but for me the 7850k is a combination of a few things - supporting AMD as the smaller company, but also recognising my needs. The integrated graphics are perfectly fine for my current gaming needs and I would rather spend the money on a discrete GPU in 2016 when the new process node/GPUs are out than now. I also have two young children so I'm lucky to get 30 min a day on the desktop!

    Money isnt the issue but like every geek here we need to have a build that is optimal in some way. But yes - when I set out to built an AMD system that made sense I really, really struggled to justify it. Its not only the weakness of the CPU, but the high TDPs relative to performance. Fighting back at Intel while being two process nodes behind is just terrible for AMD.
  • BlueBlazer - Wednesday, November 18, 2015 - link

    Potential buyers would most likely choose either Intel's LGA115x or AMD's AM3+ platform for gaming builds, and overlook/ignore AMD's dead-end FM2/FM2+ platform. That is very much due to future upgradeability considerations, especially the CPU (where AMD's FM2/FM2+ platform is limited to 4 "cores"). Install a powerful discrete GPU and AMD's APU will bottleneck it (which is why a faster CPU is required)...
  • abufrejoval - Wednesday, November 18, 2015 - link

    I have come to respect APUs in my main home server. It started with an A10-5800K Trinity APU some years ago, 32GB of RAM and 16TB of RAID6 storage, but when I saw an A10-6700 Richland APU cheap on eBay I swapped it for reduced idle power (22Watts now).

    It's been running pretty much non-stop for years, and because it's so super quiet I find that most of the time I rather use it for pretty much everything from surfing, to taxing and video rather than wake my heavy dGPU Intel game rig from standby.

    It typically runs Windows 2008R2 and then quite a few VMs using VMware workstations and dual monitors.

    The original motive was flexibility: 8 SATA ports, 32GB of RAM, full IOMMU/VT-d capabilities, 3 4-16 lane PCIe slots and sufficient graphics power on a very low power and financial budget simply couldn't be had from Intel at the time.

    With Skylake Xeons equivalent flexibility and capabilities are starting to appear, while of course it vastly exceeds the APU's performance on some (for me not critical) aspects, but also comes at a significantly different price.

    My main disappointments so far have been:
    Lack of ECC support with the FM2(+) sockets (I froth at the mouth whenever I think about that stupid cost cutting omission)

    Kaveri idle power performance much worse than Richland

    Missing Carrizo socket SKU with ECC DDR4 RAM

    Missing "game cartidge" variant e.g. with 8GB of GDDR5 soldered on a dGPU form factor PC designed for a passive PCIe backplane, expandable from 1-4 stackable APUs
  • CaedenV - Wednesday, November 18, 2015 - link

    I too am running an A10 5800K CPU for my home server at the moment... but it is because I got it for free (PCIe slot burned out, and my friend didn't want it anymore). But had I been buying an A10 vs a Pentium or i3 for home server use it would hands down be the Intel. Up front costs being equal, even a crappy Celeron is 'good enough' to run a home server as performance is decided by the HDDs and RAM. In the long run the Intel is going to run cooler, quieter, and on less power for a system that is on 24x7. This means lower power bills, less noise produced, less airflow needed, and less dust accumulation.

    It is not that the AMD is bad as a home server platform (mine has been running great for quite a while now), it is simply not as good. Heck, even with it being free I am considering replacing it next year with an intel system simply because it will pay for itself in power costs alone in just a few years. So even free AMD vs paying for Intel is a toss-up.
  • DiHydro - Friday, December 4, 2015 - link

    It's going to be hard to pay off a new system with a difference of $16 a year, and that is at 50% CPU 24/7, which is extremely high for a home server.
  • Computer Bottleneck - Wednesday, November 18, 2015 - link

    Eight 6 Gbps SATA ports on A88X is pretty sweet, but yes it would be great to see ECC on this platform as well. Also more Mini-ITX FM2+ boards.
  • spinportal - Wednesday, November 18, 2015 - link

    That's a power hungry bugger that trades blows with the i3 4130T, but not acceptable TDP for a laptop or NUC. Nice try, but misses the mark as a desktop chip or a mini-platform.
  • Ian Cutress - Wednesday, November 18, 2015 - link

    It's a 95W desktop part. It's not geared for laptops or NUCs. There are 65W desktop parts with TDP Down modes to 45W, and lower than that is the AM1 platform for socketed. Carrizo at 15W/35W for soldered such as laptops and NUC-like devices.
  • Vesperan - Wednesday, November 18, 2015 - link

    Apologies if I missed it - but what speed was the memory running at for the APUs?

    The table near the start just said 'JEDEC' and linked to the G-skill/Corsair websites. This is important given these things are bandwidth constrained - the difference between 1600mhz and 2133mhz can be significant (over 20 percent).
  • tipoo - Wednesday, November 18, 2015 - link

    2133mhz, page 2
  • Ian Cutress - Wednesday, November 18, 2015 - link

    We typically run the CPUs at their maximum supported memory frequency (which is usually quoted as JEDEC specs with respect to sub-timings). So the table on the front page for AMD processors is relevant, and our previous reviews on Intel parts (usually DDR3-1600 C11 or DDR4-2133 C15) will state those.

    A number of people disagree with this approach ('but it runs at 2666!' or 'no-one runs JEDEC!'). For most enthuiasts, that may be true. But next time you're at a BYOC LAN, go see how many people are buying high speed memory but not implementing XMP. You may be suprised - people just putting parts together and assuming they just work.

    Also, consider that the CPU manufacturers would put the maximum supported frequency up if they felt that it should be validated at that speed. It's a question of silicon, yields, and DRAM markets. Companies like Kingston and Micron still sell masses of DDR3-1600. Other customers just care about the density of the memory, not the speed. It's an odd system, and by using max-at-JEDEC it keeps it fair between Intel, AMD or others: if a manufacturer wants a better result, they should release a part with a higher supported frequency.

    I don't think we've done a DRAM scaling review on Kaveri or Kaveri Refresh, which is perhaps an oversight on my part. Our initial samples had issues with high speed memory - maybe I should put this one from 1600 up to 2666 if it will do it.
  • Oxford Guy - Wednesday, November 18, 2015 - link

    SInce you always overclock processor is makes little sense to hold back an APU with slow RAM.
  • Oxford Guy - Wednesday, November 18, 2015 - link

    It's not just the bandwidth, either (like 2666) but the combination of that and latency. My FX runs faster in Aida benches, for the most part, at CAS 9-11-10-1T 2133 (DDR3) than at 2400, probably due to limitations of the board (which is rated for 20000. Don't just focus on high clocks.
  • Oxford Guy - Wednesday, November 18, 2015 - link

    rated for 2000
  • Ian Cutress - Thursday, November 19, 2015 - link

    Off the bat, that's a false equivalence - we only overclocked in this review to see how far it would go, not for the general benchmark set.

    But to reiterate a variation on what I've already said to you before:

    For DDR3, if I was to run AMD at 2666 and Intel at 1600, people would complain. If I was to run both at DDR3-2133, AMD users would complain because I'm comparing overclocked DRAM perf to stock perf.

    Most users/SIs don't overclock - that's the reality.

    If AMD or Intel wanted better performance, they'd rate the DRAM controller for higher and offer multiple SKUs.
    They do it with CPUs all the time through binning and what you can actually buy.
    e.g. 6700k and 6600k - they don't sell a 6600k at 2133 and 6600k at 2400 for example.

    This is why we test out of the box for our main benchmark results.
    If they did do separate SKUs with different memory controller specifications, we would test update the dataset accordingly with both sets, or the most popular/important set at any rate.

    Besides, anyone following CPU reviews at AT will know your opinion on the matter, you've made that abundantly clear in other reviews. We clearly disagree. But if you want to run the AIDA synthetics on your overclocked system, great - it totally translates into noticeable real-world performance gains for sure.
  • Vesperan - Thursday, November 19, 2015 - link

    Thanks Ian - I missed than when quickly going through the story this morning prior to work. Yet somehow picked out the JEDEC bit!

    I like the approach you've outlined, it makes sense to me. So - for what it's worth, you have support of at least one irrelevant person on the internet!

    From what I saw from a few websites (Phoronix springs to mind) the gains from memory scaling decline rapidly after 2133mhz.
  • CaedenV - Wednesday, November 18, 2015 - link

    I just don't understand the argument for buying AMD these days. Computers are not things you replace every 3-5 years anymore. In the post Core2 world systems last at least a good 7-10 years of usefulness, where simple updates of SSDs and GPUs can keep systems up to date and 'good enough' for all but the most pressing workloads. People need to stop sweating about how much the up-front cost of a system is, and start looking at what tier it performs at, and finding a way to get their budget to stretch to that level.

    I don't mean starting with a $500 build and stretching your wallet (or worse, your credit card) to purchase a $1200 system. I'm not some elitist rich guy; I understand the need to stick to a budget. But the difference between AMD and Intel in price is not very much, while the Intel chip is going to run cooler, quieter, and faster. Spending the extra $50 for the Intel chip and compatible motherboard is not going to break the bank.

    Because lets face it; pretty much everyone is going to fall in one of 2 camps.
    1) you are not going to game much at all, and the integrated Intel graphics, while not stellar, are going to be 'good enough' to run solitaire, phone game ports, 4K video, and a few other things. In this case the system price is going to be essentially the same, the video performance is going to be more than adequate, and the i3 is going to knock the socks off of the A8 5+ years down the road.
    2) You actually do play 'real' games on a regular basis, and the integrated A8 graphics are going to be a bonus to you for the first 2-6 months while you save up for a dGPU anyways... in which case the video performance is going to be nearly identical between the i3 and A8, while the i3 is going to be much more responsive in your day-to-day browsing, work, and media consumption. Or, you are going to find that you outgrow what an i3 or A8 can do, and you end up building a much faster i5 or i7 based system... in which case the i3 will either retain it's resale value better, or will make a much better foundation for a home server, non-gaming HTPC, or some other use.

    I really want to love AMD, but after cost of ownership and longevity of the system is taken into consideration, they just do not make sense to purchase even in the budget category. The only place where AMD makes sense is if you absolutely have to have the GPU horsepower, but cannot have a dGPU in the system for some reason. And even in that case, the bump up to an A10 is going to be well worth the extra few $$. There is almost no use in getting anything slower than an A10 on the AMD side.

    But then again, AMD is working hard these days to reinvent themselves. Maybe 2 years from now this will all turn around and AMD will have more worthwhile products on the market that are useful for something.
  • AS118 - Wednesday, November 18, 2015 - link

    Well to be fair, the FX chips are strong enough and cheap (especially the $100 FX-6300) and you can pair them with a discrete GPU with ease. If you're trying to get a 1080p gaming rig for absolute cheapest price, a double AMD FX / GPU build works pretty well.

    That said, next year with Zen on 14nm will probably give people a good reason to get AMD. I just support AMD whenever I can because I dislike monopolies, and don't want Nvidia and Intel to become ones.

    Sure they give good price for performance NOW, but that's because they have AMD to compete with them. If they didn't, I doubt the price would be "only a little higher" than AMD.
  • Oxford Guy - Wednesday, November 18, 2015 - link

    8320E at $110 at Microcenter with UD3P 2.0 board is the best deal.
  • JoeMonco - Wednesday, November 18, 2015 - link

    "That said, next year with Zen on 14nm will probably give people a good reason to get AMD."

    Yeah, yeah. We hear this ever time a new AMD microarchitecture comes out and yet not a single one has lived up to the hype in more than a decade.
  • medi03 - Thursday, November 19, 2015 - link

    Do you get paid to spread FUD about AMD or is it for free?
  • JoeMonco - Thursday, November 19, 2015 - link

    How can facts be "FUD"? If I'm wrong, please do tell which AMD CPU has been able to beat Intel in anything performance or TDP-wise since the Core2 from 2006. I won't hold my breath, though, since you'll just sling ad homs with no real facts.
  • Deshi - Friday, November 20, 2015 - link

    Yes, and that would be because only one new architecture (buldozer and its many variants) has been released in that past Decade, and we all know how that played out. Before that Athlon actually beat Intel for a good amount of time, and the same guy that worked on that is working on Zen. I'm somewhat hopeful that it won't be a bad design decision this time. I'm just hopeing its not too late. back in Athlon days, Intel was too pridefull and choose to ignore AMD initially, which is why AMD was able to take so much market share from them. I don't expect Intel to make the same mistake again this time.
  • alistair.brogan - Wednesday, November 18, 2015 - link

    This review doesn't compare with the new Pentium G4500. Skylake Pentium CPU cores with i5 Skylake integrated graphics. Faster and 1/3 the electricity/heat compared with the AMD APU. Only advantage amd has left is how some newer games don't run right without quad core, as they are bad ports from consoles....
  • alistair.brogan - Wednesday, November 18, 2015 - link

    50 dollars cheaper too, in Canada
  • alistair.brogan - Wednesday, November 18, 2015 - link

    40-50 fps borderlands 2 on minimum settings (30-40 ultra settings) with HD 530 and the G4500
  • Ian Cutress - Thursday, November 19, 2015 - link

    As posted above, we don't have any of the other Skylake processors in yet. Benchmarking is always an iterative task - with limited space and resources you can't all benchmark them on the same day.

    But sure, if I could get all the processors in on day one, I would totally try test them all for comparion points.

    Come back when/if we test the G4500 and see the numbers then.

    And no, it's not a simple case of going out and buying this one SKU just for the niche comparison that you're interested in - there have been requests in the comment sections of reviews for other SKUs as well, and I've had a couple of emails for more SKUs on top of that.

    Some SKUs are region limited (or slow roll out), or others are OEM only, so it can be hard to source outside the usual channels. So let me try and talk to Intel so we can get them all in, and then go from there. It's never an issue of lack of interest or subversion, just procurement (and ensuring we can communicate with the manufacturer at the point of testing).

    Of course, the more readers that register their interest, the more I can pass it on up the chain to get them in.
  • BurntMyBacon - Thursday, November 19, 2015 - link

    @Ian Cutress: "It's never an issue of lack of interest or subversion, just procurement (and ensuring we can communicate with the manufacturer at the point of testing)."

    [sarcasm]I thought Intel and all other computer hardware manufacturers were required by LAW to send their products to Anandtech for prelaunch approval where Supreme Tech Justice Cutress and his colleagues pronounce life or death on potential products. ;' ) [/sarcasm]

    Seriously though, I'd say you are doing fine despite the real world procurement and scheduling issues getting in the way.
  • flabber - Wednesday, November 18, 2015 - link

    I can't agree any more with the point that power consumption is the least of my concerns. While there is a significant difference between a 250W TDP CPU and a 50W TDP CPU, where one would have to factor in the cost of cooling and a PSU, 100W is quite manageable. A 500W PSU is more that adequate for just about any current system. However, I am aware that decreased power consumption is an objective in all consumer products and will be seeing in upcoming computer components. (Ironically for mobile components, my 2009 Blackberry with a 1150mAh battery can still run for a couple days before I need charge it.)
    My rig is equipped with a A10-5800k (2012) and a year old R9-290X (2013). Everyday tasks, such as using a spreadsheet, word processor, citation management or occasional image editing, can't be improved in any noticeably way. With regard to gaming, I can't be bothered to upgrade the motherboard and CPU to a superior Intel alternative. A few more frames per second won't make a game with poor storyline any better, nor will an enjoyable game become any better.
  • Pissedoffyouth - Thursday, November 19, 2015 - link

    If you have 5800 you should look at getting a kaveri for better performance and lower power consumption
  • Gadgety - Thursday, November 19, 2015 - link

    "AMD's first talking point is, of course, price. AMD considers their processors very price-competitive"

    No kidding, I got the A8-7600 for my kid and it's integrated graphics is comparable to the i7 Iris Pro, where the i7 is 5x more expensive. So for 20% of the price we get to enjoy graphics galore. Put it on Asrock's A88X M-ITX motherboard and it outputs 4K cinema. No graphics card means it's compact so we use a tiny chassis, perfect HTPC, and useable for the type of light gaming the kids do.
  • Gadgety - Thursday, November 19, 2015 - link

    @Ian Cutress The performance parity, and sometimes superiority, of the A8-7670k compared to the A10-7850k, and also to the A10-7870k, I guess could be attributable to driver improvements. Did you use the same drivers, or updated versions? If it's improved drivers, this would likely also improve across the APU-range.
  • Ian Cutress - Thursday, November 19, 2015 - link

    Same drivers for each. I lock a set of drivers down every test-bed refresh, so in this case it would be 15.4 beta, which is getting a bit old now. Kaveri Refresh does have some minor internal improvements as well I imagine, internal bus frequencies perhaps. There's always a small amount of volatility in the benchmark, depending on what heat density or board issue you might have. Looking back, we haven't always used the same motherboard on the APUs just due to timing (but all A88X), and even though we do some overriding on power profiles it can be difficult to compensate for motherboard manufacturer non-user exposed firmware optimisations on the memory buses.

    Come the next year test-bed refresh (with DX12 relevant titles hopefully), I'll be going back and redoing them all. That should clear out the cobwebs on the latest drivers and updates, providing a new base.
  • milli - Thursday, November 19, 2015 - link

    Ian Cutress, how is the review of your grandparents laptop coming along? :)
    I'm waiting for that Carrizo review.
  • zlandar - Thursday, November 19, 2015 - link

    I don't see the point of being so cheap that you are unwilling to spend more for a superior i3-5 and discrete video card. Why would you chain yourself to a dead-end cpu/gpu integrated combo and motherboard that isn't very good to start with?

    Plenty of people have pointed out how well Intel's cpu have held up since Sandy Bridge. I'm still using a 2600k and have upgraded my video card 3 times. If you are on a tight gaming budget it makes alot more sense to buy 2nd-3rd tier last gen video cards coupled with a good cpu you don't need to upgrade.
  • BurntMyBacon - Thursday, November 19, 2015 - link

    @zlandar "Why would you chain yourself to a dead-end cpu/gpu integrated combo and motherboard that isn't very good to start with?"

    Aren't just about all laptops deadend with respect to CPU/GPU? (Particularly in the Carrizo price range)
    Ton's of laptops are sold without discrete GPUs and no option to upgrade. Why should this matter to a Carrizo review (clearly laptop in this request)?
  • Ian Cutress - Thursday, November 19, 2015 - link

    Something special in the works. After SC15 finishes, I'll be digesting the mountain of data I have. :)
  • milli - Thursday, November 19, 2015 - link

    Sounds good :)
  • BrokenCrayons - Thursday, November 19, 2015 - link

    95 watts is too much for a modern CPU to require.
  • plonk420 - Thursday, November 19, 2015 - link

    that's watts TDP, not watts at the wall.
  • BrokenCrayons - Friday, November 20, 2015 - link

    Yes, that's fine. It's too high for a modern CPU. Half that would be more reasonable and a tenth would be better still, but not at all perfect.
  • TheinsanegamerN - Friday, November 20, 2015 - link

    The i3 is still a 47 watt chip. the skylake i7s are 95 watt. It seems most "modern" cpus are in the 65-95 watt range, with 47 watt for dual cores.

    A tenth of 95 watt is only 9.5, which is core m territory. it works, but it ISNT for power users. So i'm not sure what 'modern' cpu you are referring to, as it doesn't seem to exist. And AMD's APUs can be set to 65 or 45 watt mode as well, but you get a performance hit. So half the TDP already exists, and a tenth is too little for a big CPU.
  • BrokenCrayons - Friday, November 20, 2015 - link

    Yes, the i3 is an Intel processor. Skylake's 95 watt chip is an outlier that's highly uncommon. The vast majority of computers purchased are inexpensive laptops which regularly only rise to 35 watts. Desktop processors are not only a rarity, but those that range above 65 watts are very uncommon and largely the domain of expensive workstations and the very small number of gaming systems out there. It's pretty odd of AMD to sell a budget processor that puts those kinds of demands on a system and, in my opinion, unreasonable given the competition in the same approximate performance.
  • silverblue - Friday, November 20, 2015 - link

    I know it's hardly comparable, but I wonder what power an i3 with Godavari-level GPU would use if at 28nm. Besides, TDP != actual power usage.
  • TheinsanegamerN - Friday, November 20, 2015 - link

    Desktops are rare now? Ok. You must not work in business. Desktops Are still common, not as much as laptops, but the forecast for all of 2015 was north of 120 million desktops, the vast majority of which are business machines with i5 processors, which are 65-84 watt. I7s are rated at 84 watt for haswell, 77 for ivy bridge, and 95 for sandy bridge, and 95 for skylake, so it's much more common than you think.

    As for laptops, I'm not sure why you brought them up, seeing as the a8 is a DESKTOP chip. on DESKTOPs, CPUs with 65 watt TDPs are very common. Any i5, i7, pretty much any APU from AMD, are all 65 watt+. Laptops typically use dual core chips with much lower clock rates and power restricted GPUs. The vast majority of laptops, with those 35 watt chips (and most of the skylake and broadwell chips in laptops are 15 watt BTW, 35 watt isn't really used much anymore) can't compare to a 95 watt desktop chip in performance, so it still won't work for power users.

    So when you say that the TDP is too high for a modern processor, you mean it's too much for a mobile processor, ignoring that this is a processor for desktops. Unless you want a desktop processor with a mobile TDP but desktop performance, which will never happen, because a higher TDP chip will be clocked higher. Always.
  • Deshi - Friday, November 20, 2015 - link

    Does that 95 watts include the onboard GPU? if thats the case, thats not so bad, since you would need to addin the added dGPU TDP if you build an intel + nvidia/radeon to match the performance.
  • nos024 - Thursday, November 19, 2015 - link

    blah...after 4 years and Intel's IGP is still on the heels of AMD's APU. Bleh.

    Give me graphics performance comparable to a 750ti and an ITX motherboard with M.2 storage option and I'll will go and buy it this very instance. I want something like that.

    Low profile ITX-case + 8gb RA
    M + 240/256GB M.2 RAM (SATA or PCIe) + 300W PSU + ITX MB with AC wireless + APU with 750ti GPU performance. Is that so hard to ask for?! Apparantly so.
  • BrokenCrayons - Thursday, November 19, 2015 - link

    I'd love to see better IGPs from Intel as well, but it really only serves to move the bottom rung of the graphics ladder up a notch. They don't seem like they'll ever really catch up with system requirements on contemporary games. It's more of a matter of waiting for the current Intel graphics processor to be good enough to run what's already been on the market for a while in terms of entertainment. Beyond that, if Intel dedicates more space to graphics, there'll invariably be someone else who complains that it's a complete waste for there to even be integrated graphics in the first place since they have a discrete GPU.
  • raikoh05 - Thursday, November 19, 2015 - link

    you can probably make it run better https://www.youtube.com/watch?v=uiCCKurW9TU
  • plonk420 - Thursday, November 19, 2015 - link

    how the hell is this doing better than an A10 with 128 more streaming processors?
  • JoeMonco - Friday, November 20, 2015 - link

    It's summed up as "LOL AMD".
  • Rexolaboy - Thursday, December 10, 2015 - link

    The a10 benchmarks are from the launch tests. Not current, there is no reason to include them. Anandtech fail.
  • Tunrip - Friday, November 20, 2015 - link

    "I outfitted my 15-year-old cousin-in-law with an APU" THE FUTURE IS HERE! :D
  • hmmmmmmmmmm - Friday, November 20, 2015 - link

    People spending so much time on what-ifs. Why don't they just wait for the benchmarks for Zen and Kaby Lake instead of giving each other lessons in history and mathematics.
  • BMNify - Saturday, November 21, 2015 - link

    There is an incredible bias, but to no fault of your own, in the web benchmarks part. Considering that Intel is tied for second (with Opera. Samsung is the largest, of all companies) as the most active contributor for Chromium since about 2012, major effort is being invested by Intel to optimize Chrome for their chips. It just doesn't make logical sense every Intel chip performs that much better than AMD on web benchmarks other than they have invested a lot of time in helping advance chromium development. I mean come on, even a low end Pentium at stock speeds destroys even the highest AMD chip on those javascript/Web benchmarks. That has to be obvious bias.

    http://mo.github.io/assets/git-source-metrics-chro...

    I am not knocking Intel as their efforts are commendable in chromium development and any Chrome users who also use heavy JS browser apps would be amiss to choose AMD, but just wanted to point out that in benchmarking (which should be as level field as possible), you guys should switch to another browser like Firefox or even IE 11/Edge.
  • hojnikb - Saturday, November 21, 2015 - link

    Edge uses the same engine as Chrome... IE11 is old.

    If AMD actually invested something in software optimization wouldn't hurt.
  • Gigaplex - Monday, November 23, 2015 - link

    Edge does not use the same engine as Chrome. Edge uses EdgeHTML and Chakra, Chrome uses Blink/WebKit and V8.
  • b1gtuna - Tuesday, November 24, 2015 - link

    Any possibility this can run Starcraft2 on high setting at FHD?
  • silverblue - Wednesday, November 25, 2015 - link

    AMD's own internal benchmarks suggest 35fps at maximum in FHD for the A10-7870K, but I can't find anything to suggest performance of the A8-7670K.
  • albert89 - Friday, November 27, 2015 - link

    Oh dear anandtech where for art thou Carrizo in your tests ?
    Why is the whole tech reporting industry not using this ground breaking chip in their reviews ?
    Why is everyone running scared of AMD's A10-8700P ???
  • LarsBars - Tuesday, December 8, 2015 - link

    I really appreciate the commentary on AMD. Thanks, Ian!

Log in

Don't have an account? Sign up now