Intel Dual Core Performance Preview Part I: First Encounter
by Anand Lal Shimpi on April 4, 2005 2:44 PM EST- Posted in
- CPUs
The Intangible Dual Core
The move to dual core is a bit of a "catch 22". In order to deal with the fact that a dual core die is twice the size of a single core die, AMD and Intel have to use higher yielding transistors. The larger your die, the more defects you have; so, you use higher yielding transistors to balance things out. The problem is that the highest yielding transistors run at the lowest clock speeds, so dual core chips end up running at slower speeds than single core chips. While the Pentium 4 could have hit 4GHz last year, we won't break the 4GHz barrier until late 2006 at the earliest.
In Intel's case, we're talking about 2.8GHz - 3.0GHz vs. 3.6GHz - 3.8GHz for the high end single core chips. In order to offset the difference, Intel is pricing their dual core chips within about $80 of their single core counterparts. Short of giving dual and single core chips a price parity, this is by far the best approach to assuring dual core adoption.
Why does Intel want to encourage dual core adoption? To guarantee a large installed user base, of course. The problem today is that the vast majority of desktop systems are single processor systems, meaning that most developers code applications for single processor systems. To encourage a mass migration to develop multithreaded applications, the installed user base has to be there to justify spending the added time and resources in developing such applications. As we just finished mentioning, Intel's approach is the quickest way to ensure that the exodus takes place.
So, with dual core CPUs priced very close to their single core counterparts, the choice is simple right?
On the Intel side of things, you're basically giving up 200MHz to have a dual core processor at virtually the same price. But things get a lot more complicated when you bring AMD into the situation. AMD hasn't officially released their dual core availability and pricing strategy, but let's just say that given AMD's manufacturing capacity, their dual core offerings won't be as price competitive as Intel's. Now, the decision is no longer that simple; you can either get a lower clocked dual core CPU, or a higher clocked single core AMD CPU for the same price - which one would you choose?
The vast majority of desktop application benchmarks will show the single core AMD CPU as a better buy than the dual core Intel CPU. Why? Because the vast majority of desktop applications are single threaded and thus, will gain no benefit from running on a dual core processor.
Generally speaking, the following types of applications are multi-threaded:
- Video Encoding
- 3D Rendering
- Photo/Video Editing
- most types of "professional" workstation applications
However, the vast majority of other applications are single threaded (or offer no performance gain from dual core processors):
- office suites
- web browsers
- email clients
- media players
- games, etc.
If you spend any of your time working with the first group of applications, then generally speaking, you'll want to go with the dual core CPU. For the rest of you, a faster single core CPU will be the better individual performance pick.
But once again, things get more complicated. Individually, single threaded applications will make no use of a CPU able to execute multiple threads. But, run more than one of these applications at the same time and all of the sudden, you're potentially dispatching multiple threads to your processor and thus, potentially, have a need for a multi-core CPU.
141 Comments
View All Comments
nserra - Tuesday, April 5, 2005 - link
Amd dualcore platform is right here today, the processor is not. And i dont see that a bad thing, upgradable as always been a good thing.#65 "Yes, same will also apply to the AMD's solution. Both CPU cores in dual core Opteron will share same bus and memory controller."
I am not really sure about that, amd always said the processor was being done dualcore since day one that must mean something. Dont forget that socket 939 is dual channel it could be possible to give one memory channel for one processor and the other channel for the other.
matthewfoley - Tuesday, April 5, 2005 - link
You people screaming for the gaming benchmarks, RTFA. Gaming or any other single threaded application will have identical results to a similarly clocked single core proc.ceefka - Tuesday, April 5, 2005 - link
#62 I read the article and think it's a rant, just a rant, no facts, just implications. I can sympathise with the feeling that Intel is let off the hook, for now.I do hope that games will be a substantial part of the benchies once the traditional AMD vs Intel dual core tournament takes place. Remember the pre-release benchies of the Opteron (that Italian thing)?
I also think that shrinking DVD's while typing in MS Word and listening to mp3 is about the maximum of things to do simultaneously. I have to get my head around it as well, you know ;-). It does however open a way to have someting like a home server or HTPC for everything but the most extreme stuff. It could record a TV-show, while watching a DVD and the wife chatting away on another screen.
Some say that dual core will have more benefit in servers because of the typical threaded applications. That's a good point. Can we look forward to a comparison of a 2 and 4-way dual core Opteron vs Xeon on typical server applications, workstation apps and maybe a few games just for fun.
smn198 - Tuesday, April 5, 2005 - link
lol @ #11 "now Intel is going to start eating AMD's lunch"Do you mean eating AMD for lunch? I think I prefer it your way.
RLandon - Tuesday, April 5, 2005 - link
The multitasking benchmarks clearly shows that Windows doesn't deserve to be refered to as an operating system.ceefka - Tuesday, April 5, 2005 - link
The price-difference between a dual and single core might not be too big on an Intel CPU, but you MUST get a new board. So the actual price difference when upgrading is $80 + brand new 955x motherboard. Nice one, Intel. A new board will cost you around $ 100 at least: actual difference $ 180. If AMD can stay under that difference they're at least competitive in pricing.Benchies are promising/impressive though. Wonder what the 64-bit benchies would be. Too bad that the introduction of dual-cores is in different segments (desktop vs server). Can't wait for some traditional Intel vs AMD benching ;)
#2
Read this article
http://www.anandtech.com/cpuchipsets/showdoc.aspx?... page 3, last paragraph.
AMD's Fred Weber finds Hyperthreading a "misuse of resources". AMD have always said two cores are better than a single core acting like one.
defter - Tuesday, April 5, 2005 - link
"INTEL's dual core isn't really dual-core, it's just two CPUs stick together"dual consisting of or involving two parts or components usually in pairs; "an egg with a double yolk"; "a double (binary) star"; "double doors"; "dual controls for pilot and copilot"; "duple (or double) time consists of two (or a multiple of two) beats to a measure": http://dict.die.net/dual/
Yes, two CPU stuck together can be called "dual core".
"the two cpus share the same bus, without any logic in between."
Yes, same will also apply to the AMD's solution. Both CPU cores in dual core Opteron will share same bus and memory controller.
IntelUser2000 - Tuesday, April 5, 2005 - link
130W isn't actually bad. The Xeon MP Potomac had TDP of 125W and max power of 136W, saying probably due to EIST, the difference is much less now. Plus, you aren't running two cores all the time, so if you are playing games only, then you would have 65W power consumption.Hmm... I wonder if the reason 1066MHz is not supported by any of the dual core processors is to dedicate more bandwidth of the Dual-DDRII-667 to integrated graphics. Or maybe we would see Yonah with 1066MHz bus as desktop?
IntelUser2000 - Tuesday, April 5, 2005 - link
falcc - Tuesday, April 5, 2005 - link
No games tested at all? Since when does this happen? Intel doesn't want dual core to look bad so Anandtech doesn't bench ANY games at all.Come on guys, judging by the article below on the Inquirer I'm not the only one who is suspicious.
http://theinquirer.net/?article=22332