


26 Desktop Processors Compared 192
theraindog writes "The number of different CPU models available from AMD and Intel is daunting to say the least. The Tech Report's latest CPU review makes some sense of the landscape, exploring the performance and power consumption characteristics of more than two dozen desktop processors between the $999 Core i7-975 and more affordable sub-$100 chips. The article also highlights the value proposition offered by each CPU on its own and as a part of the total cost of a system. The resulting scatter plots nicely illustrate which CPUs deliver the best performance per dollar."
Seems pretty clear: (Score:4, Informative)
Re:Seems pretty clear: (Score:4, Insightful)
Not embarrassing, just not as fast. However, a Ferrari is vastly more powerful for the money, but a Corvette is still the much better deal -- do I really need to explain this on here?
Re:Seems pretty clear: (Score:4, Informative)
The Corvette ZR1 has more horsepower and is less expensive than any current Ferrari.
I love Ferrari... but the Corvette needs no apologies at all!
Re:Seems pretty clear: (Score:5, Funny)
It does if you want to turn.
Or if you want to not look like a redneck.
Re: (Score:3, Informative)
A quick glance through some magazine tables shows the ZR1 with more grip (1.05 g or more) on a skidpad than any Ferrari, as well. Only the Viper ACR has done better.
Rednecks don't drive cars that cost more than 100K USD.
Re: (Score:2)
So they finally fixed the Viper? I remember that used to be one of its biggest drawbacks - tons of power, but awful handling if you needed to do anything other than drive in a straight line.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Comment removed (Score:5, Insightful)
Re: (Score:3, Funny)
Re: (Score:2)
Hell I gave my nearly 5 year old 3.6GHz P4 to my oldest and he is blasting zombies in Left4Dead even as we speak
I read this sentence, and at first I thought you were saying you let your "nearly 5 year old" kid play Left 4 Dead!
Upgrading a component at a time (Score:2)
I just ordered a new motherboard that I think is compatible with my old P4 570 CPU. (It's 800MHz FSB, so it should work, I hope!)
The one thing I've learned over the years is to buy a good motherboard that lists your current CPU as one of the oldest ones it supports -- that way you've got plenty of room for future upgrades if you need them. The simple truth is that if I end up upgrading my CPU in the near future, it'll be because the new mobo isn't compatible, not because I need more speed.
Re: (Score:3, Interesting)
Re: (Score:2)
Some people want fast computers to do work on... Like CAD/FEA/Video Editing/3D Modelling/Scientific Research/Code Development/etc.
Besides the fact, that while someone might get 120fps in crysis today, they will be barely breaking 30fps when the next game comes out.
Re: (Score:2)
Fast computers are still needed if you're actually using them for work...
Just as an example, compiling+linking a nontrivial program with profile-guided optimization (say Firefox) using MSVC++ takes several hours on decent modern hardware. I'd love for that to happen in, say, seconds instead. ;)
Re: (Score:2)
Re: (Score:2)
Not embarrassing, just not as fast. However, a Ferrari is vastly more powerful for the money, but a Corvette is still the much better deal -- do I really need to explain this on here?
Yes. In my experience, computer nerds and muscle car freaks are groups that seldom overlap. As far as I'm concerned Ferraris and Corvettes the same thing, impractical expensive cars for those going through a mid life crisis. I have no idea what you are trying to convey with this analogy.
Re: (Score:2)
The only thing I'd get the i7 over the phenom ii's for is triple channel ram. That's the only upside I see with Intel chips.
Re:Seems pretty clear: (Score:4, Insightful)
Well now, they are faster. They're just not $800 faster. Nowhere near it.
Not to mention that the corresponding hardware you have to buy with a Core i7 (the expensive motherboard, the expensive triple channel ram) makes it even more not worth it.
Re:Seems pretty clear: (Score:4, Informative)
Even better value with integrated gfx (Score:2)
AMD would end up with even better value on low-end if the cheap AMD system was built on a mobo with integrated GFX...which is good enough for everything except recent GFX-intensive games.
Yeah, there are always Nvidia chipsets...but for some reason motherboards with them are definitely more expensive (at least where I shop) than comparable ones (both with AMD and Nv chipsets) for AMD platform.
Re: (Score:3, Informative)
Too bad the ATI Driver Sucks Balls (Score:2)
I recently built a similar system - Radeon HD3200 integrated graphics, Phenom II Quad 940, Ubuntu Jaunty. Box absolutely screams (it replaced an Athlon 3200+) and I'm very happy with the Ubuntu user experience (recent convert from Fedora)
But Lord of Guns and Butter, the ATI 3D drivers SUCK. Crash crash crash, where the NVidia drivers Just Work.
In retrospect, I wish I had skipped the integrated graphics and just bought an NVidia card.
DG
Re: (Score:2)
On the plus side, ATI is devoting some serious resources to open-source drivers. Take a peek over at the Phoronix forums [phoronix.com] for some ATI devs posting the latest and greatest, answering quest
Re: (Score:3, Insightful)
Plot isn't that clear to me at all.
Price/performance is slope on that graph, and AMD looks pretty good at first look, occupying a band mostly on the top-left.
They ought to have plotted performance on the independent axis vs. price/performance on the y axis. And put that sucker on a log-log plot. Then we'll see who's got the best price/performance for various performance levels.
Re: (Score:2)
I don't know about that, the AMD X4 955 is on par performance and price wise with the Intel Q9550. Of course, the i7 line is probably what you mean by high end, but you're paying at least $200 - $900 more for a 25-75% performance gain.
Re: (Score:2)
The price/performance graph makes it look like there's a big difference in price between those chips, yeh. In reality, the Q9550 is only $15 more than the X4 955, with a gap that small, I'd put that down more to choice of other components than to "wow, the AMD chip is so much cheeper for the same performance, I'd better run out and get one.
Re:Seems pretty clear: (Score:4, Insightful)
Personally I don't find it that trivial; never before has the best choice been so extremely dependent on what you need the performance for.
Say you've got some eminently parallel task, like ray tracing. With the huge price/performance difference between low end and high end you might beat the high end in performance by buying two cheap systems rather than one expensive. Look at the i7-940; it's just barely twice as fast as the cheapest CPU, yet it costs six times as much. That price would easily accommodate a cheap motherboard and memory and you'd still be shelling out less.
On the other hand, say you have very few parallel tasks, then you may still be as well off with a cheap CPU. Without several tasks you're not going so see much difference between a dual core CPU with good per-core performance and a quad core. Not an entirely rare situation when talking about desktop systems.
Or you might be in the sweet spot of latency sensitive parallel tasks, possibly applicable to some games, in which case a more expensive quad core CPU might definitely be what you're looking for.
And add to that the need for actual application specific benchmarks to determine actual performance as opposed to generic benchmarks... well, I wouldn't call the choice trivial.
Re: (Score:2)
Notice that we are making some assumptions here that may not be entirely valid. For instance, we've priced the Socket AM3 Phenom II processors on a Socket AM2+ motherboard with DDR2 memory, though we tested most of them with DDR3 memory. As you may have noticed, memory type didn't make much difference at all to the performance of the Phenom II X4 810
and we expect the story will be similar for the rest. In the same vein, we priced the Core 2 processors with DDR2 memory, though we tested them with DDR3. Our goal in selecting these components was to settle on a standard platform for each CPU type with a decent price-performance ratio, not to exactly replicate our sometimes-exotic test systems.
"and we expect the story will be similar for the rest."
You expect that, huh? Based on what? My experience with Intel processors has always been that they're memory starved. Core i7 may be a huge improvement, but please provide me proof. After all, DDR3 is more expensive, Core i7 CPUs are more expensive, and Core i7 boards are more expensive. It adds up to a 50% jump in cost. (~$500 -> ~$750+)
I'd like to know whether that was factored in - but the quote right there makes me think maybe it wasn't?
By their
Re: (Score:2)
The only significant difference between Intel and AMD processors on the memory BW front is that AMD moved to the integrated memory controller two generations before Intel did. If your experience is that Intel processors have always been memory starved, then your experience is limited to the time between when AMD integrated the controller and when Intel integrated the memory controller.
On the question of memory BW it is that simple. The company that integrated the memory controller first was ahead on deliv
Re: (Score:2)
Oh, I'm sorry - I didn't realize the E6300 and E8400 had integrated memory controllers. ;)
Here I thought they'd get a performance boost from the benchmarks being DDR3!
Re: (Score:2)
True. Those Intel chips do not have integrated memory controllers.
All the chips from both companies potentially get a boost from DDR3 memory over DDR2 memory. But, as the article demonstrates on the other pages, for the benchmarks they are looking at, the sensitivity going from DDR2->DDR3 is small.
I only talked about it in terms of integrated memory controllers because you brought up the idea that Intel parts have always been "memory starved". That statement is what I was contesting. Intel parts have
Re: (Score:2)
I only talked about it in terms of integrated memory controllers because you brought up the idea that Intel parts have always been "memory starved". That statement is what I was contesting. Intel parts have only been memory starved on some workloads, relative to AMD, when AMD had an integrated memory controller, and Intel didn't. The blanket statement that this has always been the case is not true.
You should've been fighting my conclusion - they couldn't be bothered benching with the hardware they priced out. I mentioned the i7's improvement in my first post, but I'd still like some numbers on just how big a difference it makes.
As to your point that you want to see data justifying the claims that DDR2 vs DDR3 in their study isn't all that relevant, the article provides that data. One may question the workloads they chose, but the data justifying the claim within the scope of the workloads that they did choose is there.
QX9770(DDR3) vs QX9775(DDR2)?
Mem Latency +60%
Crysis FPS Avg -21%
Crysis FPS Min -32.5% (below 30FPS; expect stutters/jolts with VSYNC enabled)
FarCry2 FPS Avg -35%
FarCry2 FPS Min -34%
UT3 FPS Avg +13%
UT3 FPS Min +4.5%
HL:EP2 FPS Avg -7%
Particle Sim Bench FPS Avg (?) +13.8%
Worldben
Re:Seems pretty clear: (Score:5, Informative)
No, but the $300 i7 920 surely is...
Re:Seems pretty clear: (Score:4, Insightful)
Not if you factor in the motherboard price difference - about $100 for x58 over AM2+. The i7 is brilliant for heavy, multi-threaded workloads but I question how much of that performance is really worth it for an average consumer or even Slashdot reader.
Re:Seems pretty clear: (Score:5, Interesting)
My AMDx2 3800+ is starting to show its age, but there is no way I am going to buy a high end part with Larrabee so close.
Re:Seems pretty clear: (Score:4, Interesting)
WHen I can I buy the highest end I can and use it for 5 years.
I might upgrade a video card. After 5 years, it gets deprecated into the house hold pool.
I have 5 year old computers running games like team fortress 2 and wow just fine. Not the highest end graphic performance, but fine enough.
When I build my next box, It will have 16Gigs ram, minimum. Hopefully 32. This will probably stretch it's use out to 8 years.
Re:Seems pretty clear: (Score:4, Insightful)
That's one way to go, but the most economically sound way to do it has always been (slightly) more frequent upgrade cycles and lower range hardware.
Re:Seems pretty clear: (Score:4, Interesting)
The most economically sound upgrade path for me has been sell after 6 months, getting 95% of purchase price back-my people like buying my 6 month old systems, and I like buildiing new ones!
Re: (Score:2)
I dontship, I have a large enough friends/family/coworker group to easily find a sale locally.
My labour, meh, 1 hour to build and image the system.
In fact most of the time I get back more than I paid at wholesale level.
Love those XP patches that allow boot in any
new machine!
Re: (Score:2)
Here! Here! Mod that man up as insightful.... in small yet functional increments.
Re: (Score:2)
The CPU is a bit dated but not much of what I do is CPU bound anyway. I run a lot of concurrent VMs that mostly just sit around and play a game here or t
Re:Seems pretty clear: (Score:4, Interesting)
That's just the motherboard, let alone the processor or ram cost differences. Not to mention they are about to change the socket again rather soon, if I recall correctly (aka 1 year).
AMD changes sockets a lot more gradually, letting people actually, you know, upgrade.
Re: (Score:2)
That being said, I plan on upgrading to i7 920. My current system is a desktop, mail-, web-, file-, print-, dns-, and dhcp-server, router, firewall, and a few more things I am forgetting. [Of course I run Linux]. My current P4 2.8Ghz (space heater that also does some computation) can't handle much when it I run a simple multi-threaded desktop app. The cpu has 1 core and two "hyperthreads". I look
Re: (Score:2)
high end version? do you understand intel's processor path? It's a mandatory upgrade every new socket. Maybe you might want to look into AMD if you want to not waste cash on an i7 920 which is being discontinued [bit-tech.net], by the way. Mind you, I use an i7 920, but only because I got it free. Had I not, I'd be running AMD.
Calculate cost on a per year basis including Watts (Score:2)
I've got an AMD Athlon X2 4850e. The time before that I lasted 6 years. So far I'm having a hard time seeing the value proposition in replacing the 4850e in the foreseeable future. There is still nothing better that I can see is the best combination of performance, low power consumption and price.
In fact, it would make more sense to calculate a cost per year to own rather than the outright purchase cost, since this is what it really costs a person. Probably most
Re:Seems pretty clear: (Score:5, Insightful)
The Phenom II X4 955 beats the i7 920 in 3 out of the 4 games they tried. The only one it lost was Cryis Warhead and it was a narrow loss (48 vs. 46 FPS).
These price difference of these two chips is about $35 on Newegg. I think for gamers, getting the X4 955 and putting that extra $35 towards the video card will net better results. This isn't counting the additional cost of DDR2 vs. DDR3 memory which has minimal effect on performance right now but still has a big price difference.
Re: (Score:2)
I know that some people hate to hear the "yea, but if you overclock part X" argument, but here goes...
You can pick up a core 2 duo E8400 wolfdale ($168@newegg) and an arctic freezer 7 pro hsf ($37), and perform a very, very modest overclock from 3.0 to 3.33ghz for a total of $205. Hell, you could probably perform this overclock with the stock cooler with no issues and save a further $37.
Now, you have the equivalent of the $270 E8600 c2d which also rates high in their gaming benchmarks (beating the phenom ii
Re: (Score:2)
Maybe I'm missing something but the links you posted do not show the E8600 beating the 920, 940 or 955, except where it beats a 920 by less than 1 frame per second. All that tells me is that games are not CPU bound.
"71% increase in cost vs. a 2%
Re: (Score:2)
"Maybe I'm missing something but the links you posted do not show the E8600 beating the 920, 940 or 955, except where it beats a 920 by less than 1 frame per second. All that tells me is that games are not CPU bound."
The e8600 beat all of those cpu's in TFA, not the links I posted.
Re: (Score:2)
The Phenom II X4 955 beats the i7 920 in 3 out of the 4 games they tried.
AMD always leads in games when you compare similarly priced chips ... until you overclock them :(
Re: (Score:2)
The point of getting an i7 920 is to overclock it to 3.6-4ghz+... At that point it blows the Phenom II out of the waters.
Re: (Score:2)
I am not one such; but they are hardly nonexistent.
Re: (Score:3, Insightful)
I agree so much I'm going to take it to the next level:
Almost all hardware today is ridiculously powerful for ridiculously cheap. Review sites implicitly agree by reviewing at full detail 1600x1200 or higher.
Re: (Score:2)
Re:Seems pretty clear: (Score:5, Insightful)
In science we call that honesty. You can make a tiny little difference look REALLY BIG by cropping your graph so that it only shows a very tiny range. By showing the origin tiny differences look, well, tiny.
Re:Seems pretty clear: (Score:5, Insightful)
My issue with the graph is someone needs to take a class on "how to make a graph". 0% performance and $0 cpu.... why? Was there a $0 cpu? Did any of the cpus get a 0%?
Probably because the person who made that graph for The Tech Report wanted all the proportions to be honest.
Did you ever read the book How to Lie with Statistics [wikipedia.org]? Or the book How to Lie with Charts [amazon.com]? Or a nice, short blog post called Graphs That Lie [gyford.com]?
When you chop out some of the "wasted space" in a graph, you distort the graph. Unless people are careful and check where your axes begin, and then mentally visualize where the axes go, they'll get a misleading idea of the data from the graph.
Suppose the bottom part of the graph was sliced off, at the 90% line, to make you happier. Imagine what it would look like. The AMD X2 6400+, sitting at the 100% line, would have very little white space under it; and the Intel i7-920, sitting a bit below the 200% line, would now appear to be ten times faster than the AMD X2 6400+. The numbers would be the same, but the visual impact would be that the Intel chip totally blows away the AMD chips.
The graph is good the way it is.
I'll meet you halfway, though: it wouldn't have hurt for them to have put in a second chart, zooming in on just the most crowded areas.
steveha
Correlation is positive (Score:4, Insightful)
Re: (Score:2, Insightful)
You perhaps aren't cynical enough. I wonder how much this chart would change if the chips were overclocked. That would tell you if some of the slower processors were simply underclocked versions of the faster ones.
Re: (Score:2)
Judging overclocked chips?
Are we talking about a 32 second test, or a test over several months?
Does the overclocked $250 system catch your house on fire after 6 hours of gaming?
Re:Correlation is positive (Score:4, Funny)
I've always tended to get right behind the cutting edge, processors usually take a dive in price and speed, Like when i bought my processor, i paid 300 for a quad core intel last summer, the next step up was 395, then 600, then 1000, but the 1000 wasn't 3.3x faster, the 600 wasn't 2x faster, etc. However, the $150 intel chip was 40% slower, so the price/performance sort of follows a log function where the trick is finding the right place before the price starts to skyrocket in relation to performance. I'm sure that if you could find a bunch of old pentium 1 processors on the super cheap, in bulk, you might get the same price/performance as a new mid-level intel chip, but why would you want to run a beowulf cluster to get the same throughput? (Ignoring the price of all of the other components, since the article is price/cost of the cpu alone, that was the metric i went by as well).
I wonder what the price/performance ratio of a pentium 1 that cost $0.01 is.
Well, I suppose that running a beowulf cluster is it's own reward.
Re: (Score:3, Insightful)
Aside from the possitive correlation, the other main point I would take away is the apparent 'knee' in the chart at about the $300 mark.
Seems that is the point of diminising returns.
Re: (Score:2)
So CPUs are sort of like wine?
I get your underlying point, but wine has more subjective values (wine enthusiasts would debate this though) that come into play whereas CPUs have more quantifiable comparisons. That said, for both CPUs and wine I would think that most of the people who are buying "high end" products they are a little more discerning about their purchases.
Re: (Score:2)
The other thing to note is that there are some CPUs that by this metric are clearly just not very worth it where their are cheaper ones that perform better.
Depends on how much you value the time it takes to complete $PROCESSOR_INTENSIVE_TASK. If I hired software engineers for $100,000 a year, that's $1 a minute and so it might very well be "worth it" to spend $1000 on the latest N-core processor (compilation being nearly infinitely parallel) if that save 5 minutes per compile over 200 compiles on the year.
The only thing the plot proves is that there are diminishing returns, not that any particular price/performance point is not "worth it".
Only Intel compatible .... (Score:3, Interesting)
I am fed up with all these people who think that all the world is Intel compatible -- when there are better CPUs out there.
Re: (Score:3, Insightful)
They'll cover ARM when someone sells a motherboard with a socket I can stick a 2+ GHz quad-core ARM in and get performance equal to an equivalent AMD or Intel chip.
As it stands, ARM is irrelevant outside the embedded/pda/non-x86 netbook scope.
Re: (Score:2)
GBA, DS, and DSi have ARM CPUs (Score:2)
tell me whenever I can get an ARM CPU that can render, play games, etc.
Games? Nintendo has put ARM CPUs in its handhelds since mid-2001.
Too bad (Score:2)
Because desktop PCs of the world DO run Intel compatible chips, and that is just how it is. Doesn't matter if you don't think it should be that way, that's how it is and that's what tech sites deal with. Windows is of course the by far dominant desktop OS with over 90% market share. Well Windows only runs on Intel and AMD chips. Most versions are x86 and x64 only, there are a few IA64 (Itanium) versions. No ARM, no Power, etc.
After Windows is MacOS. All new Macs sold for a number of years have been Intel ch
Windows Mobile for subnotebooks (Score:2)
Windows only runs on Intel and AMD chips.
Unless it's Windows CE. I seem to remember that some early examples of what everyone but Psion now calls "netbooks" [wikipedia.org] ran Windows CE.
Where do you go to buy an ARM desktop? I am not aware of any vendors selling them.
There used to be desktop computers that run RISC OS [wikipedia.org], but I get your point.
Re: (Score:2)
Windows CE is ballocks, it's barely acceptable for a smartphone or PDA. I have a tablet with it (WebDT 360) and it's a nightmare trying to use it as a real OS. (I'm working on building Angstrom Linux, have got a build, now need a customized kernel and X server. Fun times.)
Re: (Score:3, Informative)
Well, it seems somebody makes ARM PCs... ;>
http://en.wikipedia.org/wiki/A9home [wikipedia.org]
http://www.advantage6.com/products/A9home.html [advantage6.com]
http://www.cjemicros.co.uk/ [cjemicros.co.uk] - just 600 Pounds!
But yeah, I agree with you.
Though I wonder what upcoming ARM netbooks will bring; with existing official Debian ARM port, be might even see the one true desktop Linux distro that you mention, Ubuntu...
Re: (Score:2)
It makes little sense to compare ARM boards. They are usually power-optimized (with peripherals consuming substantial fraction of power), not performance-optimized. Also, end-user devices might be underclocked and undervoltaged. Besides, if you want performance with non-x86 CPUs, then look at MIPS CPUs (which are usually used for things like AV processing in set-top boxes).
Now, a comparison of end-used devices (netbooks/phones) with ARM/MIPS CPUs might actually make sense.
A better solution. (Score:5, Interesting)
What is the cheapest CPU that can playback 780P flash well?
That is probably a good CPU for 99% of the population. Flash is a resource hog and is likely to be most intensive thing that most people use.
The next step up would be to list several games and see what is the cheapest CPU that can play them at say 60FPS at good settings with a $99 video card.
If your a video editor, hardcore PC gamer, transcode a lot of video, or run CAD get the fastest CPU you can afford.
So hard core types should buy I7s and pretty much everybody else should buy AMDs once you take into account ram and motherboard prices.
Also if you are planing on running virtual machines AMD are often a better choice. Intel doesn't support virtualization on a lot of their CPUs while I think AMD does on their AM2 and up CPUs.
Re:A better solution. (Score:4, Insightful)
Re: (Score:2)
That is kind of the point. If you want to step it up then go for one that supports 1080p and that will be good enough for most people.
And actually I think the Atom doesn't handle 720p all that well yet.
That is the interesting thing about CPUs today. The best price to performance ratio isn't always important.
If you have no need for the extra performance don't get it. As it is very few applications will use an I7 well. Most people just don't need 8 threads.
Re: (Score:2)
Incorrect. NVIDIA Ion based nettops are already playing 1080p video smooth as silk. It's more about the graphics adapter than it is the Atom.
Re: (Score:2)
It That is blue ray and not Flash. That is why I specified Flash.
Re: (Score:2)
"Flash Player 9 Update 3, released on December 3, 2007,[2] also includes support for the H.264 video standard (also known as MPEG-4 part 10, or AVC) which is even more comp
Re: (Score:2)
I'm doing it on IBM T42 with 1.7GHz PIV and Radeon 9700. That's like 4 years old. We really don't need faster processors we have already for general use computing. Games on the other hand...
I use Ubuntu (with Compiz), but I suppose Windows should be also fine, probably even better as drivers are faster.
Flash is a resource hog? Little red riding hood! (Score:2)
Flash is a resource hog? Little Red Riding Hood for you!
http://vimeo.com/3514904 [vimeo.com]
How can you call that a hog? It is a blast, on any CPU.
Re: (Score:3, Interesting)
But, to answer your question, probably the new class of netbook like the Pegatron [engadget.com], which, interestingly enough is running a Freescale processor with an ARM based core. This little netbook also has flash based GPU acceleration (supposedly), is incredibly thin and sports I think 6 hours of battery life.
Re: (Score:2)
Sorry that was a typo. Yes I meant 720P
And yes I think one of those ARM notebooks would be very handy.
Re: (Score:2)
Ancient CPUs (upgrade comparison) (Score:2, Interesting)
Do any of the CPU reviews use old CPUs? What I what to know is how much faster today's CPU is compared to my 3-6 year old CPU, but these hardware reviews typically have a low end much newer/faster than my current system. Practically, a 50% CPU edge is too marginal for me to upgrade, but if a new system was 3X faster than my current aging machine I would be tempted!
Re: (Score:2)
Do any of the CPU reviews use old CPUs? What I what to know is how much faster today's CPU is compared to my 3-6 year old CPU, but these hardware reviews typically have a low end much newer/faster than my current system. Practically, a 50% CPU edge is too marginal for me to upgrade, but if a new system was 3X faster than my current aging machine I would be tempted!
tomshardware.com
Re:Ancient CPUs (upgrade comparison) (Score:5, Informative)
Tom's cpu chart is a great tool, but they don't generally compare older chips to newer ones. They also change the testing credentials from time to time, so there's no real way to directly compare old vs. new.
Anandtech has a new cpu benchmark site that compares everything from a single-core atom up to the top-of-the-line core i7. They've also recently added two pentium 4 chips to the mix so you really can directly compare them to the newer stuff. Check it out:
http://www.anandtech.com/bench/ [anandtech.com]
What I'm seeing... (Score:5, Interesting)
The performance scales sub-linearly with the price, and ends up almost flat at the extreme end. This means you need to examine the cost of SMP vs. a more powerful CPU. Two X2 6400+ chips in an SMP should give you about the same performance at the same cost as one i7-920, after you add in the extra for the upgraded chipsets and mobo.
More powerful low-end chips become more and more effective when SMPed versus their higher-end rivals. The other benefit of going SMP is that you have fewer cores sharing the same cache, therefore increase the number of distinct tasks you can perform in parallel effectively without cache-flooding.
Of course, you can't SMP forever - the largest SMP array you can make before the system slows down by more than the CPU increases performance is 16-way. Even before then, you lose linear scaling fairly early on. So you end up balancing the different CPUs against the different methods of arranging them to get the best performance for your money.
Re: (Score:3, Insightful)
/.'ed (Score:5, Funny)
Re: (Score:2)
> Apparently, the tech report should have benchmarked their web server before putting this article up.
It's probably more bandwidth related than computer related. When the County first put our sex offender registry online, we had a measly T1 which got saturated by the community within minutes of the local news stations carrying the announcement. Our (low-power at the time) Linux web server registered 0.00 0.00 0.00 loads most of time time. If it wasn't for the server logs expanding rapidly, I wouldn't
Article doesn't really talk about features (Score:5, Informative)
Its too bad the article doesn't talk about things like Execute Disable, Virtualization support, etc. For a power user audience like /. these are important considerations.
For me not being able to install Xen, or Windows 7 XP mode, etc are complete deal killers. I want CPUs with those features, especially when shopping "value CPUs".
Getting something like an E8190 is a mistake that will bite a /. power user in the ass eventually even if it is a few bucks cheaper than an E8200 and delivers the same performance, at the same wattage, etc...
Intel Atom 330 (Score:4, Informative)
I've been something of an AMD fanboy ever since the Athlon came out, but I just bought an Intel Atom 330 for a lightweight file server, and I have to say I'm thoroughly impressed. 64-bit, dual-core, virtualization extensions, and low-power to boot for around $80 which includes the motherboard. Simply unbeatable.
Also wanted to mention that these guys [cpubenchmark.net] have easy-to-read benchmark charts of a wide variety of CPUs. Certainly more than the 26 in TFA. Benchmarks don't tell the whole story of course, but it's a good start for quick-and-dirty comparison.
Re: (Score:2, Funny)
Re: (Score:3, Funny)
With vista ? There must not be much money left for the hardware!
i7 920 - the most cost effective? (Score:2)
Haven't finish reading the report before the site gone under, but the i7 920 seems to be the most cost-effective according to their statistics.
Re: (Score:2)
Honestly, it doesn't matter much any more. Except for certain things like audio and video ripping/encoding, 3D design rendering, etc., most any modern dual-core CPU will do whatever a home user needs just fine. Yes, even for gaming. I'm running a socket AM2 Athlon 5600+ and it's not been a bottleneck yet. My video card could stand to be upgraded for some games though.
So your method of choosing a CPU makes more sense than any other right now.
Re: (Score:3, Funny)
easy, people click, then get scared of all the words, and run away. But they still access the page.
Re: (Score:2)
The computers are only going to use around 350W at max load. Games don't utilise more than two cores (with some rare exceptions) so you could've got better real-world performance for less money with a dual core CPU. The E8400 for example.