AMD Launches Budget Processor Refresh 209
MojoKid writes "AMD has again launched a bevy of new processors targeted squarely at budget-conscious consumers. Though Intel may be leading the market handily in the high-performance arena, AMD still provides a competitive offering from a price/performance perspective for the mainstream. HotHardware has a performance quick-take of the new 3.2GHz Phenom II X2 555 and 2.9GHz Athlon II X4 635. For $100 or less, bang for the buck with AMD is still relatively high."
I agree... (Score:5, Informative)
I agree, I have a Phenom x2 and my whole system cost me a mere €300, - including sound, HDD and good enough video to have a 3d gnome desktop.
And this is where the money in processors is (Score:3, Insightful)
The standard pleb doesn't really give a damn whether it can crunch a billion petaflops in under a nanosecond, or heat a cup of water standing on the desk by its sheer awesomeness.
All they care about is whether they can chat to their friends, write a letter, browse the intert00bs and lose the last bit of their privacy by posting everything on facebook.
Re:And this is where the money in processors is (Score:5, Insightful)
Wow, tell us more about the stupid sheeple, oh insightful one!
Re: (Score:2, Interesting)
Actually, the new AMD's are also very valid options if you are interested in bang for the watt. 65 W TDP for a pretty fast quad core is awesome if you're doing number crunching.
Re: (Score:3, Insightful)
I can't say anything about your comments on the use patterns of the "standard pleb" but I do quite a lot of structural engineering work, which involves extensively using commercial structural analysis software on a daily basis and even developing my own programs, and I do all that on a "cheap and disposable" Athlon X2 4000+ system which cost me around 250 euros three years ago.
The thing is, you may have far more powerful CPUs in the market but the truth is that, although they can cost huge amounts of money,
Re: (Score:2)
Don't be so cautious with describing video (Score:2)
It's fast enough for any usual non-gaming usage...and also for most games, if you're fine with mostly ignoring latest gen ones (and really, with so many great older ones that's easy). Plus it is consistantly passively cooled.
Re: (Score:2)
Re: (Score:3, Insightful)
Also, if the majority of the public has these slower CPUs, what sane game maker is going to make games that do not at least run on these machines? That sounds like a good way to lose 90% of your profit.
Re: (Score:2, Interesting)
Re: (Score:2)
Re:Don't be so cautious with describing video (Score:5, Interesting)
Games almost never require high end systems. There are a few that come along that won't run on anything less than the latest greatest but it is extremely rare. Most games will run on mid rangish hardware, and not have a problem with things a couple generations out of date. They won't let you max all the detail in that case, but they'll run just fine.
Most people do not have high end systems. Many systems are older, after all not everyone upgrades all the time, and even when they do they often don't buy the high end parts. As such game makers support that. They usually also have higher detail settings for people with higher end systems, since those people often also spend more money, but they don't usually cut out the more mid range market.
Right now most games run quite well on a dual core in the 2GHz+ range with a $100ish current graphics card or a $200ish older graphics card. By well I mean with details turned up a reasonable amount and smooth gameplay.
Re: (Score:2)
Details on medium-low, and on lower resolution (1280 by 1024 or 1440 by 900 seems to be the limit spot, 800 by 600 is better, and usually 1680 by 1050 not very good. Higher resolutions need not apply).
Usually "smooth gameplay" means decent minimum frame rates
Re: (Score:2)
I get good frame rate on 16x10 (>40fps) and decent frame rate on 19x12 (~30fps) with 5 year old Athlon 64 X2 4200+ (2x2.2GHz) and Radeon 4850. Recent games I have played were Dragon Age and Torchlight - and all settings were at least on medium, most on high.
IMO, level of details in most games already exceeds by a huge margin whatever normal human being can perceive. Pushing for more details is pointless.
Re: (Score:2)
"High details" are usually shown in super-zoomed images in games reviews.
30 fps is absolutely enough in minimum frame rate (depends on game, though - less might be enough in some titles).
Re: (Score:2)
Lol. Depends on what you call “run”.
If you call 18 fps with medium graphics settings an medium resolution “running”, then yes.
But if you want your game to not look like the previous generation, or even worse, you PC has to be new too.
Re: (Score:2)
I play Call of Duty Modern Warfare 2 just fine with a Geforce 8800 GTS 320MB
That is high quality settings and 1920x1200 resolution
Framerate? Dont know, its smooth enough which should be around 30-40 frames per second on average.
Spending more than $100 on a graphics card today is just waste of money. Except if you would like to play a game like Crysis on Linux where you need the extra power to get decent performance. Then again,
Re: (Score:2)
people buy sportscars to show off, so why not buy "sports-computers" to show off?
Re:Don't be so cautious with describing video (Score:4, Insightful)
I would also be glad to see the term "console port" go away. It's nonsensical, implies there was some amount of "porting" being done...while that's not really true nowadays, not after efforts of MS. Same dev tools, same team, same engine, similar art assets; there's no porting taking place, only two parallel and largely common efforts. Not exploiting the strenghts of both platforms (do you think console side of such game is really optimised for hardware?)
But the term must be convenient for publishers, with players pointing fingers at those "evil consoles" instead of pointing them at...publishers.
Re:Don't be so cautious with describing video (Score:5, Insightful)
Console ports require more thought than "recompile with a different target".
Re: (Score:2)
And no main menu option for "quit". And an uncontrollable third-person camera. And a game mechanic only workable on an analogue movement controller. and so on, and so on.
Re: (Score:2)
Um....no.
First, as other posters have pointed out, there are other consoles besides the XBox.
Second, we do have different processor architectures between the XBox 360 and the PC. Now, if your coders are competent that won't be a problem, but at a game company you'll likely find some relatively young guy who's absolutely sure that his assembly is faster than what the C compiler can produce so he codes something in ASM. Or he assumes pointers are 4-bytes. Or that numbers are big-endian*. Or any one of the
Re: (Score:2)
But while saying this you should have in mind, say, PC version of Quake and port of it on PS1...
Wii is largery out of scope here, it's very different, with large portion of exclusive games (which also require vastly different coding practices) and those games aren't the ones people think about when saying "console port". PS3 seems to invalidate my point...but, when you look closer, multiplatform titles present also on PS3 are sometimes built on middleware engine; and often simply end up not very good on eit
Re: (Score:2)
For the vast majority of console games the point stands. Most games are playable on all three major console platforms. Of the games that aren't available for all three platforms, most of those are either Wii/Xbox only, or Xbox/PS3 only. The games that exclude the Xbox are few and far between.
Since the Xbox is simply a specialized PC with an OS created to handle Microsoft's standard graphics API, and since Microsoft's graphics API is far and away the dominant API in the PC market, the line between PC and
Re: (Score:2)
What makes console ports console ports is the fact that the games are completely dumbed down to fit the expectations of your average console gamer.
They are not smart, nor do they have taste.
They want what plants crave.
Re: (Score:2)
Yup, convenient for publishers indeed...it's "their" fault, those dumb console gamers.
Haven't it ever crossed your mind that for people who, for a long time, also enjoy console for a long time those "hybrid games" (that's more proper term) are also horrible? Also dumbed down?
Hey, might as well have come from Peggle, Solitaire or flashgames, right?...
Re: (Score:2)
With that price, I wonder if AMD can compete on the performance pre price scale.
E.g. I bet you get a ton of AMD CPUs for the price of one high-powerful Core i7.
What we need, is 4-8 socket mainboards!
anyone recommend a good AMD mobo for a hackintosh? (Score:2, Offtopic)
Seth
Re:anyone recommend a good AMD mobo for a hackinto (Score:2)
Yes, something with an intel processor. Save yourself the headaches, its not worth the savings.
Re: (Score:2)
Totally agree. Hackintoshes just don't play nicely with AMD CPUs.
Re: (Score:2)
Re: (Score:2)
The Hackintosh has to look as much like a real Mac as you can manage.
Of course that means Intel only for the CPU.
An alternate ATI or Nvidia GPU (beyond what's in actual shipping Macs) might even be a problem.
Re: (Score:2, Informative)
Don't get me wrong, i
Re: (Score:2)
So, would there really be any difference if Nvidia board similar to Intel 9300 ones was the variant...for AMD?
Re:anyone recommend a good AMD mobo for a hackinto (Score:3, Interesting)
Please visit the OSx86project wiki page to have those questions answered. I'll tell you off the bat that you will have to patch the kernel, which already puts you at risk for a world of hurt if your other components don't play nicely either.
Plus, AMD actually outperforms Intel in some areas (Score:3, Interesting)
Re: (Score:2)
Some of us like the higher fidelity from the unsmoothed pixels that software rendering masters!
Damn kids with your fancy bicubicly stretched pixels and anti-aliased edges. Get off my screen!
Re: (Score:2)
Flash, sure it barely makes a dent in my Q8300 but my old p4 could never run any of the popular flash sites fullscreened.
Would the quad cores work in a small case? (Score:2)
Just make sure the hot air gets out - fans (Score:2)
Re:Would the quad cores work in a small case? (Score:5, Informative)
As for the case... I don't have a suggestion for that at the moment.
(Best place to pickup the AMD CPU & MB is over at MWave since they'll bundle it, assemble the CPU and RAM onto the MB, and test it for you. So you're never left holding a bag full of incompatible parts.)
Re:Would the quad cores work in a small case? (Score:5, Informative)
Like you I need a Linux machine for work-related computing-intensive work, so I assembled one last fall. I use a decent quality MicroATX case with the Gigabyte MA785GPMT motherboard and the Phenom 2 X4 955. Add 8Gb of memory, a drive and you're set. I was going to add a separate graphics card at one point but so far I actually use the on-board graphics, with the 2d-only free drivers. I don't need speedy graphics for showing terminal output and static graphs after all.
The system came in cheap, it's really quiet and it's surprisingly speedy. True, it's barely half the speed of the 8-core Xeon machine I have at work - but at only an eighth of the cost.
My only advice is, don't go too cheap on the case. That's the single most important part for determining the noise level, and there's nothing so irritating as having a constant high-pitched whine from under your desk all day long.
Re: (Score:3, Insightful)
You're modded up, so I'll just add a +1 comment to your observation on the case. For the last decade or so, the most expensive part of my systems has usually been the case. Of course, there were only about 3 of them in that decade. A good case is a must.
So far, I've been a loyal Antec fan. Roomy, rolled edges, rails for everything, good ventilation...I have no complaints about their cases. They are damn well built.
Re: (Score:2)
Re: (Score:2)
A half-height discreet card will fit in that case just fine, and you'd see a nice graphics boost if you like your HTPC to do double duty.
Re: (Score:2)
The lower-power Phenom II chips have the lowest TDP around; functional units and whole cores can be powered down when not in use. You're not likely to see the latter while using your machine, except that the Phenom II X3 processors make use of it so that the disabled core doesn't even cost you any power.
Intel v AMD (Score:5, Informative)
I build new boxes every 6-8 months or so and rotate them into production boxes to make room for the next set. Until recently the Intel chipsets were ahead of the game vs the AMD chipsets with regards to things like E-SATA, AHCI, and PCI-e. AMD has caught up in the last 8 months, though. High-end Intel cpus tend to be a bit faster than high-end AMD cpus and you can also stuff more memory into small form-factor Intel boxes vs small form-factor AMD boxes.
On the flip-side, AMD boxes tend to be cheaper all-around and aren't quite so gimicky when it comes to managing cpu speed vs heat dissipation. Whole systems based on AMD seem to eat less power and from a cost standpoint when running systems 24x7. Power is getting to be quite important.
If you are trying to create the fastest, highest-performance box in the world Intel is probably your game (and for graphics you are going to be buying a 16x PCI-e card anyway with that sort of setup).
If you ratchet down your expectations just a bit, however, you can slap together a very good box with AMD at its core for half the price and 85% of the performance, and that is going to be plenty good enough for just about anything considering how overpowered machines have gotten in the last few years vs what people actually run on them.
Personally speaking I see no point purchasing the absolute bleeding edge when it is just going to become non-bleeding edge in 8-months when I can instead purchase two of something just slightly behind the bleeding edge at a much lower price.
These are just my observations.
-Matt
The problem I've had (Score:2)
Is that AMD chipsets have been buggy in my experience. Well, for the most part it seems like there haven't been actual chipsets made by AMD, they've always been third party like nVidia, VIA or ATi. At any rate they seem to have bugs, sometimes minor, sometimes severe. The worst was back with the original Athlons, I got one and could not make it work with my GeForce 256. I found out this was because the AGP bus was out of spec and didn't work the GFs at all.
That is one of the main reasons I've stuck with Int
Re: (Score:2)
Well, if you want to go that far, to Athlon and GF256 days, Intel had their share of problems too... (unstable P3 Coppermines just above 1GHz mark? Flaky motherboards with Rambus chipset & bridge?)
That said, yes, many chipsets for AMD had problems - but you could always find something solid. SiS chipsets which you don't seem to even remember were particularly impressive - perhaps slightly slower than Via, Nv or ATI alternatives, but absolutelly rock-solid and troublefree, on par with Intel (for example,
Re: (Score:2)
3 is a bit optimistic if you're calling it bleeding-edge.
High-end, sure, heck even ultra high-end, but as soon as a slightly better version comes out you're not bleeding-edge any more, and that often happens in less than 3 months.
Less Garbage From AMD (Score:4, Insightful)
Good Review of the 555 (Score:2, Informative)
It's a pretty decent/entertaining review. He also speaks about over clocking.
http://www.youtube.com/watch?v=CNcE3GND3sQ&feature=sub [youtube.com]
It's the hard drive stupid (Score:5, Interesting)
You know, 1 core, 2 cores, 3 cores, 1,000000 cores I have realize means exactly jack if the data they need to crunch is still sitting on frigen hard drive.
My processors and I would do flips and flops, if we could just get some dam data off our drives. Come on? We have basically not had a real leap in hardrive speeds or technology in how many years?
I mean solid states and all are great, but they still have a long way to go. What happens when we need to start pushing terabytes like megabytes?
We got a ram and catch arms race going on because, the hard drives suck and no one seems to be doing anything about it.
The best we can do are raid tricks to get any more performance (or reliability for that matter), and that has well known limits and problems.
Re: (Score:2)
+...when video processing is no longer a highly CPU bound activity, you will have a point there.
Until then... not so much.
You may be hard pressed to stress the network with a number of clustered multi-core machines. Nevermind overwhelming disks or HBAs.
Re: (Score:2)
Re: (Score:2)
You know that there is this technology called RAM, right?
Also, there is cache. And nowadays, opimization consists of fitting your algorithm in the cache. Then when it’s done, you take the next block. So what back in the days was swapping (to disk) is nowadays swapping (to RAM).
I recommend you buy yourself as much RAM as you can fit into your mainboard. That should help.
AMD (Score:5, Insightful)
If nothing else AMD serves to counterpoint Intel from being a monopoly. Further they actually make some pretty good chips.
I support AMD because they keep Intel in check. And as a bonus their chips aren't that bad.
Why is everyone pooh-pooing AMD? (Score:2, Interesting)
And the Athlon II X4 630 2.8Ghz 4-core processor is getting great reviews at newegg [newegg.com]with good potential for overclocking, even with the stock cooler.
br> There's a few great motherboard/CPU combo deals going on right now at newegg. QuadCore for $170 [newegg.com] and dual-core for $90. [newegg.com]
it _always_ depends (Score:5, Insightful)
When people are going to learn performance _depends_ on what you're going to process?
I remember, few years ago, having a server we had with an Athlon XP 2600 (its real clock was 2.1GHz AFAIR). A perfectly speedy machine for desktop usage, but as a server (pure CPU-load in that case, no I/O bottleneck) it was having a real hard time. We eventually replaced that machine and old 4x Xeon (P3-based, 500MHz), and things went to normal.
I already suspected what the problem could be, so I've decided to make a test replacing - temporarily - the Xeon-based server with a Sun Ultra 30 (1xUltrasparc II @ 300MHz).
Well, the Sparc not only survived the test, but also kicked hard the Athlon's ass. Still, as a desktop machine, the Sparc was mediocre.
The difference was that the Sparc had 2MB L2 cache, while the Athlon had only 256kB (even with 2x bandwidth and lower-latency RAM). In _that_ case the L2 cache made all the difference. Per MHz, the Sparc also won, by large margin, the Xeon machine (1MB L2 for each processor).
Athlon's (pre-64) performance compared to P4 (sorry, I don't have an i7 to compare against a X4) varies. For desktop usage the Athlons felt snappier in general, but with some performance "hiccups" when you started to tax the machine more. The P4s felt slower overall, but the performance seemed to be more homogeneous.
Which one was better then? Well, that's a good question. I personally preferred the "slower but smoother" P4, but Athlons were fine and I could recommend both processors for home usage,
You know what really, really suck?
Those benchmarks they publish around.
I mean "XYZ fps in Crysis"? mp3 lame encoding time? Synthetic benchmarks?
Those say nothing to me. Run some database benchmark, or measure the time it takes to compile the Linux kernel using all cores at once... Or move GBs of data in the memory N times etc. Then it might be interesting.
Re: (Score:2)
Wish I had some mod points today...
Someone please mod the parent up as insightful!
I only use Intel on mobile (Score:2)
For everything else, AMD's price/performance ratio can't be beat, Intel's superior marketing notwithstanding. It would cost me twice as much money to get an Intel processor and a decent Intel chipset mobo for the desktop I'm running right now. Quite frankly, I think this price differential is much better spent on a 128GB solid state drive.
Re:watts of boom (Score:5, Informative)
i know we're not to rtfa, but you're off by ~100w
for both phenom ii processors in the review
which are 80w and 95w.
Re:watts of boom (Score:5, Informative)
Having worked on these processors at the circuit level(*) I can tell you that your '100W over TDP' number is rubbish.
If you'd like to know more about what happens when chip vendors fudge on this "invalid metric" search for "nvidia bumpgate". If our chips were running at 100W over spec'd TDP we'd have a lot of very unhappy customers.
* yes, I'm an engineer at AMD and I designed major components on the parts discussed ITFA. I did my time at Intel as well.
Re: (Score:2)
Re: (Score:2)
Re:It's been a while since I considered AMD (Score:5, Informative)
Would you please elaborate on the "poor performance". What are/were you doing? Gaming, video encoding, or what? I have a 64-bit X2 dual in a system I built for $300. The only reason I considered a 64-bit processor was so I could stick 4gb of RAM into it, so please further elaborate on how "they burned (you) with their 64 bit processors". What additional benefit were you expecting from 64-bit architecture? I've used this machine for some CPU-heavy statistical/programming work (Natural Language Processing), and it performed adequately. It even handles high-detail Civ4 games well, despite using only onboard video.
The Atom is FAR inferior in terms of performance, so to answer your question, no. The Atom is designed for mobile computing, so it sacrifices performance for power-saving gains. This is meant to compete with intels low priced desktop-orientated CPUs.
Re:Why do you feel I owe you an explanation? I don (Score:5, Insightful)
Um, of course you are entitles to your opinion. However, if you want to air your opinion in the public square and are not willing to share any details to back it up, you're no better then the crazy dude on the corner talking about the faeries that visit him at night..
Re:Why do you feel I owe you an explanation? I don (Score:5, Insightful)
Just to join in the fun; if you post your opinions on a public forum you are expected to back up your claims with examples and logic. If you cannot do so, either because of personal beliefs, or other restrictions such as NDAs, then do not post them.
Of course, while you are certainly entitled to your opinion, that alone worth little on a discussion board. The merit of this system comes from the fact that others may examine your arguments, and either adjust their own beliefs, or reply to your data with their own data. Saying you believe something and not backing it up adds little to the discussion; none of us know you, so we cannot judge if your opinion really has merit. And do not be too surprised when people start trying to interpret your post and "putting words in your mouth." That just means you didn't explain things well enough, so they had to draw their own conclusions.
While I do not believe you are trolling, I do think you completely missed the point of the comment system, at least for this topic.
Re: (Score:3, Insightful)
You don't *have* to justify your opinion, but no one *has* to listen to it or give it any relevance.
By posting here it can be assumed that you want your opinion to be heard and considered and thus probably do care about people listening to it. Thus it would be assumed that you would justify your opinion and not respond in like a flaming mule.
Re: (Score:2, Insightful)
And the person who asked for more information said please! Imagine that. He was pleasant, and Blappo or whatever was rude in response.
Re:Why do you feel I owe you an explanation? I don (Score:5, Insightful)
Comment removed (Score:5, Insightful)
Re: (Score:2)
If you can't back up your trash talk, then the rest of us can just assume it's mindless noise.
If you make a claim, then expect to be called out it.
Your complete inability to articulate why exactly the AMD gear isn't fast enough is a pretty good indication that any objective speed difference is moot. ...and yes there is an objective difference. Some of us can even give firsthand accounts of those differences and actual numbers. However, I don't think it matters so much for the vast vast majority of consumers
Re:Why do you feel I owe you an explanation? I don (Score:4, Informative)
Why? I don'thave to explain/justify anything, so why would I?
Well, you demanded the same thing from other people [slashdot.org], to cite your own words:
"I want youto support your assertions. YOU MADE THEM after all" [sic]
So it's entirely reasonable to ask you of the same thing.
Re: (Score:3, Insightful)
Re:It's been a while since I considered AMD (Score:5, Insightful)
Intel processors in lower-end price brackets might often score a win, but only if you consider the price of CPU alone. Intel GFX is crappy. There's Nvidia integrated GFX available...but for some reason the motherboards with them are usually quite a bit more expensive than AMD ones. Cheap AMD CPU with cheap integrated GFX offers best all-around performance - as good as any other setup for "daily" tasks, definatelly more 3D oomph than comparatively priced alternatives.
Re: (Score:2)
Intel processors in lower-end price brackets might often score a win, but only if you consider the price of CPU alone.
Odd... Looking at a major webshop around here, there are 6 AMD CPUs to chooce from before you get to the 2nd cheapest option from Intel. And that includes a QuadCore from AMD.
Re: (Score:2)
Re: (Score:2)
By pure lack of funds I was pushed in past to buy Athlon 64 X2. Before I was exclusively with Intel.
By seeing how my 5yo 4200+ (2x2200GHz) CPU works now, I really see no point buying e.g. i7 9xx - which I can easily afford now.
I have looked at past benchmarks to try to find how outdated my CPU really is. Difference is at ~50-60% I'd say. But it works fine for most of the workloads I use it for: development/compilation, video and games.
Keeping that in mind, I find it outrageous now to even think of
Re:It's been a while since I considered AMD (Score:4, Informative)
Actually I think with Intel you can get burned more easily. Phenom II X2 3.2GHz tells you really all you have to now. If you are buying higher numbers you get a better CPU.
On the other hand with a Core2 the case is not that clear. Is a Core2 Duo 3000 MHz better than a Core2 Duo 2833 MHz? Nope, the former one is an E6850, the latter an E8300. And even those numbers won't tell you much. Higher model numbers are often better, but not always. For example the Q6xxx models have Intel VT, the Q8xxx don't.
That is not a big problem for us enthusiasts who get and understand every information about that CPU. But to less tech savy people I will always suggest AMD. Even if Intels good chips are better than AMDs chances are they pick a bad one and would be better served with AMD.
Re: (Score:2)
Q8XXX got VT in mid 2009.
I have a Q8300 and it has VT, see http://ark.intel.com/Product.aspx?id=39107 [intel.com] for further details.
Got a heck of deal on it, so I can always upgrade the CPU to a Q9XXX and still come out on top.
Re: (Score:2)
On the other hand with a Core2 the case is not that clear. Is a Core2 Duo 3000 MHz better than a Core2 Duo 2833 MHz? Nope, the former one is an E6850, the latter an E8300.
And the wildly different model numbers - the things they're sold by - tell me right off they're fundamentally different and need to be looked at closer. Without even looking, one's 65nm and one's 45nm. I pull up the specs and the cache sizes are also different.
Maybe it's just me, but that the difference between AM2 and AM3 was that the AM3 had a 2 after the name (Athlon II and Phenom II) didn't strike me as the most obvious way to advertise that change.
Conclusion: Buying parts because of a single number
Re: (Score:2)
That is not a big problem for us enthusiasts who get and understand every information about that CPU. But to less tech savy people I will always suggest AMD.
It's irrelevant to "less tech savvy people" because these sort of details simply don't matter to them.
Re:It's been a while since I considered AMD (Score:5, Informative)
What are you talking about? AMD64, also known as x86_64 or EMT64T was invented by AMD.
The performance is absolutely stellar.
AMD did this so well, Intel decided to try to copy them, and came up with Intel 64T.
As a whole, there is barely a noticeable performance difference between the two platforms.
Of course there are some low-performing 64-bit procs for budget users, just like there are slow Intel procs for budget users.
But overall, Intel 64-bit procs are no better than AMD 64-bit procs.
Also, when it comes to hardware virtualization and IOMMU, AMD has a very significant edge.
Don't blame AMD because you bought the wrong proc model for your system, or misconfig'ed it. Processor is definitely not the only thing that impacts performance. There are many other ways you can screw your system's performance in picking hardware components -- not all procs are ideal for all configurations.
Hell, i'm very often getting better performance with Linux and Windows (dual boot) out of my AMD Athlon 64 X2 5200+ Windsor 2.6GHz than with my Intel Core2 Quad Core Q9400 2.66Ghz, and much better benchmarks for certain types of workloads.
Even though the Quad Core machine has 8gb of RAM, and my dual-core machine only has 4gb...
I blame it on the Linux and Windows kernels' poor support for multi-processing and seedy memory management.
Re:It's been a while since I considered AMD (Score:5, Interesting)
A couple of years ago I ran two very similar five day long geophysical jobs (pre-stack time migration) on an 8 CPU AMD system and an 8 CPU Xeon system of equivalent speeds. All CPUs were at 100% over that time with the exception of some disk access at the start and disk writes every twelve hours for checkpoints. There was a five minute difference over that week and the margin of error was probably more than twice that.
I haven't been able to tell the difference since then either.
Re: (Score:2)
I blame it on the Linux and Windows kernels' poor support for multi-processing and seedy memory management.
Compared to what ?
Re:It's been a while since I considered AMD (Score:4, Interesting)
So say you, but can you prove it was an issue with the processor, and that it was a design issue, do you have information backing this up?
I think slashdot readers might be interested in the remarks of someone more experienced with both AMD and Intel processors, rather than someone who tried an AMD CPU once, didn't do due their due dilligence, and just assumed all AMD procs were broken because their system was.
It's happened too many times to count that I got a defective Intel processor that had the thermal monitor "broken" in some way that caused the proc to always throttle its clock down.
Chips were replaced under warranty, and then all was well. Every manufacturer had bad batches, that's why you do burn-in testing on CPUs, memory, and motherboards, before deployment.
I've dealt with different systems totalling a few hundred different AMD CPUs, and not run into any defective ones yet, or caveats to 32-bit or 64-bit AMD procs.
I'm not saying Intels are unreliable or anything, and I hope i'm not jinxing myself: but so far, all (perhaps) 10 DoA or otherwise defective CPUs i've seen in my life were Intel processors.
Re:Speaking of"readers" and "due diligence" (Score:4, Insightful)
Are you honestly arguing that a poster's choice of "an" vs "some" disqualifies his entire argument? Getting into semantics much? The point still stands that he tried a very small number of CPUs, and by virtue of that small number, his opinion is not likely to be worth much.
Perhaps if the original poster said he ran a cluster of a thousand AMD CPUs, or even just tried several different generations of AMD CPUs your point would have merit. However, a person is not a fanboi for pointing out obvious inconsistencies, regardless if he mis-remembered a not particularly significant number.
Re: (Score:2)
Re: (Score:3, Insightful)
but AMD64 is a very minor extension to x86 and leverages SSE.
Load of [wikipedia.org].
Intel had a 64-bit extension in the 90s ...
Do you mean PAE [wikipedia.org]? Then it's totally different story and btw it is still supported and used when 32 OS has to access more than 4GB RAM (in a limited way).
Interesting. To what then you would attribute Linux uprising then? It was precisely because enterprises got tired of *nix vendor lock-in into expensive hardware - which already in 90s was underperforming compared to x86. Linux allowed to move many legacy *nix applications to cheap OTS hardware and that actually how it (Linux) made the first inroad
Re:I'd just like to say FUCK YOU to the "troll" mo (Score:5, Informative)
1. Screaming and yelling at a poster for being a idiot because they do not agree with you,
2. Failing to recognize that the commenting system on
3. Not understanding that capitalization of words is to be done in accordance with proper rules of grammer, and not as a means to yelling louder over your percieved persecution.
Thank you, the AC who burned 3 mod points on one poster in one article. Posting as AC, obviously, to preserve my moderation.
Re:I'd just like to say FUCK YOU to the "troll" mo (Score:5, Funny)
You were modded down because your comment provided no useful information for Slashdot readers.
As far as I'm concerned, you said AMD CPUs were "garbage" while refusing to back up these claims with supporting evidence. I and a lot of others visitors know from personal experience that AMD products aren't garbage. If you're going to make this claim, you'd better back it up with meaningful performance metrics.
And by that I mean
The moderators read your comment which , IN THEIR OPINION, underperformed.
Re: (Score:2)
Re: (Score:3, Informative)
Your posting history clearly identifies you as a troll, so you don't get the benefit of doubt anymore. If it looks trollish, and it comes from you, then it will be considered one.
Using ALL CAPS and calling people idiots doesn't help, either.
Re: (Score:2)
Yes, we do, but on the other hand, Kwh/flop is much lower now than before.
Re:AMD=Awful Macro Devices For A reason (Score:5, Funny)
Re: (Score:2, Insightful)
Re: (Score:2)
Not agreing with the troll of course (from what I see it's not worth it for me to even check what he specifically wrote), but Intel is still more open, in practice, in one important factor.
Fully usable free software GFX drivers on the Linux side. You can have a trouble-free experience not relying on binary blobs (well, excluding BIOS/etc.).
Re: (Score:2)
AMD/ATI also have fully open drivers available, although they are not complete by any means an older ATI card running open drivers still seems to outperform an up to date Intel card.
AMD are still releasing documentation for their latest cards, so hopefully the driver situation will keep improving.
Re: (Score:2)
The point is more that Intel drivers can be called "production ready", more or less, right now. AMD ones...not so much. I'm hearing "next year" a bit too long for my taste.
And that says somebody with old R200-based card lying around, which has rather decent OSS drivers (actually, only those are left supporting it)