It would be a shame if after what, 4 or 5 years? of being a strong competitor, AMD loses focus on what is relevant in the market and just keeps making faster, more power-hungry, CPUs.
The last thing we need is for Intel to have no real competitors. Innovation would slow and prices would hike up.
It's not as if AMD was suddenly not making fast CPUs, just that Intel's best was faster than AMD's best.
There was just a little issue of whether they make the fastest. I would expect that few people would notice the difference in speed between a top AMD and a top Intel chip in actual use.
Afaict intel and amd both make significant profits from selling the cream of thier production at huge markups. It is much easier for AMD to do that if thier cream is currently better than the competitors cream (the same applies to intel but to a lesser extent because intel is the gorilla with lots of contracts and marketing).
I just bought an Intel Core 2 Duo, and I love it. But I was beginning to worry that AMD's rocky quarter, lack-luster product line up, and soon to be cut backs, might lead to a less competitive playing field. But I'm excited to hear that AMD is still in this fight and will be upping the ante for my next PC purchase.
"Up to" is sugar-coated for "You can't expect any better than this" with a implicit translation of "It can get a whole lot worse".
Ex: If CPU X get "up to" 100% more performance than CPU Y, but in all tests but one, actually has 1% of the performance, I'd rather have CPU Y.
"Up to" means nothing to me, except as an advertisement for the competator; whichever has the least unpleasant average and worst case performance is the one I'm interested in.
"Up to" means nothing to me, except as an advertisement for the competator; whichever has the least unpleasant average and worst case performance is the one I'm interested in.
And those numbers would be indicative of anything either. The problem with CPU benchmarks is that they have no real world application; Everyone has different needs. However, the marketing types for both the suppliers and consumers need numbers to push in front of each other, so they make up these things which those of us in the field understand have no real world meaning.
Actually, benchmarks DO have some real-world meaning - but only for comparison. If your specific needs happen to be similar to the things benchmarks stress, then you can expect the results to be relevant. If your needs differ wildly from benchmark methods, then you can expect the results to be irrelevant - but most likely they will be equally irrelevant.
Benchmark performance is, at the very least, a better indication of relative performance than clock speed of cache size.
Not true. I have a top of the line computer from 3.5 years ago, and it cannot play high def trailers. Combine this with lots of flash video, video conferencing, etc. and you have various cpu needs that need to be met. It is hard for me to believe too, but flash and quicktime are pushing the needs of people's hardware.
Not true. I have a top of the line computer from 3.5 years ago, and it cannot play high def trailers. Combine this with lots of flash video, video conferencing, etc. and you have various cpu needs that need to be met. It is hard for me to believe too, but flash and quicktime are pushing the needs of people's hardware.
This is an indication that the code sucks. You should be able to play almost anything with a processor rated as 3000+ or
better. I am sure that you can find some unrealistic 1080p/MPEG4 cont
Athalon 2700, gf6800, 720p Quicktimes run at about 10 fps, don't know why. Pretty fresh install of WinXP and I don't use IE, but spyware never seems to outside the realm of suspects.
The programs that make up SPEC *are* real world programs that have been selected by the SPEC committee as representative of typical workloads. The mix of programs that make up SPEC change over time to reflect the fact that CPU's become more capable. In particular, many older benchmarks in the SPEC suite have been dropped because they now fit entirely in L2 cache.
All that said, this is the hardware equivalent of Fredrick Brooks maxim that 'the Paper tiger is always better than the actual one unless realit
When the fastest Barcelona is ~2.5GHz and Clovertown is 3.0GHz, comparisons at the same frequency are pointless. What matters in reality is performance at the same price or performance at the same power or highest available performance at any price.
that article happens to state that 1) only the 3800 is an EE chip. 2) they're running on one of the most power hungry motherboard chipsets made for AMD: the nVidia 590 SLI 3) only the X2 5000+ is a 65 nm CPU So basically, let's stack the deck as much as possible against AMD in this test without showing a best case scenario, while postulating that they're showing a "worst-case scenario" with a "bad E6300 sample".
I like Anandtech usually, but this article could almost have been sponsored by Intel and is far from
I agree with all that. IIRC, advantage is to AMD for idle power consumption also; OTOH, with virtualization on the upswing, CPU's are less and less idle these days. Speaking of virtualization, AMD seems to be doing the right thing there... nested page tables and so forth, along with the L2/L3 cache combo which they tested out as being better for virtual environments.
AMD has an advantage over Intel at any given process size, on the grounds that the IBM-AMD SOI process is basically better at any given process
Chipset to cpu and cpu to cpu link with intel you have to use the chip set for one cpu to talk to another one. Also If amd where to copy intel and put 2 dies on the same cpu they will have a better link for them that will not eat up chipset to cpu bandwith.
This post is very confused. First of all you can't put two dies on the same cpu, or at least it would be a horribly bad idea. You can put 2 cpu's on a die. Now I thought AMD already did this but they could just package several chips together and I'm feeling too lazy to look it up.
Anyway, yes for Intel chips they must communicate over the FSB. However, as I've recently been finding out they don't do that much communicating. For instance most cache state info is generated just by listening on the FSB. Th
Over the past week we have heard about Intel's dominance and flashy new products, AMD's disastrous quarter, and now AMD's supposedly dominant new offering.
I read tech news daily and am getting sick of the media wars... It is no wonder casual users get fatigued trying to keep up. Casual opinions depend on which day (or week or month) a person chooses to research product offerings. It is no wonder I am always hitting a brick wall when trying to get my users to educate themselves so they can get more out of their tech. They don't know what to make of all the posturing.
This is not a function of the tech world developing *that* quickly. It is a result of the major players trying to out-strategize each other. I don't want to see anymore benchmarks (or hear about anymore promised software) until I am standing in front of a demo machine that is running the tech.
Well, blame Intel.;) Seriously though, Intel's got the performance lead for now, but AMD's got the better tech and their release schedule "lags" Intel's a little. So Intel got the "jump" on AMD release cycle wise, and now you've got the situation where Intel has a brand spanking new product out that beats AMD's old offering by about 10-20%, at best at stock speeds.
I personally am waiting for AMD's release and benchmarks before making a final decision, but the fact that I'm doing so already says which way I'
FWIW, it seems to be AMD doing all the posturing. Intel seems to have taken a "no response" approach to media claims, instead producing product and letting guys like toms hardware do their thing. This isn't to say they don't advertise, but they don't take out full page NYT (or was it washington post?) ads chest pounding like AMD does.
That's why we're constantly hearing about the performance advantages of Penryn. If anything, AMD has been a bit quieter than Intel. Compare the references on the Wikipedia pages for Barcelona and Penryn if you want evidence.
Well, as much as I think Intel usually gets a bad rap on slashdot and similar places, in fairness I out to point out that this is really easy to do when you have the performance crown. Taking out ads bragging about their superior performance would mostly just give people a reason to doubt intel (if they are taking out ads does it mean it's in doubt?) while AMD had better take out these ads as at the moment no one else is going to do it for them. If the situation ever reverses and AMD's strategy to keep up s
I think this has more to do with the fact that "Intel Inside" and such have been ingrained in people from Intel's past advertising. The general public is much more likely to have heard of Intel than AMD, which means AMD has a much greater need to get their name out there than Intel.
Well, fact is, when it comes to CPUs, in the end, whether Athalon, Sempron, X2, Core 2, Celeron.. all these things over the last decade, if you spent X on one product (not including server chips) the difference in performance was generally measured in a few percent. On the other hand, buying video cards has gotten BAD. The numbers, ratings, even the price points have become seemingly almost random. The mark up is enormous at just about any retail chain. Higher price cards can have lower performance sometimes
Indeed. I work in a law office as a graphic designer/web designer/video editor. That's what I do all day (when I'm not reading slashdot).
2 of our attorneys just got quad-core Mac Pros with Studio displays. For writing documents on. Maybe the occasional slide show. I'm stuck on this 3-year-old Dell with dual CRT monitors. Old ones.
Sorry, just had to bitch a little. Your comment is more real-world than you may have realized.
Dude, convince them to get you a cheap mac mini and set up Xgrid clustering on those quad core mac pros. You'll be leaching off their processors in no time.:)
[...] Learn how to sell and start earning what your [sic] worth.
I hate selling. I'm damn good at it, but I hate it. To quote Zoidberg:
It's all so complicated with the flowers and the romance and the lies upon lies!
Selling's about kissing ass and pushing off whatever it is you're selling on whoever has a wallet, no matter what their needs. Ok, at the retail level maybe not so much. But any sales job that pays close to 6 figures, yep.
Yeah, I'll pass. And I'll be doing what I want when I leave work at 4:3
I think the bitch was more that the budget was being used in such a way that it was not doing any good for anyone. Partner X & Y don't need five grand machines to write up their depositions on. Instead a better use of the budget would have been to buy them some nice three grand machines and spend the other 4 grand on an upgrade for someone who's work relies more on the amount of computing power they have than those attorneys. But of course you just wanted to troll. I wonder how much you have invested
Last time I worked at a law firm, partners didn't write depositions. They signed depositions they had associates write. On the rare instance they had to "write" something themself, they did no such thing. They dictated and handed a tape to their secretary.
I think one of the major reasons why AMD did so poorly last quarter was its silly marketing campaign. Towering signs on billboards and large airport ads tout AMD as "smarter choice", since it uses less power.
Marketing a chip as using less power is the same as having Toyota make an exclusive advertising campaign toward wheel-chair bound people: the group you're targeting has few people in it and they're going to research any product they buy. The server market is important, but when I buy my shiny new server, power consumption isn't my first consideration, nor is that the only thing AMD offers.
With this announcement, I'm hoping AMD starts a new slogan touting, say speed. That's what I buy a processor for primarily. AMD's always been fast for the cost and it's high time they market themselves as being faster and better rather than being "as good as" Intel. My new pick for a marketing slogan? "Upgrade to AMD" AMD should position its chips to be slightly more expensive at every pricing tier, but in doing so, blow them away in performance. (In the present economy, businesses have money and will gladly spend more money on products they feel are superior. Ford spends more money on marketing than BMW (but which would you rather own?). AMD should be trying to make Intel look like Ford, rather than being the "Ford alternative".)
AMD is marketing to a minor concern of a niche audience, while they ought to be using their superior performance (at a given price point) to sell hardware. Would you rather be a "power saver" or "upgrade to AMD".
The server market is important, but when I buy my shiny new server, power consumption isn't my first consideration, nor is that the only thing AMD offers.
That's nice - but when we look at purchasing $250k - $500k of servers, power consumption as an important factor.
Back in the days when dual-cores were just beginning, this indeed was HUGE. Do you want 30% more Irwindales which would require 100 tons of cooling, or the AMD dual-cores which require 30 tons of cooling? The same is going to happen at the dual-core/quad-core boundary.
As CPUs are cheaper and cheaper and A/C systems remain a constant cost, the people who spend large amounts of money are going to look more and more at power costs. They're probably aiming at business customers who don't buy *a* server, but buy a *hundred* servers.
Exactly. When you have a datacenter (or even a room) full of servers, the amount of heat you have to dissipate is very important. Electricity and cooling costs are huge. And unlike a server which you buy once and may use for 3-5 years, you pay for electricity and cooling all the time. The electricy/cooling costs over the lifetime of a product can often cost more than the server itself, so anyone not looking at the power consumption of their systems as a high priority item (desktop, server, anything) is doing
Sorry, but performance/watt has become one of the major metrics by which CPUs are chosen, especially in business and double especially in business server rooms. Performance/watt is the new performance/$, because wattage tells you how much the hardware is going to cost on a continual basis for both the electricity to run it and for the necessary air conditioning. These costs dominate over the initial cost of the hardware, and are thus more important. Plus, if you have limited space you have a limited heat capacity, and higher perf/watt means you can get more perf in your server room.
My point was that the metrics are important, but the people buying the servers are going to do research regardless of the marketing material. If AMD marketed itself as "better" you'd consider it just as if they marketed themselves as "power-saving". While AMD may gain some traction in the server farm market from the ads, the waste the opportunity to gain traction in any other market by targeting a niche, and in doing so make people who want to buy AMD chips for other reasons think twice -- "You can have a
My point was that the metrics are important, but the people buying the servers are going to do research regardless of the marketing material.
Tee hee hee! Of course, that's why there's all those ads in computer trade rags and the wall street journal, because the people buying servers always do their own research and never believe marketing materials.
AMD should market to their entire market.
Based on what? They can't exactly claim top performance right now. They can't claim longest battery life for laptops
Would you rather be a "power saver" or "upgrade to AMD".
Be a power saver.
When Intel came out with Centrino, I bought one almost at launch. When AMD came out with Winchester, the Athlon 64 that made the gigantic leap in price/performance/watt, I bought two. When nVidia started making lower-clocked GPUs that didn't need a fan and wiped the floor with ATI in price/performance/watt, I bought three over 3 years (6200, 7600, and now, 8500) (that was the main reason - the other one being ATI's shitty drivers for Linux). Now, I'm looking at a new ultraportable, and AM
Wait a second... So you want AMD to miraculously make their CPUs faster at any cost (they recently tried to, with less than impressive results) and additionally raise the prices by an amount not prroportionate to the speed gains? AMD took and held its chunk of the home user market by being cheaper and/or better then Intel. Once they start having worse pricing (and also horrible efficiency) it's time to consider jumping ship.
Also, forgoing power (and thus heat) efficiency isn't going to make them any frien
Now if you are merely speaking of home use, but even there, I want power for less cost too.
And I assume there's a strong correlation between power and noise. Most people's homes are quiet enough for a loud machine to really call attention to itself.
SSE4? Please, don't get distracted over little things like whether or not I can cook!
SSE4? I'm not buyin' either AMD or Intel until they're at least at SSE256. What's that? It'll take a while? That's OK, I don't have the monies to get them now anyway. <sarcasm/>
For my type of workloads, straight SSE2 is still just fine. I'll take an improvement on that now instead of, say, waiting for the x86 world to match AltiVec instruction-per-instruction. But i would go for a wider ISA - give me 4x64bit registers
What's really relevant to me is the performance per dollar... not just dollar of CPU cost, but also dollar of whole system cost (including software, if that goes above zero), and dollar of energy cost (including the cost of shoving waste energy out the back door in seasons I does me no good to keep it indoors).
Why? That observation seems spot-on to me. I recently bought an AMD system instead of an Intel one due to price. I've been primarily a Mac user for the past 18 years, and I needed to replace my aging G4, so I had no opinion either way as far as brand went. I chose my performance level based on benchmarks at Tom's Hardware and other places, and then I sought out to build a machine that met those minimum specs. While it looked like I could get a faster Intel machine if money were no object, the AMD's pri
It would be more relevant to know how does it perform real life tasks, eg kernel compilation time comparison...
Interesting definition of "real life tasks" you have there.
For the majority of the computing population, I would suggest that "real life tasks" would be more accurately defined as downloading and playing porn, rendering MySpace pages and running Norton Antivirus together with the 28 different systray applets installed by Dell during the manufacture of their shitwreck of a PC. Furthermore I would al
I'm all for heated competition, and it's great that AMD can claim integer performance supremacy on the high end again for a while. But at what cost do they make that claim? The article mentions that the 8222 SE is priced at $2149. So if I want a system with more than 4 cores, I'm bound to pay ~2.5x as much per processor.
I can get a workstation with 8 3Ghz Clovertown Xenon cores from Apple for just under $4000, 8 Opteron cores at the same clock will cost me more than twice that for the processors alone, nev
Sorry, AMD, but I don't get my panties in a bunch over CPU speed any more. The CPU isn't the bottleneck that it once was. Truthfully, I have not seen a significant benefit to higher CPU speeds since circa 300MHz days. Except for gaming, things seem to always work about the same speed. The rate at which I can type this message is limited not by CPU, but by my fingers; the speed with which I browse the web is limited not by CPU, but by my ability to skim for content; the speed with which I get real paying w
Wasn't intel recently showing some Penryn benchmarks with up to 50% improvements depending on application.
All pointless till we have a 3rd party compare Penryn to Barcelona. I imagine neither will have much impact till 2008 as both will be production limited this year.
Really I find my current PC fast enough. What I want is lower power and heat for the entire system. Now if AMD can produce a cheap and silent system with good graphics performance I am all for it. Say something as fast as an X24400 and an Nividia 7600 GT all for about $300 then you have a winner. You will sell millions. A quad core system? I just don't need it yet.
If there was a way to lock in 50% of that performance for encryption and malware scanning and all the other security gorp that's killing us, that would be great.
Please don't dump it into another golly geewhizbang video or multimedia processor subfunction on the chip.
Since AMD keeps pushing the ship date of Barcelona out (now Q3) and Intel keeps pulling ship date of Penryn-gen quads (Harpertown) in (now Q4, maybe Q3?) the relivant comparison is not going to be between Barcelona and Clovertown, but Barcelona and Harpertown. I suppose since, AMD only wants to compare same-clock chips (probably because they won't get higher clock Bars for a while) they may start argueing that we should only be allowed to compare Intel's old 65nm products (not the 45nm) to 65nm Barcelona, to
When Apple chose Intel chips, a huge mindshare of tech enthusiasts became enamored of Intel. I look into myself and see the irrational plus sign in front of Intel and minus sign in front of AMD, though I'd been a huge fan of AMD before. What the f? Both companies are just awesome innovators. I hope they keep sharpening each other. proverbially, as steel against steel. Gotta love the geniuses in our midst.
Yeah, and notice that they say "at the same frequency", when Intel currently has a frequency advantage (not as big as P4, but then again Core 2 isn't an IPC dog like P4 was). Not that I expect any minor improvements Intel makes in the next 60 months to produce their own 50% leap in performance, this comparison still seems very suspect. As in pure marketing BS.
However AMD doesn't need to attempt become relevent again. They are currently very relevent. Did Intel become irrelevent when they were behind AMD on performance? No. In the past, AMD did lose more by not having the performance crown, and one could certainly imagine the momentum they were gaining in the K7 days fading quickly if Intel had come out with a superior chip. But today, AMD has both the marketshare and the OEM support to be merely competitive performance-wise and still be relevent. So they lose out at the top speed grades. If they can continue to match up their products to Intel's at lower speed grades, and they will, then they will continue to be a good choice for many people, and will definitely still be relevent.
Note that AMD has 3GHz Opterons out now. The frequency advantage at the moment is slim to non-existant in shipped CPUs.
In any case, what matters is what Barcelona will ship at, which has not yet been specified. In any case, if Barcelona lives up to AMD's stated expectations on performance, it will be a killer CPU. Your statement about Intel's potential improvement leaps are spot on, and fall into Inforworld's Tom Yager's statements about Intel which are essentially phrased as "Core 2 is Intel's last hurrah". Why? Because Intel is essentially running on 10 year old technology and is rushing to catch up, despite some of the nifty architectural things they did recently to speed up C2D (integrated L2 cache for example).
I also believe that Intel is now following AMD's lead by leaving extra headroom for those that prefer to OC their CPUs and concentrating more on TDP and stability. I've noticed that Intel's chips since P4 are certainly more stable, while my rather severely OC'd AMD CPU occassionally (twice this year) shuts down, most likely due to heat or a RAM instability (since the shutdowns happen during low usage periods at night, I'll bet the 20% OC'd RAM is probably to blame).
Basically, right now Intel owns the crown, but they own it while comparing to AMD's last gen CPUs which are 3+ years old.
Intel is also leaving a big gap for AMD to fill with the price of their quad core cpus. A grand? AMD should be able to compete with that. I just want a shuttle with four cores, and if AMD delivers with procs that are 'only' $500 that will be pretty significant.
Intel has already announced major price cuts for the Fall. I can't see how this does anything other than hurt Intel and current sales. For instance, I'm sure I'm not the only one that sees that and goes... hmm, I might have considered a new Intel CPU, but the prices are dropping significantly in a couple of months. And then there's the AMD roll out at the same time, maybe I'll wait and see what that brings to the table.
For the last 30 years, all of semiconductors has been based on "and it'll be even cheaper in six to nine months". It has always been "buy it now, or wait six to nine months for the next cool stuff".
Actually if Intel is running on 10 year old chip design technology (I presume this is primarily a remark about the FSB) then this suggests they have the potential to *radically* increase performance over AMD. If Intel's process advantages and chip design teams can actually gain a performance advantage while using 10 year old technology they can sit back and pick the low hanging fruit (changing to modern methods) and gain huge performance boosts while AMD has to do truly innovative things to gain any performance increases.
Frankly this sounds more like fanboi talk than a serious analysis. If your goal is to diss Intel and give AMD props then saying they are using 10 year old technology makes sense. If you are actually trying to argue that AMD's future is much brighter than Intel's it's totally non-sensical. If Intel can gain huge performance benefits just copying stuff AMD is doing now while AMD has to make huge advances just to stay competitive I know who I would put my money on.
Frankly this sounds more like fanboi talk than a serious analysis. If your goal is to diss Intel and give AMD props then saying they are using 10 year old technology makes sense. If you are actually trying to argue that AMD's future is much brighter than Intel's it's totally non-sensical. If Intel can gain huge performance benefits just copying stuff AMD is doing now while AMD has to make huge advances just to stay competitive I know who I would put my money on.
Err, you're assuming Intel in fact has a brand spanking new architecture up its sleeve, ready to be rolled out at any given moment. State your sources please. Otherwise, let's not speculate on the future x 2.
Exactly. Netburst was their new technology that was supposed to be "the future" and we all know how well that worked out. It really is quite impressive that they've done as well as they have at evolving the P3 architecture. But as an AMD fan, this scares me because I don't know what they're going to do
Intel has been doing a lot of work to get around the relative limitation of not having an on-board memory controller. When they integrate the memory controller, quite a bit of that work will be irrelevant. Sure, being smart about how you cache things is good. But intel has done what they can to reduce their dependence on a bottleneck. When that bottleneck is removed, they won't necessarily be able to scale to full use of the new bandwidth. I think it is AMD that has been increasing the performance by making
It's not purely about FSB, although that is a major short coming for multi-core designs. Intel won't radically improve their performance by going with a hypertransport design, but they'll improve their scalability (the quads are already beyond the maximum FSB - see the Xeon benchmarks for Mac Pros that's available). The 10 year old design refers to their falling back to the P-III core and then adding in a bunch of cache logic, fab processes and other more minor tweaks to improve that core's performance to w
Wouldn't it be great for every app to essentially have its own virtual world? That would indicate a much lower potential for harm inherent to the OS architecture.
Not quite there with you on this one. Take a UNIX environment for example. There are a lot of 'localhost' services which are required for a VM to be useful. It's obvious for a UI front-end, but even in the back-end you've got cron-tasks et-al. Whatever the OS or technology, these boiler-plate services consume resources, often in a wasteful way (d
Intel can't copy AMD's advances. They'd have to license some tech to do so, and that's not very likely (I believe HTT for instance belongs to AMD).
Actually Hypertransport is an open standard and anyone can implement it. AMD doesn't have the clout to force proprietary standards on the market, so their only hope to have a standard adopted is to make it open and royalty free.
Which is why Intel will (probably) never implement it. They aren't interested in standards which they don't control. They already don't like the fact that AMD is cross-licensed for all x86 tech, which was part of the motivation for creating the entirely separate IA-64 ISA. When IA-64 failed and Intel was forced to implement x86-64, the only reason they used AMD's spec was because Microsoft said that they would only support one x86-64 ISA, and AMD got their first. Basically it took MS to out-monopoly Intel. So unless they are forced to use HT, they won't, and I can't see any way they could be forced. They may implement something similar -- they will have to in order to address multi-socket scalability -- but it will not be compatible.
AMD would love for Intel to copy their tech. Every time they do, it makes AMD look like the leader and Intel the follower. You could practically hear the screams of orgasmic joy from AMD when Intel announced EMT64.
The P6 micro-architecture of the Core 2 line is older than the PIII as it debuted in 1995 with the Pentium Pro. However, the P6 core was a solid one and was vastly reworked into the Core 2. The K10 is a derivative of the 1999 K7 Athlon, and like the Core 2, it's a very tweaked and reworked version of that.
The newest x86 architecture was killed off last year with the late 2000-era NetBurst being scrapped after it hit the thermal wall and the K8s smoked them. Newer isn't always better.
another thing, yeah they don't need to become relevant again, they are still. But, The "At the same frequency" has me a bit confused. since AMD has often had a higher performance per frequency, That's not news, however the "same frequency" did not = "same pricepoint" and a similar freqency product from AMD cost more than from intel. example a 2 Ghz Athlon XP (3000+) was more than a 2.0 GHz P4. while the similar pricepoint 2000+ ran at 1.66 Ghz...
I'd like to see a price-point comparison...
Note that AMD's claim is to be faster "at the same clock". When the P4 was pushing clock speeds into oblivion, AMD stressed the point that clock speed is irrelevant -- what really counts is how fast the system runs your software. How you get there is quite beside the point. How odd that AMD is now using clock speed as a key indicator. Intel is already shipping 3GHz Clovertowns, and the article states that AMD has not released the Barcelona clock targets. It they ship substantially below 3GHz (2.4?), then
Note that AMD's claim is to be faster "at the same clock". When the P4 was pushing clock speeds into oblivion, AMD stressed the point that clock speed is irrelevant -- what really counts is how fast the system runs your software. How you get there is quite beside the point. How odd that AMD is now using clock speed as a key indicator.
Ummm... they're not. If they were using clock speed as a metric, they would be saying "Look! We're running at 3.5GHz and Intel is only running at 3GHz!" while completely ignoring the actual performance -- exactly what Intel did all those years. They are instead talking about performance-per-clock-cycle, which (according to this) means that a 2.66GHz AMD chip would be considerably faster than a 3GHz Intel chip. We can expect them to continue touting the overall performance rather than raw clock speed, since they look better from a performance standpoint and worse from a raw clock speed standpoint.
How is that different than what they've been saying all along?
That's exactly it. People for many years bought Intel chips because they said 3ghz on the box while the AMDs said 2.2ghz. What people took a long time to figure out is that Intel was just bumping the cycles up so that it sounded faster, while the AMDs were getting more work done per tick. I always equated it to engines. You can have a 4 cylinder engine that makes 200 horse power at 7,500 rpms and a 8 cylinder engine that makes 200 horse power at 4,800 rpms. Even though one may have nearly double the rp
How is that different than what they've been saying all along?
Because they've been saying all along that it is overall performance that matters, while here they are keeping the discussion only in the realm of IPC. Performance in Instructions/second = IPC * Hz. If you talk only about IPC or only about frequency, then you're leaving out half the equation that results in the real number you want.
So while they are at least talking about benchmark scores and actual performance (can't really talk about IPC with
So while they are at least talking about benchmark scores and actual performance (can't really talk about IPC without them), the problem is that they are leaving out the effect of frequency. In the absence of any other information, "40% more IPC" isn't really any more informative than "40% more frequency". We can guess what frequencies Barcelona might come out at, and thus make some use of this information, but it is definitely a difference from AMD's old statements.
Not really. Based on projected release dates, AMD is basically just releasing the results of simulations. They almost certainly don't have real silicon yet, and until they do, it'll be pretty hard for them to guess how fast they'll run.
Since the projected release date is in the next six months, I'll say it's quite certain that they have real silicon. It may not be the silicon they end up going to production with -- historically it's unlikely that rev A0 ends up being the production rev -- but they must hav
"40% more IPC" isn't really any more informative than "40% more frequency".
This depends. If they say "A has 40% more performance @ 2GHz than B @ 2Ghz", comparing their relative performance is just a matter of math. A at 2GHz will have the same performance as B at 2.8GHz. Saying "A will have 40% higher Hz than B" and not mentioning the performance per Hz (á la the 90's) makes it impossible to make any performance estimates.
It also says a little about the performance-gains one can expect over time. A certain chip-die will not increase it's performance/Hz ratio during it's lifespan
I believe that AMD is saying how Barcelona will perform compared to the chips available today, not the chips Intel will have down the road. This is their way of saying that if Intel suddenly is able to release much faster chips between now and Barcelona's release, this claim by AMD won't reflect THAT.
So, current Core processor machines are what AMD is making claims about, not the 45nm chips that Intel is currently working on but has not released yet.
They probably mean that they'll be running 50% faster than the chips that Intel will have out at the same time (ie, the Penryn cores I believe). At least, that's my take on it.
More like an attempt by AMD fanboys to make up for the past two negative articles on AMD by declaring that they will "Outpace Intel by 50%" in a Slashdot headline. Just to make themselves feel better about their choice of processor team to root for. Me? I go for whoever's best, and right now that's Intel.
AMD's release will be first, so they'll be in the lead for a little while - in exactly the same way that Intel's Core 2 Duo release came before Barcelona.
In the mid 90's, Intel stuff was pretty obviously better. Then AMD was pretty obviously better for a while. Now it's really interesting - they're basicly neck and neck with each pulling ahead with new releases and then falling behind again.
Well, I highly doubt that AMD's integrated memory controller's ability to hide the latency can scale any faster than Intel's cache tech. After all they basically control latency in the same manner, smartly guessing what memory the chip is going to need and making sure they keep the cache filled with that data. Of course the on board memory controller means that the latency starts at a lower place in the first place. Though it could be that the extra latency of going across the northbridge gets worse as c
Unfortunately for AMD they could end up not being able to take full advantage of their superior (large scale) architecture if OS vendors don't provide better NUMA support.
If there is any problem with the NUMA code in Linux, AMD is welcome to fix those problems themselves, instead of waiting for someone else to do their job for them.
I'm hoping for something really interesting for next year's machine, a minimum of 8 cores in a maximum of two sockets is what I'm thinking
Yesterday I booted up something like that with knoppix - it was a joy to see 8 penguins on boot. You can get two intel based 8 CPU machines in 1U now (two sockets, 4 cores, power supply in the middle and then the other board with 8 cores) - it will be good to see AMDs four core options since their dual core opterons on 8 sockets is already impressive if on a much bigger
Bring on the competition (Score:3, Funny)
i certainly hope so (Score:4, Insightful)
The last thing we need is for Intel to have no real competitors. Innovation would slow and prices would hike up.
Re: (Score:2)
The last thing we need is for Intel to have no real competitors. Innovation would slow and prices would hike up.
Re: (Score:2)
There was just a little issue of whether they make the fastest. I would expect that few people would notice the difference in speed between a top AMD and a top Intel chip in actual use.
Re: (Score:2)
Afaict intel and amd both make significant profits from selling the cream of thier production at huge markups. It is much easier for AMD to do that if thier cream is currently better than the competitors cream (the same applies to intel but to a lesser extent because intel is the gorilla with lots of contracts and marketing).
SPECint_rate_2006 vs SPECint_rate2006? (Score:5, Funny)
Yes!!! (Score:2)
-Rick
Re: (Score:3, Insightful)
I would be prudent.
Using "up to" in benchmarks and comparisons... (Score:5, Insightful)
"Up to" is sugar-coated for "You can't expect any better than this" with a implicit translation of "It can get a whole lot worse".
Ex: If CPU X get "up to" 100% more performance than CPU Y, but in all tests but one, actually has 1% of the performance, I'd rather have CPU Y.
"Up to" means nothing to me, except as an advertisement for the competator; whichever has the least unpleasant average and worst case performance is the one I'm interested in.
Re:Using "up to" in benchmarks and comparisons... (Score:4, Interesting)
And those numbers would be indicative of anything either. The problem with CPU benchmarks is that they have no real world application; Everyone has different needs. However, the marketing types for both the suppliers and consumers need numbers to push in front of each other, so they make up these things which those of us in the field understand have no real world meaning.
It's a vicious circle, non?
Re: (Score:3, Insightful)
If your specific needs happen to be similar to the things benchmarks stress, then you can expect the results to be relevant. If your needs differ wildly from benchmark methods, then you can expect the results to be irrelevant - but most likely they will be equally irrelevant.
Benchmark performance is, at the very least, a better indication of relative performance than clock speed of cache size.
Fact is, few end users actually NEED
Re: (Score:2)
Re: (Score:2)
This is an indication that the code sucks. You should be able to play almost anything with a processor rated as 3000+ or better. I am sure that you can find some unrealistic 1080p/MPEG4 cont
Re: (Score:2)
Re: (Score:2)
All that said, this is the hardware equivalent of Fredrick Brooks maxim that 'the Paper tiger is always better than the actual one unless realit
"at the same frequency" is pointless (Score:5, Insightful)
Re: (Score:2)
C//
Re: (Score:2, Informative)
1) only the 3800 is an EE chip.
2) they're running on one of the most power hungry motherboard chipsets made for AMD: the nVidia 590 SLI
3) only the X2 5000+ is a 65 nm CPU
So basically, let's stack the deck as much as possible against AMD in this test without showing a best case scenario, while postulating that they're showing a "worst-case scenario" with a "bad E6300 sample".
I like Anandtech usually, but this article could almost have been sponsored by Intel and is far from
Re: (Score:2)
Speaking of virtualization, AMD seems to be doing the right thing there... nested page tables and so forth, along with the L2/L3 cache combo which they tested out as being better for virtual environments.
AMD has an advantage over Intel at any given process size, on the grounds that the IBM-AMD SOI process is basically better at any given process
Correction (Score:4, Funny)
Re: (Score:2)
Of course Intel claims that they are 1.499999999326112 and 0.999999994351582.
amd quad-cores are true quad-cores with a better.. (Score:2, Insightful)
Also If amd where to copy intel and put 2 dies on the same cpu they will have a better link for them that will not eat up chipset to cpu bandwith.
Re:amd quad-cores are true quad-cores with a bette (Score:3, Interesting)
First of all you can't put two dies on the same cpu, or at least it would be a horribly bad idea. You can put 2 cpu's on a die. Now I thought AMD already did this but they could just package several chips together and I'm feeling too lazy to look it up.
Anyway, yes for Intel chips they must communicate over the FSB. However, as I've recently been finding out they don't do that much communicating. For instance most cache state info is generated just by listening on the FSB. Th
Why casual users can't be bothered (Score:5, Interesting)
I read tech news daily and am getting sick of the media wars... It is no wonder casual users get fatigued trying to keep up. Casual opinions depend on which day (or week or month) a person chooses to research product offerings. It is no wonder I am always hitting a brick wall when trying to get my users to educate themselves so they can get more out of their tech. They don't know what to make of all the posturing.
This is not a function of the tech world developing *that* quickly. It is a result of the major players trying to out-strategize each other. I don't want to see anymore benchmarks (or hear about anymore promised software) until I am standing in front of a demo machine that is running the tech.
Guess I woke up on the wrong side of the bed.
Regards.
Re: (Score:2, Informative)
Seriously though, Intel's got the performance lead for now, but AMD's got the better tech and their release schedule "lags" Intel's a little. So Intel got the "jump" on AMD release cycle wise, and now you've got the situation where Intel has a brand spanking new product out that beats AMD's old offering by about 10-20%, at best at stock speeds.
I personally am waiting for AMD's release and benchmarks before making a final decision, but the fact that I'm doing so already says which way I'
Re: (Score:2)
-nB
Re: (Score:3, Insightful)
Intel seems to have taken a "no response" approach to media claims, instead producing product and letting guys like toms hardware do their thing. This isn't to say they don't advertise, but they don't take out full page NYT (or was it washington post?) ads chest pounding like AMD does.
-nB
Re: (Score:2)
Wait... what?
That's why we're constantly hearing about the performance advantages of Penryn. If anything, AMD has been a bit quieter than Intel. Compare the references on the Wikipedia pages for Barcelona and Penryn if you want evidence.
Re: (Score:2)
If the situation ever reverses and AMD's strategy to keep up s
Re:Why casual users can't be bothered (Score:4, Insightful)
Re: (Score:2)
On the other hand, buying video cards has gotten BAD. The numbers, ratings, even the price points have become seemingly almost random. The mark up is enormous at just about any retail chain. Higher price cards can have lower performance sometimes
Awesome! (Score:5, Funny)
Heh. (Score:5, Interesting)
2 of our attorneys just got quad-core Mac Pros with Studio displays. For writing documents on. Maybe the occasional slide show. I'm stuck on this 3-year-old Dell with dual CRT monitors. Old ones.
Sorry, just had to bitch a little. Your comment is more real-world than you may have realized.
Re:Heh. (Score:5, Funny)
Re: (Score:2)
Re: (Score:3, Insightful)
I hate selling. I'm damn good at it, but I hate it. To quote Zoidberg:
Selling's about kissing ass and pushing off whatever it is you're selling on whoever has a wallet, no matter what their needs. Ok, at the retail level maybe not so much. But any sales job that pays close to 6 figures, yep.
Yeah, I'll pass. And I'll be doing what I want when I leave work at 4:3
Re: (Score:2)
But of course you just wanted to troll. I wonder how much you have invested
Re: (Score:2)
Re: (Score:2)
OOOP (Score:2)
OOOP = Obligatory Open Office Plug
AMD needs to rebrand itself too (Score:5, Insightful)
Marketing a chip as using less power is the same as having Toyota make an exclusive advertising campaign toward wheel-chair bound people: the group you're targeting has few people in it and they're going to research any product they buy. The server market is important, but when I buy my shiny new server, power consumption isn't my first consideration, nor is that the only thing AMD offers.
With this announcement, I'm hoping AMD starts a new slogan touting, say speed. That's what I buy a processor for primarily. AMD's always been fast for the cost and it's high time they market themselves as being faster and better rather than being "as good as" Intel. My new pick for a marketing slogan? "Upgrade to AMD" AMD should position its chips to be slightly more expensive at every pricing tier, but in doing so, blow them away in performance. (In the present economy, businesses have money and will gladly spend more money on products they feel are superior. Ford spends more money on marketing than BMW (but which would you rather own?). AMD should be trying to make Intel look like Ford, rather than being the "Ford alternative".)
AMD is marketing to a minor concern of a niche audience, while they ought to be using their superior performance (at a given price point) to sell hardware. Would you rather be a "power saver" or "upgrade to AMD".
Re:AMD needs to rebrand itself too (Score:4, Insightful)
That's nice - but when we look at purchasing $250k - $500k of servers, power consumption as an important factor.
Back in the days when dual-cores were just beginning, this indeed was HUGE. Do you want 30% more Irwindales which would require 100 tons of cooling, or the AMD dual-cores which require 30 tons of cooling? The same is going to happen at the dual-core/quad-core boundary.
As CPUs are cheaper and cheaper and A/C systems remain a constant cost, the people who spend large amounts of money are going to look more and more at power costs. They're probably aiming at business customers who don't buy *a* server, but buy a *hundred* servers.
Re: (Score:2)
Electricity and cooling costs are huge. And unlike a server which you buy once and may use for 3-5 years, you pay for electricity and cooling all the time. The electricy/cooling costs over the lifetime of a product can often cost more than the server itself, so anyone not looking at the power consumption of their systems as a high priority item (desktop, server, anything) is doing
Re:AMD needs to rebrand itself too (Score:4, Interesting)
Re: (Score:2)
Re: (Score:2)
Tee hee hee! Of course, that's why there's all those ads in computer trade rags and the wall street journal, because the people buying servers always do their own research and never believe marketing materials.
AMD should market to their entire market.
Based on what? They can't exactly claim top performance right now. They can't claim longest battery life for laptops
Re: (Score:2)
Would you rather be a "power saver" or "upgrade to AMD".
Be a power saver.
When Intel came out with Centrino, I bought one almost at launch. When AMD came out with Winchester, the Athlon 64 that made the gigantic leap in price/performance/watt, I bought two. When nVidia started making lower-clocked GPUs that didn't need a fan and wiped the floor with ATI in price/performance/watt, I bought three over 3 years (6200, 7600, and now, 8500) (that was the main reason - the other one being ATI's shitty drivers for Linux). Now, I'm looking at a new ultraportable, and AM
Re: (Score:2)
Also, forgoing power (and thus heat) efficiency isn't going to make them any frien
Re: (Score:2)
And I assume there's a strong correlation between power and noise. Most people's homes are quiet enough for a loud machine to really call attention to itself.
Fly me... (Score:5, Funny)
But marry me soon baby, I need the money
SSE4? Please, don't get distracted over little things like whether or not I can cook!
Re: (Score:3, Informative)
SSE4? I'm not buyin' either AMD or Intel until they're at least at SSE256. What's that? It'll take a while? That's OK, I don't have the monies to get them now anyway.
<sarcasm/>
For my type of workloads, straight SSE2 is still just fine. I'll take an improvement on that now instead of, say, waiting for the x86 world to match AltiVec instruction-per-instruction. But i would go for a wider ISA - give me 4x64bit registers
What's really relevant ... (Score:4, Insightful)
What's really relevant to me is the performance per dollar ... not just dollar of CPU cost, but also dollar of whole system cost (including software, if that goes above zero), and dollar of energy cost (including the cost of shoving waste energy out the back door in seasons I does me no good to keep it indoors).
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Real life tasks (Score:2, Insightful)
Re: (Score:2, Funny)
Interesting definition of "real life tasks" you have there.
For the majority of the computing population, I would suggest that "real life tasks" would be more accurately defined as downloading and playing porn, rendering MySpace pages and running Norton Antivirus together with the 28 different systray applets installed by Dell during the manufacture of their shitwreck of a PC. Furthermore I would al
Wow, they really put the squeeze on the high end.. (Score:2)
The article mentions that the 8222 SE is priced at $2149. So if I want a system with more than 4 cores, I'm bound to pay ~2.5x as much per processor.
I can get a workstation with 8 3Ghz Clovertown Xenon cores from Apple for just under $4000, 8 Opteron cores at the same clock will cost me more than twice that for the processors alone, nev
CPU Speed? Who cares?! (Score:2, Informative)
Truthfully, I have not seen a significant benefit to higher CPU speeds since circa 300MHz days. Except for gaming, things seem to always work about the same speed. The rate at which I can type this message is limited not by CPU, but by my fingers; the speed with which I browse the web is limited not by CPU, but by my ability to skim for content; the speed with which I get real paying w
How does it compare with Penryn? (Score:2)
All pointless till we have a 3rd party compare Penryn to Barcelona. I imagine neither will have much impact till 2008 as both will be production limited this year.
Also AMD need to stop talking and start showing.
I currently don't need more speed. (Score:5, Insightful)
Now if AMD can produce a cheap and silent system with good graphics performance I am all for it. Say something as fast as an X24400 and an Nividia 7600 GT all for about $300 then you have a winner. You will sell millions.
A quad core system? I just don't need it yet.
Dedicate 50% of that performance to security (Score:2)
Please don't dump it into another golly geewhizbang video or multimedia processor subfunction on the chip.
At this rate, We'll see Penryns before Barcelona (Score:2)
I suppose since, AMD only wants to compare same-clock chips (probably because they won't get higher clock Bars for a while) they may start argueing that we should only be allowed to compare Intel's old 65nm products (not the 45nm) to 65nm Barcelona, to
Apple Backwash (Score:2)
What socket? (Score:2, Insightful)
Re:Nice attempt, AMD. (Score:4, Insightful)
But you know what they say about lies, damn lies, and benchmarketing.
Or, if they don't say it, they ruddy well should.
Re:Nice attempt, AMD. (Score:5, Insightful)
However AMD doesn't need to attempt become relevent again. They are currently very relevent. Did Intel become irrelevent when they were behind AMD on performance? No. In the past, AMD did lose more by not having the performance crown, and one could certainly imagine the momentum they were gaining in the K7 days fading quickly if Intel had come out with a superior chip. But today, AMD has both the marketshare and the OEM support to be merely competitive performance-wise and still be relevent. So they lose out at the top speed grades. If they can continue to match up their products to Intel's at lower speed grades, and they will, then they will continue to be a good choice for many people, and will definitely still be relevent.
Re: (Score:2)
Re:Nice attempt, AMD. (Score:5, Informative)
In any case, what matters is what Barcelona will ship at, which has not yet been specified. In any case, if Barcelona lives up to AMD's stated expectations on performance, it will be a killer CPU. Your statement about Intel's potential improvement leaps are spot on, and fall into Inforworld's Tom Yager's statements about Intel which are essentially phrased as "Core 2 is Intel's last hurrah". Why? Because Intel is essentially running on 10 year old technology and is rushing to catch up, despite some of the nifty architectural things they did recently to speed up C2D (integrated L2 cache for example).
I also believe that Intel is now following AMD's lead by leaving extra headroom for those that prefer to OC their CPUs and concentrating more on TDP and stability. I've noticed that Intel's chips since P4 are certainly more stable, while my rather severely OC'd AMD CPU occassionally (twice this year) shuts down, most likely due to heat or a RAM instability (since the shutdowns happen during low usage periods at night, I'll bet the 20% OC'd RAM is probably to blame).
Basically, right now Intel owns the crown, but they own it while comparing to AMD's last gen CPUs which are 3+ years old.
Re: (Score:2)
Re: (Score:2, Interesting)
It's going to be very very interesting.
Re: (Score:2)
Re:Nice attempt, AMD. (Score:5, Insightful)
Frankly this sounds more like fanboi talk than a serious analysis. If your goal is to diss Intel and give AMD props then saying they are using 10 year old technology makes sense. If you are actually trying to argue that AMD's future is much brighter than Intel's it's totally non-sensical. If Intel can gain huge performance benefits just copying stuff AMD is doing now while AMD has to make huge advances just to stay competitive I know who I would put my money on.
Re: (Score:2)
Frankly this sounds more like fanboi talk than a serious analysis. If your goal is to diss Intel and give AMD props then saying they are using 10 year old technology makes sense. If you are actually trying to argue that AMD's future is much brighter than Intel's it's totally non-sensical. If Intel can gain huge performance benefits just copying stuff AMD is doing now while AMD has to make huge advances just to stay competitive I know who I would put my money on.
Err, you're assuming Intel in fact has a
Re: (Score:2)
Exactly. Netburst was their new technology that was supposed to be "the future" and we all know how well that worked out. It really is quite impressive that they've done as well as they have at evolving the P3 architecture. But as an AMD fan, this scares me because I don't know what they're going to do
Re: (Score:2)
I think it is AMD that has been increasing the performance by making
Re: (Score:2, Insightful)
Re: (Score:2)
Not quite there with you on this one. Take a UNIX environment for example. There are a lot of 'localhost' services which are required for a VM to be useful. It's obvious for a UI front-end, but even in the back-end you've got cron-tasks et-al. Whatever the OS or technology, these boiler-plate services consume resources, often in a wasteful way (d
Re:Nice attempt, AMD. (Score:5, Interesting)
Actually Hypertransport is an open standard and anyone can implement it. AMD doesn't have the clout to force proprietary standards on the market, so their only hope to have a standard adopted is to make it open and royalty free.
Which is why Intel will (probably) never implement it. They aren't interested in standards which they don't control. They already don't like the fact that AMD is cross-licensed for all x86 tech, which was part of the motivation for creating the entirely separate IA-64 ISA. When IA-64 failed and Intel was forced to implement x86-64, the only reason they used AMD's spec was because Microsoft said that they would only support one x86-64 ISA, and AMD got their first. Basically it took MS to out-monopoly Intel. So unless they are forced to use HT, they won't, and I can't see any way they could be forced. They may implement something similar -- they will have to in order to address multi-socket scalability -- but it will not be compatible.
AMD would love for Intel to copy their tech. Every time they do, it makes AMD look like the leader and Intel the follower. You could practically hear the screams of orgasmic joy from AMD when Intel announced EMT64.
Re: (Score:2)
The newest x86 architecture was killed off last year with the late 2000-era NetBurst being scrapped after it hit the thermal wall and the K8s smoked them. Newer isn't always better.
Re: (Score:2)
Re: (Score:2, Insightful)
Intel is already shipping 3GHz Clovertowns, and the article states that AMD has not released the Barcelona clock targets. It they ship substantially below 3GHz (2.4?), then
Re:Nice attempt, AMD. (Score:5, Insightful)
Ummm... they're not. If they were using clock speed as a metric, they would be saying "Look! We're running at 3.5GHz and Intel is only running at 3GHz!" while completely ignoring the actual performance -- exactly what Intel did all those years. They are instead talking about performance-per-clock-cycle, which (according to this) means that a 2.66GHz AMD chip would be considerably faster than a 3GHz Intel chip. We can expect them to continue touting the overall performance rather than raw clock speed, since they look better from a performance standpoint and worse from a raw clock speed standpoint.
How is that different than what they've been saying all along?
Exactly (Score:2)
Re: (Score:2)
Because they've been saying all along that it is overall performance that matters, while here they are keeping the discussion only in the realm of IPC. Performance in Instructions/second = IPC * Hz. If you talk only about IPC or only about frequency, then you're leaving out half the equation that results in the real number you want.
So while they are at least talking about benchmark scores and actual performance (can't really talk about IPC with
Re: (Score:2)
Not really. Based on projected rele
Re: (Score:2)
Since the projected release date is in the next six months, I'll say it's quite certain that they have real silicon. It may not be the silicon they end up going to production with -- historically it's unlikely that rev A0 ends up being the production rev -- but they must hav
Re: (Score:2)
"40% more IPC" isn't really any more informative than "40% more frequency".
This depends.
If they say "A has 40% more performance @ 2GHz than B @ 2Ghz", comparing their relative performance is just a matter of math.
A at 2GHz will have the same performance as B at 2.8GHz.
Saying "A will have 40% higher Hz than B" and not mentioning the performance per Hz (á la the 90's) makes it impossible to make any performance estimates.
It also says a little about the performance-gains one can expect over time.
A certain chip-die will not increase it's performance/Hz ratio during it's lifespan
Re: (Score:2)
Re: (Score:2)
So, current Core processor machines are what AMD is making claims about, not the 45nm chips that Intel is currently working on but has not released yet.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
AMD's release will be first, so they'll be in the lead for a little while - in exactly the same way that Intel's Core 2 Duo release came before Barcelona.
In the mid 90's, Intel stuff was pretty obviously better. Then AMD was pretty obviously better for a while. Now it's really interesting - they're basicly neck and neck with each pulling ahead with new releases and then falling behind again.
Re: (Score:2)
Re: (Score:2)
Who is "we" that already "know"s that? What 4-core AMD processor beats its Intel competition?
Re: (Score:3, Interesting)
Re: (Score:2)
If there is any problem with the NUMA code in Linux, AMD is welcome to fix those problems themselves, instead of waiting for someone else to do their job for them.
Combine the two. (Score:2)
Two enter, One Leaves!
Re: (Score:2)
Yesterday I booted up something like that with knoppix - it was a joy to see 8 penguins on boot. You can get two intel based 8 CPU machines in 1U now (two sockets, 4 cores, power supply in the middle and then the other board with 8 cores) - it will be good to see AMDs four core options since their dual core opterons on 8 sockets is already impressive if on a much bigger