AMD Launches First 45nm Shanghai CPUs 264
arcticstoat writes "The wait for AMD's next-gen CPUs is finally over, as the company has now officially launched its first 45nm 'Shanghai' Opteron chips for servers and workstations. 'AMD's move to a 45nm process relies on immersion lithography, where a refractive fluid fills the gap between the lens and the wafer, which AMD says will result in 'dramatic performance and performance-per-watt gains.' It's also enabled AMD to increase the maximum clock speed of the Opterons from 2.3GHz with the Barcelona core to 2.7GHz with the Shanghai core. Shanghai chips also feature more cache than their predecessors, with 6MB of Level 3 cache bumping the total up to 8MB, and the chips share the same cache architecture as Barcelona CPUs, with a shared pool of Level 3 cache and an individual allocation of Level 2 cache for each core.'"
Which to buy now? (Score:3, Interesting)
Does this mean that AMD chips are now competitive on price-performance with Intel's? I mean for a fairly high-end desktop or server; obviously different considerations apply in the embedded or netbook market.
Re:Which to buy now? (Score:4, Informative)
This news is about a server/workstation chip, and I don't do any purchasing of those. As far as desktop chips are concerned, AMD was ALWAYS competitive on a price-performance basis. The key word there being price.
Re:Which to buy now? (Score:5, Informative)
No, they weren't. For the past year Intel has boxed AMD in with chips at the same performance and lower price, or the same price and higher performance, or both.
And Intel has had performance segments (QX*) stretching well above AMD's, and pricing segments (Atom) well below AMD's.
AMD's short-lived price/performance superiority in the desktop sweet-spot in 2004 and 2005 has left many people thinking they're still in that position. That hasn't been true since Core 2 came out. HyperTransport gave them a slight edge in very-high-end servers for certain applications, but Intel stayed near them with reliably higher clock speeds, and is coming out with QuickPath in four days, wiping out those few use cases where AMD can make easy sales today.
What I'm saying is, right now you are likely to choose Intel in almost all situations, if you are objective.
Re:Which to buy now? (Score:5, Informative)
cycles aren't everything in all cases... AMD still has more system bandwidth, which speeds up everything when talking about IO bound applications. FSB speeds up every aspect of the computer.
The applications where AMD is superior are IO bound applications like database servers, and music production.
Intel is better for video because you are dealing with a limited number of streams and it's computationally expensive, so is CPU bound.
With audio you can have hundreds of streams (often 4-6 per fader on the mixer), and at 24/96, will quickly overwhelm any intel based system. Since a lot of us use DSP cards ( think of it as GPU for sound) the data path capacity, especially to the DSP processors (PCI/PCI-e) is very important, and Intel simply can't touch AMD in this respect.
AMD architecture simply has untouchable plumbing. If you will notice, Apple is looking for a new chip vendor. This probably has a lot to do with it since most audio professionals use Apple gear.
If Jobs and Co. were smart, they'd offer both intel and amd architectures, depending on the job being done. Intel is fantastic for video and a lot of pro video peeps use Apple gear too. Those are two market segments that couldn't be more different in their requirements. To be the best of the best for multimedia, Apple needs to either build a new architecture or offer both AMD and Intel.
-Viz
Re: (Score:2)
Re:Which to buy now? (Score:4, Insightful)
That's been true in some price ranges, but Intel hasn't trumped AMD across the board any time recently. There's always been a couple price ranges - and usually the relevant ones like $120 to $150 - where AMD has a better product.
Geode?
I'm not trying to say that Intel hasn't been "the winner" for the past year or so, but it certainly hasn't been as one-sided as you're claiming. AMD has been selling chips, based on being the best choice for individual consumers, the whole time.
Re: (Score:3, Insightful)
First, Intel didn't have better price/performance until recently. Certainly well into core 2 duo period the AMD Athlon 64, X2, etc were much better price/performance.
Second, on the low end:
AMD Sempron 1150 2ghz: $22
Intel Celeron 430 1.8ghz: $39
That Sempron is much faster than that Celeron. Atom is cheap for the processor, but the other parts cost more and use a lot of power (%50 of total power, 15 watts for chipset on Intel's Atom mITX board). Why do you think netbooks only get ~3hrs with 4 watt processo
Re:Which to buy now? (Score:4, Insightful)
Er, huh?
AMD dominated the price/performance war with Intel from the time they released their K6 chips - that'd be 1997 (hello, remember the "sub-$1000 PC"? that's thanks to AMD). This was the case until just recently when things started to go multi-core - and even then, AMD had a bit of resurgence while playing leapfrog with Intel.
From about 1999 to 2003 AMD was way, way ahead of Intel; Intel didn't pull ahead of AMD in terms of simple performance (without spending close to a grand for a processor) until the release of their Core based processors. Their performance started to improve quite a bit with the M based processors, but your common desktop price/performance was still dominated by AMD.
Arguably, AMD's memory management is still better. We'll see how this generation hashes out.
Re:Which to buy now? (Score:5, Informative)
According to Anandtech's review, it's highly competitive for database servers. http://it.anandtech.com/IT/showdoc.aspx?i=3456 [anandtech.com]
This bodes well for the company (Score:5, Interesting)
Just an off-the-cuff calculation on my part shows power consumption dropped over %50 over Barcelona, clock-for-clock.
This is good news, because when AMD moved from 90nm to 65nm, their leakage was so bad that the power consumption only dropped around %10 clock-for-clock. Combine this with better cache architecture (larger, and faster), and AMD may have a winner in the server space.
I'm not sure if they're going to take back the desktop anytime soon. Intel doesn't have the FBDIMM downside on desktop systems, and I'm fairly sure that Shanghai didn't add major microarchitecure changes, so a quad-core Core2, let alone an i7, should continue to dominate the desktop.
However, it is nice to know that the market once again will have a choice in processors. AMD's 65nm offerings were spanked in terms of performance and power consumption by Intel's lineup, but Shanghai will at least compete on the power front, if not the performance front. We shall see what happens when AMD releases their desktop version.
Re:This bodes well for the company (Score:4, Insightful)
Power consumption is actually one of the areas where intel has been soundly beat, year after year.
Even 65nm processors from AMD use less power than Intel's 45nm procs, and Intel doesn't have an on-chip memory controller.
Add in the extra power consumption of an Intel northbridge, and intel's offerings are usually about double the power consumption of a similarly clocked AMD system.
AMD's real problems are in acheiving high clock speeds, and solving their fabrication process. If AMD's 45nm process is as improved as they say it is, and with their fabrication/design company split they should be able to get that side of their business under control.
Oh please. (Score:5, Insightful)
The two companies take turns one-upping each other for the bleeding edge, but every time (10 years running) I've specced out a mid-range (home gamer, single CPU motherboard) to low-end (grandma's email/photo machine) machine, AMD's been the way to go. It's a lot like trying to decide which company's video boards to pick if you're trying to make a game machine without breaking the bank.
Some people are Intel partisans, some people AMD partisans. Benching them and looking at spec, I've consistently found that AMD's got faster chips (for the same $) up to the "sweet spot" in the curve where price starts shooting upwards during the times I've been buying, but I also know there were times I was not in the market when Intel had done a price cut and AMD hadn't caught up.
I'm not going to call someone an idiot for their CPU choice, as it's a long-term purchase decision that has to be balanced with other factors (motherboard choice, RAM, video board, power concerns, cooling solution, etc) anyways. In fact, I recommend consumers try to stay OFF the "bleeding edge" because they're basically throwing money away on it; even if you buy the latest, hottest chip right from the factory it's obsolete by the time you get it home. Your best bet is looking at the curve, because there's always a spot (usually between $150 and $250) where the price starts to jump up exponentially for only an incrementally "faster" product. Buy at the spot beyond which the relationship between price and performance fails to be linear and you'll turn out pretty happy.
Re:Oh please. (Score:5, Insightful)
Since the C2D arrived, I've been going with Intel. I usually don't overclock, but the C2D handles it so well with such little effort that I based my purchase of a $200 ~2.2 GHz chip on that alone. With the addition of a $30 heatsink I had it at 3.4 GHz with temperatures under 60 C at load (below the temperature seen at stock speed with the stock cooler, implying good longevity), back when there were no 3.4 GHz Duos and the closest thing cost about $1000. I have several friends who had never OCed before who did the same thing, all ending up with 2.8-3.6 GHz chips that all are still working perfectly and speedily ~1.5 years later.
Re: (Score:3, Interesting)
Starting with some of GP's requirements (game-capable PC but at a reasonable price) and wanting to use ECC RAM for reliability I ended up buying an AMD last year. It is an AMD Athlon64 X2 EE 4600, a dual core with 2x2.4 GHz, not overclocked. In practice, this machine is fast enough, especially considering that I don't run the very latest games.
The deciding factor in terms of Intel vs. AMD was that ECC capable mainboards for Intel are expensive. The cheapest C2D would have been not much more expensive than t
Re: (Score:2)
Err.... does ECC ram actually help with reliability, or does it merely ensure that errors get detected ?
Perhaps I'm a bit of a purist, but if ECC Ram is actually self-correcting, I would worry about how/why it got corrupted in the first place. I find it much cheaper and easier to buy good quality non-ECC Ram instead.
Re:Oh please. (Score:4, Informative)
What happens is that during normal operation of any RAM there is a small chance that a particular bit will get flipped. Cosmic rays are often blamed as the culprit; of course if you overclock and overvolt your memory you increase the chance of errors, but even good quality RAM running within spec will get an incorrect bit every so often. If you use non-ECC memory there is no chance to spot this error; it just returns the wrong data. The old parity memory added one extra check bit for every eight bits, so most of the time it could detect (but not correct) a one-bit error. ECC stands for error correcting code (look it up on Wikipedia) meaning that if one individual bit is corrupted it can recover the correct data. If you are really unlucky and two bits in the same code word get corrupted at the same time then you still have problems, but that is unlikely.
If you are using non-ECC RAM then may be getting corrupted from time to time, but you don't notice.
Re: (Score:2)
Re: ECC RAM (Score:3, Informative)
Accorcing to Wikipedia (http://en.wikipedia.org/wiki/Hamming_code#Hamming_codes_with_additional_parity_.28SECDED.29 [wikipedia.org]), a scheme that can correct 1 bit error and detect 2 is typically used. So it can correct single errors. The most common reason is some form of radiation (see http://en.wikipedia.org/wiki/Soft_error#Causes_of_soft_errors [wikipedia.org]).
Against at least one of those (cosmic rays), even quality RAM is not immune. This said, only the vendors of quality RAM seem to be in the business of making the ECC version a
Re: (Score:2)
Perhaps I'm a bit of a purist, but if ECC Ram is actually self-correcting, I would worry about how/why it got corrupted in the first place. I find it much cheaper and easier to buy good quality non-ECC Ram instead.
ECC isn't just really meant to counter hardware memory errors. It's also meant to prevent single bit flip error events.
Cosmic rays and background radiation can cause single bit flip events. With increasing data densities, the likelyhood of a high-energy particle colliding with a memory cell incr
Re: (Score:3, Informative)
I have to agree with Lonewolf666. I have been agonizing over going with AMD over Intel but the ECC issue is a deal breaker. It is only supported on Intel's more expensive and older motherboards. DDR3 in combination with ECC is not supported at all ruling out anything recent and an FB-DIMM solution would be more expensive yet.
All of AMD's recent processors have really nice support (note 1) for non-registered ECC DDR2 with the caveat that not all systems have support in BIOS for it. Gigibyte motherboards
Re: (Score:2)
I bought an E4500, expecting I could overclock it a bit. But I put the machine together, and honestly I don't know why I'd even bother. It's plenty fast enough for anything I do. At some point you reach diminishing returns, and honestly I don't care if a task finishes in 3 seconds instead of 2. Do you really see a difference between 2ghz and 3ghz for anything other than encoding video?
Re:Oh please. (Score:4, Funny)
Do you really see a difference between 2ghz and 3ghz for anything other than encoding video?
Yes, it gives you more headroom for when your PC is bogged down with spyware and viruses!
Re: (Score:2)
I have had a very similar experience - always being an AMD fan prior to this build (AthlonXP Barton was the last one) and now have a C2D E6750 which is getting a bit long in the tooth.
In my experience, I got fast DDR2 1066 RAM, lowered the FSB:CPU ratio, pushed the core voltage up to 1.4V and put the clock speed up to give about a 3.4 GHz clock speed, from the stock 2.6 GHz.
Everything still runs happily like this, even on the stock cooler.
I even can feel the difference in Far Cry 2 with all the physics set
Re:Oh please. (Score:5, Insightful)
I tend to agree. The honest truth is that and AMD 780G motherboard and one of the low power X2s makes a great system for most users. If you want to play games throw on a 3870 or if you really need it a 4850.
I just built a system for my wife with an ASUS 780G motherboard, X2 and 4 Gigs of ram. Total cost was under $200 and it runs very well.
If you not into high end gaming then AMD seems like a great choice.
I can hardly wait for 45nm AMD desktop CPUs to start showing up. I really want one.
Re: (Score:2)
My Athlon 4200 X2 plays world in conflict and FSX just fine since I added a 3870. I would say that an X2 5000 will do just fine for a mid level gaming rig with an good video card.
Re: (Score:3, Interesting)
You might be surprised at the performance leap from a new CPU. I went from an X2 4800 with an 8800GTS, to an overclocked C2Q. I noticed a tremendous improvement in almost all games, despite running on the same GPU. For one, it eliminated any and all stuttering, even in older games. I'd say it pumped a good 30% more fps into my main games like WoW, LoTR, and the shooters of course.
Today's graphics cards are so ridiculously fast, they're very commonly limited by the CPU. It's like the 3Dfx days all over
Re: (Score:3, Insightful)
They're only CPU bound at low-res.
At any decent res, you'll be GPU bound, even with the latest and greatest graphics card.
Also, *which* 8800GTS are you on? It's a horribly overloaded model number where there are even different cores used. You can tell which you have based on the ram, a 320 or 640meg flavor is the older card, a 512meg flavor is newer.
I'm thinking that you did something other than swap out your processor. WoW shouldn't be having any issues even on a single-core machine with a graphics card
Re: (Score:3, Insightful)
I don't think that's entirely accurate. You can't just switch out the processor. You have to switch out the entire motherboard too, so you have a different memory controller, a different northbridge, a different sound chip, a different sata controller, etc... What makes you think that the processor was the magical factor out of all those that eliminated the problem.
Additionally, you probably also did a complete reinstall of your operating system when you upgraded the parts, which de-bloats Windows and yo
Re: (Score:2, Interesting)
I have to agree. When the Quad cores shipped, I tested them and I compared the speed per dollar. AMD was half price for perf
Future proofing? (Score:2)
Perhaps the rationale behind buying nearer the bleeding edge than your sweet spot is not having to replace as often?
Re:Future proofing? (Score:5, Funny)
Re:Future proofing? (Score:5, Funny)
If you're buying "bleeding edge" you're not going to be satisfied with your purchase 2-3 years from now (about my replacement cycle for my personal box, and even then a lot of my components like hard drive / sound card / DVD drive tend to last through 3-4 iterations), unless your tastes suddenly radically change and you're no longer interested in the "bleeding edge" games you were trying to run.
Plus, consider the following two options:
#1 - "Bleeding edge" rig. Blow $900 on processor, $1200 on dual video boards, $400 on RAM, $800 or so on miscellaneous other components. Total system cost around $3000.
#2 - "Decent Gaming" rig, single $300 video board, $200 processor, etc. Total cost: $900 if you really push your luck.
I'll take my $2000, buy more games, take girlfriend to dinner, stick some in a rainy-day fund, etc. One of these years you need to run the numbers and then you'll figure out that the "savings" you claim are there from buying at bleeding edge aren't really there at all. Even if I spend $900 every 2 years upgrading my PC, it takes me 5-6 years to equal the cost of your rig, and I guarantee you're going to turn around and want to rebuild to get back to the bleeding edge because you'll be "disappointed" that your 2-3 year old "bleeding edge" machine is only getting 15 fps in the timedemo mode of CallOfUnrealCrysisDoomQuakeTournament 3: Yet Another Non-Scaling Tech Demo Masquerading As A Game.
Re: (Score:2)
Speak for yourself. I tend to stay with stuff until it breaks.
Besides, I never said I am buying bleeding edge. I merely suggested buying nearer it than the original poster's sweet spot.
I run no games.
Re: (Score:2)
Agreed - even if you buy "sweet spot" rigs twice as often as the bleeding edge guy, on average over the time involved you will have a better gaming experience.
"Bleeding edge" guy is essentially paying for the R&D which makes our 2nd generation cards, motherboards and processors cheaper and less power hungry.
Re: (Score:2)
It depends on where the sweet spot is and what else is going on.
I have found that if the sweet spot in capable of providing solutions at the time of the build, then it is still capable 5 years down the road. Vista being the exception of course. Even if your gaming, you will find that it is still fast enough although you want something faster. I just retired some Athlon XP 2100 machines last month which had nothing wrong with them, they were just 5 or 6 years old and when parts started failing, we decided to
Re: (Score:2)
And what's so bad about replacing often? If, at the end of ten years, you end up having had more total cycles available AND having spent less money, AND have the additional reliability factor of not holding on to equipment for very long after it's no longer under warranty, what's so bad about that?
Convenience (Score:2)
Convenience. I tend to use e highest quality equipment I can afford [apple.com.], and to run free software [debian.org.], so I don't need to reinstall or worry much about machines and OS. Having to reinstall because e machine is obsolete is someðing I want to postpone as much as poßible.
Re: (Score:3, Insightful)
You subscribe to the bathtub curve of reliability. For many components, this isn't an accurate model.
For many components, after you get past infant mortality, the devices remain consistently reliable. I've seen 386s and 486s that are still running, day in day out, today. PDP11s simply don't die, and there are some that are just sitting in a corner quietly doing mission critical tasks in industry.
All you have to do is identify common failure modes and do maintenance to mitigate them. For example, the dominan
Re: (Score:3, Informative)
I think you pretty much nailed it. A few years ago I bought a amd 3000+ for 595 bucks because I wanted bleeding edge. A few days later a friend bought a amd 2800+ for 250 bucks. Less than half what I paid.
My extra almost 400 bucks got me nothing that mattered. Sure I was faster then he was but he could play all the same games I could just as well. Sure the web, read email, view porn. Just as good as I could.
The only place you could tell the difference was when encoding video or audio. I was just
Re: (Score:2, Informative)
Frequently?
Do you have any actual backup for that? The last three desktops I've owned have all had AMD processors, and the only thing that's gone bad on any of them was the AGP slot went bad on one of the mobos after about 5 years.
Re: (Score:3, Informative)
No he doesn't have any data to back that up. Well, not unless he is going back to the K6-2 and K6-3 days. I have used and supported many AMD system and only had 3 of them with problems with chipsets. Actually, this problems weren't even the chipsets, they were with cheap board manufacturers who used counterfeit parts and got a load of bad resisters. The ECS K7S5A had a short run on that.
Of course he might be referring to the older VIA chips that allowed the user to select optional components during the driv
Re: (Score:3, Interesting)
I have had a similar experience with machines I have had and built/administered:
1. K6-2/500 on a VIA MVP4 chipset: no problems at all
2. Celeron 900 on an i810/ICH1 chipset: no problems at all
3. P4-M 2.2 on an i845MP/ICH3M chipset: integrated Intel PRO/100 NIC died
4. Duron 1600 on an NForce 2 Ultra 400 chipset: no problems at all
5. X2 4200+ on an NForce 4 SLi chip: no problems at all
6. Dual 2.8 Xeon Irwindale on an E7320/6300 chipset: integrated Intel IDE controller was recognized intermittently
7. Pentium D
Re: (Score:3, Insightful)
I doubt he does. Of the half a dozen or so systems that I have had in the last few years all of them but one has been AMD. Every one of the AMD worked fine until I retired it, but one. I killed that one. I was flashing the bios and accidentally put my big foot on the power strip. Yeah, it was stone dead.
All the amd mb but that one still work fine. I have one I pulled out of storage after 2 years and powered it up. Its running perfectly. Hell, even the clock was still within acceptable time for b
Re:Oh please. (Score:5, Funny)
Hah! that's priceless about the power strip.
Everyone knows you're supposed to sit perfectly still, holding your breath, squeezing your sphincter while any BIOS update goes through. Anything less than that shows disrespect resulting in consequences like yours.
Re: (Score:2)
Re: (Score:2)
Along with daedae, I'd like to see any sort of proof on that also. Nearly every machine I have had over the last 6+ years has been an AMD of one form or another. In fact, the only two non-AMD machines I have are an MSI Wind (I wanted to tinker with an Atom based machine in Linux, OSX, and Windows, to see how it performed) and an C2D 2.4ghz machine I slapped together (well before I acquired my Wind PC) in order to play with OSX.
I can't recall one occasion in which I've had bug-ridden and/or chipset incompa
Re: (Score:2, Interesting)
Ed...A fairly high-end desktop isn't anything close to comparable to a high-end server. AMD's have been superior for building high-end servers for some time now, thanks to bandwidth considerations.
Re: (Score:2)
Does this mean that AMD chips are now competitive on price-performance with Intel's? I mean for a fairly high-end desktop or server; obviously different considerations apply in the embedded or netbook market.
What apps are you running? Do your apps take advantage of multiple cores (regardless of the speed) or risk pushing 17.6GB/sec of memory bandwidth? Then go for the new AMD or Intel CPU, as they are both stupid fast. If not, then you might be better served with a previous gen CPU at a much lower price.
I mean, really, we are talking about very small differences in speed at this point, right? Would the average person actually be able to detect the difference? I am all for ever faster processors, as it allo
Re:Which to buy now? (Score:4, Funny)
It is always best to do your research when buying a new chip so you don't get shanghaied.
Re:Which to buy now? (Score:5, Insightful)
AMD is still doing OK on price to performance, but what I think is hurting them is that the margins are not the same, because CPU's as a whole are just so cheap now. I remember back when an ENTRY LEVEL off-brand chip like a Cyrix (or, AMD) cost $150. "Intel Inside" cost $350 or more starting out. We'll call that a 50% ratio. The AMD (and certainly not the Cyrix) chips were not quite as fast as their Intel competition, but to a high school student who was making $50 per week part time, I certainly didn't mind that small gap in performance.
Now today, the ratio has changed. AMD still beats Intel in price to performance, but not by the same ratios, and the margins are much different. If a $40 AMD chip is slightly slower than a $65 Intel chip, then that's great, but the difference is only $25. A lot of people are going to be pretty quick to drop the couple of extra $$ for the Intel chip. Particularly now that I've noticed that, quite often, when you go over to the motherboards, Intel compatible motherboards are often coming in just a bit cheaper than AMD motherboards.
Now personally, when I can, I still buy AMD at least 50% of the time, but the only reason I do that is because I remember the days when Intel's competition wasn't as tough, and I remember those days of $350 chips from them. I only support their competitors to ensure that that situation doesn't repeat itself. For people without such a goal though, Intel is certainly tempting.
Re: (Score:2)
Re: (Score:2)
A lot of those are designed for heavily parallel operations, and as such other concerns come into play than pure chip speed. One VERY major issue is heat dissipation. Another, even for super computers, is cost. If it takes 400 AMD processors to do what 350 Intel's could do, it might still be worth it if the price difference is such that the AMD's would still come out cheaper.
Particularly when comparing on a desktop level, I don't think you'll find ANYONE who will claim that AMD is making faster chips tha
Re: (Score:2)
"AMD is still doing OK on price to performance, but what I think is hurting them is that the margins are not the same, because CPU's as a whole are just so cheap now"
$1500 for an Intel Quad-Core 'Extreme' CPU is 'just so cheap'? You must earn a lot more than I do.
The only reason AMD CPUs look cheap is because AMD don't have anything at all to compete with Intel's desktop CPUs at the high end; they'd love to be able to charge $1500 for their fastest desktop CPUs, but no-one would pay it.
Re: (Score:2)
$1500 for an Intel Quad-Core 'Extreme' CPU is 'just so cheap'? You must earn a lot more than I do.
High end has always, and will always, be expensive. That's not the area of the market where most of the chips are sold anyways.
The low end market sees far higher volume, and in that arena, AMD's introductory processors start at around $25 (with a heatsink & fan - I remember paying more than that for JUST a heatsink - no CPU or fan included), and Intel starts at around $40. That's for a 2.0Ghz and a 1.8Ghz chip, respectively, and that's quite capable of doing most of what your average user would want t
Re: (Score:2)
You seem to be thinking of Intel as the "good brand" and AMD as the "off brand". That made sense in 1997, but now it's like Toyota vs Honda - both companies make top quality products.
Unfortunate name (Score:5, Funny)
Re: (Score:2)
Intel Nehalem [wikipedia.org], the "perfect CPU for your native american tribe."?
Don't read too much sense into it...
Re:Unfortunate name (Score:5, Funny)
I still find it funny that Apple computers now use intel Core 2 Duo processors.
Re: (Score:2, Insightful)
Shh. Remember apple runs "fast" and is "glorious for multimedia!" somehow we skipped Linux and AMD, but hey, want to pay 2x as much for half the performance?
Re: (Score:2)
Re: (Score:2)
Right, because compiz/xgl could never compare to an apple fanboy. Is that what you're saying? Whoops.
Making Me Feel Old (Score:5, Interesting)
The first computer that I was a primary operator on, a S/360-135 plug-compatible 2Pi, had 768KB when it was delivered and was eventually bumped to 1.25MB shortly before I moved on to programming.
The computer upon which I wrote my first professional (COBOL) program was an IBM 3033 with a (for then) eye-popping 4MB of physical memory.
The first computer I ever owned was an RCA COSMAC with 4KB of memory.
The first DIY computer I ever assembled completely from parts (about 15 years ago) had 4MB of interleaved DRAM and a 256KB SRAM cache and was considered somewhat amazing by everyone who saw how fast it ran OS/2. I eventually boosted it up to 16MB
Now you get 8MB of on die cache with your four cores... And I still can't get a decent flying car.
Re: (Score:3, Funny)
Re: (Score:2)
About Time (Score:3, Interesting)
It's about time... I mean, seriously. The CPUs coming out of AMD have stagnated in the last few years. The Phenoms are decent enough, I guess, if you have apps that can take advantage of the three or four cores, but they clock at slower than comparable X2s, and two cores is still the optimal point on the diminishing returns curve (on adding more cores).
I remember the 90s and early 00s when you were basically required to upgrade your processor every year or two or be hopelessly behind when the latest game came out. Now, I'm running the same machine I was back in '04, except with a new video card and an upgrade from a 3800+ (2.4Ghz) to a 4800+X2 (2.6Ghz) a year and a half ago.
I got curious how far I was behind these days, and found that as far as everything goes, a 4800X2 is still about as good a chip as anything AMD produces, only about 30% below the top chips AMD makes right now.
By contrast, Intel has the E8500 which is not only significantly faster, but is heavily, heavily OCable as well. I think Moore's Law has finally broken down for AMD.
Re: (Score:2)
Think of it this way, the current AMD's are not much behind in similar clock speeds, but are a bit behind at top clock speeds (well, ok, more than a bit if you OC). Even if I buy C2D when I want fast workstations, I would definitely say AMD is withing the same performance "generation" and they are very competitive at the mid-low segment.
So, your perception that Intel is improving very fast is based on the fact that they were running on a laughable (PR dept driven) architecture for years. From P3 -> P4 th
Re: (Score:2)
Yeah, I know what you mean. I was seized by a sudden crisis when I picked up Fallout 3 at the Gamestop. PC version or PS3? On one hand, the PS3 version runs better, and plays on a 46" LCD television. The PC version crashes occasionally (as do all Bethesda games -- they're allergic to making quality software), but... the PC has a mouse. While the VATS system kind of cheats for you on the whole aiming issue, playing something that requires aim (aka COD4) is still much more enjoyable on the PC than a console.
Re: (Score:2)
VATS was actually what sold me. I'm more interested in the exploration and story and am more of a turn-based game player vs. a "twitch" gamer, and have a hard-time aiming for headshots all the time.
I've found VATS to be a nice compromise. (of course I also like that its use is optional :) )
On the other hand, besides the PC game being buggy, it is also more "open" in that it can be modded.
This was a factor but since my desktop (can't call it a gaming PC anymore its too behind the curve), can't support Fallo
Alice must be pleased... (Score:2)
Will there be a Hourai as well? ^_^
Re: (Score:2, Insightful)
Re: (Score:3, Informative)
Despite any advances AMD makes in CPU's, they still have such a sub par selection of chipset vendors.
What's wrong with their PCI-E bridge chip? It converts PCI-E to HT and back pretty well afaik. Or maybe you meant the southbridge? Yeah, that USB and SATA logic is really cramping my gaming rig.
The performance-interesting parts of the northbridge are on the CPU in AMD architectures (and now intel ones too), and they're great. I'm not sure what you're complaining about.
Re: (Score:2)
What are you saying, nVidia chipsets rock!
We've had someone stick up for amd, and now for nvidia. Anyone want to stick up for via chipsets? anyone? oh...
Faster than the Ahtlon64 6400+ (Score:2)
So, thish seems as good a place as any to ask; what's the fastest AMD CPU available? I have a Socket AM2 6400+ and I'm looking for an upgrade without changing the motherboard. I'm talking single core operation, that is, I don't care if a good threaded app runs faster on a quad core Phenom than on my dual core 6400+, I just need it to run one application that doesn't thread, on one core, really fast.
a day late and a dollar short (Score:2)
This is hardly exciting. AMD should have released these a year ago. Now they are irrelevant.
Please tell me AMD is not betting it all on SIMD. (Score:2)
"If AMD is betting the company on an improved production process..."
They're not. There's a reason they bought ATI and it's not just because they want to get into the graphics business.
Re: (Score:2)
and that reason would be?
oh, i know, to bleed money ferociously and take your company bottom line so low that it nearly killed it!!!!!
oh and now you can spin off parts of it (and hope to sell them to someone) so you can at least try to remain viable?
AMD has been so poorly run over the last 4 years it's just plain sad.
HOPEFULLY, they can get their shit together and remain viable, but this new chip isnt going to do it for the most part.
there just isnt going to be enough call for it wit the way spending on IT
Re: (Score:2)
Re: (Score:3, Insightful)
Of course it worked for Intel. Higher resolution lithography processes mean you can fit multiple cores in the same space as a single core from a decade ago. It means that the latency for critical paths is reduced, which means you can run the chips at a higher clockspeed. It means current consumed by transistor switching is reduced, so that chips can run at a lower power whilst maintaining or increasing throughput (thought interestingly leakage current increases as feature size shrinks).
Manufacturing proces
Re:Congratulations! (Score:5, Insightful)
If you're hinging your comments on the wafer size, you're blinded by Intel propaganda. Take a look at AMD's SPECjbb numbers, their cost per socket/core, and their threading for virtualization. Then perhaps you'll stop being an Intel shill.
Re: (Score:2)
Re: (Score:3, Interesting)
You haven't examined SPECjbb, then, have you? It's a Java-based business transaction kit that seems to have quite a bit of both fairness and repeatability; it's not easily manipulated, like others I've seen. After running more than several thousand runs with it, I find it pretty reasonable in terms of systems performance comparison, rather than motherboard/subsystem/peripheral benchmarks-- which shed light only on one specific characteristic of a machine, or are operating systems-specific. Admittedly, it do
Re: (Score:2)
I'm fairly certain he was referring to feature size (nm), where AMD is 1.5 years behind. He's undeniably correct there, your (correct) implication that AMD is still in the race on performance, price, and energy use is true but unrelated to his point.
Now, you may ask "so what, they've been ahead in manufacturing". And my answer would be to look at the financials. Smaller processes are cheaper, ergo AMD leaks money like a sieve - _that's_ "so what".
Re: (Score:2)
Thriving on *what* is the question that comes to mind for me... Thriving on manufacturing nifty cell phones, cool. Thriving on hosting pr0n and v1@35a sites, not so much. Thriving on midnight hotel-room kidney transplants, eewww! Make them stop!
Shanghai could the the melamine capital of China for all we know.
Re: (Score:3, Informative)
Thriving on life. I've just lived there for four months and wish I could go back.
Re: (Score:3, Insightful)
How about AMD/ATI?
They have bee producing some very good chipsets for the desktop.
Re: (Score:2)
Me, I just have a 6-yrs-old P4 laptop which, compared to nowadays new models w/ Core Duo, isn't much different.
I once had a P4 laptop, replaced with a Centrino. I definitely do not miss the P4's howling fan & poor battery life, the Centrino (half of a Core Duo) runs much quieter and cooler.
Re: (Score:2)
A 6-year old P4 laptop is very different from a new one, in just about every way except for maybe hard-drives which only got marginally faster. The CPUs use less power, video cards (and even chipsets) have hardware video decoding for many hours of DVD watching on one battery, the RAM is tremendously faster than 6 years ago, screens are br
Re: (Score:2)
Never underestimate a person's desire for the shiniest, newest thing.
Like you, I am happy - for now - with my Athlon XP 1500+ CPU paired with an Nvidia 128MB eGForce Ti video card. Old school, yet it handled nearly flawlessly all but the last level in Portal. Most people don't need the power of the last two years worth of hardware improvements. But remove price as a factor (i.e., you have more than enough money) or add status, and why
Re:...and so? (Score:5, Interesting)
I also downgraded my speakers. My old Klipsch 5.1 surrounds sound speakers sounded great, but they drew something like 45 watts. I replaced them with a generic set of 2.1 speakers that don't sound as good, but are more than adequate for the purpose, and I am now only drawing 2 watts.
Re: (Score:2)
There would simply be no price drop. The price drop comes from the company having made their money on that generation and released the new generation. If both Intel and AMD were to drastically slow down selling high end processors, they'd stop building new fabs - and run their current-generation fabs until they had made their money on them.
This would b
Re: (Score:2)
Re:...and so? (Score:5, Interesting)
Re: (Score:2)
get a real video card that does the decoding on the GPU and the clock speed of the cpu matters very little.
Re:...and so? (Score:4, Interesting)
So you're advocating Windows? I'd love to do H.264 decoding on the GPU in Linux, which driver and which video card do that for me?
I use XBMC, we're stuck with using the CPU for now but at least we can use both cores. So far the AMD CPUs haven't fared well with that software for full 1080P H.264 decoding either.
Re: (Score:2)
Ha
But there's your error .... I need six cores just to keep up the the crap I'm running
One Core for Crapware
One Core for Anti Crapware
One Core for Virii
One Core for Antivirus
One Core for Applications
and
One Core to rule them all and bind them!
Re: (Score:2)
Decreasing the feature size has always leaded to big improvements on top performance, performance/watt and price on the past. I'm not saying that this trend will continue on the future, but it still hods for 45nm.
I see
You are missing out.. (Score:2)
I might not have said much if you said you had a P4 desktop, but...
The power envelope of a non-netburst processor makes laptops much much better. Heat and battery life on my P4 laptop were quite unbearable.
Re: (Score:2)
Re: (Score:2)
Re:Clock (Score:5, Insightful)
Comparing two CPUs based on clock speed alone is like comparing the speed of two cars by measuring only the RPMs of the tires. It won't get you anywhere ... you need to know the size of the tires as well!
Thus concludes my first /. car analogy. Thank you.