AMD Launches New Processor Socket Despite Poor Economy 215
arcticstoat writes to tell us that despite a poor economic climate, AMD is moving forward with a new processor socket launch, although they are trying to make it as upgrade-friendly as possible. "As you probably already know from the AM3 motherboards that have already been announced, AM3 is AMD's first foray into DDR3 memory support. As Phenom CPUs have integrated memory controllers, it's more accurate to say that it's the new range of Phenom II CPUs (see below) that are DDR3-compatible. However, the new DDR3-compatible Phenom II range is also compatible with DDR2 memory. As the new CPUs and the new AM3 socket are pin-compatible with the current AM2+ socket, you can put a new AM3-compatible CPU into an existing AM2+ motherboard. This means that you can upgrade your CPU now without needing to change your motherboard or buy pricey new DDR3 memory."
What's the point in wating for markets to turn (Score:5, Insightful)
Re:What's the point in wating for markets to turn (Score:5, Insightful)
Re: (Score:2, Interesting)
What exactly is the gap between Intel and AMDs CPUs?
(I'm not trolling or trying to start a flamewar, just curious)
Re: (Score:3, Informative)
There's lot to consider when you decide which CPU to go for, and then there is their market performance.
Re:What's the point in wating for markets to turn (Score:5, Informative)
AMD is competitive at the low and middle end as long as you don't overclock the middle end CPUs.
(If you include the price of the motherboard and don't care about overclocking a low- or midrange AMD system will be cheaper.)
AMD don't have as high end CPUs as Intel and the ones which are closest don't overclock as good or use as little power.
Though then I'd say you shouldn't overclock anyway and AMD chipsets have used less power making the two when used in a complete system rather comparable.
Also AMD used to have an advantage in memory bandwidth and when using multiple CPUs.
Information may be slightly outdated but all of it is probably true, Intel may have catched up in memory bandwidth performance with their latest CPUs since they have put the memory controller within the CPU themself to.
Re: (Score:2, Informative)
That's not actually true, AMD give realistic power draw estimates but real world testing has shown that the AMD parts now use less power. One must also take into account that AMD has been integrating a significant part of northbridge into the CPU die for some years now.
Re: (Score:2, Informative)
In addition to having competitive low-end and mid-range CPUs, AMD is the clear performance leader in virtualization applications.
Lots of cores (Score:3, Insightful)
Information may be slightly outdated but all of it is probably true, Intel may have catched up in memory bandwidth performance with their latest CPUs since they have put the memory controller within the CPU themself to.
AMD's Hypertransport has interesting extensions to help cache coherency and currently scales very well with lots of cores and lots of physical CPUs.
(Opteron can be used in motherboard with 4 or 8 slots).
Intel's Quickpath is currently more a generation 1 interconnect. That's probably why they have only announced platforms with lots of cores and cpu packages only for later on.
(The first CPUs announced, as far as I've read, are only to be used in 2 socket configurations).
Thus if you want to run a server which
Re: (Score:2)
Re:What's the point in wating for markets to turn (Score:4, Informative)
You have to account for how each company lists its TDP.
Intel lists its TDP as average load usage, while AMD lists the max draw.
So if each chip was listed at 90W, the AMD would actually use less power.
Re: (Score:3, Insightful)
For notebooks I have no idea how total system power usage looks, AMDs chipsets provide better integrated graphics than Intel do however. And I guess I would go for someone better though still crappy graphics when somewhat faster / more power efficient CPU (if Intel really is.)
Afaik AMD don't have an alternative to Atom, I may be wrong though.
Also Intel notebook with Nvidia chipset may compare better to AMD.
Re: (Score:2)
In my experience Intel is dominating in the notebook business. I prefer AMD but the notebooks out there using them are either:
1) based on Sempron (slowish but low powered)
2) based on older X2 core (good performance but runs hot and suc
Re:What's the point in wating for markets to turn (Score:5, Informative)
AMD has the Geode LX and NX lines.
Geode LX [amd.com] is very low powered and the highest clock speed (I've seen) is 566Mhz.
Geode NX [amd.com] is targeted directly at the Atom. Although I have yet to see any of these out in the wild.
I've only ever found a Geode in the wild clocked as high as 500Mhz (see the ALIX boards [mini-box.com])
Actually the Geode is a dead end processor, AMD already has stated they are disconinuing it.
AMD recently announced a new processor "Conesus" that is intended for netbooks and UMPC.
http://gizmodo.com/5086703/amds-upcoming-conesus-netbook-chip-wont-stoop-to-mid-levels
Re:What's the point in wating for markets to turn (Score:5, Informative)
(PS to trolls: Unbuffered ECC memory is only marginally more expensive than unbuffered non-ECC, though it usually has a small latency penalty. Registered/FB-DIMMs ECC on the other hand are Quite Expensive)
Re: (Score:3, Informative)
Re:What's the point in wating for markets to turn (Score:5, Interesting)
Core i7 940 -> $564.99 + about $250 for mobo = $800+
Phenom II 940 -> $224 + about $150 for mobo = about $375
Core i7 needs DDR3, Phenom II 940 runs DD2 (note that the 940 is an AM2+ part, not AM3 so it doesn't support DDR3). DDR3 is somewhere around 50% more expensive than DDR2 (though falling).
For me, the fact that the i7 is only about 10-20% faster than the Phenom for more than twice the cost, it's simply not worth considering for me. Then again, I do most of my gaming on consoles.
Re: (Score:2)
For me, the fact that the i7 is only about 10-20% faster than the Phenom for more than twice the cost, it's simply not worth considering for me. Then again, I do most of my gaming on consoles
Not to mention the fact most games bottleneck at the graphics card, and not the CPU. So that 10%-20% faster CPU isn't guaranteed to give you 10%-20% increase in FPS. Phenom II is definitely the way to go if you want the best gang for your buck, imo.
Re: (Score:2, Funny)
But then you lose the "I spent $X,000 on my superawesomeness gaming rig" pissing contests when you play those games online...
Re: (Score:3, Interesting)
"Starting now, there's going to be a lot less talking and a lot more killing."
"Less smack, more thwack"
I could come up with more ways to say it, but you get the point.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Insightful)
Do you think a current i7 would compare to what you will buy for $300 in two years' time? In three years a laptop would be faster.
Re: (Score:2, Informative)
Yeah, was looking at the Core i7 earlier today and noticed they are pricey.
Think this is a better bang for the buck...
Gigabyte GA-MA74GM-S2H $64.90
AMD Athlon64 X2 5200+ Retail (Socket AM2) $59.99
Rosewill R363-M-BK Micro ATX Black Ultra High Gloss Finished Computer Case with
400W ATX $59.99
CORSAIR XMS2 2GB (2 x 1GB) 240-Pin DDR2 SDRAM DDR2 800 (PC2
6400) $44.99
BFG Tech BFGE98512GTE GeForce 9800 GT 512MB 256-bit GDDR3 PCI Express 2.0
x16 $134.99
Total: $364.86
Missing: Hard Drive, DVD,
Re: (Score:3, Interesting)
An i7 920 will crush an X2 5200+ in CPU-intensive tasks. http://www.guru3d.com/article/intel-core-i7-920-and-965-review/15 [guru3d.com] shows a software-based rendering benchmark. The i7's are up in the 11's and 12's, while my X2 6000+ was in the 4's.
The i7 system will definitely cost more, as there aren't really any budget X58 motherboards and the CPUs and DDR3 are still brand-new, top-of-the-line parts (and thus get a price premium). However, I just got an EVGA tri-SLI board, i7 920, and 6GB of DDR3-1866 for $570 + S
Re: (Score:3, Insightful)
Re: (Score:2, Interesting)
Re: (Score:2)
Just about ANYTHING will beat the pants off of a Celeron - Celerons are useless.
Re:What's the point in wating for markets to turn (Score:4, Insightful)
In addition the fact that it's cheaper for them to make this than the previous version as well, they have every reason to stay competitive.
Who writes this "poor economy" crap?
Many companies are doing just fine through this downturn, it's just a mental state of consumers that has changed, and probably not for the long run either as consumers tend to have about the memory of a goldfish when it comes to taking corrective action financially.
We're just slowly deflating back to where we were before this hyperinflation the last few years has brought.
Re: (Score:3, Funny)
I reserve the right to remind you that you said that.
Re: (Score:3, Informative)
Re: (Score:2, Informative)
> Who writes this "poor economy" crap?
When you're reading to pull your head out of the sand and stop ignoring facts... The Dept of Labor [speaker.gov] for one.
--
Stop Racism. Support the HUMAN Race.
Re: (Score:2)
Wow a politician carefully contructs a chart to show that politicians should have more power - never saw that coming! Sure, the data's from the Dept of labor, but interpretation is everything. Here in Silly Valley, for example, turnover is quite high (many job losses), but there are plenty of new jobs to move to. So far, it's nothing like the "laid off means at least 6 months looking" dot-bust days.
Fearmongering only makes things worse. It's pathetic enough that we've spent over 1.5 trillion on unneeded
strange (Score:3, Insightful)
Other than starving CIS majors, who barely earn enough money from their university's computer lab to pay for Ramen Noodles, who does that? IT professionals would just buy all the hardware together because their time is worth more than their money, and everybody else just buys entire new computers. This could only appeal to a handful of small-budget kids.
Re:strange (Score:5, Insightful)
Or a lot of small-budget husbands :P
Re: (Score:3, Funny)
OK, I'm not married, but recently when I was at the computer store, I overheard this scrawny guy on the phone with his wife begging her for permission to get the 2G instead of the 1G RAM upgrade. His whiny, pathetic, groveling demeanor over a $20 difference in price, and his futile attempts to explain to her why 2G is better than 1G, made me absolutely want to vomit. I'm not married, but I vowed that day to either divorce or kill myself if I ever find myself to be such a pathetic, spineless loser.
So my advi
Re:strange (Score:5, Insightful)
So my advice to the married chumps out there is to keep a separate bank account for discretionary purchases which your wives have neither control of nor access to. Life without self-respect (and gadgets) is not worth living.
Seconded. One of the best things you can do is establish the idea of a slush fund for both sides of the relationship; fighting over money is one of the more common reasons for divorce.
Re:strange (Score:4, Insightful)
So my advice to the married chumps out there is to keep a separate bank account for discretionary purchases which your wives have neither control of nor access to. Life without self-respect (and gadgets) is not worth living.
Or... marry someone who isn't a total shite and respects your interests.
-Taylor
Re:strange (Score:4, Insightful)
Or... marry someone who isn't a total shite and respects your interests.
-Taylor
This.
There's no way you are going to be able to successfully "hide" income from your partner.
You need to be able to sit down and talk about priorities and expenses without it devolving into a shouting match or having one of the parties become an unhappy invertebrate...
Re:strange (Score:4, Funny)
Life without self-respect (and gadgets) is not worth living.
Dude! Way to totally reverse priorities. Did it occur to you that maybe she's the one with the high-paying job, and all that groveling got him better hardware than he could have bought if his was the only income?
Okay that probably wasn't the case, I'm just sayin', if I had to choose between self-respect and gadgets... "Honey, please?! I took out the trash last night and everything!"
Re: (Score:2)
I'm just sayin', if I had to choose between self-respect and gadgets... "Honey, please?! I took out the trash last night and everything!"
Well that's your decision. Just please have the decency to not do your groveling in public - it embarrasses the rest of us.
Re: (Score:2)
Life without self-respect (and gadgets) is not worth living.
Dude! Way to totally reverse priorities. Did it occur to you that maybe she's the one with the high-paying job, and all that groveling got him better hardware than he could have bought if his was the only income?
Should it matter? In my opinion, if you're married, then it doesn't matter who makes the money, you're in it together. By that reasoning, should a parent who stays home with the children not be able to buy anything at all since they don't earn anything? Granted, I think that any major purchase or decision should be made together, and a $1k purchase is usually considered major, but one person shouldn't be begging the other for anything. That kind of relationship is not sustainable and not healthy for anyone.
Re: (Score:3)
Should it matter? In my opinion, if you're married, then it doesn't matter who makes the money, you're in it together. By that reasoning, should a parent who stays home with the children not be able to buy anything at all since they don't earn anything? Granted, I think that any major purchase or decision should be made together, and a $1k purchase is usually considered major, but one person shouldn't be begging the other for anything. That kind of relationship is not sustainable and not healthy for anyone.
Re: (Score:2)
Dude! Way to totally reverse priorities. Did it occur to you that maybe she's the one with the high-paying job,
Irrelevant. Would she be groveling for permission to spend $20 extra on a nicer pair of shoes, if he had the high-paying job? No, if she's already buying the shoes, she'll write the check, or swipe the card, and tell you later -- easier to get forgiveness than permission.
What's more, if it's a high-paying job, $20 is nothing.
if I had to choose between self-respect and gadgets...
Your choice.
I would certainly choose self-respect over marriage, though.
Re: (Score:2)
Irrelevant. Would she be groveling for permission to spend $20 extra on a nicer pair of shoes, if he had the high-paying job?
Ha! And you think those circumstances are the same?
I would certainly choose self-respect over marriage, though.
Yes that's pretty much the choice. ;)
Re: (Score:2)
you think those circumstances are the same?
Why not?
Here's how that conversation could have gone:
"I'm getting the 2 gig version. In fact, you know what? I'm getting the 4 gig version."
"Honey, we can't..."
"No, but I can."
"That's it! You're sleeping on the couch!"
"On the couch with my new laptop! Sounds good!"
If she's going to divorce you over twenty fucking dollars, she is not worth it. That's when you say, "Your latte for the next week, or me. Choose."
I know this will probably come off as advice on how to be an asshole. It's not -- you do want to be
Re: (Score:2)
Why not?
Uh cus she's a woman, something you either don't or don't want to understand.
By the way, I was joking in my first post, the last one, and this one, but it's a joke based on reality. You think marriage is that simple, you just say "it should be this way" and it is? That's why you aren't. ;)
Re: (Score:2)
Would she be groveling for permission to spend $20 extra on a nicer pair of shoes
Nope, she'd say, "They were on sale." And that would be the end of it. If you buy a computer, make sure it is on sale and then brag about how much you "saved", not how much you spent.
Re:strange (Score:5, Insightful)
Or, you can be adults, and maybe agree on an amount for discretionary spending that doesn't require the others approval.
Re: (Score:2)
Re: (Score:2)
Hear hear!
Listen spineless losers: only grovel for 4GB. 2GB isn't enough. If you grovel, do it for something worthwhile.
Re: (Score:2)
20 years ago, you'd be groveling over 64k vs. 32k.
Re: (Score:2)
Listen spineless losers: only grovel for 4GB.
Fuck that, if I'm groveling, it'll be for this beast [apple.com].
Re: (Score:2, Insightful)
Re: (Score:2)
Wow, really? by your response, I never would have guessed.
Its your response, that repulses me. Life lived for oneself is not worth living. Not that I'm recommending suicide or Marriage for you. But seriously, don't make the purchase of 1 gig of freaking ram the definition of a life worth living. Gadgets suck compared to people.
Re: (Score:2)
But seriously, don't make the purchase of 1 gig of freaking ram the definition of a life worth living.
No, it's the groveling. Let me put it in perspective:
For you, it's a gig of RAM. For her, it's $20. Neither is worth fighting over -- but then, a modern computer with 2 gigs will most likely be quite a lot faster and more useful than one with 1 gig. If that matters to you, she should respect that.
Let's suppose she's the one with a high-paying job. Probably fair to assume she drinks Starbucks. If you really don't have the $20, she can skip her latte for a few weeks. She doesn't even have to go off caffeine,
Re: (Score:2)
So my advice to the married chumps out there is to keep a separate bank account for discretionary purchases which your wives have neither control of nor access to. Life without self-respect (and gadgets) is not worth living.
Wow, I'm really lucky... last time my computer broke (it was 5 years old), she went with me to the Apple Store so we could use her student discount to get a new dual G5. That's now 4 years old... hmmmm...
Re: (Score:2)
Reminds me of my cousin, he works in a computer store and his girlfriend wouldn't allow him to get a new computer because she thought he should spend the money on a trip for them (he was like 19 and she 18 so pretty pussy whipped.) ;), or well, not the actual problem, be he found a workaround.
Anyway, he kept the case, problem solved
Re:strange (Score:4, Insightful)
Not strange at all (Score:5, Insightful)
Other than starving CIS majors, who barely earn enough money from their university's computer lab to pay for Ramen Noodles, who does that? IT professionals would just buy all the hardware together because their time is worth more than their money, and everybody else just buys entire new computers. This could only appeal to a handful of small-budget kids.
If you don't think in terms of upgrading the processor of the computer sitting on your desk, but instead think of HP updating the processor in their line of AM2-based computers, then you should be able to see that the appeal is basically universal. This way the OEMs can offer refreshed versions of their lines without having to incur the extra expense of DDR3. Obviously they will also make a DDR3 AM3-based line, but the DDR2-based line will be cheaper.
Backward compatibility and in-place upgrades appeals to far more than a handful of poor hobbyists.
Re:strange (Score:5, Interesting)
Re:strange (Score:4, Insightful)
Virtualization doesn't help your performance if you're already using all of a particular resource. It has overheads that mean you're getting less out of your hardware in terms of raw performance. The fact that you can put 5 boxes that would otherwise be sitting idle on the same hardware is what makes virtualization attractive.
Re: (Score:3, Informative)
Exactly how does virtualisation magically add performance out of hot air? And exactly how does buying additional iron provide that same kind of performance increase per $ spent that the parent mentioned?
Gosh, can we please have an automated -2 buzzword sucker whenever someone comes up with fancy terms where they just don't fit?
Re: (Score:2)
Perhaps I should have been more clear.
Virtualization allows you to swap out the underlying hardware without reconfiguring the server's OS.
Granted, you still do need to set up a "host" operating system, although this tends to be a fairly trivial task.
Re: (Score:3, Insightful)
Re: (Score:2)
How in the world does virtualization give you CPU cycles you didn't have before?
Switching to clustering will cost well more than $5K.
Re: (Score:2)
Not necessarily. If you have a nice machine that could go for another year in it's current role if it was just a bit faster, why not drop in the new CPU? Getting an all new machine will take up time as well moving the software and/or configs over.
Next year when the budget is hopefully better, get a new system sans CPU, and move it over. Put the old one back in the old box and deploy somewhere less demanding.
Doesn't matter. (Score:3, Interesting)
I mean the economy in terms of releasing a product update. If the work is done & ready to go, it's too late to worry about the economy, just ship it. Not only that, product development cycles on these products are long enough that they need to continually invest in R&D regardless of the economy, by the time a just-started project is done, the economy will have rebounded and ready for new product.
If the world is switching to DDR3, that probably means having a new socket. As such, AMD needs to introduce the new socket when they are ready to.
Excellent (Score:2)
This is great. I'm hurting for a new desktop and was planning on getting an AM2 CPU.
I still am, but knowing that the AM3 was just around the corner, waiting to knock all the AM2/AM2+ prices down has delayed my plans for a few weeks, and now there's finally an end in sight!
am3 CPU in am2+ motherboard: OK Otherway.. no (Score:5, Informative)
You may be able to put a am3 processor in a am2+ motherboard, but the Register says that am2+ processor in a am3 motherboard will not work. (http://www.reghardware.co.uk/2009/02/09/review_cpu_amd_phenom_ii_am3/page2.html)
To quote: ..
"makes life horribly confusing as the Phenom X4 920 and 925 and the X4 940 and 945 will be identical apart from the processor socket. This means that there is the possibility that some poor so-and-so will buy an AM2+ CPU and an AM3 motherboard when ne'er the twain shall meet."
careful what you buy out there
Re: (Score:2, Informative)
IIRC the AM3 has fewer pins and is able to plug into
an AM2+ socket, but AM2+ chips can't plut into an AM3 socket. So, if you buy the wrong one you'll know as soon as you try to plug your AM2+ cpu into your AM3 motherboard...
Re: (Score:2)
That's kind of logical - sure, some poor so-and-so could in fact get into some trouble this way, but there's absolutely nothing AMD could do to fix that, save for putting an appropriate notice on the box. I mean, AM3 motherboards will use DDR3 memory (which is different from DDR2 even in terms of physical dimensions and pinout, so you can't put a DDR2 module in a DDR3 slot), but AM2 processors can't talk to DDR3 memory because they were not designed to do that, and AMD can't magically fix all those AM2 proc
Despite a poor economy? (Score:5, Insightful)
Re: (Score:2)
Re:Despite a poor economy? (Score:5, Informative)
And also due to poor economy, otherwise they wouldn't support cheaper DDR2.
I guarantee you they would.
Even when the economy was good, there was a lot of downward pressure on the prices of computers. Mandating a switch to a more expensive memory tech before the market is ready is a sure way to have it backfire in your face *cough* RAMBUS *cough* Ugh that was some nasty phlegm.
Re: (Score:2)
Yeah, speaking of which I had to tell a poor sot today that it's not worth upgrading his memory because it uses Rambus. Heck, for the cost of adding 512 MB he could buy a whole new computer. If they'd used DDR from the start, he could probably be rockin' 1GB RAM now, and be good to go for at least a few more months.
Re: (Score:2)
Ha ha?
expensive memory? (Score:2)
Re: (Score:2)
It starts getting expensive though if you want to have more than 8GB in your computer. You either have to buy find 4GB sticks at that point ($$$) or a server-class board ($$$). It's actually kind of annoying - DDR2 is crazy cheap, but it starts to get difficult to use more than about 8GB of it, and really difficult to use more than 16GB.
Inflationary summary? (Score:5, Insightful)
"Despite a poor economic climate, farmers still harvest crops they planted last year...." - come on....
(The Colloquial) Moore's law is a cruel mistress (Score:2)
Taking as gospel the non-technical* formulation of Moore's law: processing power doubles every 24 months, then delaying your product even a few weeks puts you behind the performance curve. A 2 week delay comes in at a manageable 1.3% but delay your product 8 weeks, and you are already 5.5% behind your competitor**. In a market with margins in the low single digits, that's the difference between profit and loss.
Economy or not, you've got to release the product when the engineers say its ready or else it deca
Actual processor speed (Score:2)
Except that actual processor speed went off Moore's curve a while back ... While transistor densities have gone up (mostly) according to schedule, actual processor speed has not.
Your argument is good, and AFAIK processor makers use it to a certain extent, it's just that the percentages are a bit smaller.
DDR3 not worth it (Score:2)
"despite" submitter's ignorance? (Score:3, Insightful)
TITLE: AMD Launches New Processor Socket **Despite Poor Economy**
So we're not supposed to do anything because the economy is bad?! So let's never a thing again because we're ignorant of larger pictures and contexts and variableness in life. What kind of f***ed-up sh** IS THAT! Start tagging sentences with pessimistic endings and implying stuff because we're ignorant a**holes. Let's see ....
Today I drove to work in the winter despite road salt runoff will affect the lake. I bought a new dog despite the existence of puppy mills. I washed my hands after peeing despite the fact antibacterial soap kills good germs. I sat on a wooden chair despite my ginger ass getting chapped.
You know what ... I think the title actually had an effect on me despite the fact I found it totally ignorant. What do you think?
A real review of the product with benchmarks, here (Score:2)
Re:Good (Score:5, Informative)
The latency is generally lower than DDR2, measured in wall-clock time. The advertised latency appears worse only because of the faster clock.
Re:Good (Score:5, Informative)
That's bullshit, CL in periods * period length = latency, and since they are clocked higher the latency will probably be around the same, I won't calculate it for you.
And that latency is how long it takes before you actually start to read any bits, but as soon as you have started each bit will come faster from the higher clocked memory.
If you don't get a speed increase it's because either of:
1) Processor not fast enough to take benefit of additional bandwidth.
2) Cache system smart enough to not take benefit of additional bandwidth.
3) Application not using memory in a fashion where it will take benefit of additional bandwidth.
Most likely the later one ..
All higher end graphic cards come with faster memory, it may not be a huge deal always but it probably add some benefit, rather stupid if it didn't.
AMD said they would skip DDR2 and go directly to DDR3 earlier because there was no benefit when actually in use but I guess they "had to" when Intel was using DDR2 just because people see the numbers and wonder why one is bigger than the other.
Though first AM DDR2 chips vs 939 DDR chips showed no increase in speed in benchmarks.
Anyway, DDR3 is faster than DDR2, will you notice it? I have no idea.
Re:Good (Score:5, Insightful)
The 3-fold clocking scheme will only really help on interleaved burst reads. The memory cells don't charge the output buffers any faster just because you clock them at a higher rate. This is why the nCLK latencies scale with the number of folds in DDR scheme. The only things that will make the cells charge faster are a) higher voltage or b) smaller process or c) a more conductive semiconductor chemistry that lowers resistances and increases currents on the wafer.
If you can have 3 banks of DDR3 interleaved by 1 clock then you can probably see some significant gains on sequential (aka burst) reads. In real life, this doesn't happen very much, especially in a multithreaded environment where almost all s/w is written using high-level foundation classes with very little machine optimization.
Re: (Score:2)
I remember reading about what a huge impact the concept of interchangeable parts had on the industrial revolution. And at the beginning there for a while, it also applied to computers.
It still does. Your PCI-E video card works in just about any computer that supports that socket.
I am positive you can engineer a socket that leaves plenty of room for the future. But why would you want to do that, if you can drain your customer's bank accounts?
Why would you do that? It'd be more expensive now, offer no advantages, and in three years, the new socket would be faster and come with updated IO choices.
Re: (Score:2)
and in three years, the new socket would be faster and come with updated IO choices.
That's arguable. We're quickly approaching performance limits, diminishing returns are starting to show up, and processors are only getting marginally faster. Now the trick is to play with the cache, and add cores to show a performance increase. I'm wondering in 3 years if we'll be able to justify upgrading at all. If that's the case, I expect new sockets galore and radical design changes just to artifi
Re: (Score:2)
That's arguable. We're quickly approaching performance limits, diminishing returns are starting to show up, and processors are only getting marginally faster. Now the trick is to play with the cache, and add cores to show a performance increase.
Guess what? Adding more cores usually needs more pins for IO. AMD did something smart and reserved some space for another HT link (apparently), which is what you've been complaining about them not doing, so I'm confused why you're unhappy. Why would they have added the HT stuff 2 years ago if noone was going to use them? What happens when it's time to go past 40 address lines? We get a new socket, that's what.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Insightful)
No, they were busy getting drunk and ignoring the adults who told them they'd end up with shitty jobs later on in life if they didn't take school more seriously.
Re: (Score:2)
It also does not include anyone who is not eligible for unemployment, for any reason. (Such as being unemployed for longer than the unemployment period + extension.)
Re: (Score:2)
That's a joke, but seriously, it's folly to equate today's unemployment to the great depression's. It's like comparing a windy day to a hurricane.
Re: (Score:2)
is the current problem the hurricane or the windy day?
Re: (Score:2)
Yeah, it is not at 25%+ yet
Re: (Score:2)
(sorry flipping burgers does not feed a large family)
Do people actually think it does?
Re: (Score:3, Funny)
doh! Hit the wrong button. Forgot to add the punchline about... unless they're bringing those burgers home. =P
Re: (Score:2)
A new RAM format means the price of the old format declines a little, stagnates, then climbs up incessantly.
http://www.dramexchange.com/ [dramexchange.com]