AMD Introduces New Opterons 128
New submitter Lonewolf666 writes "According to SemiAccurate, AMD is introducing new Opterons that are essentially 'Opteron-ized versions of Vishera.'
TDPs are lower than in their desktop products, and some of these chips may be interesting for people who want a cool and quiet PC."
And on the desktop side, ZDNet reports that AMD will remain committed to the hobbyist market with socketed CPUs.
Not watching the trends? (Score:4, Interesting)
I hope people are starting to sit up an take notice. The desire for fulfill the prophesies of Moore's law and to have ever faster and more powerful computing has already exhausted itself. Games are just about as good as they are going to get without new display technologies. The desktop PC has been maxed out and has been resorting to multi-processor and multi-core as the means to keep growing but meanwhile, the primary OS for most people running these systems is still not taking full advantage of even those advances.
So now things are going for lower power, lower operating temperature and all that. What sort of things benefit from that? How about "embedded systems"? Things that people don't want or need to reboot? The current versions of Windows are too bloaty, power and memory hungry to fit within that framework, so it'll have to be another OS. We know this because of the horrible failure "Netbook" computing has been. People wanted it, but expected it to run Windows. Windows couldn't really do it effectively. (I know... people are still doing it... I've still got two netbooks running XP and going strong... but anyone selling XP?) Microsoft shows no remorse over their architectural choices and show no signs of slimming down and getting lighter. So nothing points in Microsoft's direction... not even Microsoft. They are raising prices to make up for the lack of interest in what they are doing now.
Think about what we are seeing.
Re:Not watching the trends? (Score:5, Insightful)
Actually, all modern OSs do a fantastic job of taking advantage of multiple cores. It's the apps that fail to do so.
As for OSs that take advantage of low power CPUs, you only mention MS - who (I suppose) has done a good job of this with Windows RT on the Surface. And maybe even a good job with whatever the hell Windows Phones run. It's just that consumers have not liked the apps. Of course Apple and Google both have solid contenders in the embedded space.
So, as it always has been: "It's the applications, dummy."
What are you trying to get at?
Re:Not watching the trends? (Score:4, Insightful)
I think he's saying that CPUs bought several years ago are good enough for most people and the need to upgrade hardware every few years is not as pressing as it once was. One way to force this is to bloat software like OS so that you needed new processors.
This leaves MS in a difficult place as most consumer tend to buy new machines to get new Windows versions instead of upgrading. There are rumors that MS is switching to a yearly release to entice consumers to upgrade. It is nearly the same model that Apple uses.
A key difference is that while Apple might make some profit on OS upgrades, they make a lot more on hardware. Thus MS is trying to get into the hardware business as well.
Re: (Score:2)
Most people won't be buying Xeons or Opterons, but for those that do 64 cores and 128GB of memory for under $9k means a lot more hardware for much lower budgets than expected.
Re: (Score:3)
I think he's saying that CPUs bought several years ago are good enough for most people and the need to upgrade hardware every few years is not as pressing as it once was. One way to force this is to bloat software like OS so that you needed new processors.
This leaves MS in a difficult place as most consumer tend to buy new machines to get new Windows versions instead of upgrading. There are rumors that MS is switching to a yearly release to entice consumers to upgrade. It is nearly the same model that Apple uses.
A key difference is that while Apple might make some profit on OS upgrades, they make a lot more on hardware. Thus MS is trying to get into the hardware business as well.
If only that were true. The merging CPU/GPGPU and then HSA approach to ubiquitous computing is going to need OS Level tools and frameworks to make that huge leap and make it uniformly on a platform so that Application layer development isn't spinning its wheels reinventing custom threading models and distributed design architectures because there doesn't exist a set of Core APIs to aide. Apple is way ahead in this regard. AMD is also way ahead in this regard. Ironic that both MSFT and Intel are behind when
Re: (Score:3)
Re: (Score:2)
So you are saying that a company should be forced to sell software they no longer want to support?
What about that companies support costs. If the company is still selling the software their customers are going to demand support and the main reason that the company stopped selling said product is that the products support costs were too damn high. So how does that get the company more money? All it gets the company is pissed off customers and frazzled tech support.
Re: (Score:1)
XP still has support, its called DIY.
Re: (Score:2)
Out comes the Linux Mantra, but for an MS product! Wow.
Re: (Score:2)
Re: (Score:2)
In your post you stated.
This is a`reasonable compromise for consumers and vendors: the former get to use the software they want, the latter continue to get their money
How are the vendors getting money in your solution?
Re: (Score:3)
Re: (Score:3)
That is nice and reasonable as long as the consumer is aware that support will only be for the version they purchased.
Comment removed (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
Re:Not watching the trends? (Score:4, Interesting)
Re:Not watching the trends? (Score:4, Insightful)
Oh I completely dig that idea. If it is of no use to you (ie. you aren't selling it) then you have apparently exhausted its value to you as a business. It is now your responsibility under the contract of copyright, to release it to the public domain. But no. "The value" is maintained by keeping it away from the public in order to ensure that they keep buying the same things over and over and over again. This is a public abuse which could only be enabled by copyright law.
So copyright went from the right to copy and distribute to the right to take it away from the public and to withhold information, arts and technology.
Re: (Score:2)
1. Some people still are running Win98SE
2. If WinXP were free, people would likely kick in with their own fixes and updates
Also, if the copyright for Windows XP were turned over to the public domain, I think the source code would ALSO be made available somehow. They could try to keep it to themselves... you know they wouldn't be obligated to publish it anywhere as far as I know... not sure how the library of congress works with regards to all of that, but I get the feeling the code would get leaked somehow
Re: (Score:2)
No, they don't... and it's just a thought.
But in general, I think when people are ready to give up on the new version of Windows, they will pirate and old one and/or move to another OS. The options are limited but flavors of Linux like Ubuntu are extremely effective with new users. (I am not an Ubuntu fan. Not at all!)
I'm not predicting doom for Windows and certainly not setting a date. I am extrapolating the more numerous and recent failures Microsoft has had and they are huge. The primary cause is cu
Re: (Score:2)
Re: (Score:2)
Perhaps a good solution would be to implement yours, but give companies the option of just releasing old versions to public domain if they wish to avoid any support issues that they were afrai
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Re:Not watching the trends? (Score:4, Interesting)
I hope people are starting to sit up an take notice.
???
The desire for fulfill the prophesies of Moore's law and to have ever faster and more powerful computing has already exhausted itself.
Not sure I follow. Transistor density has kept on increasing. It's been a little slower recently, I think, but several manufacturers are now sub 30nm for a variety of different process types.
Games are just about as good as they are going to get without new display technologies.
Really? Seems unlikely.
The desktop PC has been maxed out and has been resorting to multi-processor and multi-core as the means to keep growing but meanwhile, the primary OS for most people running these systems is still not taking full advantage of even those advances.
Are you sure? Have you looked at the recent CPU benchmarks? More and more programs are taking advantages of multiple cores. All sorts of things that people actually do, like file compression, web browsing, media transcoding. Certainly the things I do benefit from multiple cores.
We know this because of the horrible failure "Netbook" computing has been.
Netbook computing was fine until microsoft moved quickly to kill it. Then the manufacturers seemed bent on suicide after that for inexplicable reasons. Oh, and intel came up with bizarro licensing for the Atom restricting manufacturers yet they haven't (with few exceptions) switched to the faster and cheaper Bobcat CPUs which lack such bizzare licensing restrictions.
Why can't I buy a machine at the low price point and low eright of the EEE 900? That machine sold many millions. Netbooks used to be sub 1kG in the beginning. Now the lightweight ones are 1.5. What happened?
Venduhs are strange. Why did they drop all the high res screens from laptops 10 years ago only to scrabble to play catch up after Apple decided to bring in high res displays? Makes no sense.
That said, there's still a quite decent range of cheap netbook machines around, but they're just not as good as they were.
Re: (Score:2)
I agree with the games thing. There are many ways they could use more CPU/GPU and still be useful. For example when have you seen wind in trees in a game that actually looks like wind in real trees? Trees in games are always some sort of leaf pattern on a plane with holes in it. Any games with a good deform-able environment? How about reflection in water ripples? Bullets that are actually computed using wind and movement of the player? Actual gravity and friction?
Oh bull (Score:4, Interesting)
While software has been hampered by web "technology" over the last decade, we are hardly at the pinnacle of software and computing... it's more like the Dark Ages, actually. Some stuff is being done elsewhere (GPUs, mobile), but we're still mired in fundamentally stagnant and backwards principles on the desktop (and server, really).
Laughable. Let's assume anything video-related is "new display technology," and that we certainly have a long way to go to realtime radiosity and raytracing at extremely high resolution in a mobile device, then toss it 3D for good measure, so that's a given. But in terms of gameplay, all the computing and RAM you can get can be eaten up for a very long while. Simulation in games, today, isn't anything like what it could be. If I can't build a city at the SimCity level, zoom in and rampage through it at the GTA level, and walk up to each and every person on the street and learn their personal history and daily routines at an RPG level, then go into every structure and demolish it bit-by-bit with full soft-body dynamics, you've got quite a long way to go.
This is true to some extent, but "resorting to multi-processor and multi-core" means the desktop isn't maxed out. The primary OS (and software) may not be taking advantage of these things, but they are there and we're far from done yet.
Microsoft is irrelevant. They have been for a long time. They may not be going away anytime soon, but they've been irrelevant since Google used the web to effectively route technology around them (due to earlier attempted lock-in). Of course, this has resulted in aforementioned Dark Age of Software, but at least we're not stuck on one platform. We're at the point where Valve is looking to seriously move gaming away from Windows, and there are alternatives for everything else, so what happened before doesn't really apply to what can happen in the future.
What we are seeing is ripe potential for a Computing Renaissance.
Re: (Score:3)
So now things are going for lower power, lower operating temperature and all that. What sort of things benefit from that?
Racks and racks of servers.
Every saving of one watt in TDP of the processor can double (or more) in savings in less cooling of the building.
Comment removed (Score:5, Interesting)
Re: (Score:2)
Hell one of the Apple fanbois tried giving up ALL X86 for a month, just one month, and using nothing but his iPad and his iPhone...what happened? he gave up after a week and a half because it was hobbling him too damned much.
Do you have a link to that story? Sounds like an interesting read but my efforts at searching for it have failed.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Games are just about as good as they are going to get without new display technologies.
Hah, you're totally making that up. 3D simulation has barely scratched the surface of what is possible, and AI of today is just a lame joke. How about audio synthesys that doesn't need vactors? How about fully interactive worlds? How about interacting with proper physics? How about hair that looks and acts like hair?
Trust me, the games you are playing today will look just as dated in ten years as that games you played then years ago.
Re: (Score:2)
One problem is that xorg is using 75%... but only of one core. Xorg is a notorious offender when it comes to a program that you'd think would be very well multi-threaded but is actually single-threaded (Firefox being another although that is gradually changing).
Re: (Score:1)
A couple of years ago there may have been a reason to run XP on a netbook, but not anymore. Almost no sites require IE and lots of apps are moving to the cloud . Do yourself a favor and install a lightweight linux distro (such as Debian with LXDE) and google Chrome. You'll immediately notice the improvement in boot times, responsiveness, and browser speed. And you'll be a lot less susceptible to malware.
I would just install Windows 7. It runs as fast as XP and, there is more software available. Also, internet video (Flash, HTML5) is garbage in Linux and cannot be played smoothly on a netbook.
Re: (Score:2)
I'll take that bet. Multiple instances of EVE Online running in XP and you were lucky to get 30-40 FPS on the primary window, with the background windows chugging along at 10-20.
Same hardware, Win7 64bit, now you're getting 60fps solid on the primary window and 30-45 on the secondary windows.
Also running Win7 on my 2007 era Thinkpad T61p. Runs nicer the WinXP did and I do a lot more with it.
(Win7 is pretty darned g
Keep 'em Coming (Score:5, Interesting)
AMD has huge advantages in the server market, I'm really surprised people are so stuck on XEON's.
You can't cram 64 XEON cores into a 1U. Not to mention Intel is spotty on their hardware virtualization extensions.
Intel has the lead in power consumption, sure. But if you're looking into running anything Xen, KVM or VMware in production, the cost savings AMD brings to the table makes them a competitive contender.
I'm in the market for a new Workstation. I've been looking at an Opteron instead of the desktop models. Primary reason being 16 cores on one chip, at a lower power consumption than the 8-core Desktop model.
Re: (Score:2)
Until recently, I've been buying 100% AMD for 15 years... but AMD is so far behind that for the first time, I bought several Intel-based servers.
Not sure what advantages you think AMD has over Intel... I would love to see a list. because frankly, it's sad to see AMD where it is.
Re: (Score:3)
AMDs advantage is lots of CPUs for cheap.
For some workloads, that is worth it. I am using some for a VDI deployment. RAM is the limiting factor on how many desktops I can host not CPU and not disk, because I went all SSD.
Re: (Score:3)
RAM is the limiting factor on how many desktops I can host not CPU and not disk
Surely in that case it's also worth going for AMD, since you also get excellent value in terms of DIMM sockets. If CPU is really not the limiting factor, you could get a 4 way 6212 (are those the cheapest?). The processors are about £200 a pop, and you get 32 DIMM sockets giving you up to 512G of RAM, using 16G DIMMs. 16G DIMMs are now at the point where they are sometimes less/GB than 8G ones.
Between the cheaper moth
Re:Keep 'em Coming (Score:4, Insightful)
- Core density
- Virtualization extension on all Opteron chips (and now most desktop chips, even the A6-4455M in my laptop)
Not all XEONs have hardware virtualization. Only some of the most expensive chips have it and even then, it can be spotty.
Bottom line, AMD wins in virtualization/"cloud" market (and supercomputing).
Re: (Score:2)
Augh. Damn me for using my phone... That message should have been for parent.
Sorry about that.
Re: (Score:2)
- a core on an AMD system has about the same performance as a thread on a XEON. So Opteron and XEON are equivalent there.
- The e3 - entry level XEON -- the cheapest class of XEON chips -- has hardware virtualization. I don't think it's on every chip, but it's really not hard to pull up Intels site and see if a specific chip supports it.
Re: (Score:2)
Not all XEONs have hardware virtualization. Only some of the most expensive chips have it and even then, it can be spotty.
Not true. Every Xeon since 2006 has shipped with VT-x support. Look at the Xeon 5030 [intel.com] for example. Absolute bottom of the line ($150 at launch) from 2006, and it supports VT-x.
You're probably thinking of Intel's desktop line where to do artificially hobble large swaths of their CPUs with respect to VT-x.
Re: (Score:3)
Until recently, I've been buying 100% AMD for 15 years... but AMD is so far behind that for the first time, I bought several Intel-based servers.
Not sure what advantages you think AMD has over Intel... I would love to see a list. because frankly, it's sad to see AMD where it is.
That's easy. Core density per dollar in the same n rack units.
For the same number of dollars I can buy more real estate on which to run my virtualization stacks with AMD processors than Intel processors. And the savings extend beyond the hardware too. With more cores per socket, my VMWare licensing costs (per core or per VM, however you want to break it out) can be much lower with AMD processors in those sockets. So cheaper CAPEX (hardware and license costs) and cheaper OPEX (support subscription costs) ma
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
And 45W parts at the desktop are *very* nice in terms of noise. Even with the stock CPU fan, it makes for a very quiet desktop.
Re: (Score:2)
The number one AMD advantage to me was 64 cores and 128GB on a SuperMicro board in a decent chassis for $9k. The equivalent Intel machine does have more cores but costs more than around five AMD machines.
Re: (Score:2)
Also when the price difference is more than 2x the price comparison still wins if you compare threads to cores, so I see your nitpick as misleading. Is it deliberately so or do you just know very little about the subject and made a mistake?
Re: (Score:2)
Re: (Score:3)
I'm in the market for a new Workstation. I've been looking at an Opteron instead of the desktop models.
Be careful. Strange workstation vendors have decided that if you're blowing $5k on a huge workstation 2 sockets, then naturally you don't mind if it sounds like it's being powered by a gas turbine located inside the case. Oh, and the exhaust of the gas turbine is then fed into a flugelhorn or vuvuzela just incase you are hard of hearing.
I really don't know why. I have purchased a 3 GPU machine which dissi
Re: (Score:3)
I'd probably go for single socket, 16-core Opteron on a supermicro or Tyan standard ATX board I can plop in my existing chassis. Supermicro will need a breakout cable for front panel buttons, but no big deal. In this situation I can fit most sized heatsinks just need to be sure it will fit on the socket.
Re: (Score:2)
Re: (Score:2)
Do your 'workstations' have intel xeon cpus or amd cpus?
The loud workstation I'm referring to was an AMD one. The quiet GPU monster was an Intel Core i7, so it didn't have ECC, but neither did the 3 graphics cards which were being used for computation, so it was a bit of a wash, really. And it actually needed proper graphics cards, since it relied heavily on the texture sampling unit for the computations.
That's the other nice thing about AMD is that you can get cheap machines with ECC: the standard single s
Re: (Score:1)
Re: (Score:2)
I think they do now. Or at least the computation cards do. I don't think they have them on the consumer level graphics cards. It turned out that the consumer graphics cards were better for our needs since they had more and faster texture sampling units. Since the workload was image processing, that was necessary to get good performance. The computation cards tend not to be so good if you're doing graphics or graphics-like workloads as they dedicate more resources to floating point and fewer to pixel specifi
Re: (Score:2)
I don't think they have them on the consumer level graphics cards. It turned out that the consumer graphics cards were better for our needs since they had more and faster texture sampling units.
I recently bought a Radeon HD 7850 2GB card that I'm pretty sure has ECC. Cost was $200. It's pretty cool - if you overclock the memory too high, the card doesn't start crashing your games or anything, it just doesn't run any faster.
To my knowledge, all graphics cards with GDDR5 memory have ECC.
Re: (Score:2)
I've been looking at custom building a server at home and when looking in the 64GB-256GB of ram with single-dual socket 8-32 threads, Intel wins in price, performance, and power consumption.
Choosing between an Intel Xeon i5 3.3ghz quad+HT with 65watt TDP compared to an Opteron Bulldozer 4module-8core 2.8ghz with a 125watt TDP and lower IPC, for an almost identical price isn't even a choice.
Re: (Score:2)
That's why you put the noisy little buggers in a server room and keep the door closed.
As a relatively early adopter my 48 core AMD is in 5U, but you can get the newer 4 socket 16 core ones in 1U. I've got a pile of twin 8 core systems in 1U from a few years ago that put out about the same amount of heat as a recent 64 core system would, and they are like hairdryers.
Re: (Score:2)
Re: (Score:2)
The hype makes it look like a crippling stupid hack like the netburst stuff but reality is a fairly simple tradeoff that's not going to make any difference to what
Re: (Score:2)
Re:Keep 'em Coming (Score:5, Informative)
Pizza boxes (1U) don't offer hot swapable HD bays
supermicro would disagree with you
Re: (Score:1)
Supermicro is crazy. They'll cram 4 nodes in 2U [supermicro.com] with an Infiniband interconnect.
Re: (Score:3)
supermicro would disagree with you
Not to diss Supermicro, they make nice boxes, but I've had hotswap 1U gear from other manufacturers for over a decade.
Methinksts the GP just doesn't know the market.
Re: (Score:2)
Pizza boxes (1U) don't offer hot swapable HD bays, making them less than ideal for your VMware server.
Although having hot-swap bays in an ESX(i) server is nice, one of the whole points of the VMware infrastructure is that you can down a physical machine for maintenance without affecting any services.
You'd still have some sort of redundant disk (RAID-1 at least) on the ESX server, and if a drive fails, you just migrate all the VMs to a different server, replace the failed drive, let the RAID rebuild while nothing is running on that machine (which means the rebuild can be given high priority and complete more
Re: (Score:2)
Hotswap absolutely exists in 1U. 4x2TB HDDs in RAID-10 will provide best availability:cost:performance ratios. I have not fiddled with 3 or 4TB drives yet.
2U would probably be ideal in a 4-way setup, for air flow, indeed. But if you need number crunching 1U offers better density.
Heat should not be an issue with proper active cooling.
Re: (Score:2)
Re: (Score:2)
Either a 2nd drive will die before you get a chance to swap out the bad one or you'll be an idiot and pull the wrong drive.
If all you have is (4) bays, you should mirror across the first (3) and use the 4th as your hot-spare.
Re: (Score:2)
RAID-10 or RAID-6 can tolerate 2 drive failures.
Re: (Score:1)
Esxi could also be already embedded on the motherboard / installed on an internal usb drive or sdcard.
Having hot swappable hard disks, today, is not really necessary.
Re: (Score:2)
That isn't what that link is saying. It's saying they are comparable.
Re: (Score:2)
Re: (Score:3)
Did you actually read the article? It says the performance of the 8-core AMD FX-8350 has similar performance as the 4-core Intel Core i5-3570K. In most of the tests, AMD actually performed worse.
So where's the disconnect? If words are too hard for you, just go look at the graphs.
Re: (Score:2)
Did you actually read the article? It says the performance of the 8-core AMD FX-8350 has similar performance as the 4-core Intel Core i5-3570K. In most of the tests, AMD actually performed worse.
So where's the disconnect? If words are too hard for you, just go look at the graphs.
Wake up. Two years ago Intel went on a media blitz dismissing parallelism and GPGPUs. Behind the scenes, they spending billions catching up in an attempt to cut AMD off at the knees because they know AMD has a huge lead with their new architecture. Software from all walks of life are moving to parallelism and OpenCL. OpenCL 1.2 latest updates in the API is a huge leap and AMD with Apple are leading the charge. Intel is busting itself and come next year Xeon Phi will be crammed down everyone's throat via ma
Re: (Score:2)
Re: (Score:2)
AMD SUcks (Score:5, Funny)
Everyone knows that Intel is better, and competition in the CPU market is not a good thing. I hope AMD goes out of business soon, so that Intel can lower the price of their chips.
Re: (Score:1, Insightful)
Everyone knows that Intel is better, and competition in the CPU market is not a good thing. I hope AMD goes out of business soon, so that Intel can lower the price of their chips.
Yea, because usually when a company has no competition they lower prices. Happens all the time.
Re: (Score:3)
Everyone knows that Intel is better, and competition in the CPU market is not a good thing. I hope AMD goes out of business soon, so that Intel can lower the price of their chips.
Yea, because usually when a company has no competition they lower prices. Happens all the time.
Woooosh!
Re: (Score:2)
If I read what you wrote as a sarcastic statement intended to be read as the opposite of what you wrote, you come off sounding a lot more intelligent. Intel does have a leg up at the moment, but everything after that is incorrect and misguided in what you said. Here's hoping competition stays alive, socketed CPUs stay around, and AMD has a long life ahead of them (one they earned, of course).
Re: (Score:3, Insightful)
...competition in the CPU market is not a good thing. I hope AMD goes out of business soon, so that Intel can lower the price of their chips.
What? Competition drives innovation and lowers prices. It happened with AMD's Athlon killing the old Netburst P4s. It happened with x64 killing IA-64. Why would AMD leaving the market "let" Intel lower CPU prices?
Oh, I'm sorry, you're just a troll, without the possibility of reasonable discourse or fair and reasoned debate. Forgive my oversight.
Re: (Score:2)
I wish I had mod points. This is the funniest thing I've read on here in a while. It's goes above and beyond trolling.
Re: (Score:2)
You got some nice bites there, but I think you could have trolled harder.
7/10
1.25v DDR3, but CPU efficiency... (Score:3)
Okay so they're the only x86 CPU offering 1.25v DDR3 support but the difference between a pair of 1.25v and 1.5v DIMMs is around 4 W [tomshardware.com] and you can save 3 of those 4 W moving to the commonly available 1.35v DDR3. Meanwhile AMD keeps putting out 125W processors like the FX-8350 to not really compete with a 77W processor like the i7-3770K, so this "major datacenter advantage" I think I'll file under "major wishful thinking". Not to mention you're investing into a platform with little future since AMD wants to push ARM servers now. But I guess Intel has let AMD put a positive spin on continuing to deliver on old sockets.
Re:1.25v DDR3, but CPU efficiency... (Score:5, Interesting)
Huh?
The i7 3770K has a TDP of 95W. And the FX-8350 is a very good chip and much cheaper than the i7. The benchmarks relative to the i7 are all over the place. In most cases it sits somewhere between the i5 and i7. In some cases it is destroyed by the i7, in other cases, the reverse is true. The single threaded performane is quite weak and usually substantially less than the i5, but then the i5 to i7 difference isn't enormous. The difference from FX8350 to i7 seems to be around 20-50% in most cases.
Curiously the AMD processors tend to stack up better on the Linux benchmark suites.
Anyway.
This thread is about the Opteron processors, which are still (a) competing against SB, (b) benefit from substantially cheaper full system costs and (c) you aren't terribly sensitive to single thread performance if you're buying a 4 socket server.
Not to mention you're investing into a platform with little future
What does that even mean? It's all x86, so even if AMD vanishes tomorrow you can keep using the servers and then transition to intel when you need new ones. The whole point of having more than one vendor means that no matter what, you're not investing in a platform with no futuer.
Re: (Score:2)
The i7 3770K has a TDP of 95W
Intel's website says 77 W [intel.com] while various [techpowerup.com] websites [flyingsuicide.net] say the retail packaging says 95 W. Weird.
Re:1.25v DDR3, but CPU efficiency... (Score:5, Interesting)
The i7 3770K has a TDP of 95W.
I know that, at least in the past, Intel used to issue TDP numbers that represented "typical" heat, while AMD used to issue TDP numbers that represented worst-case heat (which is what TDP ought to be IMHO). I have read here on Slashdot that more recently, AMD has started playing those games as well.
But according to NordicHardware [nordichardware.com], in this case Intel is under-promising and over-delivering, and the chips really do dissipate only 77W despite being rated for 95W. (But how did they measure that? Is this a "typical" 77W? I guess it's not that hard to run a benchmark test that should hammer the chip and get a worst-case number that way.)
Curiously the AMD processors tend to stack up better on the Linux benchmark suites.
This is probably because Linux benchmarks were compiled with GCC or Clang rather than the Intel compiler. The Intel compiler deliberately generates code that makes the compiled code run poorly on non-Intel processors. The code checks the CPU ID, and the code has two major branches: the good path, which Intel chips get to run, and the poor path, which other chips run.
http://www.agner.org/optimize/blog/read.php?i=49 [agner.org]
The irony is that Intel, by investing heavily in fab technology, is about two generations ahead of everyone else, so they can make faster and/or lower-power parts than everyone else. This means they could be competing fairly and win.
But because Intel does evil things like making their compiler sabotage their competition, I refuse to buy Intel. They have lost my business. They don't care of course, because there aren't many like me who are paying attention and care enough to change their buying habits.
If you want the fastest possible desktop computer, pay the big bucks for a top-of-the-line i7 system. But if you merely want a very fast desktop computer that can play all the games, an AMD will do quite well, and will cost a bit less. So giving up Intel isn't a hard thing to do, really.
Re: (Score:1)
The i7 3770K has a TDP of 95W.
I know that, at least in the past, Intel used to issue TDP numbers that represented "typical" heat, while AMD used to issue TDP numbers that represented worst-case heat (which is what TDP ought to be IMHO). I have read here on Slashdot that more recently, AMD has started playing those games as well.
But according to NordicHardware [nordichardware.com], in this case Intel is under-promising and over-delivering, and the chips really do dissipate only 77W despite being rated for 95W. (But how did they measure that? Is this a "typical" 77W? I guess it's not that hard to run a benchmark test that should hammer the chip and get a worst-case number that way.)
Curiously the AMD processors tend to stack up better on the Linux benchmark suites.
This is probably because Linux benchmarks were compiled with GCC or Clang rather than the Intel compiler. The Intel compiler deliberately generates code that makes the compiled code run poorly on non-Intel processors. The code checks the CPU ID, and the code has two major branches: the good path, which Intel chips get to run, and the poor path, which other chips run.
http://www.agner.org/optimize/blog/read.php?i=49 [agner.org]
The irony is that Intel, by investing heavily in fab technology, is about two generations ahead of everyone else, so they can make faster and/or lower-power parts than everyone else. This means they could be competing fairly and win.
But because Intel does evil things like making their compiler sabotage their competition, I refuse to buy Intel. They have lost my business. They don't care of course, because there aren't many like me who are paying attention and care enough to change their buying habits.
If you want the fastest possible desktop computer, pay the big bucks for a top-of-the-line i7 system. But if you merely want a very fast desktop computer that can play all the games, an AMD will do quite well, and will cost a bit less. So giving up Intel isn't a hard thing to do, really.
AMEN!
Re: (Score:2)
I know that, at least in the past, Intel used to issue TDP numbers that represented "typical" heat, while AMD used to issue TDP numbers that represented worst-case heat (which is what TDP ought to be IMHO). I have read here on Slashdot that more recently, AMD has started playing those games as well.
At least in the past, Intel used to issue TDP numbers that ignored the chipset, while AMD would compare CPU+chipset, because intel's chipsets would suck down the power like nobody's business and AMD's wouldn't. In the early amd64 days, companies actually built (17") laptops around the desktop Athlon64 because CPU+Chipset had lower TDP than intel's mobile offering. When you're competing against Mobile P4, that's not a surprise. But the same situation persisted all the way through the Core 2 Duo days! Intel's
Re: (Score:1)
The Opterons in the new series have a TDP of 65W or 95W for the 8-core models. At the expense of being clocked lower than a FX-8350, but the performance per watt is still better than for the FX-8350.
Looking at the 4 core models, the 3350HE may be a worthy replacement for the Athlon II X4 910e I have in my current desktop:
Four cores like the Athlon, 2.8 GHz clock speed where the Athlon had 2.6 GHz and only 45W TDP versus the 65W of the Athlon. In terms of pricing, the 3350HE seems to be similar to where the
Re: (Score:2)
The i7 3770K has a TDP of 95W.
No it doesn't, but I guess if you don't have facts use FUD. Intel has kept a "segment TDP" on the retail packaging because they want all Sandy/Ivy Bridge motherboards, coolers etc. to support 95W processors - the maximum in the Sandy Bridge line - but the actual processor will never use more than 77W. This was explained here [nordichardware.com] but Intel's site and 99,9% of all reviews and online sites will list it as a 77W processor. In fact the 95W figure is so rare that only reason to bring it up - particularly ignoring all
Re: (Score:2)
AMD to become overpriced at some point? (Score:2)
And on the desktop side, ZDNet reports that AMD will remain committed to the hobbyist market with socketed CPUs.
On a related note, I find it quite weird that Intel willfully forfeits their own future over to motherboard makers. It makes little sense to me to depend on third parties you've absolutely no control on to fix the price of the final product that your current customers -- computer makers -- end up buying, irrespective of the fact that Intel itself makes motherboards. I must be missing something besides the obvious (aka it's thinner, which incidentally ensures that AMD has to do this too for laptops). Slashdo
Re: (Score:2)
Intel also makes motherboard in addition to chipsets and CPUs...
Re: (Score:2)
On a related note, I find it quite weird that Intel willfully forfeits their own future over to motherboard makers. It makes little sense to me to depend on third parties you've absolutely no control on to fix the price of the final product that your current customers -- computer makers -- end up buying
Intel brings out a new CPU socket every week, so nobody upgrades their Intel CPU because they can't, the new CPU takes a new socket. AMD brings out new CPU sockets only rarely, so people do sometimes upgrade their CPU. Intel processors appeal to the PHB market, and AMD processors appeal to the nerd market. Intel is known for being expensive. Cost-cutting makes sense, and the socket is expensive.
no faster clock Opteron 4300 (Score:2)
I was hoping AMD could release a faster workstation level Opteron 4300 to match the FX-8350, the top end 4386 is still a 3.1Ghz (turbo to 3.8Ghz) but
the fastest 8 core Opteron 6328 is 3.2Ghz and goes to 3.8Ghz turbo. (but Opteron 4386's TDP is 95w vs Opteron 6328's 115w) while FX-8350 is 4Ghz turbo to 4.2Ghz and consuming 125w
Re: (Score:3, Funny)
Foiled again by Intel