Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Intel Hardware

Intel Shows Off 80-core Processor 222

Posted by Zonk
from the next-up-a-skillion-core-system dept.
thejakebrain writes "Intel has built its 80-core processor as part of a research project, but don't expect it on your desktop any time soon. The company's CTO, Justin Rattner, held a demonstration of the chip for a group of reports last week. Intel will be presenting a paper on the project at the International Solid State Circuits Conference in San Francisco this week. 'The chip is capable of producing 1 trillion floating-point operations per second, known as a teraflop. That's a level of performance that required 2,500 square feet of large computers a decade ago. Intel first disclosed it had built a prototype 80-core processor during last fall's Intel Developer Forum, when CEO Paul Otellini promised to deliver the chip within five years.'" Update: 06/01 14:37 GMT by Z : This article is about four months old. We discussed this briefly last year, but search didn't show that we discussed in February.
This discussion has been archived. No new comments can be posted.

Intel Shows Off 80-core Processor

Comments Filter:
  • But... (Score:2, Funny)

    by Anonymous Coward on Friday June 01, 2007 @09:05AM (#19351077)
    Does it run Linux?
  • cue (Score:5, Funny)

    by russ1337 (938915) on Friday June 01, 2007 @09:07AM (#19351097)
    Cue the 'needed to run Vista' jokes....
  • Older Story (Score:3, Informative)

    by Maddog Batty (112434) on Friday June 01, 2007 @09:07AM (#19351105) Homepage
    Older story on this here: http://hardware.slashdot.org/article.pl?sid=06/09/ 26/1937237 [slashdot.org]

    Sure would be nice to have a play with it once they have worked out how to program it...
    • Re:Older Story (Score:3, Insightful)

      by timeOday (582209) on Friday June 01, 2007 @11:31AM (#19353235)

      Sure would be nice to have a play with it once they have worked out how to program it...
      It's very likely you can get one at Best Buy before they have worked out how to program with it. The fact is, current programming paradigms simply aren't suited to fine-grained parallelism - and in saying that, I don't mean to imply that such a paradigm can definitely exist. Sure there are many parallel research languages, but whether those could be adopted by mainstream programmers and used to achieve anywhere near linear speedup on mainstream applications is an open issue. Even the PS3, which is oriented to media applications which are relatively easy to parallelize, is getting little benefit from its measly half dozen computational units.

      But with single-thread performance growth at a virtual standstill, Moore's law is going to result in exponential growth in the number of cores, whether or not we're ready to write software for them.

      I wonder if we won't move towards a more "biological" paradigm - massive parallelism, but with massive redundancy and therefore inefficiency from a computational standpoint, but also robustness to hardware and software bugs.

  • IA64 (Score:5, Insightful)

    by AvitarX (172628) <me@@@brandywinehundred...org> on Friday June 01, 2007 @09:08AM (#19351119) Journal
    I remember when IA64 was the next huge supercomputer on a chip 5 years off.

    It didn't work out too well for Intel.
  • by 91degrees (207121) on Friday June 01, 2007 @09:09AM (#19351123) Journal
    It's known incorrectly.

    The measurement is "FLOPS". Floating Point Operations Per Second. It's an acronym. The 'S' is part of the acronym. Hence even if you only have oneof them, it's still a FLOPS. And it's capitalised.

    Strictly speaking it should be "trillion FLOPS" as well since it's not an SI unit but my pedantry is limitted.
  • by Anonymous Coward on Friday June 01, 2007 @09:09AM (#19351127)
    can you imagine.
    • by jollyreaper (513215) on Friday June 01, 2007 @09:39AM (#19351477)

      beowulf cluster?

      can you imagine.
      Yeah, man. Or what if Intel codenamed their next processor Beowulf? *inhales, holds breath, exhales slowly, smoke twisting lazily* Can you imagine a Beowulf cluster of Beowulfs or did I just blow your mind?
      • by avronius (689343) * on Friday June 01, 2007 @11:06AM (#19352841) Homepage Journal
        If intel called the 80 cpu beast "Grendel", could it still be part of a Beowulf cluster? Or would it end up in a perpetual battle - cpu versus os - until the very fabric of the universe itself crumbled around us?
        • Re:Dude.... (Score:3, Interesting)

          by jollyreaper (513215) on Friday June 01, 2007 @11:34AM (#19353285)

          If intel called the 80 cpu beast "Grendel", could it still be part of a Beowulf cluster? Or would it end up in a perpetual battle - cpu versus os - until the very fabric of the universe itself crumbled around us?
          If you could work Grendel's mom into a Beowulf cluster of Grendels, you might just have aced the German pr0n market. For Japan, add tentacles. And if you can reverse the expected subject/verb/object order, you might have a market in Soviet Russia to boot.
  • by $RANDOMLUSER (804576) on Friday June 01, 2007 @09:10AM (#19351135)
    "Intel CEO promises to deliver magical new uber processor within five years".

    Stop me if you've heard this one before...
  • by doombringerltx (1109389) on Friday June 01, 2007 @09:10AM (#19351137)

    Intel used 100 million transistors on the chip, which measures 275 millimeters squared. By comparison, its Core 2 Duo chip uses 291 million transistors and measures 143 millimeters squared.
    Maybe its just because I haven't had my morning coffee yet, or is that a typo?
  • by jollyreaper (513215) on Friday June 01, 2007 @09:11AM (#19351143)
    I'm sorry but when I see these competitions I always come back to this Onion piece. A classic.

    http://www.theonion.com/content/node/33930 [theonion.com]

    Fuck Everything, We're Doing Five Blades

    By James M. Kilts
    CEO and President,
    The Gillette Company

    Would someone tell me how this happened? We were the fucking vanguard of shaving in this country. The Gillette Mach3 was the razor to own. Then the other guy came out with a three-blade razor. Were we scared? Hell, no. Because we hit back with a little thing called the Mach3Turbo. That's three blades and an aloe strip. For moisture. But you know what happened next? Shut up, I'm telling you what happened--the bastards went to four blades. Now we're standing around with our cocks in our hands, selling three blades and a strip. Moisture or no, suddenly we're the chumps. Well, fuck it. We're going to five blades.

    Sure, we could go to four blades next, like the competition. That seems like the logical thing to do. After all, three worked out pretty well, and four is the next number after three. So let's play it safe. Let's make a thicker aloe strip and call it the Mach3SuperTurbo. Why innovate when we can follow? Oh, I know why: Because we're a business, that's why!

    You think it's crazy? It is crazy. But I don't give a shit. From now on, we're the ones who have the edge in the multi-blade game. Are they the best a man can get? Fuck, no. Gillette is the best a man can get.

    What part of this don't you understand? If two blades is good, and three blades is better, obviously five blades would make us the best fucking razor that ever existed. Comprende? We didn't claw our way to the top of the razor game by clinging to the two-blade industry standard. We got here by taking chances. Well, five blades is the biggest chance of all.

    Here's the report from Engineering. Someone put it in the bathroom: I want to wipe my ass with it. They don't tell me what to invent--I tell them. And I'm telling them to stick two more blades in there. I don't care how. Make the blades so thin they're invisible. Put some on the handle. I don't care if they have to cram the fifth blade in perpendicular to the other four, just do it!

    You're taking the "safety" part of "safety razor" too literally, grandma. Cut the strings and soar. Let's hit it. Let's roll. This is our chance to make razor history. Let's dream big. All you have to do is say that five blades can happen, and it will happen. If you aren't on board, then fuck you. And if you're on the board, then fuck you and your father. Hey, if I'm the only one who'll take risks, I'm sure as hell happy to hog all the glory when the five-blade razor becomes the shaving tool for the U.S. of "this is how we shave now" A.

    People said we couldn't go to three. It'll cost a fortune to manufacture, they said. Well, we did it. Now some egghead in a lab is screaming "Five's crazy?" Well, perhaps he'd be more comfortable in the labs at Norelco, working on fucking electrics. Rotary blades, my white ass!

    Maybe I'm wrong. Maybe we should just ride in Bic's wake and make pens. Ha! Not on your fucking life! The day I shadow a penny-ante outfit like Bic is the day I leave the razor game for good, and that won't happen until the day I die!

    The market? Listen, we make the market. All we have to do is put her out there with a little jingle. It's as easy as, "Hey, shaving with anything less than five blades is like scraping your beard off with a dull hatchet." Or "You'll be so smooth, I could snort lines off of your chin." Try "Your neck is going to be so friggin' soft, someone's gonna walk up and tie a goddamn Cub Scout kerchief under it."

    I know what you're thinking now: What'll people say? Mew mew mew. Oh, no, what will people say?! Grow the fuck up. When you're on top, people talk. That's the price you pay for being on top. Which Gillette is, always has been, and forever shall be, Amen, five blades, sweet Jesus in heaven.

    • by liquidpele (663430) on Friday June 01, 2007 @09:47AM (#19351613) Journal
      Haha....
      You'll love this then.
      octaginator 8 blades [youtube.com]

      -Reece
    • by Traa (158207) on Friday June 01, 2007 @10:52AM (#19352633) Homepage Journal
      I see your TheOnion piece, and raise you a Dave Barry!

      http://www.washingtonpost.com/ac2/wp-dyn/A61952-20 03Jul15 [washingtonpost.com]

      Blade Inflation
      By Dave Barry

      What's next from the razor-sharp minds of the shaving industry?

      Attention, consumers with bodily hair: The razor industry has news for you! You will never in a million years guess what this news is, unless your IQ is higher than zero, in which case you're already thinking: "Not another blade! Don't tell me they're adding ANOTHER BLADE!!"

      Shut up! Don't spoil the surprise for everybody else!

      Before I tell you the news, let's put it in historical context by reviewing:

      THE HISTORY OF SHAVING

      Human beings are one of only two species of animals that shave themselves (the other one is salamanders). The Internet tells us that humans have been shaving since the Stone Age. Of course, the Internet also tells us that hot naked women want to befriend us, so we can't be 100 percent sure about everything we read there.

      But assuming that www.quikshave.com/ timeline.htm is telling the truth, Neanderthal Man used to pluck his facial hairs "using two seashells as tweezers." No doubt Neanderthal Woman found this very attractive. "You smell like a clam," were probably her exact words. It was during this era that the headache was invented.

      By 30,000 B.C., primitive man was shaving with blades made from flint, which is a rock, so you had a lot of guys whose faces were basically big oozing scabs. The next shaving breakthrough came when the ancient Egyptians figured out how to make razors from sharp metal, which meant that, for the first time, the man who wanted to be well-groomed could, without any assistance or special training, cut an ear completely off.

      This was pretty much the situation until the late 19th century, at about 2:30 p.m., when the safety razor was invented. This introduced a wonderful era known to historians as "The Golden Age of Not Having Razor Companies Introduce Some Ludicrously Unnecessary New Shaving Technology Every 10 Damn Minutes."

      I, personally, grew up during this era. I got my first razor when I was 15, and I used it to shave my "beard," which consisted of a lone chin hair approximately one electron in diameter. (I was a "late bloomer" who did not fully experience

      puberty until many of my classmates, including females, were bald.) My beard would poke its wispy head out of its follicle every week or so, and I, feeling manly, would smother it under 14 cubic feet of shaving cream and lop it off with my razor. Then I would stand in front of the bathroom mirror, waiting for it to grow again. Mine was a lonely adolescence.

      The razors of that era had one blade, and they worked fine; ask any older person who is not actively drooling. But then, in 1971, a very bad thing happened: Gillette, looking for a way to enhance the shaving experience (by which I mean "charge more") came out with a razor that had TWO blades. This touched off a nuclear arms race among razor companies, vying to outdo one another by adding "high-tech" features that made the product more expensive, but not necessarily better. This tactic is called "sneakerization," in honor of the sneaker industry, which now has people paying upwards of $200 a pair for increasingly weird-looking footwear boasting the durability of thinly sliced Velveeta.

      Soon everybody was selling two-blade razors. So the marketing people put on their thinking caps, and, in an astounding burst of creativity, came up with the breakthrough concept of: THREE BLADES. Gillette, which is on the cutting edge (har!) of razor sneakerization, currently has a top-of-the-line three-blade razor -- excuse me, I mean "shaving system" -- called the "Mach3Turbo," which, according to the Gillette Web site (www.gillette.com) has more technology than a nuclear submarine, including "open cartridge architecture" and an "ergonomic handle" featuring "knurled ela

  • by CajunArson (465943) on Friday June 01, 2007 @09:13AM (#19351167) Journal
    It must be a really slow news day. From the dateline:

    Published: February 11, 2007

    Not to mention that Slashdot (even Zonk) Covered this LAST YEAR [slashdot.org].
    But that's OK, I'm sure Slashdot gave insightful and cogent coverage of real events that actually matter to geeks on this site, you know, like the Release of a new major version of GCC [gnu.org]
    Oh wait.... that (like a bunch of other actually interesting stories) would be in the aptly-named, sir not appearing on this website category due to it not making enough banner revenue.
  • by simong (32944) on Friday June 01, 2007 @09:14AM (#19351181) Homepage
    oh.
  • AMD's response (Score:4, Insightful)

    by drgonzo59 (747139) on Friday June 01, 2007 @09:17AM (#19351201)
    This is a nice move by Intel. I wonder what AMD's plans are...81 cores?

    Besides, with most software being single-threaded I don't know if a consumer will immediately need more than 4 cores for a while. I can still see software companies trying to come up with ways to keep all 80 cores busy..."Well, they need at least 20 anti-virus processes, 10 genuine advantage monitors, and we'll install 100 shareware application with cute little icons in the task bar by default. There, that should keep all the cores nice and warm and busy -- our job is done!".

    But in all seriousness, I would expect some extremely realistic environmental physical simulations (realtime large n-body interactions and perhaps realtime computational fluid dynamics)...now that's something to look forward to!

    • I think one of the possible uses for an 80-core CPU that nobody's really talked about is multiple redundancy. If one core should somehow get fried, you have 79 left.

      • by drgonzo59 (747139) on Friday June 01, 2007 @09:38AM (#19351457)
        If that's the case, might as well connect all unused heatsinks to a griddle and I'll fry my by bacon and eggs in the morning on them. If some of them burn up or get too hot it's "ok" I got 60 others waiting...

        Or... you could just have two and order a CPU to be delivered when one burns up. In fact that's what happens on some mainframes. If a part fries, the machine calls "home" and the company will send a replacement immediately. Sometimes the administrator will find out something went bad only when the replacement part already arrived at the door.

      • by Tony Hoyle (11698) <tmh@nodomain.org> on Friday June 01, 2007 @10:46AM (#19352545) Homepage
        Virtualisation - dual and quad core doesn't really cut it for massive levels of virtualisation.. you want to reserve at least a core for your host OS, then you've got to divvy up the rest. With dual core that means you're down to having no smp in your vm's (sucks if they're compile boxes), with quad core that's only 3... get 20-30 machines in there and it's starting to look shaky. 80 cores would scale to hundreds of virtual machines without any particular slowdow.
    • by ciroknight (601098) on Friday June 01, 2007 @09:38AM (#19351459)
      AMD's response was buying ATi in order to work on their future chip, "Fusion", which will incorporate somehow a GPU-type accelerator on-die or at least in-package with a traditional x86 CPU.

      GPUs already have "many cores", if you can really call them that; 16 ROPs, 80 texture units, 64 shader cores, etc. Intel's approach is a much simpler architecture (in fact, "too simple" right now, the cores are practically feature-less), but DAAMIT's makes more business-sense (re-use what we've already got vs. invent something new).
      • by drgonzo59 (747139) on Friday June 01, 2007 @09:45AM (#19351593)
        That makes sense. The GPUs sometimes have a a higher transistor count than CPUs...

        The problem with Fusion is if they kill the add-on graphics and you just buy one Fusion processor that costs say $400 to plug into the CPU slot. In the meantime, NVIDIA releases their new generation board and Intel releases a new generation CPU. The consumer can choose to upgrade one or the other or both, but an AMD customer is stuck just one expensive part and would have to upgrade it as one piece.

        Perhaps in the future the CPU, the graphics card and the memory will all be on one giant module. You get high performance but not the ability to customize individual components.

    • by Hoi Polloi (522990) on Friday June 01, 2007 @10:36AM (#19352411) Journal
      Kilts becomes AMD CEO:

      81 cores? Fuck that! We're going to 100 cores and putting a goddamn window on the CPU so all of the fan boys can watch the electrons flow. Then we're going to put the ethernet connection DIRECTLY on the chip. Ya, you heard me right, a connector right on the damn chip. You disagree? Great, I need your soon to be empty cube to store my prototypes you pussy.

      Wait! Brace yourself, I've got another one. A speaker slapped onto the CPU. You hear that? That is the sound of genius and of every command to the CPU being broadcast. The bling-bling crowd will eat that shit up like a cupcake at a fat camp. Let Intel suck on THAT for a while.
    • by neurojab (15737) on Friday June 01, 2007 @11:44AM (#19353451)
      Besides, with most software being single-threaded I don't know if a consumer will immediately need more than 4 cores for a while. I can still see software companies trying to come up with ways to keep all 80 cores busy

      You're ignoring the server market, where the applications tend to be highly multi-threaded, and it's not difficult at all to keep your CPUs busy.

  • Ob (Score:2, Funny)

    by rlp (11898) on Friday June 01, 2007 @09:26AM (#19351301)
    In Soviet Russia, Intel's 80 core processor imagines a Beowolf cluster of you!
  • by tomstdenis (446163) <tomstdenis AT gmail DOT com> on Friday June 01, 2007 @09:34AM (#19351409) Homepage
    and all that is holy on this sacred Earth ...

    This isn't a general purpose processor. Think "cell processor" on a larger scale. You wouldn't be running your firefox or text editor on this thing. You'd load it up and have it do things like graphics processing, ray tracing, DSP work, chemical analysis, etc...

    So stop saying "we already don't have multi-core software now!!!" because this isn't meant for most software anyways.

    Tom
    • by ciroknight (601098) on Friday June 01, 2007 @09:44AM (#19351561)
      "This isn't a general purpose processor."

      ...Yet. You're right in the fact that these cores are incredibly simplistic, so much so that they make DSPs look functional, but really what's going on here is a science project to develop the on-chip network, not to develop the CPU cores as much. Intel envisions lifting the networking component out of this design and applying it to various different cores, so that a general computing core can be mixed in with DSP cores and other "Application Specific Accelerator" cores.

      So no, this model you're not going to be running Firefox or your text editor on (in fact, I doubt you even _could_ do this, these cores currently are very, very stripped down in their capacity to do work, to where they're basically two MACs tied to a small SRAM and a "network adapter"), but never-say-never, this style of chip is right around the corner.
  • Not usefull yet.. (Score:4, Interesting)

    by CockroachMan (1104387) on Friday June 01, 2007 @09:39AM (#19351471)
    It's useless to keep putting more cores into a processor when we still don't have a decent parallel programming paradigm.

    80 cores is an absurd number, with the parallelism level that we have in today programs, most of the cores should be idle most of the time.
    • by ciroknight (601098) on Friday June 01, 2007 @09:57AM (#19351765)
      I really don't understand this idea that we "can't use multiple cores yet" because we don't have some magical, mythical necessary programming model that will make this come alive instantly. The fact is, we've had the necessary models for decades now. Adding multiple cores doesn't necessarily mean we need to change our programs at all, but rather it means we need to change our Operating Systems.

      To the point: right now, we typically schedule applications to run on time-slices, to virtually expand one processor to every single process we have running. With multi-core, we need to change the way we schedule to be more granular, and to assign better core affinity (to the point where we can address specific cores directly from the operating system, always running the same application on the same core). Every task gets its own core, and can then use threading (or spawning another process) to request/force more core-time if necessary.

      Ordinary desktops have been parallel for a long, long time, we've just hidden it from the users and from the programmers because it's hideously complex when it comes to timing and scheduling. The whole idea of the Operating System was to hide this complexity from the users to begin with, and to put it instead on the shoulders of smarter, better software that has been combed over and refined. To the point: our OSes have emulated parallel machines because we didn't have parallel machines, real-time (or at least near-real-time) multitasking would be impossible without it. Now, we have parallel machines, and we can stop emulating it or at least minimize our need to.
    • by Bastard of Subhumani (827601) on Friday June 01, 2007 @10:42AM (#19352497) Journal

      80 cores is an absurd number
      Eight of these processors ought to be enough for anyone.

      "He never really said that" posts in 5,4,3...
  • by Colin Smith (2679) on Friday June 01, 2007 @09:40AM (#19351487)
    A transputer [wikipedia.org] which was around in the 80s?

    Hmm has it really taken 20 years of research or ... Wonders... 20 years later... Patents...?

     
  • perl -e 'fork while 1;'


    There ya go. Think of it as a benchmark. How long can the 80-core processor run that without dying?

  • by flyingfsck (986395) on Friday June 01, 2007 @10:32AM (#19352351)
    I guess that eventually they'll just fire up a whole 300mm wafer. It'll be cookin...
  • by Doc Ruby (173196) on Friday June 01, 2007 @11:08AM (#19352859) Homepage Journal

    [a teraflop is] a level of performance that required 2,500 square feet of large computers a decade ago.

    Over a decade and a half ago, in 1990, I programmed parallel AT&T DSP32C boards (multiple DSPs per ISA board in an 80386 host). Up to 5 25GFLOPS chips on a 125GFLOPS board, up to 8 boards in a 1TFLOPS host PC. That PC, nearly double the "decade ago", had over 1TFLOPS (including its FPGA glue) in about 3 square feet.

    And it actually ran applications (commercial image processing) in the field. This Intel chip might be smaller than 3'^2, but it still needs to run in the same size PC, and it doesn't run any commercial apps. Even 17 years later, on twice the cores we had in 1990.

    Sure, the past 17 years hasn't seen our parallelism innovations become common (though finally digital cameras are catching up to our 16Mpxl, but not at 40bit color). Because the same problem we didn't solve in general, parallel programming semantics and debugging that reuse existing codebase and techniques, is still hard. But if we'd discarded the uniprocessor codebase then, or just ignored it as we built a new, parallel codebase, we'd have 17 years of code now that would be "legacy" which didn't largely lock us, and our thinking, out of simple multiprocessing development.

    So I hope that Intel's multicore development is really just a platform for developing parallel coding systems. I'd love to see all that Intel money and brains put behind "executable UML", or some other flow "language", by inventing "lossless" lexical/graphical interconverters and expression standards. But they'll probably spend it all on marketing and emulating an 80386 to run Windows. Because that's the part of our 1990 platform that can be reinvented and resold as "brand new", just like it was back then, without taking much of a risk.
  • by Joe The Dragon (967727) on Friday June 01, 2007 @11:11AM (#19352919)
    also does it have a ram controller build in?
  • by Nom du Keyboard (633989) on Friday June 01, 2007 @11:19AM (#19353047)
    My concern isn't that they can cram 80 processors of some capability onto a single chip (how many i486 processors could you put on the current Core 2 Duo die and transistor budget at 65/45nm?), but how can you feed enough data on and back off the chip to keep these processors running at near full speed? 8000 processors on a chip are worthless if they can't get to their data.

    And I'm not impressed by the Flop rate. Not with the Cell Processor already out for a year.

  • by dj42 (765300) on Friday June 01, 2007 @11:49AM (#19353535) Journal
    Well, those shitty, basic computers that took up big rooms, remember those? No? Ok, well, if those were still here, this thing would be like 90239820 times smaller, cool huh? How many of those are we going to have to hear before we come up with some new kind of comparison. You know how fast a woman can plot a route around a detour using a map in a big city? Yeah? Well, this shit is like 939203902093902093092093 times faster.
  • by nbritton (823086) on Friday June 01, 2007 @11:59AM (#19353709)
    I'd love to know how they managed to scale up to 80way x86? IIRC anything passed 8way was diminishing returns?
  • by peter303 (12292) on Friday June 01, 2007 @12:06PM (#19353841)
    With high end processor power consumption approaching the better part of a kilowatt, perhaps toasting speed could be a measure.
  • by Assassin bug (835070) on Friday June 01, 2007 @12:46PM (#19354521) Journal
    Of course we won't see an 80-core processor any day soon! Intel will produce "new chips" every couple years and just add cores. I suspect they have done this all along (e.g, the Pentium series). That is it makes sense from a capitalist standpoint to throttle your technology to generate or maintain a certain level of demand. I have no evidence for this, but as cynical as it sounds it sure makes sense.
  • by rrohbeck (944847) on Friday June 01, 2007 @01:35PM (#19355307)
    ought to be enough for anybody.

    Seriously, why don't we put the cores into the memory DIMMs? It's been tried before but now it seems that a CPU core is just a little commodity thing, and memory bandwidth is where the bottleneck is.

  • by Skapare (16644) on Friday June 01, 2007 @01:41PM (#19355403) Homepage

    At 3.16GHz and with 0.95 volts applied to the processor, it can hit 1 teraflop of performance while consuming 62 watts of power.

    Yikes, that's over 65 amps! Well, OK, it's less than 1 amp per core.

  • by Khyber (864651) <techkitsune@gmail.com> on Friday June 01, 2007 @04:15PM (#19357839) Homepage Journal
    Nobody believed me when I mentioned in a previous story comment that I was playing with an 8-core HP test system. Everyone went "No Wai!" and this was about three months or so ago. Lookie here - 80 fucking cores.

    Guess who's laughing now?

ASHes to ASHes, DOS to DOS.

Working...