Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Intel Hardware

Intel Shows Off 80-core Processor 222

thejakebrain writes "Intel has built its 80-core processor as part of a research project, but don't expect it on your desktop any time soon. The company's CTO, Justin Rattner, held a demonstration of the chip for a group of reports last week. Intel will be presenting a paper on the project at the International Solid State Circuits Conference in San Francisco this week. 'The chip is capable of producing 1 trillion floating-point operations per second, known as a teraflop. That's a level of performance that required 2,500 square feet of large computers a decade ago. Intel first disclosed it had built a prototype 80-core processor during last fall's Intel Developer Forum, when CEO Paul Otellini promised to deliver the chip within five years.'" Update: 06/01 14:37 GMT by Z : This article is about four months old. We discussed this briefly last year, but search didn't show that we discussed in February.
This discussion has been archived. No new comments can be posted.

Intel Shows Off 80-core Processor

Comments Filter:
  • But... (Score:2, Funny)

    by Anonymous Coward
    Does it run Linux?
    • Re:But... (Score:5, Funny)

      by Nom du Keyboard ( 633989 ) on Friday June 01, 2007 @11:16AM (#19353009)
      Does it run Linux?

      Yeah. 80 different distributions at once.

      • 80 different distributions at once.


        Intel has found a useful argument they could give to illustrate their newest
        " s/number of GHz/numer bor cores/ "
        marketing propaganda.
      • by Wolfrider ( 856 )
        Indeed, but only with Vmware and/or OpenVZ. ;-)

        Seriously, this thing doesn't even support the x86 instruction set (yet?), and is not form-factor compatible with today's processors.

        When it finally comes out, hopefully they will have integrated the "3D" chip-stacking tech into it as well, and HOPEFULLY it will run both cooler and with less power requirements than today's offerings.
  • cue (Score:5, Funny)

    by russ1337 ( 938915 ) on Friday June 01, 2007 @09:07AM (#19351097)
    Cue the 'needed to run Vista' jokes....
    • Re: (Score:3, Funny)

      by NickCatal ( 865805 )
      Finally I can realize my dream of playing 500 instances of Quake 3 on one machine!
    • Considering how slow Vista is compared to Win2K and XP, perhaps in this case you meant "queue?"
  • Older Story (Score:3, Informative)

    by Maddog Batty ( 112434 ) on Friday June 01, 2007 @09:07AM (#19351105) Homepage
    Older story on this here: http://hardware.slashdot.org/article.pl?sid=06/09/ 26/1937237 [slashdot.org]

    Sure would be nice to have a play with it once they have worked out how to program it...
    • Re: (Score:3, Insightful)

      by timeOday ( 582209 )

      Sure would be nice to have a play with it once they have worked out how to program it...

      It's very likely you can get one at Best Buy before they have worked out how to program with it. The fact is, current programming paradigms simply aren't suited to fine-grained parallelism - and in saying that, I don't mean to imply that such a paradigm can definitely exist. Sure there are many parallel research languages, but whether those could be adopted by mainstream programmers and used to achieve anywhere near

  • IA64 (Score:5, Insightful)

    by AvitarX ( 172628 ) <me@@@brandywinehundred...org> on Friday June 01, 2007 @09:08AM (#19351119) Journal
    I remember when IA64 was the next huge supercomputer on a chip 5 years off.

    It didn't work out too well for Intel.
    • Re:IA64 (Score:5, Funny)

      by ciroknight ( 601098 ) on Friday June 01, 2007 @09:21AM (#19351253)
      I remember when Pentium was the next huge chip from Intel that was a few years off.

      I guess we all know how that one turned out.
      • Also Pentium IV - piece of shit, still pretty damned fast, sold like mad. Sometimes Intel's failures are great successes.
      • do you remember the 80286 (which had an extra year wait)? There are some of us here that do. Even the z80 was a bit late by making it a better 8080. I was not waiting for it (but coded on it), but I would guess that 1 or 2 ppl here did.
    • Itanium isn't doing that badly, but it's been relegated to the "heavy iron" mainframe and supercomputer type systems, and that's a tough market. They made a gamble that didn't work as well as they hoped.
      • Re: (Score:3, Informative)

        by timeOday ( 582209 )
        Itanium is doing ridiculously badly. Intel and HP will never recoup the billions they invested through sales of big iron alone.
    • But, CPU cores are going to be the MHz of the next 20 years. Remember in the late 80s when 8MHz was alot and 33Mhz and 66Mhz were in the lab? 20 years form now we'll have 2Gigacore CPUs running between 2-3GHz.
  • by 91degrees ( 207121 ) on Friday June 01, 2007 @09:09AM (#19351123) Journal
    It's known incorrectly.

    The measurement is "FLOPS". Floating Point Operations Per Second. It's an acronym. The 'S' is part of the acronym. Hence even if you only have oneof them, it's still a FLOPS. And it's capitalised.

    Strictly speaking it should be "trillion FLOPS" as well since it's not an SI unit but my pedantry is limitted.
  • by Anonymous Coward on Friday June 01, 2007 @09:09AM (#19351127)
    can you imagine.
    • by jollyreaper ( 513215 ) on Friday June 01, 2007 @09:39AM (#19351477)

      beowulf cluster?

      can you imagine.
      Yeah, man. Or what if Intel codenamed their next processor Beowulf? *inhales, holds breath, exhales slowly, smoke twisting lazily* Can you imagine a Beowulf cluster of Beowulfs or did I just blow your mind?
      • If intel called the 80 cpu beast "Grendel", could it still be part of a Beowulf cluster? Or would it end up in a perpetual battle - cpu versus os - until the very fabric of the universe itself crumbled around us?
        • Re: (Score:3, Interesting)

          by jollyreaper ( 513215 )

          If intel called the 80 cpu beast "Grendel", could it still be part of a Beowulf cluster? Or would it end up in a perpetual battle - cpu versus os - until the very fabric of the universe itself crumbled around us?
          If you could work Grendel's mom into a Beowulf cluster of Grendels, you might just have aced the German pr0n market. For Japan, add tentacles. And if you can reverse the expected subject/verb/object order, you might have a market in Soviet Russia to boot.
  • by $RANDOMLUSER ( 804576 ) on Friday June 01, 2007 @09:10AM (#19351135)
    "Intel CEO promises to deliver magical new uber processor within five years".

    Stop me if you've heard this one before...
  • Intel used 100 million transistors on the chip, which measures 275 millimeters squared. By comparison, its Core 2 Duo chip uses 291 million transistors and measures 143 millimeters squared.
    Maybe its just because I haven't had my morning coffee yet, or is that a typo?
    • Re: (Score:3, Insightful)

      by Andy Dodd ( 701 )
      80 cores means there are probably quite a lot of on-chip interconnects between the cores.
      • transistor density? (Score:3, Informative)

        by twitter ( 104583 )

        80 cores means there are probably quite a lot of on-chip interconnects between the cores.

        There has to be a typo hiding in there, but the whole thing is an empty set. It's hard to believe they can make 80 cores with 100E6 transistors when it take 261E6 transistors to make two. Each core would have less than a million transistors in the 80 core model. You have to go all the way back to the 486 [answers.com] to see that kind of count from Intel. It's possible because the cores are not x86, there's no "ability to use

        • There has to be a typo hiding in there, but the whole thing is an empty set. It's hard to believe they can make 80 cores with 100E6 transistors when it take 261E6 transistors to make two.

          Yeah, when I read it I thought it must be a typo (especially given that the die area is bigger). Although, the article says it's VLIW (it isn't specific, but I guess it'll be IA64 based), which means you can throw away a shed-load of transistors from the scheduler that would have to be present on an OOE superscalar device.
        • You have to go all the way back to the 486 to see that kind of count from Intel.
          That might not be a terrible idea. Since the utility of this chip assumes fine-grained parallelism anyways, the new metric would have to be flops per transistor. Implement the 486 design with modern process technology, thus allowing you to put 250 of them on a single chip runing at 100x the original clock speed of 33mhz, and you might get something very nice. Or is that even possible?
    • by imsabbel ( 611519 ) on Friday June 01, 2007 @09:31AM (#19351349)
      Core2 has 2 or 4 Mbyte l2 cache. 1 bit cache is 6 transitors == more than 200 of those 291 million transistors are high-density cache. (Density of cache is a lot higher than that of logic, which the 80 core cpu nearly solely is made of).

      (Btw, i fucking HATE the "millimeters squared" expression. Its square millimeter. 275 mm squared would be more than 65 cm^2.)
      • by DrDitto ( 962751 )
        Intel does not use the classic 6-transistor SRAM cell. Their SRAM technology is cutting-edge and a couple generations beyond everybody else.
    • It's possible... (Score:4, Informative)

      by mbessey ( 304651 ) on Friday June 01, 2007 @10:02AM (#19351849) Homepage Journal
      I thought that was a little weird, too. But the 80-core chip could simply have more wires (and therefore, fewer transistors). Given that they mention that there are routing elements between the cores, it's possible that a lot of the chip's real estate is taken up by massive busses between adjacent cores.

      Another explanation might be that they didn't want to waste the time/expense to come up with an optimized layout, or that they intentionally spaced things out to make testing easier.
  • by jollyreaper ( 513215 ) on Friday June 01, 2007 @09:11AM (#19351143)
    I'm sorry but when I see these competitions I always come back to this Onion piece. A classic.

    http://www.theonion.com/content/node/33930 [theonion.com]

    Fuck Everything, We're Doing Five Blades

    By James M. Kilts
    CEO and President,
    The Gillette Company

    Would someone tell me how this happened? We were the fucking vanguard of shaving in this country. The Gillette Mach3 was the razor to own. Then the other guy came out with a three-blade razor. Were we scared? Hell, no. Because we hit back with a little thing called the Mach3Turbo. That's three blades and an aloe strip. For moisture. But you know what happened next? Shut up, I'm telling you what happened--the bastards went to four blades. Now we're standing around with our cocks in our hands, selling three blades and a strip. Moisture or no, suddenly we're the chumps. Well, fuck it. We're going to five blades.

    Sure, we could go to four blades next, like the competition. That seems like the logical thing to do. After all, three worked out pretty well, and four is the next number after three. So let's play it safe. Let's make a thicker aloe strip and call it the Mach3SuperTurbo. Why innovate when we can follow? Oh, I know why: Because we're a business, that's why!

    You think it's crazy? It is crazy. But I don't give a shit. From now on, we're the ones who have the edge in the multi-blade game. Are they the best a man can get? Fuck, no. Gillette is the best a man can get.

    What part of this don't you understand? If two blades is good, and three blades is better, obviously five blades would make us the best fucking razor that ever existed. Comprende? We didn't claw our way to the top of the razor game by clinging to the two-blade industry standard. We got here by taking chances. Well, five blades is the biggest chance of all.

    Here's the report from Engineering. Someone put it in the bathroom: I want to wipe my ass with it. They don't tell me what to invent--I tell them. And I'm telling them to stick two more blades in there. I don't care how. Make the blades so thin they're invisible. Put some on the handle. I don't care if they have to cram the fifth blade in perpendicular to the other four, just do it!

    You're taking the "safety" part of "safety razor" too literally, grandma. Cut the strings and soar. Let's hit it. Let's roll. This is our chance to make razor history. Let's dream big. All you have to do is say that five blades can happen, and it will happen. If you aren't on board, then fuck you. And if you're on the board, then fuck you and your father. Hey, if I'm the only one who'll take risks, I'm sure as hell happy to hog all the glory when the five-blade razor becomes the shaving tool for the U.S. of "this is how we shave now" A.

    People said we couldn't go to three. It'll cost a fortune to manufacture, they said. Well, we did it. Now some egghead in a lab is screaming "Five's crazy?" Well, perhaps he'd be more comfortable in the labs at Norelco, working on fucking electrics. Rotary blades, my white ass!

    Maybe I'm wrong. Maybe we should just ride in Bic's wake and make pens. Ha! Not on your fucking life! The day I shadow a penny-ante outfit like Bic is the day I leave the razor game for good, and that won't happen until the day I die!

    The market? Listen, we make the market. All we have to do is put her out there with a little jingle. It's as easy as, "Hey, shaving with anything less than five blades is like scraping your beard off with a dull hatchet." Or "You'll be so smooth, I could snort lines off of your chin." Try "Your neck is going to be so friggin' soft, someone's gonna walk up and tie a goddamn Cub Scout kerchief under it."

    I know what you're thinking now: What'll people say? Mew mew mew. Oh, no, what will people say?! Grow the fuck up. When you're on top, people talk. That's the price you pay for being on top. Which Gillette is, always has been, and forever shall be, Amen, five blades, sweet Jesus in heaven.

    • Comment removed based on user account deletion
    • by Traa ( 158207 )
      I see your TheOnion piece, and raise you a Dave Barry!

      http://www.washingtonpost.com/ac2/wp-dyn/A61952-20 03Jul15 [washingtonpost.com]

      Blade Inflation
      By Dave Barry

      What's next from the razor-sharp minds of the shaving industry?

      Attention, consumers with bodily hair: The razor industry has news for you! You will never in a million years guess what this news is, unless your IQ is higher than zero, in which case you're already thinking: "Not another blade! Don't tell me they're adding ANOTHER BLADE!!"

      Shut up! Don't spoil the surprise for everybody else!

      Before I tell you the news, let's put it in historical context by reviewing:

      THE HISTORY OF SHAVING

      Human beings are one of only two species of animals that shave themselves (the other one is salamanders). The Internet tells us that humans have been shaving since the Stone Age. Of course, the Internet also tells us that hot naked women want to befriend us, so we can't be 100 percent sure about everything we read there.

      But assuming that www.quikshave.com/ timeline.htm is telling the truth, Neanderthal Man used to pluck his facial hairs "using two seashells as tweezers." No doubt Neanderthal Woman found this very attractive. "You smell like a clam," were probably her exact words. It was during this era that the headache was invented.

      By 30,000 B.C., primitive man was shaving with blades made from flint, which is a rock, so you had a lot of guys whose faces were basically big oozing scabs. The next shaving breakthrough came when the ancient Egyptians figured out how to make razors from sharp metal, which meant that, for the first time, the man who wanted to be well-groomed could, without any assistance or special training, cut an ear completely off.

      This was pretty much the situation until the late 19th century, at about 2:30 p.m., when the safety razor was invented. This introduced a wonderful era known to historians as "The Golden Age of Not Having Razor Companies Introduce Some Ludicrously Unnecessary New Shaving Technology Every 10 Damn Minutes."

      I, personally, grew up during this era. I got my first razor when I was 15, and I used it to shave my "beard," which consisted of a lone chin hair approximately one electron in diameter. (I was a "late bloomer" who did not fully experience

      puberty until many of my classmates, including females, were bald.) My beard would poke its wispy head out of its follicle every week or so, and I, feeling manly, would smother it under 14 cubic feet of shaving cream and lop it off with my razor. Then I would stand in front of the bathroom mirror, waiting for it to grow again. Mine was a lonely adolescence.

      The razors of that era had one blade, and they worked fine; ask any older person who is not actively drooling. But then, in 1971, a very bad thing happened: Gillette, looking for a way to enhance the shaving experience (by which I mean "charge more") came out with a razor that had TWO blades. This touched off a nuclear arms race among razor companies, vying to outdo one another by adding "high-tech" features that made the product more expensive, but not necessarily better. This tactic is called "sneakerization," in honor of the sneaker industry, which now has people paying upwards of $200 a pair for increasingly weird-looking footwear boasting the durability of thinly sliced Velveeta.

      Soon everybody was selling two-blade razors. So the marketing people put on their thinking caps, and, in an astounding burst of creativity, came up with the breakthrough concept of: THREE BLADES. Gillette, which is on the cutting edge (har!) of razor sneakerization, currently has a top-of-the-line three-blade razor -- excuse me, I mean "shaving system" -- called the "Mach3Turbo," which, according to the Gillette Web site (www.gillette.com) has more technology than a nuclear submarine, including "open cartridge architecture" and an "ergonomic handle" featuring "knurled elastomeric crescents." That's right: It has elastomeric crescents, and they have been knurled! By knurlers! No, I don't know what this means. But it sure sounds technological.

      Which brings us to today's exciting news, which was brought to my attention by alert reader Jake Hamer. Gillette's arch-rival, Schick (maker of the Xtreme 3 shaving system) has announced that it's coming out with a new razor that has -- prepare to be floored by innovativeness -- FOUR BLADES. Yes! It will be called the "Quattro," which is Italian for "more expensive."

      Of course it will not end there. I bet an urgent memo has already gone out in Gillette's marketing department. "Hold some focus groups immediately!" it says. "Find out what number comes after four!"

      Yes, the razor-technology race shows no signs of slowing. And who knows what lies ahead? Razors with 10 blades? Twenty blades? A thousand blades? Razors that go backward in time and shave your ancestors? Exciting times lie ahead, shaving consumers!

  • by CajunArson ( 465943 ) on Friday June 01, 2007 @09:13AM (#19351167) Journal
    It must be a really slow news day. From the dateline:

    Published: February 11, 2007

    Not to mention that Slashdot (even Zonk) Covered this LAST YEAR [slashdot.org].
    But that's OK, I'm sure Slashdot gave insightful and cogent coverage of real events that actually matter to geeks on this site, you know, like the Release of a new major version of GCC [gnu.org]
    Oh wait.... that (like a bunch of other actually interesting stories) would be in the aptly-named, sir not appearing on this website category due to it not making enough banner revenue.
    • Clearly, there is a demonstrable need for news sites to process dupes faster and in parallel with other dupes. The reason this one took so long is because there isn't a high-speed dupe instruction on the older generations of processors.
  • by simong ( 32944 ) on Friday June 01, 2007 @09:14AM (#19351181) Homepage
    oh.
  • AMD's response (Score:4, Insightful)

    by drgonzo59 ( 747139 ) on Friday June 01, 2007 @09:17AM (#19351201)
    This is a nice move by Intel. I wonder what AMD's plans are...81 cores?

    Besides, with most software being single-threaded I don't know if a consumer will immediately need more than 4 cores for a while. I can still see software companies trying to come up with ways to keep all 80 cores busy..."Well, they need at least 20 anti-virus processes, 10 genuine advantage monitors, and we'll install 100 shareware application with cute little icons in the task bar by default. There, that should keep all the cores nice and warm and busy -- our job is done!".

    But in all seriousness, I would expect some extremely realistic environmental physical simulations (realtime large n-body interactions and perhaps realtime computational fluid dynamics)...now that's something to look forward to!

    • by mstahl ( 701501 )

      I think one of the possible uses for an 80-core CPU that nobody's really talked about is multiple redundancy. If one core should somehow get fried, you have 79 left.

      • If that's the case, might as well connect all unused heatsinks to a griddle and I'll fry my by bacon and eggs in the morning on them. If some of them burn up or get too hot it's "ok" I got 60 others waiting...

        Or... you could just have two and order a CPU to be delivered when one burns up. In fact that's what happens on some mainframes. If a part fries, the machine calls "home" and the company will send a replacement immediately. Sometimes the administrator will find out something went bad only when the rep

      • Virtualisation - dual and quad core doesn't really cut it for massive levels of virtualisation.. you want to reserve at least a core for your host OS, then you've got to divvy up the rest. With dual core that means you're down to having no smp in your vm's (sucks if they're compile boxes), with quad core that's only 3... get 20-30 machines in there and it's starting to look shaky. 80 cores would scale to hundreds of virtual machines without any particular slowdow.
    • AMD's response was buying ATi in order to work on their future chip, "Fusion", which will incorporate somehow a GPU-type accelerator on-die or at least in-package with a traditional x86 CPU.

      GPUs already have "many cores", if you can really call them that; 16 ROPs, 80 texture units, 64 shader cores, etc. Intel's approach is a much simpler architecture (in fact, "too simple" right now, the cores are practically feature-less), but DAAMIT's makes more business-sense (re-use what we've already got vs. invent
      • That makes sense. The GPUs sometimes have a a higher transistor count than CPUs...

        The problem with Fusion is if they kill the add-on graphics and you just buy one Fusion processor that costs say $400 to plug into the CPU slot. In the meantime, NVIDIA releases their new generation board and Intel releases a new generation CPU. The consumer can choose to upgrade one or the other or both, but an AMD customer is stuck just one expensive part and would have to upgrade it as one piece.

        Perhaps in the future th

    • Re: (Score:3, Funny)

      by Hoi Polloi ( 522990 )
      Kilts becomes AMD CEO:

      81 cores? Fuck that! We're going to 100 cores and putting a goddamn window on the CPU so all of the fan boys can watch the electrons flow. Then we're going to put the ethernet connection DIRECTLY on the chip. Ya, you heard me right, a connector right on the damn chip. You disagree? Great, I need your soon to be empty cube to store my prototypes you pussy.

      Wait! Brace yourself, I've got another one. A speaker slapped onto the CPU. You hear that? That is the sound of genius and
    • by neurojab ( 15737 )
      Besides, with most software being single-threaded I don't know if a consumer will immediately need more than 4 cores for a while. I can still see software companies trying to come up with ways to keep all 80 cores busy

      You're ignoring the server market, where the applications tend to be highly multi-threaded, and it's not difficult at all to keep your CPUs busy.

  • Ob (Score:2, Funny)

    by rlp ( 11898 )
    In Soviet Russia, Intel's 80 core processor imagines a Beowolf cluster of you!
  • by tomstdenis ( 446163 ) <tomstdenis.gmail@com> on Friday June 01, 2007 @09:34AM (#19351409) Homepage
    and all that is holy on this sacred Earth ...

    This isn't a general purpose processor. Think "cell processor" on a larger scale. You wouldn't be running your firefox or text editor on this thing. You'd load it up and have it do things like graphics processing, ray tracing, DSP work, chemical analysis, etc...

    So stop saying "we already don't have multi-core software now!!!" because this isn't meant for most software anyways.

    Tom
    • by ciroknight ( 601098 ) on Friday June 01, 2007 @09:44AM (#19351561)
      "This isn't a general purpose processor."

      ...Yet. You're right in the fact that these cores are incredibly simplistic, so much so that they make DSPs look functional, but really what's going on here is a science project to develop the on-chip network, not to develop the CPU cores as much. Intel envisions lifting the networking component out of this design and applying it to various different cores, so that a general computing core can be mixed in with DSP cores and other "Application Specific Accelerator" cores.

      So no, this model you're not going to be running Firefox or your text editor on (in fact, I doubt you even _could_ do this, these cores currently are very, very stripped down in their capacity to do work, to where they're basically two MACs tied to a small SRAM and a "network adapter"), but never-say-never, this style of chip is right around the corner.
  • Not usefull yet.. (Score:4, Interesting)

    by CockroachMan ( 1104387 ) on Friday June 01, 2007 @09:39AM (#19351471)
    It's useless to keep putting more cores into a processor when we still don't have a decent parallel programming paradigm.

    80 cores is an absurd number, with the parallelism level that we have in today programs, most of the cores should be idle most of the time.
    • I really don't understand this idea that we "can't use multiple cores yet" because we don't have some magical, mythical necessary programming model that will make this come alive instantly. The fact is, we've had the necessary models for decades now. Adding multiple cores doesn't necessarily mean we need to change our programs at all, but rather it means we need to change our Operating Systems.

      To the point: right now, we typically schedule applications to run on time-slices, to virtually expand one proce
    • 80 cores is an absurd number
      Eight of these processors ought to be enough for anyone.

      "He never really said that" posts in 5,4,3...
  • A transputer [wikipedia.org] which was around in the 80s?

    Hmm has it really taken 20 years of research or ... Wonders... 20 years later... Patents...?

     
  • perl -e 'fork while 1;'


    There ya go. Think of it as a benchmark. How long can the 80-core processor run that without dying?

    • The death comes from process tables filling up, so I suspect the 80 core processor will die faster.
    • How fast can it die, or how long does it remain responsive? It is, after all, memory that leads to the eventual death ;)
  • I guess that eventually they'll just fire up a whole 300mm wafer. It'll be cookin...
  • [a teraflop is] a level of performance that required 2,500 square feet of large computers a decade ago.

    Over a decade and a half ago, in 1990, I programmed parallel AT&T DSP32C boards (multiple DSPs per ISA board in an 80386 host). Up to 5 25GFLOPS chips on a 125GFLOPS board, up to 8 boards in a 1TFLOPS host PC. That PC, nearly double the "decade ago", had over 1TFLOPS (including its FPGA glue) in about 3 square feet.

    And it actually ran applications (commercial image processing) in the field. This Intel

  • also does it have a ram controller build in?
  • My concern isn't that they can cram 80 processors of some capability onto a single chip (how many i486 processors could you put on the current Core 2 Duo die and transistor budget at 65/45nm?), but how can you feed enough data on and back off the chip to keep these processors running at near full speed? 8000 processors on a chip are worthless if they can't get to their data.

    And I'm not impressed by the Flop rate. Not with the Cell Processor already out for a year.

  • by dj42 ( 765300 ) on Friday June 01, 2007 @11:49AM (#19353535) Journal
    Well, those shitty, basic computers that took up big rooms, remember those? No? Ok, well, if those were still here, this thing would be like 90239820 times smaller, cool huh? How many of those are we going to have to hear before we come up with some new kind of comparison. You know how fast a woman can plot a route around a detour using a map in a big city? Yeah? Well, this shit is like 939203902093902093092093 times faster.
  • I'd love to know how they managed to scale up to 80way x86? IIRC anything passed 8way was diminishing returns?
  • With high end processor power consumption approaching the better part of a kilowatt, perhaps toasting speed could be a measure.
  • Of course we won't see an 80-core processor any day soon! Intel will produce "new chips" every couple years and just add cores. I suspect they have done this all along (e.g, the Pentium series). That is it makes sense from a capitalist standpoint to throttle your technology to generate or maintain a certain level of demand. I have no evidence for this, but as cynical as it sounds it sure makes sense.
    • by geekoid ( 135745 )
      They would release it right now if they could. You only need to throttle when there is an end game, and you don't have competition.

      The issue is the fab and the die. How many bad die's did they go through to get to this prototype? I would wager they had many more bad dies then good ones.

  • ought to be enough for anybody.

    Seriously, why don't we put the cores into the memory DIMMs? It's been tried before but now it seems that a CPU core is just a little commodity thing, and memory bandwidth is where the bottleneck is.

  • At 3.16GHz and with 0.95 volts applied to the processor, it can hit 1 teraflop of performance while consuming 62 watts of power.

    Yikes, that's over 65 amps! Well, OK, it's less than 1 amp per core.

  • Nobody believed me when I mentioned in a previous story comment that I was playing with an 8-core HP test system. Everyone went "No Wai!" and this was about three months or so ago. Lookie here - 80 fucking cores.

    Guess who's laughing now?

In the long run, every program becomes rococco, and then rubble. -- Alan Perlis

Working...