Forgot your password?
typodupeerror
Supercomputing Intel Hardware

Intel Aims For Exaflops Supercomputer By 2018 66

Posted by Soulskill
from the still-won't-run-crysis dept.
siliconbits writes "Intel has laid down its roadmap in terms of computing performance for the next seven years in a press release; in addition, it revealed its expectations until 2027 in one deck of slides shown last week. The semiconductor chip maker wants a supercomputer capable of reaching 1000 petaflops (or one exaflops) to be unveiled by the end of 2018 (just in time for the company's 50th anniversary) with four exaflops being the upper end target by the end of the decade. The slide that was shared also shows that Intel wants to smash the zettaflops barrier — that's one million petaflops — sometime before 2030. This, Intel expects, will allow for significant strides in the field of genomics research, as well as much more accurate weather prediction (assuming Skynet or the Matrix hasn't taken over the world)."
This discussion has been archived. No new comments can be posted.

Intel Aims For Exaflops Supercomputer By 2018

Comments Filter:
  • At what point does more computing power not matter anymore?

    • Exaflop computing is a requirement for the Square Kilometre Array. There is still a long way to go until there might be an upper limit, especially in Radio Astronomy.

      http://www.skatelescope.org/ [skatelescope.org]
      http://en.wikipedia.org/wiki/Square_Kilometre_Array [wikipedia.org]
      http://www.ska.gov.au/ [ska.gov.au]

    • by AHuxley (892839)
      Power supply can get be an issue. You can read about the NSA and the size and new power supply needs as it expands at Fort Meade:
      http://publicintelligence.net/nsa-site-m-cybercom/ [publicintelligence.net]
      No upper limits, just more land, power, cooling and smart people to keep the funding flowing.
      What each chip can do is limited, but you just keep buying more :)
    • by MacTO (1161105)

      For home and office needs, we have been well beyond the upper limit for well over a decade.

      For most business needs, we have been well beyond the upper limit for some time. But that does not consider the needs of all businesses.

      For scientific, engineering, and military needs -- well, we have a bloody long way to go. Supercomputers aren't built for national prestige or any of that other nonsense, simply because they are too expensive and become obsolete too quickly. These computers are built to address cur

      • by Arlet (29997)

        For home and office needs, we have been well beyond the upper limit for well over a decade.

        For video manipulation, it can still take a unpleasantly long time.

      • by ChatHuant (801522)

        For home and office needs, we have been well beyond the upper limit for well over a decade

        What I think happened is the attention of the computing world and of the enthusiasts has been focused on the whole Internet thing. This has led to a slow-down on the home/office application development side; as a result, most things we use computers for in homes and offices are the same we had ten years ago, so of course current computers are powerful enough to handle them. But I can think of lots of new home apps that will need more resources; and I'm quite sure there will be another computer revolution (

    • by Sycraft-fu (314770) on Wednesday June 22, 2011 @06:12AM (#36525644)

      Basically a computer needs to be able to do anything a human can ask of it. Well we can ask an awful lot. I want a computer that can understand natural language and respond in kind. I want a computer that can render 3D graphics that look perfectly real. I want a computer that can accurately model the weather system of the entire planet, and so on.

      We are still a long, LONG way from computing power not mattering anymore. Particularly at the high end where this is targeted. While the weather system modeling might be silly for a home user to say, it is something that we'd very much like a computer to do. However right now all the systems are crude models, very much is simplified because there just isn't the power to truly model everything down to, say, the molecular level.

      However with enough power, such a thing could be done. How much I don't know, way more than we've got now, but it is perfectly possible.

      So we really don't yet know what the upper bounds on what we might want in terms of processing power is. We won't really know until we start reaching it. We'll start having systems and any time we think up something new for them to do, they'll be able to do it with power to spare. Then we'll know "This is it, there really isn't a need for more processing power."

      We may well hit physical limits before that though.

  • While this sounds impressive, some of us would like to have something on our desktop that is capable of, perhaps, a petaflop or two. Undoubtedly, genomics research would gain massively but there are a lot more reseachers out there, I am sure, whose work would benefit from easier and daily access to such out of the box resources. Getting supercomputing resources to the masses would seem like a worthy goal, rather than just hitting exaflop headlines.

    • by IrquiM (471313)
      That's what you have GPUs for. Ask Nvidia or AMD for that sort of things, not Intel
      • by Namarrgon (105036)

        Not all workloads translate to the specialised SIMD arrays that are modern GPUs. There is still a need for large arrays [intel.com] of more general-purpose computing, as well as a need for fast single core computers (not all jobs can be computed in parallel either).

        I write visual effects software in C++ and in OpenCL. Where OpenCL is practical, GPUs can be 10x faster, but there are still many cases when it's too awkward or has too much overhead to use a GPU.

  • ...make accurate weather prediction any less necessary?

    • by Arlet (29997)

      In the Matrix, the weather predicts you!

  • Looks like someone is a sore looser.

    • by matty619 (630957)

      Sore looser? That sounds.....kinky.

      lol...sorry, Spelling Nazi at your service ;)

    • Why, did they take too much stool softener?

      Ok, with that out of the way, unlike Fujitsu, who added on extra SIMD co-processors to their CPUs SPECIFICALLY for the K-computer, Intel does not currently do is offer CPUs designed specifically for HPC. Now of course a lot of advancements in general purpose CPUs can be applied to HPC. Overall the Earth simulator cycle seems to be repeating itself->a Japanese company creates an incredibly fast computer that makes heavy use of vector processors, then cannot s
    • "Loose as a goose, and you never lose."
      That's how I learned it as a youngster. :-)

      English can be tricky and crazy that way.

      Congrat's on Japan taking the #1 spot!

  • Just think how much they could make mining Bitcoins on this thing... Oh, actually, hang on...

  • Scaled to 10 GHz? [geek.com] (the comments are fun to read)

    It's hard to take a claim like that seriously since that famous prediction. Oh well. At least they redeemed themselves with the Core architecture. In my little book anyway.

    • I love the comments there. Compared to the consensus of the commenters (circa 2000, anyway -- by 2005 they started approaching conventional wisdom), Intel was extremely conservative.

    • When you look at the number of people who still believed in the clock frequency version of Moore's law intel actually got it pretty accurate.
      They are only out by a factor of two comparing that to the other numbers 128GHz!!!! Though doubling it again for consumer chips is going to take at least 10 years more.

      When they are promising those kind of improvements for 2018 I'll let them be an order of magnitude out.

    • Got to love all the idiots lambasting Brian L's sarcasm as if he were serious.. *facepalm*.

    • by CecilPL (1258010)

      Well Netburst has been pretty much replaced with Core, but the new 10-core Xeons [intel.com] run at 2.4GHz, so that's 24GHz of computer power on a single chip.

  • by Ceriel Nosforit (682174) on Wednesday June 22, 2011 @04:48AM (#36525260)

    If my calculations are correct, 1 exaflop with today's best hardware translates to roughly 2000 m^2 worth of CPU area. Reportedly a little less than half an American football field.

    Still I'm not sure if it's enough for what I want, namely to create new lifeforms, artificially evolved to say live on a partially terraformed Mars. - That or give teens genetic upgrades for irises which change colour with their mood. The latter of these seems more likely to be met with commercial success.

    • by rbrausse (1319883)

      Reportedly a little less than half an American football field.

      or - in a more useful and recognized measurement - almost 0,8 % of the LoC's floor space

  • In 2007, they were saying intel would have 80-core CPU's in 2011.

    It's 2011 now, where can I buy one?

  • by Anonymous Coward

    http://spectrum.ieee.org/computing/hardware/nextgeneration-supercomputers/0

  • "In 2008 James Bamford's The Shadow Factory reported that NSA told the Pentagon it would need an exaflop computer by 2018."

    from wikipedia.

My idea of roughing it turning the air conditioner too low.

Working...