Intel Aims For Exaflops Supercomputer By 2018 66
siliconbits writes "Intel has laid down its roadmap in terms of computing performance for the next seven years in a press release; in addition, it revealed its expectations until 2027 in one deck of slides shown last week. The semiconductor chip maker wants a supercomputer capable of reaching 1000 petaflops (or one exaflops) to be unveiled by the end of 2018 (just in time for the company's 50th anniversary) with four exaflops being the upper end target by the end of the decade. The slide that was shared also shows that Intel wants to smash the zettaflops barrier — that's one million petaflops — sometime before 2030. This, Intel expects, will allow for significant strides in the field of genomics research, as well as much more accurate weather prediction (assuming Skynet or the Matrix hasn't taken over the world)."
Is there an upper limit? (Score:2)
At what point does more computing power not matter anymore?
Re: (Score:3)
Exaflop computing is a requirement for the Square Kilometre Array. There is still a long way to go until there might be an upper limit, especially in Radio Astronomy.
http://www.skatelescope.org/ [skatelescope.org]
http://en.wikipedia.org/wiki/Square_Kilometre_Array [wikipedia.org]
http://www.ska.gov.au/ [ska.gov.au]
Re: (Score:3)
http://publicintelligence.net/nsa-site-m-cybercom/ [publicintelligence.net]
No upper limits, just more land, power, cooling and smart people to keep the funding flowing.
What each chip can do is limited, but you just keep buying more
Re: (Score:2)
For home and office needs, we have been well beyond the upper limit for well over a decade.
For most business needs, we have been well beyond the upper limit for some time. But that does not consider the needs of all businesses.
For scientific, engineering, and military needs -- well, we have a bloody long way to go. Supercomputers aren't built for national prestige or any of that other nonsense, simply because they are too expensive and become obsolete too quickly. These computers are built to address cur
Re: (Score:3)
For video manipulation, it can still take a unpleasantly long time.
Re: (Score:2)
For home and office needs, we have been well beyond the upper limit for well over a decade
What I think happened is the attention of the computing world and of the enthusiasts has been focused on the whole Internet thing. This has led to a slow-down on the home/office application development side; as a result, most things we use computers for in homes and offices are the same we had ten years ago, so of course current computers are powerful enough to handle them. But I can think of lots of new home apps that will need more resources; and I'm quite sure there will be another computer revolution (
There is, but it is far off (Score:5, Insightful)
Basically a computer needs to be able to do anything a human can ask of it. Well we can ask an awful lot. I want a computer that can understand natural language and respond in kind. I want a computer that can render 3D graphics that look perfectly real. I want a computer that can accurately model the weather system of the entire planet, and so on.
We are still a long, LONG way from computing power not mattering anymore. Particularly at the high end where this is targeted. While the weather system modeling might be silly for a home user to say, it is something that we'd very much like a computer to do. However right now all the systems are crude models, very much is simplified because there just isn't the power to truly model everything down to, say, the molecular level.
However with enough power, such a thing could be done. How much I don't know, way more than we've got now, but it is perfectly possible.
So we really don't yet know what the upper bounds on what we might want in terms of processing power is. We won't really know until we start reaching it. We'll start having systems and any time we think up something new for them to do, they'll be able to do it with power to spare. Then we'll know "This is it, there really isn't a need for more processing power."
We may well hit physical limits before that though.
Re: (Score:1)
Re: (Score:2)
Well raytracing gets pretty close. Imagine extremely high end raytracing that can be done in real time for video games. Since this is just simulating the physics of light (if I'm not mistaken), perfecting this and having the hardware to do it very quickly would go hand in hand with amazing 3D graphics beyond any game we play now.
Re: (Score:1)
Trickle down (Score:2)
While this sounds impressive, some of us would like to have something on our desktop that is capable of, perhaps, a petaflop or two. Undoubtedly, genomics research would gain massively but there are a lot more reseachers out there, I am sure, whose work would benefit from easier and daily access to such out of the box resources. Getting supercomputing resources to the masses would seem like a worthy goal, rather than just hitting exaflop headlines.
Re: (Score:2)
Re: (Score:2)
Not all workloads translate to the specialised SIMD arrays that are modern GPUs. There is still a need for large arrays [intel.com] of more general-purpose computing, as well as a need for fast single core computers (not all jobs can be computed in parallel either).
I write visual effects software in C++ and in OpenCL. Where OpenCL is practical, GPUs can be 10x faster, but there are still many cases when it's too awkward or has too much overhead to use a GPU.
Why would the Matrix or Skynet... (Score:2)
...make accurate weather prediction any less necessary?
Re: (Score:2)
In the Matrix, the weather predicts you!
Re: (Score:2)
Okay, that's the funniest "In Soviet Russia..." joke I've ever read. Mod parent +1 funny!
Re: (Score:2)
I used to live in Phoenix. It's dry and it's hot. Phoenix is in the middle of a valley surrounded by mountains, so storm systems tend to wrap around the city rather than pass overhead. With the exact same weather every day, it's hilarious watching the weather forecasters try futilely to come up with new and exciting ways to describe that day's weather.
Re: (Score:1)
Don't worry, you'll get lots of nice storms once they strip-mine a few of those mountains.
Outdone by the Japanese (Score:2)
Looks like someone is a sore looser.
Re: (Score:2)
Sore looser? That sounds.....kinky.
lol...sorry, Spelling Nazi at your service ;)
Re: (Score:2)
Ok, with that out of the way, unlike Fujitsu, who added on extra SIMD co-processors to their CPUs SPECIFICALLY for the K-computer, Intel does not currently do is offer CPUs designed specifically for HPC. Now of course a lot of advancements in general purpose CPUs can be applied to HPC. Overall the Earth simulator cycle seems to be repeating itself->a Japanese company creates an incredibly fast computer that makes heavy use of vector processors, then cannot s
Try this memory tool... (Score:1)
"Loose as a goose, and you never lose." :-)
That's how I learned it as a youngster.
English can be tricky and crazy that way.
Congrat's on Japan taking the #1 spot!
Just think... (Score:2)
Just think how much they could make mining Bitcoins on this thing... Oh, actually, hang on...
Re: (Score:2)
Wasn't 2011 supposed to be the year Netburst (Score:2)
Scaled to 10 GHz? [geek.com] (the comments are fun to read)
It's hard to take a claim like that seriously since that famous prediction. Oh well. At least they redeemed themselves with the Core architecture. In my little book anyway.
Re: (Score:2)
I love the comments there. Compared to the consensus of the commenters (circa 2000, anyway -- by 2005 they started approaching conventional wisdom), Intel was extremely conservative.
Looking at the other predictions thats pretty good (Score:1)
When you look at the number of people who still believed in the clock frequency version of Moore's law intel actually got it pretty accurate.
They are only out by a factor of two comparing that to the other numbers 128GHz!!!! Though doubling it again for consumer chips is going to take at least 10 years more.
When they are promising those kind of improvements for 2018 I'll let them be an order of magnitude out.
Re: (Score:2)
Got to love all the idiots lambasting Brian L's sarcasm as if he were serious.. *facepalm*.
Re: (Score:2)
Well Netburst has been pretty much replaced with Core, but the new 10-core Xeons [intel.com] run at 2.4GHz, so that's 24GHz of computer power on a single chip.
Die size in square meters (Score:3)
If my calculations are correct, 1 exaflop with today's best hardware translates to roughly 2000 m^2 worth of CPU area. Reportedly a little less than half an American football field.
Still I'm not sure if it's enough for what I want, namely to create new lifeforms, artificially evolved to say live on a partially terraformed Mars. - That or give teens genetic upgrades for irises which change colour with their mood. The latter of these seems more likely to be met with commercial success.
Re: (Score:2)
Reportedly a little less than half an American football field.
or - in a more useful and recognized measurement - almost 0,8 % of the LoC's floor space
80 cores (Score:2)
In 2007, they were saying intel would have 80-core CPU's in 2011.
It's 2011 now, where can I buy one?
Re: (Score:2)
TFA probably uses this as source:
http://newsroom.intel.com/community/intel_newsroom/blog/2011/06/20/intel-equipped-to-lead-industry-to-era-of-exascale-computing [intel.com]
There are 50 cores in that Knight's Ferry. They could probably go to 80 by year's end.
Re: (Score:1)
In 2007, they were saying intel would have 80-core CPU's in 2011.
It's 2011 now, where can I buy one?
Intel showed a working 80 core research processor in 2007. Maybe that's what you heard about? It was used by researchers in the fields of distributed and parallel computing to develop new programming and compiler optimizations and operating system enhancements.
https://secure.wikimedia.org/wikipedia/en/wiki/Teraflops_Research_Chip [wikimedia.org]
Intel has since released a second generation research processor for distributed computing and cloud research called SCC, formerly know as Rock Creek. It is also available to uni
Re: (Score:3)
It is "one exaflop" and "1000 petaflops". FLOPS still meanst FLoating point Operation(s) Per Second. http://en.wikipedia.org/wiki/FLOPS [wikipedia.org]
Re: (Score:2)
"one exaflop" means 10^18 floating point operations, with no time basis.
"1000 petaflops" means 10^18 floating point operations _per second_.
The 's' is not a pluralizing suffix, it's part of the FLOPS acronym. In this usage, it should be "one exaflops", or "one exaFLOPS" to capitalize properly.
Not very likely (Score:1)
http://spectrum.ieee.org/computing/hardware/nextgeneration-supercomputers/0
This is the NSA (Score:2)
"In 2008 James Bamford's The Shadow Factory reported that NSA told the Pentagon it would need an exaflop computer by 2018."
from wikipedia.