Cray Is Building a Supercomputer To Manage the US' Nuclear Stockpile (engadget.com) 65
An anonymous reader quotes a report from Engadget: The U.S. Department of Energy (DOE) and National Nuclear Security Administration (NNSA) have announced they've signed a contract with Cray Computing for the NNSA's first exascale supercomputer, "El Capitan." El Capitan's job will be to will perform essential functions for the Stockpile Stewardship Program, which supports U.S. national security missions in ensuring the safety, security and effectiveness of the nation's nuclear stockpile in the absence of underground testing. Developed as part of the second phase of the Collaboration of Oak Ridge, Argonne and Livermore (CORAL-2) procurement, the computer will be used to make critical assessments necessary for addressing evolving threats to national security and other issues such as non-proliferation and nuclear counterterrorism.
El Capitan will have a peak performance of more than 1.5 exaflops -- which is 1.5 quintillion calculations per second. It'll run applications 50 times faster than Lawrence Livermore National Laboratory's (LLNL) Sequoia system and 10 times faster than its Sierra system, which is currently the world's second most powerful super computer. It'll be four times more energy efficient than Sierra, too. The $600 million El Capitan is expected to go into production by late 2023. "NNSA is modernizing the Nuclear Security Enterprise to face 21st century threats," said Lisa E Gordon-Hagerty, DOE undersecretary for nuclear security and NNSA administrator. "El Capitan will allow us to be more responsive, innovative and forward-thinking when it comes to maintaining a nuclear deterrent that is second-to-none in a rapidly-evolving threat environment."
El Capitan will have a peak performance of more than 1.5 exaflops -- which is 1.5 quintillion calculations per second. It'll run applications 50 times faster than Lawrence Livermore National Laboratory's (LLNL) Sequoia system and 10 times faster than its Sierra system, which is currently the world's second most powerful super computer. It'll be four times more energy efficient than Sierra, too. The $600 million El Capitan is expected to go into production by late 2023. "NNSA is modernizing the Nuclear Security Enterprise to face 21st century threats," said Lisa E Gordon-Hagerty, DOE undersecretary for nuclear security and NNSA administrator. "El Capitan will allow us to be more responsive, innovative and forward-thinking when it comes to maintaining a nuclear deterrent that is second-to-none in a rapidly-evolving threat environment."
Will this be Epyc powered? (Score:3)
I wonder if this will be Epyc powered like Frontier? I assume so since Cray has formed a strategic partnership with AMD.
What problem would this be solving? (Score:3, Interesting)
Re: (Score:2)
Development of new weapons. Since the ban on testing.
Re: (Score:2)
Re: (Score:2)
to run war games! (Score:2)
to run war games!
Re: (Score:2)
Not until you have it play itself in tic-tac-toe will the new supercomputer truly become enlightened and realize there is no winner in a nuclear war.
Until then it's just playing and preparing.
we can win USA first strike on NK! (Score:2)
we can win USA first strike on NK!
Re: (Score:2)
Re: (Score:2)
I came here to say this
Re: (Score:3)
Nuclear weapons are constantly degrading. The fissile material is slowly transmuting into "daughter" elements, therefore changing how the bomb will behave. The only way to test this absent of actually blowing them up is simulation. Lots and lots of simulation.
It's not the fissile materials, it's the explosive (Score:5, Informative)
Pu-239 has a half life of around 24,000 years - yes, it is decaying, but over the couple of hundred years we're realistically going to have bombs, it's such a small fraction as to be negligible in terms of nuclear physics wrt the actual explosive fission process. The fiddly precise bit is getting the material evenly compressed to a supercritical state, after that it's not sensitive to a bit of impurity. Bomb pits are already alloyed with 3% germanium for stability reasons, if that doesn't affect the behavior a few hundredths of a percent of decay product isn't going to matter a whit (IIRC Pu-239 decays to also-fissile U-235 anyway).
The bigger problem is in the rest of the components, in particular the high explosives...they tend to not age well over the course of a few decades; their decay process is known and monitored, then modeled in the computers. There's probably also issues with stuff like the X-ray reflectors and interstage materials decaying, maybe the lithium deuteride gets stale too, but the fissile material is for practical purposes the only completely stable part of the damn thing.
Re: (Score:3)
Yes, but the tritium, which is used as a fusion booster in many designs, has a half-life of 12.3 years. Hence why DOE/NNSA was looking for a site to produce tritium using accelerators or other methods about twenty years ago.
Re: It's not the fissile materials, it's the explo (Score:2)
Oh, the tritium is absolutely a very limited lifespan component, but it's a known factor that's designed to be easily replenished/replaced...not something you'd need an exaflop-scale system to simulate.
Re: (Score:2)
I vaguely recall that they wanted, also, to simulate the weapons' performance given various decay states of the tritium (that is, without replenishment). That might or might not require lots of computing power.
Re: It's not the fissile materials, it's the expl (Score:2)
Off the top of my head, I'd say that would be some (relatively) simple math; x percent of the tritium is decayed to He-3, which is definitely a bad thing since it's a neutron absorber, but also you might be able to just factor that into the math describing the fusion process (https://nuclearweaponarchive.org/Nwfaq/Nfaq4-4.html#Nfaq4.4) and get a good idea of yield reductions or outright fizzle given a certain decay level of the tritium.
Or I absolutely could be talking out of my only slightly informed ass an
Re: (Score:2)
I don't really know either. Physics was never my strong suit.
Re: (Score:2)
amongst many other things, the increase in neutron transfer could also decay some of the surrounding materials differently, either atomically or physically (embrittlement) amongst thousands of other factors I'm not even considering.
It's one of those issues that seems simple on the surface, but is probably pretty intricate once you dig into it.
Re: It's not the fissile materials, it's the expl (Score:2)
I'm absolutely sure the aging process is infinitely more complicated than I could ever imagine, the pit is literally the only part of the warhead that doesn't decay in a meaningful fashion.
Re: (Score:2)
It may be stable atomically, but it's not necessarily stable phase-wise. Plutonium has a number of different solid phases depending on various factors and how its alloyed. These can vary widely in density, a fact that is likely leveraged in weapons design.
Re: It's not the fissile materials, it's the explo (Score:2)
Absolutely - the 3% gallium very effectively stabilizes the Î phase, which is the ideal allotrope for pit production since it's easily machinable, ductile, fairly stable, and under relatively mild implosion pressure it transitions to α phase which is significantly denser, increasing the implosive efficiency big-time. It does also transition to α at something like 120ÂC, so there is some concern with inadvertent overtemperature as the weapon ages, but that's more a metallurgic
Re: (Score:1)
Then the US mil and send in contractors to repair and replace parts.
The stockpile all over EU in US bases is ready for use for decades more.
A lot of calculations per devices, per decade. The US has a long list of devices to look after.
Re:What problem would this be solving? (Score:5, Informative)
Multiple reasons. Nuclear bombs have been constantly refurbished over the last few decades, and computers are necessary to see if the replacements will work. Conventional explosive triggers have been changed to more stable, "inert" explosives. A substance known as "FOGBANK" was hard to produce, and computer simulations at the time weren't powerful enough to see if a replacement would work. We are also embarking on a redesign and refurbishment process of our nuclear stockpile so that'll require even more computing to validate design changes.
Re: (Score:2)
Weapons development as others have said but also more and more accurate modeling of existing warheads to know how they would perform in certain cases. These kinds of things used to be done by hand but as you can imagine the more power you can put behind it the higher resolution the model. They also get used for nuclear research which helps with reactor designs and other things related to heavy atoms and radiation.
Colossus? (Score:2)
Re: (Score:2)
And we were discussing QC yesterday and its "imminent," like any day now probably soon, on the horizon deployment and stuff and then we roll out a goddam classic computer.
Playing catch up ... (Score:2)
... to the Russians again.
Maybe we'll actually make it to the Moon.
Re: (Score:2)
Excel.
Many Windows systems still ran on DOS.
Re: (Score:2)
This is an inventory application isn't it? (Score:3)
Re: (Score:3)
It's to simulate explosions. Typical QA would involve periodically taking a bomb or two and seeing if they still make a satisfying kaboom. Since that's not allowed, you use a giant computer to simulate it instead.
Since such simulations have been happening for decades, one might wonder why you need the biggest computer in the world to do them. A cynic might suspect that it would be useful for developing improvements, as well as maintenance.
Re:This is an inventory application isn't it? (Score:5, Informative)
You don't have to be a cynic. The US is in the early stages of an on-again-off-again eventually trillion dollar program to replace its nuclear weapons stockpile. The first of the new bombs, the low-yield W76-2, started production earlier this year.
Re: (Score:1)
See, in America the way we do socialism is with our military. Nobody'll pay for it otherwise. Then you hope some of it trickles down.
Really? (Score:4, Funny)
Re: (Score:1)
OS X 10.11, yeah -- but I wouldn't figure Hackintosh. Since the top entries on the Top500 list these days use literally millions of CPU cores, I was thinking rows and rows of impressive-looking racks with CRAY on the doors, but when you open the doors there are shelves and shelves of Mac Minis (late 2014, the top-end model with the 3.0GHz Core i7-4578U dual-core CPU).
Re:Really? (Score:4, Interesting)
Actually, the early Crays did run MacOS. In a way - they used a Mac to manage them (schedule and load jobs into them, etc. Basically the Mac provided the UI, because the Cray itself was for pure computation and not to waste time on an operating system or a monitor or such. So they had a Mac to provide all the user interaction, networking and other things.
There was a joke that Apple used a Cray to help design the new Macs, while Cray used Macs to drive them. (And all the associated jokes they had like it was so fast, it could run an infinite loop in a few seconds).
Re: (Score:2)
Yes, they need the compute power because they are seeing the spinning beach ball.
Re: (Score:2)
They're actually running Radius Rocketshare [lowendmac.com].
Do they really need a super-machine? (Score:3)
Re: (Score:2)
"El Capitan will allow us to be more responsive, innovative and forward-thinking when it comes to maintaining a nuclear deterrent
that is second-to-none in a rapidly-evolving threat environment." The $600 million El Capitan is expected to go into production by late 2023.
It's just a boondoggle for the military industrial complex.
Supercomputers (Score:4, Interesting)
I used to really be into supercomputers when I was a kid. All the weird architectures and instructions sets. Custom processors to accelerate specific algorithms. Weird OSes that ran on top of other weird OSes. Crazy ECL hardware cooled with Flourinert. Walls of blinking LEDs and cooling systems with visible waterfalls.
Now they are racks of AMD or Intel blades with nVidia GPUs running customized Linux distributions. The only slightly interesting things are whatever tweaks to OpenMP they are using, or maybe the Myrinet or Infiniband backbones.
Commoditization is great, but it does make things a bit less interesting.
Re: (Score:2)
The top two right now are running on POWER9 chips from IBM. It's a pretty interesting CPU. https://en.wikichip.org/wiki/i... [wikichip.org]
Re: (Score:2)
The P9 soon will have the full attention of RedHat engineering, that could make that chip even more appealing..
Re:Supercomputers (Score:5, Interesting)
I don't agree with you, there have been plenty of interesting things happening in the field. Though part of them are software, but even in the hardware side there are fun stuff.
Bluegenes are not that old and were pretty interesting.
We saw the rise (and fall) of XeonPhi.
The Chinese system is made of their own weird CPU.
There is plenty of POWER in the field and serious investigation of using ARM processors.
On the network side, dragonfly is the new hot thing and that's new and weird. We are also seeing in-network collectives that could change the performance (and therefore design) of many applications.
The innovation on the storage side are pretty impressive as well. Most recently Burst Buffer has been changing the game. But being able to support checkpointing became a significant problem for the storage system.
On the software side, there have been lots of problems that needed to be solved. How do we deal with accelerators? How do we deal with heterogeneous processors? How do we build performance portable applications? How do we account for dark silicon? How do we manage the power budgets to maximize performance? How do we make application easier to program for non-HPC experts? How do we move past MPI+X? How do we deal with reliability issues that arise from running on 100000s processing units? How do we make checkpoint/restart transparent to the programmer?
I think the HPC community has delivered plenty of interesting things.
Re: (Score:2)
I used to really be into supercomputers when I was a kid. All the weird architectures and instructions sets. Custom processors to accelerate specific algorithms. Weird OSes that ran on top of other weird OSes. Crazy ECL hardware cooled with Flourinert. Walls of blinking LEDs and cooling systems with visible waterfalls.
Now they are racks of AMD or Intel blades with nVidia GPUs running customized Linux distributions. The only slightly interesting things are whatever tweaks to OpenMP they are using, or maybe the Myrinet or Infiniband backbones.
Commoditization is great, but it does make things a bit less interesting.
We're all enjoying a "less interesting" life. What was once something only for the extremely wealthy is now something we find in the trash because something better has come along.
I'll often comment on how we are living in the future. What was thought nearly impossible only a decade or three ago is now a commodity. Maybe some of this is still priced out of the range of many but it's in the range of a good sized portion of the public.
The idea of a video phone was thought of something someone would have to
Re: (Score:2)
I've got no problem with it. Standardization and commoditization saves the taxpayers lots of money.
W.H.O.P.P.E.R (Score:1)
Oh please... (Score:2)
... call it W.O.P.R!!!
Re: (Score:2)
Yeah, but will it run Crysis? (Score:2)
In other news (Score:2)
Cray marketing is clearly a wholly-owned subsidiary of Apple, Inc.
A great Deal IMHO: (Score:2)
We spend money buying and researching ultrafast computers and physicists to use them rather than conducting nuclear tests with all the proliferation, environmental and geopolitical downsides.
I wish more government programs had this return on investment.
Re: (Score:1)