Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
China Government Intel The Almighty Buck United States Hardware Technology

US Reveals Details of $500 Million Supercomputer (nytimes.com) 60

An anonymous reader quotes a report from The New York Times: The Department of Energy disclosed details on Monday of one of the most expensive computers being built: a $500 million machine based on Intel and Cray technology that may become crucial in a high-stakes technology race between the United States and China (Warning: source may be paywalled; alternative source). The supercomputer, called Aurora, is a retooling of a development effort first announced in 2015 and is scheduled to be delivered to the Argonne National Laboratory near Chicago in 2021. Lab officials predict it will be the first American machine to reach a milestone called "exascale" performance, surpassing a quintillion calculations per second. That's roughly seven times the speed rating of the most powerful system built to date, or 1,000 times faster than the first "petascale" systems that began arriving in 2008. Backers hope the new machines will let researchers create significantly more accurate simulations of phenomena such as drug responses, climate changes, the inner workings of combustion engines and solar panels.

Aurora, which far exceeds the $200 million price for Summit, represents a record government contract for Intel and a test of its continued leadership in supercomputers. The Silicon Valley giant's popular processors -- the calculating engine for nearly all personal computers and server systems -- power most such machines. But additional accelerator chips are considered essential to reach the very highest speeds, and its rival Nvidia has built a sizable business adapting chips first used with video games for use in supercomputers. The version of Aurora announced in 2015 was based on an Intel accelerator chip that the company later discontinued. A revised plan to seek more ambitious performance targets was announced two years later. Features discussed on Monday include unreleased Intel accelerator chips, a version of its standard Xeon processor, new memory and communications technology and a design that packages chips on top of each other to save space and power.

This discussion has been archived. No new comments can be posted.

US Reveals Details of $500 Million Supercomputer

Comments Filter:
  • by Anonymous Coward

    Backers hope the new machines will let researchers create significantly more accurate simulations of phenomena such as drug responses, climate changes, the inner workings of combustion engines and solar panels.

    Climate change? Solar panels? This project is as good as cancelled.

    • Backers hope the new machines will let researchers create significantly more accurate simulations of phenomena such as drug responses, climate changes, the inner workings of combustion engines and solar panels.

      Climate change? Solar panels? This project is as good as cancelled.

      I don't know.. Super Computers are key in current state of the art nuclear testing, when you cannot just dig a hole and set your latest weapons off underground due to the nuclear test ban treaty. I got a feeling Trump will let this climate change study slide to get better weapons. I call it a win win myself.

  • by penandpaper ( 2463226 ) on Monday March 18, 2019 @04:17PM (#58294694) Journal

    The new fad is a super computing race! Who needs to go to space when you can simulate it in your back yard.

    • by Anonymous Coward

      It was on for US since Japanese captured Top500 with their Earth Simulator. China's machines gave another boost to the efforts as the rest of the politicians woke up from their dreams of superior powers. Exascale efforts form a convenient background for the race as it has to be done anyway to get to the next level in science and applications. No personalized medicine for anybody without 10-100 times exascale machines, at least.

  • So since all "computers" are Turing complete, they can all produce the same results - it's just a matter of processing time and having enough memory to hold the needed data. So this computer can't actually accomplish anything more than before. It can just fulfill a greater workload. Or am I mistaken?

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      The ability to work with a larger dataset in "realtime" means better data trend resolution when you're simulating something and more confidence in the result. Scaling-down / iterating is possible but can introduce artifacts.

      So you're 1/2 right. Data superset > data subset.

    • So since all "computers" are Turing complete, they can all produce the same results - it's just a matter of processing time and having enough memory to hold the needed data. So this computer can't actually accomplish anything more than before. It can just fulfill a greater workload. Or am I mistaken?

      That depends. If you take into account the projected age of the universe, you might find that doing the same calculations using giant fields of wooden rods, while equivalent, may not be practical.

      So "this computer can't actually accomplish anything more than before" is as true as saying a huge truck is not an accomplishment in transport, because you could equally well divide your cargo into tiny pieces that can be carried atop snails and then reassembled when they reach the destination.

  • by Anonymous Coward

    You could name a whole list of usage of such a computer but there is only one one with the real budgets, because it currently is only allowed to be tested in simulations, and that use is what the department of energy is all about.

  • >> Argonne National Laboratory near Chicago

    You know what else is next to Chicago? Aurora, Illinois. (Wayne's World's home - and 37 miles from Argonne.)
  • I am tempted to go through all the techie and sci-fi uses of the term "Aurora" and make a single record of them all. From spaceship names to project codenames, it seems to be everywhere.

    And the female name Kira, Qi'ra, etc.

    • I bet this new super computer could go through all that data for you pretty darn fast.

    • I am tempted to go through all the techie and sci-fi uses of the term "Aurora" and make a single record of them all. From spaceship names to project codenames, it seems to be everywhere.

      And the female name Kira, Qi'ra, etc.

      Aurora is the goddess of Dawn. So, in any techie or sci fi context, it suggests fresh beginnings, a wonderful start to a new era.

      Basically, 'Sunrise' from Also Sprach Zarathustra.

  • The department of energy has so many computer I would be shocked to know if any of those are working 24/7. I can't believe they really need yet another super computer to perform stupid task. This is not about science, it's about give money to friend and pride to hold the fastest computer.
    • by CanadianMacFan ( 1900244 ) on Monday March 18, 2019 @07:25PM (#58295594)

      Larger computers allow models to be run faster so you can repeat more runs to see what the probable impacts the model predicts. They also allow finer scale detail in the model and the modellers to add new variables to the model. The amount of exponentially goes up massively if you are mapping the climate and you decrease the size of your grid element (a block where weather conditions are the same). New data is coming in from the field all of the time and our understanding how systems work is always improving. With better computers more variables may be added to a model to make it more accurate. Those variables may have used up too much computer time on the older systems.

      I'm not ruling out that there's some national pride at work and the awarding of the contract to a specific company might not have helped out someone.

    • I expect most of the big farms are used most of the time. The farms I know of do. A lot of these systems are used for simulations that take days or longer to run, so the jobs are just loaded up.

      At some point older computers become uneconomical to run because of the power budget.

  • by CanadianMacFan ( 1900244 ) on Monday March 18, 2019 @07:32PM (#58295636)

    It runs at an expected 1 exaflops but with the Spectre prevention it'll be running at just under 900 petaflops.

  • I'd like to see the power requirements.

    A friend of mine worked on the Oak Ridge Summit super computer that brought online last year. The power requirements for running the thing are astonishing in and of themselves. More or less a small city's worth of electricity.

    • by necro81 ( 917438 )

      I'd like to see the power requirements.

      Indeed: the power requirements for supercomputers have been getting crazy, and the forecasts for exascale computing are worse. Interestingly, the power required for the computation isn't all that bad; what really kills you is the power burned in moving data around, and the power required to keep everything cool. The DoE's exascale computing research has a goal of a machine in the early 2020s that only requires 20-30 MW.

      It is a few years old, but IEEE Spectrum pr [ieee.org]

  • According to a report [theregister.co.uk] by The Register, Fujitsu will use ARM to build an exascale supercomputer, which is called Post K.

    According to a report [insidehpc.com] by InsideHPC, the supercomputer will debut in 2021, which is the same year in which Aurora will debut.

    According to another report [insidehpc.com] by InsideHPC, the ARM supercomputer will be tuned for high performance in machine-learning applications.

    Fearing an economic Pearl Harbor, Washington quickly asked Cray to develop an answer to Post K. Aurora is the answer.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...