×
Supercomputing

Ask Slashdot: Best Bang-for-the-Buck HPC Solution? 150

An anonymous reader writes: We are looking into procuring a FEA/CFD machine for our small company. While I know workstations well, the multi-socket rack cluster solutions are foreign to me. On one end of the spectrum, there are companies like HP and Cray that offer impressive setups for millions of dollars (out of our league). On the other end, there are quad-socket mobos from Supermicro and Intel, for 8-18 core CPUs that cost thousands of dollars apiece.

Where do we go from here? Is it even reasonable to order $50k worth of components and put together our own high-performance, reasonably-priced blade cluster? Or is this folly, best left to experts? Who are these experts if we need them?

And what is the better choice here? 16-core Opterons at 2.6 GHz, 8-core Xeons at 3.4 GHz? Are power and thermals limiting factors here? (A full rack cupboard would consume something like 25 kW, it seems?) There seems to be precious little straightforward information about this on the net.
Supercomputing

Supercomputing Cluster Immersed In Oil Yields Extreme Efficiency 67

1sockchuck writes: A new supercomputing cluster immersed in tanks of dielectric fluid has posted extreme efficiency ratings. The Vienna Scientific Cluster 3 combines several efficiency techniques to create a system that is stingy in its use of power, cooling and water. VSC3 recorded a PUE (Power Usage Efficiency) of 1.02, putting it in the realm of data centers run by Google and Facebook. The system avoids the use of chillers and air handlers, and doesn't require any water to cool the fluid in the cooling tanks. Limiting use of water is a growing priority for data center operators, as cooling towers can use large volumes of water resources. The VSC3 system packs 600 teraflops of computing power into 1,000 square feet of floor space.
AMD

AMD Outlines Plans For Zen-Based Processors, First Due In 2016 166

crookedvulture writes: AMD laid out its plans for processors based on its all-new Zen microarchitecture today, promising 40% higher performance-per-clock from from the x86 CPU core. Zen will use simultaneous multithreading to execute two threads per core, and it will be built using "3D" FinFETs. The first chips are due to hit high-end desktops and servers next year. In 2017, Zen will combine with integrated graphics in smaller APUs designed for desktops and notebooks. AMD also plans to produce a high-performance server APU with a "transformational memory architecture" likely similar to the on-package DRAM being developed for the company's discrete graphics processors. This chip could give AMD a credible challenger in the HPC and supercomputing markets—and it could also make its way into laptops and desktops.
Stats

Humans Dominating Poker Super Computer 93

New submitter IoTdude writes: The Claudico super computer uses an algorithm to account for gargantuan amounts of complexity by representing the number of possible Heads-Up No-limit Texas Hold'em decisions. Claudico also updates its strategy as it goes along, but its basic approach to the game involves getting into every hand by calling bets. And it's not working out so far. Halfway through the competition, the four human pros had a cumulative lead of 626,892 chips. Though much could change in the week remaining, a lead of around 600,000 chips is considered statistically significant.
Supercomputing

Nuclear Fusion Simulator Among Software Picked For US's Summit Supercomputer 57

An anonymous reader writes Today, The Register has learned of 13 science projects approved by boffins at the US Department of Energy to run on the 300-petaFLOPS Summit. These software packages, selected for the Center for Accelerated Application Readiness (CAAR) program, will be ported to the massive parallel machine, and are hoped to make full use of the supercomputer's architecture.They range from astrophysics, biophysics, chemistry, and climate modeling to combustion engineering, materials science, nuclear physics, plasma physics and seismology.
Intel

US Blocks Intel From Selling Xeon Chips To Chinese Supercomputer Projects 229

itwbennett writes: U.S. government agencies have stopped Intel from selling microprocessors for China's supercomputers, apparently reflecting concern about their use in nuclear tests. In February, four supercomputing institutions in China were placed on a U.S. government list that effectively bans them from receiving certain U.S. exports. The institutions were involved in building Tianhe-2 and Tianhe-1A, both of which have allegedly been used for 'nuclear explosive activities,' according to a notice (PDF) posted by the U.S. Department of Commerce. Intel has been selling its Xeon chips to Chinese supercomputers for years, so the ban represents a blow to its business.
Intel

US Pens $200 Million Deal For Massive Nuclear Security-Focused Supercomputer 74

An anonymous reader writes For the first time in over twenty years of supercomputing history, a chipmaker [Intel] has been awarded the contract to build a leading-edge national computing resource. This machine, expected to reach a peak performance of 180 petaflops, will provide massive compute power to Argonne National Laboratory, which will receive the HPC gear in 2018. Supercomputer maker Cray, which itself has had a remarkable couple of years contract-wise in government and commercial spheres, will be the integrator and manufacturer of the "Aurora" super. This machine will be a next-generation variant of its "Shasta" supercomputer line. The new $200 million supercomputer is set to be installed at Argonne's Leadership Computing Facility in 2018, rounding out a trio of systems aimed at bolstering nuclear security initiatives as well as pushing the performance of key technical computing applications valued by the Department of Energy and other agencies.
Supercomputing

NSF Commits $16M To Build Cloud-Based and Data-Intensive Supercomputers 29

aarondubrow writes: As supercomputing becomes central to the work and progress of researchers in all fields, new kinds of computing resources and more inclusive modes of interaction are required. The National Science Foundation announced $16M in awards to support two new supercomputing acquisitions for the open science community. The systems — "Bridges" at the Pittsburgh Supercomputing Center and "Jetstream," co-located at the Indiana University Pervasive Technology Institute and The University of Texas at Austin's Texas Advanced Computing Center — respond to the needs of the scientific computing community for more high-end, large-scale computing resources while helping to create a more inclusive computing environment for science and engineering. Reader 1sockchuck adds this article about why funding for the development of supercomputers is more important than ever: America's high-performance computing (HPC) community faces funding challenges and growing competition from China and other countries. At last week's SC14 conference, leading researchers focused on outlining the societal benefits of their work, and how it touches the daily lives of Americans. "When we talk at these conferences, we tend to talk to ourselves," said Wilf Pinfold, director of research and advanced technology development at Intel Federal. "We don't do a good job communicating the importance of what we do to a broader community." Why the focus on messaging? Funding for American supercomputing has been driven by the U.S. government, which is in a transition with implications for HPC funding. As ComputerWorld notes, climate change skeptic Ted Cruz is rumored to be in line to chair a Senate committee that oversees NASA and the NSF.
Supercomputing

Does Being First Still Matter In America? 247

dcblogs writes At the supercomputing conference, SC14, this week, a U.S. Dept. of Energy offical said the government has set a goal of 2023 as its delivery date for an exascale system. It may be taking a risky path with that amount of lead time because of increasing international competition. There was a time when the U.S. didn't settle for second place. President John F. Kennedy delivered his famous "we choose to go to the moon" speech in 1962, and seven years later a man walked on the moon. The U.S. exascale goal is nine years away. China, Europe and Japan all have major exascale efforts, and the government has already dropped on supercomputing. The European forecast of Hurricane Sandy in 2012 was so far ahead of U.S. models in predicting the storm's path that the National Oceanic and Atmospheric Administration was called before Congress to explain how it happened. It was told by a U.S. official that NOAA wasn't keeping up in computational capability. It's still not keeping up. Cliff Mass, a professor of meteorology at the University of Washington, wrote on his blog last month that the U.S. is "rapidly falling behind leading weather prediction centers around the world" because it has yet to catch up in computational capability to Europe. That criticism followed the $128 million recent purchase a Cray supercomputer by the U.K.'s Met Office, its meteorological agency.
Supercomputing

US DOE Sets Sights On 300 Petaflop Supercomputer 127

dcblogs writes U.S. officials Friday announced plans to spend $325 million on two new supercomputers, one of which may eventually be built to support speeds of up to 300 petaflops. The U.S. Department of Energy, the major funder of supercomputers used for scientific research, wants to have the two systems – each with a base speed of 150 petaflops – possibly running by 2017. Going beyond the base speed to reach 300 petaflops will take additional government approvals. If the world stands still, the U.S. may conceivably regain the lead in supercomputing speed from China with these new systems. How adequate this planned investment will look three years from now is a question. Lawmakers weren't reading from the same script as U.S. Energy Secretary Ernest Moniz when it came to assessing the U.S.'s place in the supercomputing world. Moniz said the awards "will ensure the United States retains global leadership in supercomputing." But Rep. Chuck Fleischmann (R-Tenn.) put U.S. leadership in the past tense. "Supercomputing is one of those things that we can step up and lead the world again," he said.
Supercomputing

Researchers Simulate Monster EF5 Tornado 61

New submitter Orp writes: I am the member of a research team that created a supercell thunderstorm simulation that is getting a lot of attention. Presented at the 27th Annual Severe Local Storms Conference in Madison, Wisconsin, Leigh Orf's talk was produced entirely as high def video and put on YouTube shortly after the presentation. In the simulation, the storm's updraft is so strong that it essentially peels rain-cooled air near the surface upward and into the storm's updraft, which appears to play a key role in maintaining the tornado. The simulation was based upon the environment that produced the May 24, 2011 outbreak which included a long-track EF5 tornado near El Reno Oklahoma (not to be confused with the May 31, 2013 EF5 tornado that killed three storm researchers).
Earth

Interviews: Ask CMI Director Alex King About Rare Earth Mineral Supplies 62

The modern electronics industry relies on inputs and supply chains, both material and technological, and none of them are easy to bypass. These include, besides expertise and manufacturing facilities, the actual materials that go into electronic components. Some of them are as common as silicon; rare earth minerals, not so much. One story linked from Slashdot a few years back predicted that then-known supplies would be exhausted by 2017, though such predictions of scarcity are notoriously hard to get right, as people (and prices) adjust to changes in supply. There's no denying that there's been a crunch on rare earths, though, over the last several years. The minerals themselves aren't necessarily rare in an absolute sense, but they're expensive to extract. The most economically viable deposits are found in China, and rising prices for them as exports to the U.S., the EU, and Japan have raised political hackles. At the same time, those rising prices have spurred exploration and reexamination of known deposits off the coast of Japan, in the midwestern U.S., and elsewhere.

Alex King is director of the Critical Materials Institute, a part of the U.S. Department of Energy's Ames Laboratory. CMI is heavily involved in making rare earth minerals slightly less rare by means of supercomputer analysis; researchers there are approaching the ongoing crunch by looking both for substitute materials for things like gallium, indium, and tantalum, and easier ways of separating out the individual rare earths (a difficult process). One team there is working with "ligands – molecules that attach with a specific rare-earth – that allow metallurgists to extract elements with minimal contamination from surrounding minerals" to simplify the extraction process. We'll be talking with King soon; what questions would you like to see posed? (This 18-minute TED talk from King is worth watching first, as is this Q&A.)
Supercomputing

16-Teraflops, £97m Cray To Replace IBM At UK Meteorological Office 125

Memetic writes: The UK weather forecasting service is replacing its IBM supercomputer with a Cray XC40 containing 17 petabytes of storage and capable of 16 TeraFLOPS. This is Cray's biggest contract outside the U.S. With 480,000 CPUs, it should be 13 times faster than the current system. It will weigh 140 tons. The aim is to enable more accurate modeling of the unstable UK climate, with UK-wide forecasts at a resolution of 1.5km run hourly, rather than every three hours, as currently happens. (Here's a similar system from the U.S.)
Supercomputing

First Demonstration of Artificial Intelligence On a Quantum Computer 98

KentuckyFC writes: Machine learning algorithms use a training dataset to learn how to recognize features in images and use this 'knowledge' to spot the same features in new images. The computational complexity of this task is such that the time required to solve it increases in polynomial time with the number of images in the training set and the complexity of the "learned" feature. So it's no surprise that quantum computers ought to be able to rapidly speed up this process. Indeed, a group of theoretical physicists last year designed a quantum algorithm that solves this problem in logarithmic time rather than polynomial, a significant improvement.

Now, a Chinese team has successfully implemented this artificial intelligence algorithm on a working quantum computer, for the first time. The information processor is a standard nuclear magnetic resonance quantum computer capable of handling 4 qubits. The team trained it to recognize the difference between the characters '6' and '9' and then asked it to classify a set of handwritten 6s and 9s accordingly, which it did successfully. The team says this is the first time that this kind of artificial intelligence has ever been demonstrated on a quantum computer and opens the way to the more rapid processing of other big data sets — provided, of course, that physicists can build more powerful quantum computers.
Software

Brown Dog: a Search Engine For the Other 99 Percent (of Data) 23

aarondubrow writes: We've all experienced the frustration of trying to access information on websites, only to find that the data is trapped in outdated, difficult-to-read file formats and that metadata — the critical data about the data, such as when and how and by whom it was produced — is nonexistent. Led by Kenton McHenry, a team at the National Center for Supercomputing Applications is working to change that. Recipients in 2013 of a $10 million, five-year award from the National Science Foundation, the team is developing software that allows researchers to manage and make sense of vast amounts of digital scientific data that is currently trapped in outdated file formats. The NCSA team recently demonstrated two publicly-available services to make the contents of uncurated data collections accessible.
Supercomputing

Supercomputing Upgrade Produces High-Resolution Storm Forecasts 77

dcblogs writes A supercomputer upgrade is paying off for the U.S. National Weather Service, with new high-resolution models that will offer better insight into severe weather. This improvement in modeling detail is a result of a supercomputer upgrade. The National Oceanic and Atmospheric Administration, which runs the weather service, put into production two new IBM supercomputers, each 213 teraflops, running Linux on Intel processors. These systems replaced 74-teraflop, four-year old systems. More computing power means systems can run more mathematics, and increase the resolution or detail on the maps from 8 miles to 2 miles.
Google

Google To Build Quantum Information Processors 72

An anonymous reader writes The Google Quantum AI Team has announced that they're bringing in a team from the University of California at Santa Barbara to build quantum information processors within the company. "With an integrated hardware group the Quantum AI team will now be able to implement and test new designs for quantum optimization and inference processors based on recent theoretical insights as well as our learnings from the D-Wave quantum annealing architecture." Google will continue to work with D-Wave, but the UC Santa Barbara group brings its own areas of expertise with superconducting qubit arrays.
Cloud

IBM Opens Up Its Watson Supercomputer To Researchers 28

An anonymous reader writes IBM has announced the "Watson Discovery Advisor" a cloud-based tool that will let researchers comb through massive troves of data, looking for insights and connections. The company says it's a major expansion in capabilities for the Watson Group, which IBM seeded with a $1 billion investment. "Scientific discovery takes us to a different level as a learning system," said Steve Gold, vice president of the Watson Group. "Watson can provide insights into the information independent of the question. The ability to connect the dots opens up a new world of possibilities."
Supercomputing

How a Supercomputer Beat the Scrap Heap and Lived On To Retire In Africa 145

New submitter jorge_salazar (3562633) writes Pieces of the decommissioned Ranger supercomputer, 40 racks in all, were shipped to researchers in South Africa, Tanzania, and Botswana to help seed their supercomputing aspirations. They say they'll need supercomputers to solve their growing science problems in astronomy, bioinformatics, climate modeling and more. Ranger's own beginnings were described by the co-founder of Sun Microsystems as a 'historic moment in petaflop computing."

Slashdot Top Deals