Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Data Storage Supercomputing Upgrades Hardware

US Supercomputer Uses Flash Storage Drives 72

angry tapir writes "The San Diego Supercomputer Center has built a high-performance computer with solid-state drives, which the center says could help solve science problems faster than systems with traditional hard drives. The flash drives will provide faster data throughput, which should help the supercomputer analyze data an 'order of magnitude faster' than hard drive-based supercomputers, according to Allan Snavely, associate director at SDSC. SDSC intends to use the HPC system — called Dash — to develop new cures for diseases and to understand the development of Earth."
This discussion has been archived. No new comments can be posted.

US Supercomputer Uses Flash Storage Drives

Comments Filter:
  • Cost savings? (Score:5, Insightful)

    by gabebear ( 251933 ) on Sunday September 06, 2009 @08:39AM (#29330641) Homepage Journal

    "Hard drives are still the most cost-effective way of hanging on to data," Handy said. But for scientific research and financial services, the results are driven by speed, which makes SSDs makes worth the investment.

    Why is the super computer ever being turned off? Why not just add more RAM?

    SSD is cheaper than DDR ( ~$3/GB vs ~$8/GB ), but also ~100 times slower.

  • by bubbaD ( 182583 ) on Sunday September 06, 2009 @10:51AM (#29331389)

    "But that's okay, I'm sure English is your first/only language." That seems to be a really lame attempt to insult native English users. There's no grammatical rules against "problems to solve with it." Even "To problem solve with it" is acceptable because the rule against split infinitives is considered obsolete and old fashioned. English has amazing flexibility. It is the perl of human languages!

  • Re:Cost savings? (Score:4, Insightful)

    by SpaFF ( 18764 ) on Sunday September 06, 2009 @12:40PM (#29332179) Homepage

    There are plenty of reasons why supercomputers have to be shut down....besides the fact that even with generators and UPSes facilities outages are still a fact of life. What if there is a kernel vulnerability (insert 50 million ksplice replies here...yeah yeah yeah)? What if firmware needs to be updated to fix a problem? You can't just depend on RAM for storage. HPC jobs use input files that are ten's of Gigabytes and produce output files that can be multi Terrabytes. The jobs can run for weeks at a time. In some cases it takes longer to transfer the data to another machine that it takes to generate/process the data. You can't just assume that the machine will stay up to protect that data.

8 Catfish = 1 Octo-puss

Working...