How To Build a Supercomputer In 24 Hours 161
An anonymous reader writes with a link to this "time lapse video of students and postdocs at the University of Zurich constructing the zBox4 supercomputer. The machine has a theoretical compute capacity of ~1% of the human brain and will be used for simulating the formation of stars, planets and galaxies." That rack has "3,072 2.2GHz Intel Xeon cores and over 12TB of RAM." Also notable: for once, several of the YouTube comments are worth reading for more details on the construction and specs.
Comment removed (Score:4, Informative)
Re:Pretty sure (Score:5, Informative)
Computers have surpassed that level a loooooong time ago
Doubtful.
The computational requirements for simulating the human brain have been severely, even hilariously, underestimated in the past. To quote Wikipedia: One estimate puts the human brain at about 100 billion neurons and 100 trillion synapses.
That's... a lot.
First off, a lot of people think that 1 FLOP = 1 Neuron, which is not even close. The active points are the synapses, of which there are about a thousand per neuron! Each may receive an impulse over ten times a second, and involve dozens of parameters, such as the recent history of firings, neurotransmitter levels, hormone levels, membrane potentials, etc... A very conservative estimate would be that a single neuron, receiving impulses at around 10 Hz on 1000 synapses would require on the order of 1 megaflop to simulate. That's ONE neuron. Now multiply that by 100 billion, and you get a picture of what's required: about 100 petaflops, minimum. Storage is nothing to sneeze at either. Assuming a mere 50 single-precision floating point values per synapse to store all simulation state, you're looking at almost 18 petabytes of memory! That's over $100M for the memory sticks alone, even with a deep bulk-purchase discount. Unlike most server or HPC workloads those 18 petabytes would have to completely read out, processed, and possibly updated again at least ten times a second.
Second, consider that the first simulations won't be very optimized. We still don't really know what's relevant, and what can be simplified away. Hence, I suspect that the first attempts will be much less efficient, and may require 10x or even 100x as much computer power compared to later attempts. For example, neurons don't just fire impulses, they also grow and change shape. I don't think there's even a good model for how that works in the complex 3D environment of the brain!
We are getting closer, but expect to wait at least a decade or two before people start talking seriously about a full human brain simulation.
Re:Headline is stupid (Score:5, Informative)
Re:Pretty sure (Score:3, Informative)
Re:Software (Score:5, Informative)
Re:Headline is awesome (Score:5, Informative)