Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Image

How To Build a Supercomputer In 24 Hours 161

An anonymous reader writes with a link to this "time lapse video of students and postdocs at the University of Zurich constructing the zBox4 supercomputer. The machine has a theoretical compute capacity of ~1% of the human brain and will be used for simulating the formation of stars, planets and galaxies." That rack has "3,072 2.2GHz Intel Xeon cores and over 12TB of RAM." Also notable: for once, several of the YouTube comments are worth reading for more details on the construction and specs.
This discussion has been archived. No new comments can be posted.

How To Build a Supercomputer In 24 Hours

Comments Filter:
  • Comment removed (Score:4, Informative)

    by account_deleted ( 4530225 ) on Sunday November 04, 2012 @03:40AM (#41870823)
    Comment removed based on user account deletion
  • Re:Pretty sure (Score:5, Informative)

    by bertok ( 226922 ) on Sunday November 04, 2012 @04:39AM (#41870987)

    Computers have surpassed that level a loooooong time ago

    Doubtful.

    The computational requirements for simulating the human brain have been severely, even hilariously, underestimated in the past. To quote Wikipedia: One estimate puts the human brain at about 100 billion neurons and 100 trillion synapses.

    That's... a lot.

    First off, a lot of people think that 1 FLOP = 1 Neuron, which is not even close. The active points are the synapses, of which there are about a thousand per neuron! Each may receive an impulse over ten times a second, and involve dozens of parameters, such as the recent history of firings, neurotransmitter levels, hormone levels, membrane potentials, etc... A very conservative estimate would be that a single neuron, receiving impulses at around 10 Hz on 1000 synapses would require on the order of 1 megaflop to simulate. That's ONE neuron. Now multiply that by 100 billion, and you get a picture of what's required: about 100 petaflops, minimum. Storage is nothing to sneeze at either. Assuming a mere 50 single-precision floating point values per synapse to store all simulation state, you're looking at almost 18 petabytes of memory! That's over $100M for the memory sticks alone, even with a deep bulk-purchase discount. Unlike most server or HPC workloads those 18 petabytes would have to completely read out, processed, and possibly updated again at least ten times a second.

    Second, consider that the first simulations won't be very optimized. We still don't really know what's relevant, and what can be simplified away. Hence, I suspect that the first attempts will be much less efficient, and may require 10x or even 100x as much computer power compared to later attempts. For example, neurons don't just fire impulses, they also grow and change shape. I don't think there's even a good model for how that works in the complex 3D environment of the brain!

    We are getting closer, but expect to wait at least a decade or two before people start talking seriously about a full human brain simulation.

  • by zbox4 ( 2766677 ) on Sunday November 04, 2012 @06:32AM (#41871351)
    it took ~year to acquire the funds, benchmark tests, fix the design, make the tender for the parts etc, but all the construction was done in 3x8hr shifts
  • Re:Pretty sure (Score:3, Informative)

    by zbox4 ( 2766677 ) on Sunday November 04, 2012 @06:39AM (#41871365)
    your assumptions are close to mine when i estimated the ~1% compute capability of the brain. individual neurons send an outgoing signal depending on the amount and rate of incoming signals), but i am an astrophysicist, not a neuroscientist ;) the zBox4 can calculate at over 10 petaflops.
  • Re:Software (Score:5, Informative)

    by zbox4 ( 2766677 ) on Sunday November 04, 2012 @06:49AM (#41871393)
    we use various astrophysics simulation codes, i.e. GASOLINE, PKDGRAV, RAMSES etc. some are developed by us. they are all MPI and solve the coupled gravitational and hydrodynamic equations that can describe the dark matter and baryons evolving in the expanding universe. memory and speed of the computer limit the resolution that can be attained, so various "sub-grid" physical processes have to be treated carefully. for cosmological simulations we know the initial conditions - those are the fluctuations that we can read off the microwave background. they show the universe was hot, dense and smooth early on. the codes follow the perturbations into the non-linear regime when dark matter haloes, stars and galaxies form. we can then compare the properties of simulated structures with observational data etc.
  • by zbox4 ( 2766677 ) on Sunday November 04, 2012 @08:35AM (#41871629)
    surprisingly few - a couple of bad motherboards (or static ;). its only been up for a week or so and we are still testing/installing stuff before making user queues live.

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...