Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Data Storage Software Space Technology

The Black Hole Image Data Was Spread Across 5 Petabytes Stored On About Half a Ton of Hard Drives (vice.com) 293

An anonymous reader quotes a report from Motherboard: On Wednesday, an international team of scientists published the first image of a black hole ever. It looked like a SpaghettiO, and yet the image was an incredible scientific achievement that gave humanity a glimpse of one of the universe's most destructive forces and confirmed long-held theories -- namely, that black holes exist. Storing the raw data for the image was a feat itself -- tiny portions of data spread across five petabytes stored on multiple hard drives, the equivalent of 5,000 years worth of MP3s. Katie Bouman, a computer scientist and assistant professor at the California Institute of Technology, led the development of the algorithm that imaged the black hole. An image of her posing with some of the data drives went viral as observers praised her success.

The massive amounts of data were essential to creating the image of the black hole. Bouman and other scientists coordinated radio telescopes all over the Earth, each pointed at the black hole and gathering data at different times. The data scientists then pieced this information together and used an algorithm to fill in the blanks and generate a likely image of the black hole. The five petabytes of data took up such a massive amount of digital and physical space it couldn't be sent over the internet. Instead, the hard drives were flown to processing centers in Germany and Boston where the data was assembled. On Reddit's /r/datahoarder subreddit, a community dedicated to spreading the passion of hoarding vast amounts of data, the drives were bigger news than the scientific achievement itself.

This discussion has been archived. No new comments can be posted.

The Black Hole Image Data Was Spread Across 5 Petabytes Stored On About Half a Ton of Hard Drives

Comments Filter:
  • by jfdavis668 ( 1414919 ) on Friday April 12, 2019 @01:07AM (#58425360)
    Let's start measuring storage space by the ton! We can have Kilotons and Megatons...wait, that sounds very familiar...
  • by rastos1 ( 601318 ) on Friday April 12, 2019 @01:37AM (#58425416)
    "Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway." -- Andrew S. Tanenbaum
    • by tinkerton ( 199273 ) on Friday April 12, 2019 @03:10AM (#58425596)

      How many tapes do you need before the station wagon collapses into itself into a black hole?

      • Re:34 years ago: (Score:5, Interesting)

        by AmiMoJo ( 196126 ) on Friday April 12, 2019 @06:52AM (#58426050) Homepage Journal

        The Tolmanâ"Oppenheimerâ"Volkoff limit is around 2.17 solar masses, or 4.3149799e30kg.

        A typical LTO tape weighs about 200g. So 2.15748995e31, or 21 nonillion 574 octillion 899 septillion 500 sextillion tapes.

        With a typical size of about 102x105x21.5mm you would end up with a sphere ~6.886e19m in diameter. Apparently LTO tapes are not very dense.

        • I half considered starting on that :)
          Of course now that you mention it is doesn't look right to use a weight of the tape accurate to a few percent and then calculate a result with 9 significant numbers !
          Concerning the practicalities of implementation, if you'd actually start piling on the tapes then the maximum radius would likely never exceed the radius of the sun since at earth size most tapes would already be compressed to a few tonnes per cubic meter apart from a very thin shell . Also I don't think th

  • Since TB drives are common now, 5000 TB would have been easier to understand for most people.

  • ”the equivalent of 5,000 years worth of MP3s”

    How am I supposed to get a sense of scale from that? They didn’t even provide the bit rate...

  • by ghoul ( 157158 ) on Friday April 12, 2019 @03:05AM (#58425584)

    of a C130 loaded with Flashdrives flying at 700 mph. The latency is a bitch though.

  • Not valid unless given in LOCâ(TM)s
  • by Shag ( 3737 ) on Friday April 12, 2019 @08:01AM (#58426254) Journal

    I mentioned this in a late comment on the other post, and the hardware has been mentioned on the Reddit thread - including by the person who built the modules! - but the Mark 6 drive packs used for recording this data at various large, high-bandwidth radio observatories can handle 16 Gbps sustained records. (By way of comparison, an all-SSD RAID might get you about one-quarter that speed.)

    It was explained to me by a guy who runs a radio telescope as each pack more or less being a JBOD, but with controllers smart enough to write each packet of data to whatever drive was ready to handle it, while keeping a journal on some other drive of where things had been written, so that the data could be reassembled later. The word "shotgun" figured into the explanation.

  • Wow, that's a lot of data. Did it take the whole internet's bandwidth for a day to send it to all the sites?

    No. We used sneakernet.
  • "The five petabytes of data took up such a massive amount of digital and physical space it couldn't be sent over the internet.'

    Subcontract the job to the adult video industry.

Children begin by loving their parents. After a time they judge them. Rarely, if ever, do they forgive them. - Oscar Wilde

Working...