Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Data Storage Hardware

'Digital Universe' To Add 1.8 Zettabyte In 2011 60

1sockchuck writes "More than 1.8 zettabytes of information will be created and stored in 2011, according to the fifth IDC Digital Universe study. A key challenge is managing this data deluge (typified by the Large Hadron Collider at CERN, which generates 1 petabyte of data per second)."
This discussion has been archived. No new comments can be posted.

'Digital Universe' To Add 1.8 Zettabyte In 2011

Comments Filter:
  • by rbrausse ( 1319883 ) on Tuesday June 28, 2011 @07:54AM (#36595738)

    the experiments may generate the PB per second but most of the data is rejected before it hits any storage system...

  • Wow that's a lot of data. Can't wait to see more of the results published.

  • That's 1.8 x 10^-6 hellabytes for those of you keeping track.
    • That unit is as good as any. We're rapidly running out of prefixes here, and we still need formal definitions for units such as bucketload, shitload, shitton[ne] and so forth.
      • The key to getting a prefix established...is to just start using it.  The rest of the world and ultimately standardizing bodies will adopt it eventually.

        http://www.facebook.com/pages/The-Official-Petition-to-Establish-Hella-as-the-SI-Prefix-for-1027/277479937276?ref=search&sid=1050298974.3729275378..1
        • Fair enough, but I doubt if you're going to win any converts by insisting on posting your comments with horrible monospaced fonts just to grab attention... :-}
  • by Bob the Super Hamste ( 1152367 ) on Tuesday June 28, 2011 @08:09AM (#36595944) Homepage
    Can we get that in a proper measurement like Libraries of Congress.
    • by Phurge ( 1112105 )

      Using 1 LoC = 20 TB, then 1.8 ZB = 96,636,764 LoCs

      Or as wolfram alpha says 1.8 ZB = 144,000 x estimated information content of all human knowledge.

  • by Bob the Super Hamste ( 1152367 ) on Tuesday June 28, 2011 @08:15AM (#36596038) Homepage
    I wonder how much of that data is redundant. I know that for one of my side projects I have "redundant" data that I got from the Minnesota DNR, various MN counties, the state legislature, and the federal gov. Even after it had been preprocessed and trimmed down so it only has what I care about it is still around 12GB of vector data which is about 1/3 the original size.
  • by PJ6 ( 1151747 )
    that's just Netflix.
  • to store this amount of data, you need 57.5 billion 32GB iPads which will cost around $34.4 trillion — and that's equivalent to Gross Domestic Product (GDP) of United States, Japan, China, Germany, France, the United Kingdom and Italy combined. :D
  • More space to be filled by Russian mining bots. Oh wait which universe is this?
  • by L4t3r4lu5 ( 1216702 ) on Tuesday June 28, 2011 @08:34AM (#36596312)
    So it generates 1PB of data per second, yet from the article "[T]he data comes from the four machines on the LHC in which the collisions are monitored â" Alice, Atlas, CMS and LHCb â" which send back 320MB, 100MB, 220MB and 500MB"

    That's a few orders of magnitude short of 1 Petabyte, folks. Where are these numbers coming from?
    • by Eivind ( 15695 )

      One is the raw-amount, simply the number of sensors multiplied with the frequency each sample at multiplied with the size of each sample.

      But the article itself say they filter and store only the interesting stuff, which is, as we can see from the later numbers, a triflingly small fraction of the entirety.

      A camera that can do full-HD at 30fps captures 186MB/s afterall, but it does not follow that a facility with 3 such security-cameras need to store however many petabytes that becomes, in order to have a rec

    • by AJH16 ( 940784 )

      As eivind pointed out, the PB a second is raw data. The LHC utilizes 2 layers of in hardware filtering and another layer of software filtering (as I recall from a while back at least) in order to trim the data down to a quasi-reasonable data stream that can be effectively stored and analysed.

    • If you drive 60 miles/hour for 30 seconds, you haven't driven 60 miles. One is a measure of speed, the other is distance.

      Same with this. 1 PB/s is speed. 1,140 MB is the amount of data. All it really means is that these 1,140 MB are generated (and possibly collected) in 1.06 microseconds.

  • (typified by the Large Hadron Collider at CERN, which generates 1 petabyte of data per second)

    So, the LHC produces 1 petabyte per second, and given that there are 30+ million seconds in the year, that means the LHC produces 30+ zettabytes a year. Clearly there is a problem with your typification.

  • by vlm ( 69642 )

    I call bogus on this.

    10e21 / 10e10 = 10e11 bytes/living human being.

    The global GDP and global hard drive manufacturers simply cannot support a 100 GB hard drive per person per year... Cheapest option per byte is probably 1 TB drive for every 10 people. My basement therefore balances against a small African village, but there's plenty of small African villages, and only one me.

    Even if all the ACTIVE /.-er types have a basement like mine, and they do not, there are simply not enough of us. And on a global G

  • And most of it will be junk data.

    Data != Information. And then there's Metcalfe's law applied to information value.

"Being against torture ought to be sort of a multipartisan thing." -- Karl Lehenbauer, as amended by Jeff Daiell, a Libertarian

Working...