'Digital Universe' To Add 1.8 Zettabyte In 2011 60
1sockchuck writes "More than 1.8 zettabytes of information will be created and stored in 2011, according to the fifth IDC Digital Universe study. A key challenge is managing this data deluge (typified by the Large Hadron Collider at CERN, which generates 1 petabyte of data per second)."
LHC data is _not_ stored in the digital universe (Score:4, Insightful)
the experiments may generate the PB per second but most of the data is rejected before it hits any storage system...
Re:LHC data is _not_ stored in the digital univers (Score:4, Insightful)
Re: (Score:2)
Indeed, only about 25PB are stored every year from the LHC.
No. They store all of it, but mostly in /dev/null
Re: (Score:2)
Shouldn't it be "Pebibytes"? We're supposed to be geeks.
Re: (Score:2)
Shouldn't it be "Pebibytes"? We're supposed to be geeks.
Yes, but we're not morons.
Re: (Score:2)
And as geeks we understand that the English language isn't governed by a committee of Swiss engineers.
Unfortunately, they're English engineers...
Re: (Score:2)
I suppose we should be grateful that data isn't measured in petahogsheads.
Re: (Score:2)
one and the data generated are not inherenly binary, so there is no need
for either the precision of "exactly one PiB" or the context of "this is binary".
In fact, the decimal prefix is the much more sensible one to use here.
1.8 Petabyte per second... (Score:1)
Wow that's a lot of data. Can't wait to see more of the results published.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
its a decimal, not a comma. why do so many people make the same stupid mistake?
Re: (Score:2)
Re: (Score:2)
decimal notation changes with language?!? that's a new one. even if it does, the rest of your comment WAS in english.
Re: (Score:2)
decimal notation changes with language?!?
Yes it does.
that's a new one.
Only for you
even if it does, the rest of your comment WAS in english.
The decimal separator on my keypad is comma. If you weren't so swift in calling me stupid, I would have explained it earlier.
Re: (Score:2)
ok, my bad.
Hellagood (Score:1)
Re: (Score:1)
Re: (Score:1)
http://www.facebook.com/pages/The-Official-Petition-to-Establish-Hella-as-the-SI-Prefix-for-1027/277479937276?ref=search&sid=1050298974.3729275378..1
Re: (Score:1)
Re: (Score:2)
and why in hell does a /. tab not close immediately but does some shit for a few miiliseconds before closing?
Re: (Score:2)
Why two? For redundancy?
Re: (Score:2)
Can I have that in LoCs (Score:5, Funny)
Re: (Score:2)
Using 1 LoC = 20 TB, then 1.8 ZB = 96,636,764 LoCs
Or as wolfram alpha says 1.8 ZB = 144,000 x estimated information content of all human knowledge.
Re: (Score:3)
oops 1.8 ZB = 144 x estimated information content of all human knowledge.
Re: (Score:2)
Or as woflram and heart says, 0 = 144,000 x 0 (estimated worth of human knowledge).
How much is redundant (Score:3)
bah (Score:2)
57.5 billion 32GB iPads required to store this :D (Score:1)
Re: (Score:2)
You are right, of course.
That'll be 1,500,000,000,000 double density 8 inch floppies worth of data. Much more efficient, and at the last price I paid for them, (about 12 years ago now) they were $12.00 each (not per box, each)) so that is $18,000,000,000,000, or roughly 52.325581395348837209302325581395% of the cost of the iPads.
-nB
Re: (Score:3)
Don't worry, the large size won't be an issue. You can put it in a ZIP and then put that into another ZIP and so on.
That's just stupid. They use the same compression algorithm!
Put it in a ZIP in a TAR in a RAR in a 7z in an ACE in a bZip in a CAB in a dmg in a a ARJ, and finally save it as a GIF. You can't use JPEG as it's lossy.
Re: (Score:2)
No, man. It's just ZIP's all the way down.
Great (Score:2)
Large Hadron Collider data anomaly? (Score:4, Interesting)
That's a few orders of magnitude short of 1 Petabyte, folks. Where are these numbers coming from?
Re: (Score:2)
Those numbers add up to the 25PB/year that they store. The other 99.9999% (really) is filtered our at the detectors before being "sent back".
Ah, thanks for that. I was under the impression that the computer centre would be discarding the unused data, but if the detectors are smart enough to do it then all the better!
Re: (Score:2)
One is the raw-amount, simply the number of sensors multiplied with the frequency each sample at multiplied with the size of each sample.
But the article itself say they filter and store only the interesting stuff, which is, as we can see from the later numbers, a triflingly small fraction of the entirety.
A camera that can do full-HD at 30fps captures 186MB/s afterall, but it does not follow that a facility with 3 such security-cameras need to store however many petabytes that becomes, in order to have a rec
Re: (Score:2)
As eivind pointed out, the PB a second is raw data. The LHC utilizes 2 layers of in hardware filtering and another layer of software filtering (as I recall from a while back at least) in order to trim the data down to a quasi-reasonable data stream that can be effectively stored and analysed.
Re: (Score:3)
If you drive 60 miles/hour for 30 seconds, you haven't driven 60 miles. One is a measure of speed, the other is distance.
Same with this. 1 PB/s is speed. 1,140 MB is the amount of data. All it really means is that these 1,140 MB are generated (and possibly collected) in 1.06 microseconds.
Wow, great compression at the LHC! (Score:2)
(typified by the Large Hadron Collider at CERN, which generates 1 petabyte of data per second)
So, the LHC produces 1 petabyte per second, and given that there are 30+ million seconds in the year, that means the LHC produces 30+ zettabytes a year. Clearly there is a problem with your typification.
Bogus (Score:2)
I call bogus on this.
10e21 / 10e10 = 10e11 bytes/living human being.
The global GDP and global hard drive manufacturers simply cannot support a 100 GB hard drive per person per year... Cheapest option per byte is probably 1 TB drive for every 10 people. My basement therefore balances against a small African village, but there's plenty of small African villages, and only one me.
Even if all the ACTIVE /.-er types have a basement like mine, and they do not, there are simply not enough of us. And on a global G
Re: (Score:2)
Do the sums. 1 ZB = 10^9 TB. a TB had drive costs c US$50, probably less in quantity, so information storage is US$50 billion/year industry.
Doesn't seem implausible to be honest.
Re: (Score:1)
Most of CERN's data isn't on hard drives anyway - it's got the biggest tape system in Europe that I know of (I do high end storage for a living).
A photo, for funsies: http://www.flickr.com/photos/naezmi/3309812634/ [flickr.com]
C
LHC data will not be information for years to come (Score:2)
And most of it will be junk data.
Data != Information. And then there's Metcalfe's law applied to information value.