Digital Big Bang — 161 Exabytes In 2006 176
An anonymous reader tips us to an AP story on a recent study of how much data we are producing. IDC estimates that in 2006 we created, captured, and replicated 161 exabytes of digital information. The last time anyone tried to estimate global information volume, in 2003, researchers at UC Berkeley came up with 5 exabytes. (The current study tries to account for duplicating data — on the same assumptions as the 2003 study it would have come out at 40 exabytes.) By 2010, according to IDC, we will be producing far more data than we will have room to store, closing in on a zettabyte.
What if ISP's are forced to retain data? (Score:5, Interesting)
And there used to be so little on-line data (Score:5, Interesting)
What's really striking is how little data was available in machine-readable form well into the computer era. In the 1970s, the Stanford AI lab got a feed from the Associated Press wire, simply to get a source of machine-readable text for test purposes. There wasn't much out there.
In 1971, I visited Western Union's installation in Mawah, NH, which was mostly UNIVAC gear. (I worked at a UNIVAC site a few miles away, so I was over there to see how they did some things.) I was shown the primary Western Union international gateway, driven by a pair of real-time UNIVAC 494 computers. All Western Union message traffic between the US and Europe went through there. And the traffic volume was so small that the logging tape was just writing a block every few seconds. Of course, each message cost a few dollars to send; these were "international telegrams".
Sitting at a CRT terminal was a woman whose job it was to deal with mail bounces. About once a minute, a message would appear on her screen, and she'd correct the address if possible, using some directories she had handy, or return the message to the sender. Think about it. One person was manually handling all the e-mail bounces for all commercial US-Europe traffic. One person.
Internet a product of biology? (Score:2, Interesting)
In River Out of Eden [wikipedia.org] Richard Dawkins traces the data explosion of the information age right back to the big bang.
Re:Sorry, my fault... (Score:5, Interesting)
How much is actually used? (Score:5, Interesting)
My question is how much of this data is actually being used? I'm horrible for constantly downloading e-books, movies, software, OSes, and other stuff that I'm *intending* to do something with, but often don't get around to. I end up with gigabytes of "stuff" just sucking up disc space or wasting CDs. I burned a DivX copy of Matt Stone and Trey Parker's popular pre-South Park indie film "Orgazmo" in about 2001. I've since seen the film 2 or 3 times on TV. I STILL haven't watched the DivX version I have, and now I can't find the CD I put it on. I know I'm not the only one who does this either, as many of my friends are using up loads of storage space on files they've just been too busy to have a look at.
Right now I'm on a project digitizing patient files for a neurologist. We're going up to 10 years deep with files for over 18,000 patients. Most of this is *just* for legal purposes and nobody is EVER going to open and read the majority of these files. The doctor does electronic clinics where he consults the patient and adds new pages to their file, which will probably sit there undisturbed until the Ethernet Disk fails someday.
I think a more interesting story (although probably MUCH more difficult to research) would be "How much computerized data is never used beyond it's original creation on a given storage medium?"
Of course we will (Score:3, Interesting)
A good analogy is the human brain. We gather in huge amounts of information per second via touch, sight, and so on, but throw out the vast majority of the information. The key is to have good filtering systems so that things that are interesting and relevant are held onto.
Google Says: (Score:3, Interesting)
Google Says: (Score:3, Interesting)
[1] Total est. of people on the Internet:
http://www.internetworldstats.com/stats.htm [internetworldstats.com]
Yes...but is it useful (Score:4, Interesting)
of data, but what's the point. Is that data any more useful
to people than the selective data that was used to run the world
50, 60 or 100 years ago?
We as individuals are only capable of assimilating a limited amount
of information so most of those exabytes are just rolling around
like so many gears in an old machine. If they are minimally used or
never used they simply become a storage liability.
As an example, the internet has not made *better* doctors.
Even with all the latest information at thier finger tips
professionals are still only the sum of what they can
mentally absorb. Too much data, or wrong data (ie: wikipedia)
can lead to the same levels of inefficiency seen prior to
the 'information age'. What would a single doctor do with
160 exabytes of reading material, schedule it into the work day?
Also, if the amount of information is rated purely on bytes
but not in *useful content* the stats get skewed. Things like
movies and music should be ranked by the length of script
and/or notation. That would make the numbers much less than
160 exabytes.
Saying that the whole world produced 160 exabytes of information
is like saying the whole world used 50 billion tonnes of water.
did somebody actually drink it to sustain life?
Mechanistic stats are stupid.