Forgot your password?
typodupeerror
Data Storage IT

Stored Data to Exceed 1.8 Zettabytes by 2011 143

Posted by CmdrTaco
from the less-than-eighty-percent-porn dept.
jcatcw writes "By 2011, there will be 1.8 zettabytes of electronic data stored in 20 quadrillion files, packets or other containers because of, among other things, the massive growth rate of social networks, and digital equipment such as cameras, cell phones and televisions, according to a new study by IDC. Data is growing by a factor of 10 every five years. According to John Gantz, IDC's lead analyst, "at some point in the life of every file, or bit or packet, 85% of that information somewhere goes through a corporate computer, website, network or asset," meaning any given corporation becomes responsible for protecting large amounts of data that it and its customers may not have created. The study, which coincided with the launch of a " digital footprint" calculator, also found that as the world changes over to digital televisions, analog sets and obsolete set-top boxes and DVDs "will be heaped on the waste piles, which will double by 2011.""
This discussion has been archived. No new comments can be posted.

Stored Data to Exceed 1.8 Zettabytes by 2011

Comments Filter:
  • by EricR86 (1144023) on Wednesday March 12, 2008 @07:59AM (#22727078)

    Since we're talking very large orders of magnitude it would help to know what definition of zetabyte they're using.

    2^50 bytes or 10^15 bytes?

    The former is astronomically larger.

  • Wrong metric? (Score:4, Interesting)

    by guruevi (827432) <evi AT smokingcube DOT be> on Wednesday March 12, 2008 @08:06AM (#22727162) Homepage
    I was wondering if they weren't a bit wrong in their calculations. A Zettabyte is 1 Million Petabytes. Knowing that where I work has about 2 petabytes in a few SAN's and there are 1000's of larger institutions and millions that are smaller (that store in the terabytes range) around the world. The place I worked before had about a half a petabyte just in tape backups for credit card and other transactions, catalog and pricing information, images etc. and that was just an average clothing company, hardly rivaling JCPenney or Macy's. I'm also thinking about Wal-Mart with millions of products and thousands of stores. And we're just talking about SAN's here mainly in the US, not including desktops, laptops, camera's, personal information, Google.

    On another note, how much does a zettabyte actually yield these days, drive manufacturers might just give you 700 Petabytes for it. Oblig. XKCD: http://xkcd.org/394/ [xkcd.org]
  • by mikael (484) on Wednesday March 12, 2008 @08:42AM (#22727450)
    Some of the data transfers really seems wasteful. I download a Linux DVD ISO file, burn it onto a DVD, install the system on a new hard disk drive, then download another couple of Gigabytes of updates. Wouldn't be simpler to just have an installation DVD that creates a minimal system which then downloads the latest version of each module.

    And that DVD is really only used once and then forgotten about.
  • by beckerist (985855) on Wednesday March 12, 2008 @08:45AM (#22727478) Homepage
    From: http://en.wikipedia.org/wiki/Google_platform [wikipedia.org]

    # Upwards of 450,000 servers ranging from a 533 MHz Intel Celeron to a dual 1.4 GHz Intel Pentium III (as of 2005)
    # One or more 80GB hard disks per server (2003)
    So at least using these numbers, let's say on average they have 120gb per server (1 and a half, 80 GB drives...) That would mean they have 54,000 TBs or 54 PBs. I'm sure they have even more now, but as a point of reference! Yes, Google has a finite amount of space!
  • by Ed Avis (5917) <ed@membled.com> on Wednesday March 12, 2008 @08:59AM (#22727612) Homepage
    I remember when you could do a network install from two floppies...
  • by epine (68316) on Wednesday March 12, 2008 @09:21AM (#22727784)

    secondly, who really cares? Most of it is cached google pages and pron anyway...
    That's why /.ers care.
    But actually, no. We're very close already to being able to generate pron on demand without involving any principle photography. You won't even need to say what you want, that will be ascertained on the fly by neuro-cranial-bio-feedback.

    After enough of the male population has been brain mapped, it will probably turn out like spam: there's only so many unique permutations, as long as the scene is dressed up a little differently from time to time to maintain the novelty factor.

    Pron seems to be a lot like Big Bertha, where each mortar round was larger than the last, to accommodate progressive barrel enlargement. Eventually the images become extremely shocking to get any response at all.

    http://www.wired.com/science/discoveries/news/2008/03/mri_vision [wired.com]

    The future of compression is not to send the picture itself, but the reduced specification for an image that produces the same effect on the human visual system. We're already doing this with psycho-acoustic encoding.

    Once we have a sufficiently sophisticated model of human sensory perception, mental and emotional responses (which will run to TBs I'm sure), we can run a competition for the best feature movie encoded in under 4KB. Mostly it would describe desired emotional responses and cognitive states, the actual images would be back-generated to achieve this effect as determined by the human perceptual model.

Little known fact about Middle Earth: The Hobbits had a very sophisticated computer network! It was a Tolkien Ring...

Working...