Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Data Storage Media

Digital Big Bang — 161 Exabytes In 2006 176

An anonymous reader tips us to an AP story on a recent study of how much data we are producing. IDC estimates that in 2006 we created, captured, and replicated 161 exabytes of digital information. The last time anyone tried to estimate global information volume, in 2003, researchers at UC Berkeley came up with 5 exabytes. (The current study tries to account for duplicating data — on the same assumptions as the 2003 study it would have come out at 40 exabytes.) By 2010, according to IDC, we will be producing far more data than we will have room to store, closing in on a zettabyte.
This discussion has been archived. No new comments can be posted.

Digital Big Bang — 161 Exabytes In 2006

Comments Filter:
  • XXX (Score:5, Funny)

    by daddyrief ( 910385 ) on Monday March 05, 2007 @08:14PM (#18245188) Homepage
    And half of that is porn...
  • by noewun ( 591275 ) on Monday March 05, 2007 @08:15PM (#18245198) Journal
    Without Slashdot dupes.
    • by cmacb ( 547347 ) on Monday March 05, 2007 @08:40PM (#18245414) Homepage Journal
      HA!

      But seriously, I wonder what percentage of this data is text. I'd guess it is a very very small amount. When I had a film camera, in twenty years I bet I took less than 100 rolls of film. With digital cameras I've take thousands of pictures, sometimes taking a dozen or more of the same subject, just because the cost to me is practically zero. Now there are vendors that will let me upload large numbers of these amateurish photos for free, and let's pretend that there are enough people interested in seeing my pictures that these companies can pay for this storage with advertising. That's scary.

      Excluding attachments I think it would be practically impossible for anyone to use up Googles 2 gig of storage, but I've heard of people using it up in little more than a week by mailling large attachments back and forth (oh yeah, I HAVE to have every single iteration of that Word document, sure I do!)

      But what's scarier is that for some nominal fee (like $20 a year) they place no limit at all on my ability to hog a disk drive somewhere. I know people who are messed up in the head enough to want to test these claims. Give them 5 gig for photos and they've filled it up in a week, give them "unlimited" and they upload pure junk to see if they can break the thing.

      Like any house of cards, this thing is gonna come down sooner or later. I just hope that people who are making sensible use of these online services don't lose everything along with the abusers.
      • by Fweeky ( 41046 )
        "Excluding attachments I think it would be practically impossible for anyone to use up Googles 2 gig of storage"

        -% du -sh Mail
          27G Mail
        Only about 5 years worth, and I don't deal much with attachments. Or do you not class "I don't bother deleting email" as practical? Works well enough for me :)
    • by neax ( 961176 )
      the earth is going to become one giant hard disk. perhaps we should outsource to the moon...or mars.
  • by bigforearms ( 1051976 ) on Monday March 05, 2007 @08:16PM (#18245216)
    The furry porn gets deleted first.
  • How many... (Score:3, Funny)

    by Looce ( 1062620 ) on Monday March 05, 2007 @08:17PM (#18245222) Journal
    ... times does the Library of Congress fit in that? Exabytes simply don't speak to me.

    Alternatively, you can also answer in anime episodes, or mp3 files.
    • by LighterShadeOfBlack ( 1011407 ) on Monday March 05, 2007 @08:21PM (#18245262) Homepage
      That'd be 1,191,400 Libraries of Congress.

      Honestly, I don't know why the /. editors allow these "scientific articles" that only provide data in these obscure and archaic "byte" measurements. Absurd!
    • Assuming the "standard" fansub size of 233MB for an ED release, it would be about 760 billion episodes of anime. To put that in perspective, Naruto is currently 224 eps long (85 too many).
      • by Anonymous Coward on Monday March 05, 2007 @08:54PM (#18245538)
        760 billion episodes of anime.In other words, about half the length of a typical Dragonball Z fight scene.
        • As no one has actually seen a Dragonball Z fight conclude, we have no way of knowing how long a typical fight seen would go on for. Personally my bet is that after a few trillion episodes the vibration from the copious yelling of the same three words over and over causes the scene to collapse in on itself like a Moebius strip. This suggests the unpleasant possibility that the typical Dragonball Z fight scene is infinite in length. The truth of this conjecture is a hotly debated unsolved problem in mathem
        • by kv9 ( 697238 )
          are they still on Namek?
    • Re:How many... (Score:5, Informative)

      by franksands ( 938435 ) on Monday March 05, 2007 @08:44PM (#18245458) Homepage Journal
      Since you asked:

      Oh, the equivalents! That's like 12 stacks of books that each reach from the Earth to the sun. Or you might think of it as 3 million times the information in all the books ever written, according to IDC. You'd need more than 2 billion of the most capacious iPods on the market to get 161 exabytes.

      I don't have anime estimates, but I can make a Heroes [wikipedia.org] analogy.a hi-def episode is more or less 700mb. Considering the first season has 23 episodes, that would make 16.1gb. So 161 exabytes would be 10,000,000,000 (ten billion) seasons of Heroes. Since the earth currenlty has around 6.6 billion people, this would mean that you would have 1 episode for each person on the planet, and all the people of China, India and the US would have a second episode. That's how big it is.

      Regarding the storage space, I call shenanigans. We already have HDD that stores terabytes. A couple years from now, MS office will require that space to be installed.

      • Um, I already have several Hero-episode equivalents stored here at home. If all this data adds up to everybody gets a DVD, and some people get two, it doesn't seem like something anybody would really even notice, amongst all the DVDs they have already.
      • Yeah, well... I'll see your Heroes reference, and raise you a Firefly quote (that is vaguely, but not really at all, related): Dr. Simon Tam: Uh, her... her medications are erratic. There's-there's not one that her system can eventually break down, and... Mal: When want a lot of medical jargon, I'll talk to a doctor. Dr. Simon Tam: You are talking to a doctor.
  • by slobber ( 685169 ) on Monday March 05, 2007 @08:18PM (#18245230)
    I left cat /dev/urandom running
  • by Anonymous Coward on Monday March 05, 2007 @08:19PM (#18245240)
    We won't be running out of space just like we didn't run out of food. New technology will allow us to store ever more data.
    • by LighterShadeOfBlack ( 1011407 ) on Monday March 05, 2007 @08:35PM (#18245370) Homepage
      As the article notes, the amount we produce is not the same as the amount we would actually want to store. Since that 161EB includes duplications such as broadcasting, phone calls, and all manner of temporary or real-time data it's not really relevant to compare that number with storage capabilities as the summary implies.
    • We won't be running out of space just like we didn't run out of food. New technology will allow us to store ever more data.

      I remember when software came on cassettes and when food came from close to where you live.

      When floppy disks were too small, we made higher-density floppy disks, and we still needed a whole box of them.
      When there wasn't enough of a particular food, we got it shipped from further away.

      When CD-ROMs came out, we still ended up not only filling them but spreading things over multipl

      • I think you're pushing the scarcity doomwatch angle a bit hard here. Yeah, after 40 odd years we've probably pushed the basic HDD design about as far as it'll go, we'll probably never get more than 1-2TB on a 3.5" drive. But then there are numerous other technologies being developed that seek to improve on the performance and data density that HDDs provide. While they might be a way behind right now, the fact is that many of these are technologies in their infancy that will, with enough R&D, increase in
      • by babyrat ( 314371 )
        The planet simply can't sustain the 6.5 billion of us there are now, let alone the billions more to be born in the next few decades

        I think you underestimate the size of the planet - it's pretty big.

        There are 6.5 billion people on the earth (well, roughly - last time I counted, I think I lost count at about 4 billion, but I'm pretty sure I was more than halfway at that point).

        Assuming everyone lived in households of 3, and each household had it's own acre of land, you would be able to fit the entire populati
    • Re: (Score:3, Funny)

      Exactly. My company is developing a new storage medium based on penistechnology. If you don't have enough space, just play with it and it gets bigger. We're close to commercial release, just one more critical bug to iron out: it tends to burst out data if you try to enlarge it for too long.
    • We won't be running out of space just like we didn't run out of food.

      Correction: we haven't run out of food, yet.

      But we animals have a self-correcting system as far as that goes; if the food supply isn't sufficient to support a large population, then the population simply grows at a slower rate. The same will happen with our data; if we reach a point where we can't store all the non-ephemeral data we generate, we'll reflexively limit the amount of non-ephemeral data that we generate.
  • What's an exabyte? (Score:2, Informative)

    by Anonymous Coward
    Simply put, a lot [wikipedia.org]

    10^18 bytes, or One million terabytes
    • Re: (Score:3, Informative)

      by springbox ( 853816 )
      Did they measure in exabytes or exbibytes [wikipedia.org] (2^60 bytes)? The difference between 161 exabytes and 161 exbibytes are 24,620,362,241,702,363,136 bytes - about 21.36 exbibytes. Kind of important since the margin of error will only increase as the measured data grows. (Lets stop using the SI units when we don't actually mean it.)
    • I've got a much more thorough page at: http://g42.org/tiki/tiki-index.php?page=BigNumbers [g42.org]

      Yotta is the largest metric prefix and it's the next one after Zetta, so it looks like the standards people are going to have to get together to name some more prefixes.

  • by cryfreedomlove ( 929828 ) on Monday March 05, 2007 @08:20PM (#18245260)
    I imagine that a lot of this is web traffic logs. What if the US government really does force ISP's to keep records detailing the sites visited by their customers? Will my ISP rates increase to pay for all of that disk space?
    • by daeg ( 828071 )
      Just wait until the government hears that URLs change and they try to force ISPs to maintain a cache of pages along with the history.
    • Re: (Score:3, Funny)

      by garcia ( 6573 )
      Will my ISP rates increase to pay for all of that disk space?

      No, of course not. Any law or regulation that the government comes up with doesn't have any hidden costs.
      • Re: (Score:3, Funny)

        by daeg ( 828071 )
        Costs be damned when you're The Decider and, much to the dismay of IT budgets everywhere, can change time itself on a whim!
  • by Anonymous Coward on Monday March 05, 2007 @08:25PM (#18245278)
    So the sum total of data has increased by a factor of more than 30 since 2003? I knew Brent Spiner was putting on weight, but damn.
  • by iPaul ( 559200 ) on Monday March 05, 2007 @08:25PM (#18245282) Homepage
    Web server log files with the history of people clicking around. My address stored by everybody I ever bought anything on line from. It's more an information land-fill than an information warehouse.
  • by Animats ( 122034 ) on Monday March 05, 2007 @08:26PM (#18245288) Homepage

    What's really striking is how little data was available in machine-readable form well into the computer era. In the 1970s, the Stanford AI lab got a feed from the Associated Press wire, simply to get a source of machine-readable text for test purposes. There wasn't much out there.

    In 1971, I visited Western Union's installation in Mawah, NH, which was mostly UNIVAC gear. (I worked at a UNIVAC site a few miles away, so I was over there to see how they did some things.) I was shown the primary Western Union international gateway, driven by a pair of real-time UNIVAC 494 computers. All Western Union message traffic between the US and Europe went through there. And the traffic volume was so small that the logging tape was just writing a block every few seconds. Of course, each message cost a few dollars to send; these were "international telegrams".

    Sitting at a CRT terminal was a woman whose job it was to deal with mail bounces. About once a minute, a message would appear on her screen, and she'd correct the address if possible, using some directories she had handy, or return the message to the sender. Think about it. One person was manually handling all the e-mail bounces for all commercial US-Europe traffic. One person.

  • by Supreme Dragon ( 1071194 ) on Monday March 05, 2007 @08:26PM (#18245294)
    Is that the size of the next MS OS?
    • Makes me wonder what the critical mass is of junkware. One of these releases is either going supercritical and vaporize the MS HQ or collapse on itself and suck the whole planet in.
  • I'm just one person and I have 20GB just of OS and applications code. Plus another 20GB of MP3's. 161 billion /40 is about 4 billion 'gelfling people units'. Doesn't seem like a lot.
    • by Umbrae ( 866097 )
      You forget that this removes duplicates.

      Every OS file you have, application file you have, mp3 file you have, is only counted once. So 10000 gelflings is still only 40GB.
      • by gelfling ( 6534 )
        No it isn't I have 4 other people in my house, 4 other computers and they have even more per machine. I work for a company that does outsourcing. I don't think there is a reasonable estimate for the number of physical servers we manage. It's easily in the hundred thousand plus. How much DASD? Who knows. Figure one 100GB per server @ 40% utilization per x 100,000 = 400,000GB. Double that for offline and nearline storage. That's 800,000GB, easily.
  • Supply and demand (Score:5, Insightful)

    by rufty_tufty ( 888596 ) on Monday March 05, 2007 @08:31PM (#18245340) Homepage
    I'm sorry, how stupid is this?
    "producing far more data than we will have room to store"

    That's like saying, for the last 2 months, my profit has increased by 10%. If my profit keeps increasing at 10% per month, then pretty soon I'll own all the money in the world, and then I'll own more money than exists! Damn I must stop making money now before I destroy the world economy!!!

    Who are these people who draw straight lines on growth curves? Why do people print the garbage they write and why weren't they the first against the wall after the dot com bust?
    The only things that seem certain are death, taxes, entropy and stupid people...
    • by Looce ( 1062620 ) on Monday March 05, 2007 @08:39PM (#18245404) Journal
      Actually, you're spending some of the money you earn, in investments. You are neither a sink nor a source of money.

      Though with data, some people, or even companies, are merely sinks. They store huge amounts of data, mostly for auditing purposes. Access logs for webservers. Windows NT event logs. Setup logs for Windows Installer apps. For ISPs, a track record of people who got assigned an IP address, in case they get a subpoena. Change logs for DoD documents. Even CVS for developers, to keep track of umpteen old versions of software. Even the casual Web browsing session replicates information in your browser cache. Many more of these examples could be given.

      We also need to produce more and more hardware to store these archived data, the most obiquitous of which is the common hard drive. In the end, we'll need more metal and magnetic matter than the Earth can provide.

      Martian space missions, anyone?
      • by Rodness ( 168429 )
        Forget Mars, we can just tow asteroids into orbit and mine them. Hell, there's already one on the way! :)
      • That's assuming other storage mediums don't step up to replace todays magnetic storage. There's numerous other storage mediums on the horizon that don't use much metal or magnetic materials, that will far surpass traditional hard drives in data density. By the time the resources to make magnetic drives become prohibitively scarce, I just can't imagine it not being irrelevant because most new data is stored in some sort of crystal or organic material with such a high density that using up all the resources o
      • by Thuktun ( 221615 )

        We also need to produce more and more hardware to store these archived data, the most obiquitous [sic] of which is the common hard drive. In the end, we'll need more metal and magnetic matter than the Earth can provide.

        Right, and we ran out of wood because we used it all up heating our stone houses and all our land is taken up by pasture to feed the horses we use for transportation.

        Extrapolating our future needs based on the most common /current/ technology is a bit shortsighted.

    • by ni1s ( 1065810 )
      You forgot to assume the summary was sensational.

      "If everybody stored every digital bit, there wouldn't be enough room."
      Well, Duh!
    • by maxume ( 22995 )
      Stupid is relative. If stupid is a certainty, it implies something else is.
  • Data that cannot be stored will not be produced because all data that is produced must be stored. Data that is not stored (for however short a time) is not really produced.

    Then again the past no longer exists anyway, the future doesn't exist yet and the present has no duration- so maybe the data never existed anyway. Maybe you don't exist?!?! Awe man maybe I *~/ disappears in a puff of logic*
    ----
    Kudos to Augustine and Adams
    • This is of course, not true.

      I routinely have to compile static versions of my company's web stores in order to archive them and they are about 1GB each of HTML once compiled.

      Each store, however is about 100 megs of assets and then the data in the DB makes up another 50M or so. All of this is then generated dynamically and sent to client browsers that will just cache them temporarily. So, the data transmitted may be huge, but what people are storing would appear to be less.

      • May want to start compressing that shit. Use 7z; it's really good at noticing redundancies in logs and backups.
      • by szembek ( 948327 )
        It is stored, albeit temporarily. If there was no room to store it, it could not be created!
    • Of course we will (Score:3, Interesting)

      by PIPBoy3000 ( 619296 )
      Think about scientific instruments that gather gigabytes of data per second. They hold on to that for as long as they have to, pulling out interesting data, summarizing it, and throwing out the rest. I track all the web hits for our corporate Intranet. The volume is so huge that the SQL administrators come and have a little heart-to-heart chat with me if I let it build up over a few months. I don't really care about the raw information past a month or so. Instead, I want to see running counts of which
    • by istartedi ( 132515 ) on Monday March 05, 2007 @09:24PM (#18245780) Journal

      disappears in a puff of logic

      Great. Now we're all going to be inhaling second-hand logic. There ought to be a law...

  • by ni1s ( 1065810 )
    An anonymous reader tips us to an AP story on a recent study of how much data we are producing. IDC estimates that in 2010 we created, captured, and replicated close to a zettabyte of digital information. The last time anyone tried to estimate global information volume, in 2006, researchers at IDC came up with 161 exabytes. (The current study tries to account for duplicating data -- on the same assumptions as the 2006 study it would have come out at 250 exabytes.) By 2012, according to IDC, we will be produ
  • Internet | uniq (Score:3, Insightful)

    by Duncan3 ( 10537 ) on Monday March 05, 2007 @08:42PM (#18245436) Homepage
    The problem is, everything is duplicated, a LOT. All those copies needs to be stored tho, so here we are swimming in data.

    My work machine that I backed up a couple weeks ago, was a 30MB zip file, and 3/4 of that was my local CVS tree. So out of a 30GB, less then 1/3000th was not OS, software, or just copied locally from a data store.

    At home, I've saved every email, every picture, everything from my Windows, Linux, OSX and every other box I've every had since ~1992, and that's barely a few GB uncompressed.

    The amount of non-duplicate useful material is far far smaller then your would think.
    • by Firehed ( 942385 )
      Only a few gigs? You clearly don't have a camera that shoots in RAW... I've burned through well over ten gigs of storage just from mine, and I've owned it for all of six weeks (averaging to just under 300MB per DAY of new content). Sure, email takes next to nothing and I have plenty of duplicate content, but I have over a terabyte of storage and after doing my best to trim out redundancy, I still have a very sizable chunk of it used. I suppose it's really down to usage habits, but with 10+MP cameras and
    • I too am a programmer, and I have almost every scrap of code I ever wrote, including z80 assembly code to play "pong" on an analog oscilloscope. Why do I have it? I dunno, because I can. I don't even know where it is at this moment, but I THINK I have it on a cd-rom somewhere. And as long as that "archive bit" is set in my mind, it is ok (but if I couldn't find it, I'd just shrug and say, oh well...)

      Text (code, misc letters) IS very small. Up until just a couple of years ago, all the "good stuff" would fi

  • In River Out of Eden [wikipedia.org] Richard Dawkins traces the data explosion of the information age right back to the big bang.

    "The genetic code is not a binary code as in computers, nor an eight-level code as in some telephone systems, but a quaternary code with four symbols. The machine code of the genes is uncannily computerlike."
  • by basic0 ( 182925 ) <mmccollow@yahooGINSBERG.ca minus poet> on Monday March 05, 2007 @09:08PM (#18245646)
    Ok, so we generate some staggering amount of computerized data every year. This is one of those stories where I can't remember hearing about it before, but it really doesn't feel like "news".

    My question is how much of this data is actually being used? I'm horrible for constantly downloading e-books, movies, software, OSes, and other stuff that I'm *intending* to do something with, but often don't get around to. I end up with gigabytes of "stuff" just sucking up disc space or wasting CDs. I burned a DivX copy of Matt Stone and Trey Parker's popular pre-South Park indie film "Orgazmo" in about 2001. I've since seen the film 2 or 3 times on TV. I STILL haven't watched the DivX version I have, and now I can't find the CD I put it on. I know I'm not the only one who does this either, as many of my friends are using up loads of storage space on files they've just been too busy to have a look at.

    Right now I'm on a project digitizing patient files for a neurologist. We're going up to 10 years deep with files for over 18,000 patients. Most of this is *just* for legal purposes and nobody is EVER going to open and read the majority of these files. The doctor does electronic clinics where he consults the patient and adds new pages to their file, which will probably sit there undisturbed until the Ethernet Disk fails someday.

    I think a more interesting story (although probably MUCH more difficult to research) would be "How much computerized data is never used beyond it's original creation on a given storage medium?"
  • by Roger W Moore ( 538166 ) on Monday March 05, 2007 @09:24PM (#18245784) Journal
    So at this rate it won't be long before we will need real Exabyte tapes. I always thought the original ones should qualify for the award of world's most misleading name since their capacity was 500 million times less what their name suggested.
  • Google Says: (Score:3, Interesting)

    by nbritton ( 823086 ) on Monday March 05, 2007 @09:28PM (#18245808)
    (161 exabytes) / 6,525,170,264 people = 26.4931682 gigabytes per person.
  • Low SNR (Score:5, Insightful)

    by Jekler ( 626699 ) on Monday March 05, 2007 @10:00PM (#18246038)

    As interesting as the sheer volume is, most of it is garbage. I'd rather have 50 terabytes of organized and accurate information than 500 exabytes of data that isn't organized, and even if it were, it's accuracy is questionable at best. In essence, even if you manage to find what you want, the correctness of that information is likely to be very low.

    I've long said we are not in the information age, we are in the data age. The information age will be when we've successfully organized all this crap we're storing/transmitting.

  • I wonder if it would explain the amount going up as paranoid companies are heavily auditing everything they do and all financial data
  • by stoicio ( 710327 ) on Monday March 05, 2007 @11:30PM (#18246564) Journal
    It's well and fine to have a statistic like 161 exabytes
    of data, but what's the point. Is that data any more useful
    to people than the selective data that was used to run the world
    50, 60 or 100 years ago?

    We as individuals are only capable of assimilating a limited amount
    of information so most of those exabytes are just rolling around
    like so many gears in an old machine. If they are minimally used or
    never used they simply become a storage liability.

    As an example, the internet has not made *better* doctors.
    Even with all the latest information at thier finger tips
    professionals are still only the sum of what they can
    mentally absorb. Too much data, or wrong data (ie: wikipedia)
    can lead to the same levels of inefficiency seen prior to
    the 'information age'. What would a single doctor do with
    160 exabytes of reading material, schedule it into the work day?

    Also, if the amount of information is rated purely on bytes
    but not in *useful content* the stats get skewed. Things like
    movies and music should be ranked by the length of script
    and/or notation. That would make the numbers much less than
    160 exabytes.

    Saying that the whole world produced 160 exabytes of information
    is like saying the whole world used 50 billion tonnes of water. ...was that water just running down the pipe into the sewar or
    did somebody actually drink it to sustain life?

    Mechanistic stats are stupid.
  • Dr Evil (Score:5, Funny)

    by steveoc ( 2661 ) on Tuesday March 06, 2007 @12:12AM (#18246782)
    So DR Evil, after emerging from his suspended animation, would demand a computer big enough to store 100 Megabytes of evil data.
  • The article implies there is a bunch of new data.. The fact is much of the data is simply format shifted into the new medium. Examples of this are;

    1 Photography
    2 Letters and corrospondance
    3 Fileing and records
    4 Music
    5 Telephone calls & faxes
    6 Newspapers and magazines
    7 Novels and books
    8 Board games and puzzles
    9 Movies
    10 Radio and TV broadcasts
    11 ??

    All these form of data existed before. None of them was digital before. The numbers represent a format shift, not new content. Not many people archived ev
    • Noone asked me or my family, or my friends, or my colleagues, how much data we store. How do they even come up with those estimates? Do they just count the more "visible" corporate data warehouses and ignore the millions of individial users?
    • You can't "produce more data than is stored", this is idiotic to claim. How do they imagine this happening? Millions of people stubbornly trying to save files on their full hard drives, leading to a global crisis? This should join the FUD that we're about to "clog" the I
    • It's amazingly simple to produce and duplicate more data than you have room to store. You simply don't permanently store it all.

      Is this a concept that is so hard to understand? Many replies above don't seem to grasp the concept of data not actually being kept.
    • Like the previous poster said, we all produce tons of data daily without storing it. Mobile phone calls. VOIP. Video conferencing. IM without history. etc. That's countless gigabytes daily worldwide...
      • by suv4x4 ( 956391 )
        Like the previous poster said, we all produce tons of data daily without storing it. Mobile phone calls. VOIP. Video conferencing. IM without history. etc. That's countless gigabytes daily worldwide...

        I'm sorry that I have to clarify myself, but neither I or the survey include temporary data in the discussion. We're talking data that is stored and represents archived information..

        Otherwise where do we stop? Do we count copies of the programs in RAM, swap files, temporary caches and so on? It'll become point
  • by nbritton ( 823086 ) on Tuesday March 06, 2007 @07:27AM (#18248458)
    50 Exabytes = (50)1024 petabytes = (50)1048576 terabytes:

    RAID6 (24 Drives -2{Parity} -1{Hot Spare} = 21) 750GB, 13.48TB ZFS/Solaris:
      93,345,048 750GB Hard Drives:     $17,735,559,120
       3,889,377 Areca ARC-1280ML:       $4,317,208,470
       1,944,689 Motherboards/Mem/CPU:     $766,207,466
       1,944,689 5U Rackmount Chassis's: $4,546,682,882
         194,469 4 Post 50U Racks:          $45,700,215
           3,684 528-port 1Gbps Switches:  $374,294,400
              40 96-port 10Gbps Switches:   $11,424,000
       1,948,935 Network Cables:             $2,020,812
               ? Assembly Robots/Misc.     $111,000,000

    Sub Total:                          $27,910,097,365
    Tax/Shipping:                        $2,645,915,779
    Grand Total:                        $30,556,013,144

    $470 billion cheaper then the IRAQ war.
    • To get a feel for the size and scale of 50 exabytes using today's technology,

      * Two copy's of the entire Library of Congress, 6000 TB[1], can be stored in the collective cache buffers of the RAID controllers.
      * It would need a 1,712 MW (peak) power source, a typical PWR nuclear power station produces 2,000 MW. Tack on another $5 billion for the construction of a nuclear power station.
      * You would likely need to employ an entire team (in 3 shifts) to replace defective drives every day.
      * You would need 1,684,80

The only function of economic forecasting is to make astrology look respectable. -- John Kenneth Galbraith

Working...