To Purge Or Not To Purge Your Data 190
Lucas123 writes "The average company pays from $1 million to $3 million per terabyte of data during legal e-discovery. The average employee generates 10GB of data per year at a cost of $5 per gigabyte to back it up — so a 5,000-worker company will pay out $1.25 million for five years of storage. So while you need to pay attention to retaining data for business and legal requirements, experts say you also need to be keeping less, according to a story on Computerworld. The problem is, most organizations hang on to more data than they need, for much longer than they should. 'Many people would prefer to throw technology at the problem than address it at a business level by making changes in policies and processes.'"
Easier to keep (Score:5, Insightful)
Re:Easier to keep (Score:5, Insightful)
True, proper archiving takes huge amounts of time since it adds overhead to your operation.
In an ideal world, everything that you store is automatically labeled and old data will automagically be purged. But storing all kinds of shit is just that much easier. It also doesn't help that data storage is so dirtcheap. 1TB can be bought for around $100 if I am not mistaken. It doesn't pay to kill old useless stuff you have floating on your hard disk.
Re:Easier to keep (Score:4, Insightful)
Yes, less data need to be kept, but first there needs to be a _massive_ re-education of the 'data packrat' culture that the users of it have.
Re:Easier to keep (Score:4, Interesting)
you'll need to filter your 'customer communications' from your 'shopping lists'
Actually, I thought it was a fairly common legal tactic to make the data as difficult to actually find as possible, without revealing too much to the other side.
"They want records from three years ago? Send a truck with printouts of all the files we have, that'll keep them busy..."
Does anyone know that this is no longer the case?
Re:Easier to keep (Score:5, Interesting)
As time went on, fewer things ended up on paper, but the rules of discovery didn't evolve. That was the time of backing up a U-Haul full of printed out copies of every file, e-mail, etc. that a company had. Now the opposition had to dig through mounds of trash in the hopes that they will find that one incriminating document.
Then attorneys got more savvy, and in the so-called Rule 26 (refers to the Federal Rules of Civil Procedure), the attorneys would agree on the format of ESI to be exchanged. In December, 2006, the Federal Rules of Civil Procedure changed to directly address ESI and electronic discovery.
Now, in litigation, parties may still get obnoxious amounts of data, but it's electronic. Once it's processed and converted (usually to TIFFs with extracted text, but sometimes PDF), attorneys can do what amounts to a Google search through the files and find what they want pretty quickly. In fact, paper documents are usually scanned and OCRed so they can be handled and searched in the same manner.
Actually, I thought it was a fairly common legal tactic to make the data as difficult to actually find as possible, without revealing too much to the other side.
"They want records from three years ago? Send a truck with printouts of all the files we have, that'll keep them busy..."
Does anyone know that this is no longer the case?
So no, it's no longer the case. But the first guy who did it must have thought he was pretty funny.
Re: (Score:2)
attorneys can do what amounts to a Google search
No, they can do a search. Why compare it to Google?
Re: (Score:2)
Re: (Score:2, Interesting)
Lets not get snippy here, but I think the consensus is that:
Of the 30 gigs of things I've put on this laptop th
Re: (Score:2)
There are 4 boxes to use in the defense of liberty: soap, ballot, jury, ammo. Use in that order. Starting now.
Cutesy, but there's no particular reason that "jury" has to come after "ballot". Needs a rewrite if you want it to retain its cuteness.
Re:Easier to keep (Score:5, Insightful)
Cheaper to keep. Every hour I waste cleaning house costs more than it does to keep it stored. Storage continues to get cheaper, salaries typically don't. Sure, that $1.25M is a big scary number. But nothing compared to the salaries/benefits at a 5000 person company. Now you can argue the cost of data retrieval goes way up because chances are it'll take a hell of a lot longer to find, but that's a different argument altogether and you can just as easily question what the cost of not being able to recover something that was cleaned by accident is.
Re:Easier to keep (Score:4, Interesting)
Lovely scaremongering, but what did they mean by legal e-discovery? The time it takes to sort through the data or what?
The funny thing is it depends on your MTA (Score:2, Interesting)
My 10GB mail box in outlook, when mirrored to my local hard drive in MBOX format, automagically becomes 2 GB - and that's before compression and attachment pruning.
I have no idea what the hell Outlook is doing on the server, if it is just storing things in multiple formats at once or if it is just mis-calculating all the space, but that is one hell of a difference.
Re: (Score:2)
Re: (Score:2)
I cannot say what he does, but if the Sexchange server is open for IMAP, you can telnet to it and pass an IMAP command to dump everything in RFC 822 format. It ends up very close to mbox format; it might even have a _From line.
Re: (Score:3, Insightful)
Re: (Score:2)
210MB is a lot. That's as large as my CVS repository, which I have added to daily for ten years or so, and which contains lots of external data too (a copy of The Great Gatsby in troff format is in there somewhere).
Re: (Score:3, Informative)
1) This is the average. Your company might have 700MB/user, in my organization, it's close to 1TB/user/year that gets added. We're doing medical imaging.
2) It's not just tape libraries. The cost for D2D2T or D2D2D (what we're doing) goes way up compared to a 'simple' backup scheme. Especially if you're like us and require mulitple gigabit streams, disk storage can't be just 4 cheap SATA disks in RAID5. We have 2 storage arrays with 14 drives each for general access and another storage array with 10 SATA dis
Re: (Score:2)
When I worked in enterprise environments, my cost went up for backups but cost per GB went down. In general that is the rule I have found, in larger environments my cost per MB goes down significantly not up.
My point boils down to this, general stats like they have above are useless because we have environments like yours where you do medical imaging, and environments like mine where we do a mixture of marketing
Re: (Score:2)
Jeez, when did you last work at a large company?
We easily get close to 10 GB per person, and we are reasonably vigilant about it.
Then you ahve the Total Cost of the back up. The drive(not as cheap as a home drive, but still cheap) the person receiving, the people to put a drive in, the process of managing the disk arrays, the NAS, the backing up, and insurance. Plus normal overhead.
Legal e0discovery is time consuming becasue it needs humans involved. People may be trying to hide what they are doing in a ma
Yes--deleting costs money! (Score:5, Insightful)
I did a back-of-the-envelope calculation on just this question in 2004, and estimated that file deletion was not productive unless we could do it at a rate of at least 17MB per minute (of labor). Four years later the threshold is probably at least 45MB per minute.
Generally, this means that if we can blow away whole disks or huge directories of data, it may pay off. Users going through their files one by one is usually an absolute waste.
Re: (Score:3)
Creation time
Last Access Time
Last Modified Time
If we also had a
Last backed up time/scanned time
that virus scanners and backup software could use instead, then you can track last-access to eliminate files that haven't been opened by end-users in a particular time period for permanent offsiting or removal. Making today's complex HSM architectures easier to implement or not necessa
Re: (Score:2)
Access time tracking is routinely turned off to improve performance of filesystems.
Identifying useful data is still manual (Score:2)
Sure, it's easy to just discard everything older than X. The problem is that there _is_ data you need to keep for a long time, so that crude an approach isn't very effective. (For instance, for financial records you need to keep a record of everything you've bought until you sell it or declare it fully depreciated, and then keep those records for N years longer for tax purposes.)
For my work, I don't usually need files much older than 2-3 years, but occasionally I do need to drag out something 10 years old
Although I agree with you in principle.... (Score:3, Informative)
I've become the e-discovery guy (at least for email) where I work. Our lawyers told me that the latest revision of FRCP (Federal Rules of Civil Procedure) require an entity to keep evidence, even if automatic purging systems are in place.
Rule 37 of FRCP [cornell.edu] says that if you are ordered to hand over the evidence, and you cannot, then the judge can order that "designated facts be taken as established for purposes of the action, as the prevailing party claims". In other words, if the person suing you claims you s
Re: (Score:3, Insightful)
The problem is that it's easier to just archive the cruft stuff than it is to go through it all and figure out what's worth keeping or training staff to organize their data and retain only that which is necessary .
There, fixed that for you. Meta-tags and other efforts might change this in the future, but until there is a generalized understanding of things that should be archived and things that should not, and a better way to store, find, retrieve, and utilize company data, there will be tons of data save
Re: (Score:3, Interesting)
The problem is that it's easier to just archive the cruft stuff than it is to go through it all and figure out what's worth keeping or training staff to organize their data and retain only that which is necessary .
There, fixed that for you.
According to the original article, ("The average employee generates 10GB of data per year at a cost of $5 per gigabyte to back it up ") the cost of backups is fifty dollars a year per employee.
So if that an average employee costs the company $100 per hour (including overhead), then if "training training staff to organize their data and retain only that which is necessary" takes more than half an hour per year, it's more cost effective to archive the junk than it is to train the employees to sort it.
Re: (Score:2)
Even that is only true if data-storage costs are constant -- or employee-data grows parallell to cost-falling. Which seems unlikely.
Storing something for a year costs half of storing it forever, more like it, because storage-costs drop like a lead balloon and data grow.
If I were to delete EVERY file in my home-directory that is more than 3 years old -- I'd save 15% of the space used. If I where to delete every file more than 5 years old, I'd save 4% of the space used.
Which frankly ain't worth it.
Re:Easier to keep (Score:5, Insightful)
The bigger problem is that you will fight different battles. If you're fighting a sales rep that sold your clients to a competitor, you want as much ammunition as possible. If a client is suing you for incorrect information relayed 8 years ago and you're probably guilty, you want as little information as possible.
Re: (Score:3, Insightful)
My last job (Score:3, Interesting)
This company occasionally needed blueprints from the 1930s/1940s (great lakes ships), but none of their ships went back much
Yes and... (Score:2)
...whilst policies and procedures often solve a lot of things in a cleaner, more common sense manner there are unfortunately far too many people lacking common sense.
Throwing hardware at it guarantees it'll be done, expecting people to follow policies and prcoedures will likely leave you with a 50% success rate in ensuring the correct data is kept/binned and that's if you're lucky.
The world as a whole would be so much more efficient if we could get people to follow policies and procedures or at least the co
Re: (Score:2)
Exactly. If it takes me two hours per week to sort through every bit of my data and decide what to pitch, that cost has to be compared to the archival cost to decide whether it is a worthwhile endeavor.
Of course, at my office, we just bought a server and a controller with 16 SATA ports, filled the sucker up with off-the-shelf 500GB disks, and built a 7TB RAID6 using Linux software RAID. The whole job only cost about $2k, and we no longer waste any time deciding what to delete and what to keep.
Re: (Score:2)
Re: (Score:2)
Re:Easier to keep (Score:4, Interesting)
Result, look up a customer and you would find some files scanned half a dozen times.
Huh? (Score:5, Insightful)
Re: (Score:2)
what about the costs of reviewing and purging that data? That is straight up time, whether it's reviewing existing data or spending the time to create guidelines for which data to keep.
Right now, the-way-things-are-done is to save it all and pay for it.
You can train employees to change the-way-things-are-done.
The learning curve is expensive, but the general idea (aspirational, as with anything corporate) is that once everyone figures out the policies, time is used more efficiently and the 'cost' goes down.
And time costs money. More than storage.
Can I see the report that verifies your assertions?
You did have someone study the long term costs and give you hard numbers, didn't you?
A company isn't going to fsck around their multi-m
Re: (Score:2)
To put this into perspective, we have PRA requests for all sorts of "data" that we are supposed to keep. It has become almost a full time job going through all the crap to find what the PRA requests are asking for.
And we're a SMALL school district.
Re: (Score:2)
> and those costs are even higher when done by a law team during a discovery
> process. Gets quite expensive when law teams are billing $1k per hour to do
> discovery.
This is a very good point. The more data you have and the more poorly organized it is the more it costs you to honor discovery requests whether or not anything relevant is found. Thus there exists incentive to index your archive and minimize its size even if you are confident that it contains nothing that could be used against you.
10 GB user data? Not likely (Score:5, Insightful)
10 GB of data per user, sure.
10 GB of user data, no way.
If assuming 300 work days per employee, that would mean that the average employee creates 1.2 kB of data per second.
The only way this could be true is if you count data that isn't user generated, and they count the total data storage for the company and divide it by employees.
If so, users deleting their e-mails won't have much of an effect.
Re: (Score:2)
I've been in my current position almost a year now, and I've already generated about 1/2 a terabyte of data; and that's only the stuff I've decided is worth keeping (I've probably generated several terabytes in reality),... Of course, I'm probably not your average office worker -- my data is mostly monte carlo simulations of proteins, on the order of millions (some in the billions) of steps long. Some of the largest trajectories are 45 GB (yes, that's one file).
Far more than 10GB of original data (Score:2)
10GB of original data is easy, and it doesn't take a year, just a week or two. Today and yesterday, I measured physical properties of a lot of output from a particular industrial process (just one plant in a factory, and I only recorded measurements of a few instruments). This only gave me a few hundred MB of raw data, but it will result in several GB of data after analysis. This is all original data, and this is a normal amount of output. I regularly fill several DVDs with this sort of archive data.
Of cour
Re: (Score:2)
It sounds like you work with a lot of old people that have to send attachments to themselves because they haven't heard of this new-fangled thing called "FTP",...
Re: (Score:2)
Re: (Score:2)
You're obviously not writing software, doing CAD work, or any kind of computational modeling. It's easy to have that much data -- my source tree alone is 2GB.
And what about our colleagues in the porn production industry? I mean, one hour of hi-res MPEG is a lot of megabytes. Multiply it by the number of, ah, employees...
Re: (Score:3, Funny)
If assuming 300 work days per employee, that would mean that the average employee creates 1.2 kB of data per second.
Top posting and absence of editing by Microsoft Outlook users engaged in a brief inter-departmental discussion could easily account for that volume.
Is that what you meant by "isn't user generated"?
Re: (Score:2)
They count more than just the stuff you typed as "user data." For example, Linux admins download ISOs, lawyers download PDFs, Windows admins download patches, service packs, and malware cleaning tools, and sales people download porn. All this data is used by the users and must be archived.
It's not the storage... it's the apps (Score:4, Insightful)
Apps aren't really well designed for this in mind. They don't come at the problem from a "document lifecycle" perspective but instead a "document creation".
This is generally because data has a variable lifespan. Lets take an email as part of a project as an example. As the author I may decide that the email isn't needed after a week so set an expiry of 1 week. But you, as the recipient, may take that email and turn that into several tasks so for you the email is much more important and thus want to keep it for much longer.
Users aren't really going to be good at making these decisions unless some application continually bombards them with "go check the status of these 1000 documents you've got".
Re: (Score:3, Informative)
Users aren't meant to be making those decisions, the Records Management department should be... that is if you even have one! If you leave everything up to the users, you WILL have a cluster fuck of records.
I work in Records Management at a large company with many different divisions in diverse fields. RM is completely left up to us. We manage well over 10,000 boxes and there's only 3 of us. We alone determine when something is to be destroyed (but require authorization from dept heads to be shredded), how
Mod parent way up! (Score:4, Interesting)
Congratulations. You're the first person I've seen who understands that.
Accounting understands the need to close one year and open the next. They have processes for what is carried over and how it is identified.
Yet no other department (or application) understands the need to close old data and archive it.
Re: (Score:2)
Is this significantly different from tagging a release in a version control system?
Re: (Score:2)
Well, ERP solutions try to assign other units to "resources" (not just money) and store them in a subledger somewhere. And BPM systems are trying to do that with everything else.
Re: (Score:2)
Does it matter? (Score:2)
There should be enough local cache for every user to have access to every document they could possibly create, unless you are working at a movie company. Given proper indexing, it should be possible for users to find what they need.
Storage is cheap enough for this to work, even if some documents are slow (compressed, maybe combined as deltas with other very similar documents) or very slow (have to pull from tape or something). But again, all of that which an average user needs should be cacheable on their o
It depends upon business (Score:2, Informative)
For example, Financial institutions are required to keep data for longer period for legal purpose as well as traceability (during investigation of fraud or other kind of crimes). The banks worked for had legal requirement of keeping data at 2 places at least 15 km apart, with all kind of protection against fire and intrusion.
A good manufacturing company would keep data for longer period ot only to comply with ISO standards, but to trace manufacturing defects and a good evidence of past history for insuran
Re: (Score:3, Insightful)
Additionally, there are many businesses that don't understand their data retention requirements beyond 'we need to keep some data for 10 years', so instead of compartmentalizing their data and saying 'keep this for 10 years, that for 5 years, and purge this every year and that every 3 months', they just keep everything. Further, if they have a data retention requirement for 3 years or 10 years, they might wait longer before purging it just because it's easier to keep it then it is to go find and remove the
Choosing your battles (Score:2)
It's far better to spend a few $K than to waste literally weeks of time trying to sort things out, especially when you need sales to be selling a
Re: (Score:2)
Exactly. I've worked at my current company for about three years. It'd take me a few days at least to go through all the documents that I've created since I've been here. The cost of storing all those documents is significantly less than the billable hours that my company would have to give up for me to spend those days sorting paper. Not to mention the fact that I can't imagine have the luxury of a few days without having to worry about projects/clients/etc and have the time to focus on sorting through sta
Re: (Score:2)
Well, if you ahve done it more then once at some tiny shit hole company, I guess that's the way to do it...
Email Attachments (Score:5, Insightful)
Then again, I'm biased - I believe email should just be pure text. Perhaps that's a sign that I'm now old...
Re: (Score:2)
I guess that might explain all the SAN storage requests for our email archive servers.
Re: (Score:2)
I'm 500% better than average! (Score:2)
average employee generates 10GB of data per year at a cost of $5 per gigabyte to back it up...
I cry nonsense in the statement above.
I put a 25 cent blank DVD into the DVDwriter of my PC. Then I copy the entire contents of my 'C:\backup' folder onto this DVD. I start the program, and go do something else. Total dedicated time: 2 minutes
When the DVD write is done, I write a label code on the DVD (date, employee, backup number) and put the disk back on the stack in the file cabinet. Total dedicated time: 2
Re: (Score:2)
Re: (Score:3, Insightful)
Unfortunately, writable DVDs are not an acceptable archive medium, and a stack of disks with written labels is not an indexing solution that will scale beyond one person.
Re: (Score:2)
My salary and benefits: @ $18/hr time used on backup: 0.067 hrs My cost per gigabyte of backup: $1
And you backed it up a total of once. The cost of $5 is likely a yearly cost (as the volume is yearly), Backups are usually done 1/day. Your yearly costs would be in the hundreds of dollars per gigabyte.
Re: (Score:2)
My salary and benefits: @ $18/hr time used on backup: 0.067 hrs My cost per gigabyte of backup: $1
You haven't counted overhead. First, there is your personal overhead. Do you talk to your co-workers in the hall? Get coffee on company time? Go to the bathroom? Fill out time sheets to account for what you do all day? Read memos telling you that you have to fill out time sheets? Read your e-mail? Post comments to slashdot at 10:47AM on a workday? Only robots are 100% efficient in their use of time.
And then there is company overhead-- your computer, pens, paper, copy machine, office, lighting, se
Re: (Score:2)
That is a completly ignorant example of needing to back up 1000's or people, and billions of transactions.
It was easier with paper... (Score:2)
Used to be records were kept on paper,
paper was kept in boxes,
and boxes were dated MM/YY.
I came into the office one fine 1998 January 02,
and the hallway was stacked full of boxes dated 01/94,
02/94, 03/94, etc.
Company policy was discard records after three years,
so all records from 1994 were on their way to the dumpster.
Re: (Score:2)
So THAT explains why they kept moving Milton's desk (image [dereksemmler.com])! I guess all those TPS reports take up space!
keep, but not on the high-performance disk arrays (Score:2)
Communicate less (Score:3, Interesting)
Re: (Score:2)
Or communicate less in writing - I personally have had this policy for a long time. If I worry that a question, comment, concern, etc might not reflect well on me in the future, I walk into my boss's office and ask out loud (with the door shut.) If I want the communication to be recorded for all eternity I use email...
easy solution (Score:3, Funny)
put everything on one disk drive, unRAIDed. when it fails, problem solved. voila, built in obsolescence
Future BI. (Score:2)
Business Intelegence Software just may make use of the software. Wile a lot of buisness are STUPID in their use of BI Software. There may be some point either the company dies or will get a clue and do some BI analysis on its data.
You actually can do some amaizing things with BI. Say for example You are storing Time Card Data from employees. And you want to check the effectivnes of managers. So with say 20 years of time card data and employee records of which manager is which. You just may find a coraltion
litigation hold (Score:2, Informative)
Who decides what to delete? (Score:2)
Look at how people deal with email. I've got coworkers that have every single email (including mailing lists they've subscribed to) they've ever sent or received since they started (~8yrs ago). They're probably got 20GB of email on their laptop. Now we only allow 100MB of server based email storage, so that helps on the server side, but we're still backing up this guys laptop.
On the datacenter side, we had a database corruption about 10years ago so we implemented snapshots, and then snapshots of those sn
This is what Retention Policies are for (Score:2)
IANAL. This is why most companies spend some money developing a retention policy and planning its implementation. It requires a bit of time from every employee to decide if a piece of information is something that requires short term, long term or permanent storage but if you get people into the habit of sorting things like email into folders that reflect the company retention policies (which need to be pretty clear and well planned both from an IT and a legal perspective) then you can reduce the cruft you
Throwing Policies at a Technology Problem? (Score:2)
What about throwing company policies at a technology problems?
Hypothetically (never happens in the real world of course), what if there was a document management server, samba dropbox, where all documentation for deliverables are kept in portable excel 2003 format? What if content identification is done my creating folders with "project" and "project"_old naming conventions, hyperlinking is done in excel (because html is complicated), and ad nauseum for the automated process called "company policy"?
Store Smarter, Not Just More (Score:3, Interesting)
Let's say your corp is more than 50% likely to go through "e-discovery" once every 10 years. Each worker will generate 10GB * 10 years = 100GB, backing up all the increasing data pile is (pairing the balancing ends of the accumulation for half the accumulation years) 101GB * 5 = 505GB, at $5:GB is $2525, plus about $2M:TB / 505GB = $1.01M, for a total of $1,012,525 per worker, times at least 0.50 probability is at least $506,262 average predictable cost per employee.
One approach is to keep much less data. But when you keep less data, you have to guess right every time what data you'll need later. If your process discards data that's valuable later (but lost) it better be worth less than the amount you save. That's too hard to know, which is one reason companies keep all the data, and figure it out later.
A better approach is just to cut that $1-3M:TB e-discovery cost. Of course, the best way is to avoid being investigated, but one has less than 100% control over that, especially from inside the IT department. A much better way to do it is to better inventory the data stored as you go along accumulating it, in the terms in which a later e-discovery would search it. Which also can have the benefit of making the info in the data more available in the normal course of business, which can make that data's increased value (and lowered costs of searching it) worth the entire process. The cheaper possible e-discovery would be just a bonus.
What really gets me is how these economics are the true cost of storage. A 1TB drive costs $120, and maybe a better 1TB in a 100% redundant RAID costs $250. But it really costs something like $300,000 over its lifetime (probably replaced every 3 or so years, across the 10 years I analyzed). If IT spent a few hundred hours a year streamlining the navigation of all that data, at a cost of a few dozens of thousands of dollars, divided across all those employees, the entire org's IT operations would be much more economical, when the large cumulative risk of e-discovery costs are factored into the true cost.
Re: (Score:3, Insightful)
Re: (Score:2)
"Innocent until proven guilty" only applies in criminal cases. In civil cases - the kind a business is most likely to encounter - the exact opposite is typically true.
=Smidge=
Re:hmm (Score:5, Interesting)
They called it 'desk cleanout day', and unless you were an official dedicated contact on a particular subject you were to wipe all correspondence of more than a year old.
(There were also other grades of information, but erase after a year was the default).
Re: (Score:2)
The top 500 company I worked for did just the opposite: Destroy all data in case a legal issue comes up. They called it 'desk cleanout day',
Enron?
Re: (Score:2)
And they claim to be ethical.
Re: (Score:3, Interesting)
That was a common company wide AT&T policy wipe everything after 60 days. all email to be deleted after 60 days. it was a fireable offense for creating a pst file on your desktop and we did a regular sweep for pst files on corperate pc's on a regular basis.
It really did not stop anyone from keeping info, many managers simply printed out the emails and kept them in files, one IT manager we let go had 3 years of email printed and stored in file cabinets in his office. it was insane.
Re: (Score:2)
What's a pst file?
Re: (Score:2)
Yeah this whole thing seems a little fishy... (Score:2)
Re: (Score:2)
On top of what you said - $5 a gigabyte? What is this 1998? Even if you get WD's highest quality consumer hard drives they're about $1 a gigabyte, plus if you buy them in bulk they're probably considerably cheaper. You can use 2 or 3 of them for data redundancy, and it's still significantly cheaper. I question where they got that number.
As soon as you say that I can be reasonably sure that you've never factored in storage costs for anything fancier than a desktop PC.
SAS disks are typically 3-5 times more expensive per drive. Factor in RAID (level 5 if you want capacity, 10 if you want performance, 6 if you want a compromise of both) and can potentially double the cost per gigabyte. But you can't get 15,000 RPM SATA disks and you can't bond SATA channels together for performance.
Secondly, seeing as the subject is archiving they're probabl
By your interesting math... (Score:2)
Drive is 3-5x more expensive than $1 a gigabyte...raid level 5 means 2+ drives, we're to $6-10 already, then you say the majority of the cost wouldn't be in the media...
From working at a large university, three fortune 500 companies, and now the small business I work for, I don't think it's even suggestible that most user data is backed up in an out-sourced tape data center. That's an absurd suggestion. The vast majority of data never makes it off either a local hard
Re: (Score:2)
Drive is 3-5x more expensive than $1 a gigabyte...raid level 5 means 2+ drives, we're to $6-10 already, then you say the majority of the cost wouldn't be in the media...
RAID 5 means 3+ drives, and means that you lose 1 drive worth of capacity.
You probably wouldn't use RAID 5 with drives that size because rebuilding the array would take too long. And I don't think you can get 1TB SAS drives yet.
From working at a large university, three fortune 500 companies, and now the small business I work for, I don't think it's even suggestible that most user data is backed up in an out-sourced tape data center. That's an absurd suggestion. The vast majority of data never makes it off either a local hard drive or a temporary, lightly backed up network "drive".
I'm sorry, but that is so far at odds with all my experience that it's not even worth my time to discuss it.
"Local hard drive"??!
Yes local hard drive (Score:2)
We're not talking SQL servers here, or customer information databases, we're talking average employees performing their business duties.
Re: (Score:2)
Not even worth your time to discuss it..hahahahaha. OK, you're right, every company in the world uses network-drive-only setups, and bans their users from any writing to either linux scratch drives or C:\ (usually the location of MY DOCUMENTS). That's completely accurate...dws.
Which can be trivially redirected to a network drive through Group Policy.
Hell, it can be trivially redirected to a network drive using a Windows NT 4 domain policy.
Unless you're a very small company indeed, this is the only sensible thing to do unless you plan to backup every PC individually.
Re: (Score:2)
And FYI, I worked at one of the largest business software companies in the states, and one of the largest pharma companies in the states - so no, it's no
Re: (Score:2)
Thirdly, I don't think the cost of media is the biggest factor by a long way. They've probably also factored in cost of a contract with Iron Mountain, cost of robotic tape library, licensing costs for TSM (or similar) and a proportion of the wages involved in paying someone to swap the tapes out and hand them over to Iron Mountain every day.
Indeed -- the cost is in offsite storage and archival. I've previously used Amazon S3 [amazon.com]. They charge .15 cents per gigabyte-month for redundant online storage, and if you want redundancy against bit flip failures on their end, you can also employ something like reed-solomon error correction on uploaded data.
When I set up uploading of (encrypted) backup archives, the total overhead was approximately $102/month in data transfer costs (1 terabyte amortized over a month) and $307/month in data storage (2 terabyte
Re: (Score:2)
One of us is either making a wrong assumption:
1) I was assuming they couldn't have been talking about long term storage, because no way an *average* user produces 10GB a year that needs long term storage.
2) You were assuming that somehow the average user produces 10GB a year that requires long-term storage.
There is no way that the average user generates 10GB of data that makes it into long-term storage in a year.
That's approximately 200MB of data a week. Most corpor
Re: (Score:2)
You're also making an assumption that all data is user generated and not automated which could include log files for access times and in the case of my company 100gigs a day of security footage which we don't retain longer than 30 days. Whenever there is an incident that video is retained though.
The cost per gig is on par with what I've seen after deploying a 30tb SAN along side my 60TB SAN. 168 drives in the large SAN, a good number of them are fibre channel drives too which are quite costly.
Add in the c
Re: (Score:2)
Sarbanes Oxley vs. History (Score:2)
The end result of Sarbanes-Oxley, on top of the increasing amount of encryption and the use of high-density short-lived storage, is going to be a frustrating gap in the historical record for future generations.