Hitachi Promises 4-TB Hard Drives By 2011 372
zhang1983 writes "Hitachi says its researchers have successfully shrunken read heads in hard drives to the range of 30-50 nanometers. This will pave the way for quadrupling today's storage limits to 4 terabytes for desktop computers and 1 terabyte on laptops in 2011." Update: 10/15 10:39 GMT by KD : News.com has put up a writeup and a diagram of Hitachi's CPP-GMR head.
Waiting for... (Score:5, Insightful)
So? (Score:4, Insightful)
Re:4 Terabytes? (Score:2, Insightful)
Re:4 Terabytes? (Score:3, Insightful)
Re:Waiting for... (Score:4, Insightful)
Even my none geek friends and family are starting to feel the pain as working with video and Bit Torrent becomes more common. Multiple TB usage won't be that uncommon I think. What we really need now though is RAID-5 for the average Joe.
Re:Will we even use magnetic HDs in laptops in 201 (Score:3, Insightful)
While its pretty incomprehensible to use even a fraction of the mentioned 4Tb right now, I can see that with high-def video becoming more and more common, at the very least all the people pirating movies and tv shows will use these drives. Also, think about how more and more computers are being sold with TV tuners in them (granted most people will never use them). A few years from now, I can see that instead of regular TV tuners, HDTV capture devices will be much more common - thus people will actually use that space...
So where is the speed? (Score:5, Insightful)
I imagine some of you out there, like myself, are starting to see problems with data integrity as the mountain of data you are sitting on climbs in to the petabytes. All I can say is: bit flips suck! Do you KNOW your data is intact? Do you REALLY believe your dozens of 750GB-1TB SATA drives are keeping your data safe? Do you think your RAID card knows what to do if your parity doesn't match on read - does it even CHECK? I hope your backup didn't copy over the silent corruption. I further hope you have the several days it will take to copy your data back over to your super big - super slow - hard drive.
Is anyone thinking optical? Or how about just straight flash? I have a whole stack of 2GB USB flash drives - should I put them in a RAID array?
BFD (Score:1, Insightful)
I think 2011 is a pretty conservative estimate.
Ugh, no. (Score:2, Insightful)
The possibly negative consequences here can be very damaging. Imagine the security breach when a company "loses a laptop" that contains 30 years of your transaction history. Or, say you're 20 years old right now, imagine what would happen if in 2040 you decide to run for congress and your opponent pulls out dirt from your Google searches and GMail chats of your youth? Imagine the blackmail material that could be uncovered.
The possibilities are endless, but without a real revolution in the way corporations and government operate, they all seem to lead to the absolute end of privacy.
Re:Waiting for... (Score:5, Insightful)
Actually, my sickened mind went a completely different direction... remember when we were going to have 8 Ghz Pentium 4s with 6 GB of RAM to run Windows Vista?
Heck, it's still common to see computers sold with 256 MB of RAM, which wasn't a particularly large amount 5 years ago... that it's even salable today speaks volumes. I have an "end of life" Pentium 4 2.4 Ghz that I picked up this w/e for like $50. 20 GB HDD, 512 DDR RAM, CD, Sound, etc.
Other than the small-ish HD and the CD instead of the DVD, this system is not significantly different than a low-end new system. And, when it was first sold 3-4 years ago, its specs weren't particularly exciting.
Point being, there's a "we don't talk about it" stagnation going on in the Computer industry. I honestly think that most of the new purchases are based on the expectation of EOL and the spread of viruses. It's gotten to where it's actually cheaper to buy a new computer than it is to reload your old one. Part of that is the fact that it takes a full business day of rebooting the computer to update Windows from whatever came on the CD.
This part just floors me. I have the original install disk for the aforementioned $50 Dell 2.4 Ghz system, and am reloading from scratch so it's all clean. It takes ALL FREAKIN DAY simply to update Windows to the latest release, with a 1.5 Mb Internet connection. (not high end, but still no particular slouch)
Yet it takes about an hour and just ONE short line to update CentOS (RHEL) to current:
My point to all this?
The computer industry has (finally) reached a stable point. Performance increases are flat-lining to incremental, rather than exponential, and there's little incentive to change this, since a 4-year-old computer still does most anything anybody needs a computer to do. There will always be a high-performance niche, but it's a niche. The money has moved from computing power to connectivity.
People no longer pay for processing power, they pay for connections. Thus the Intarweb...
Re:The bigger problem (Score:4, Insightful)
I'm sorry, but this is just fantasy world 101. I almost never have to look through old mail, but when I do it's because some clients are trying to dredge up something that just not how it happened. Often when I do, it's important that I have all the "useless" mails as well, so you can say with confidence that "No, you just brougth this up two months before the project deadline and it wasn't in any of the workshop summaries [which are in project directories, not mail] before that either."
When I do, it's far more efficient to search up what I need rather than going over old junk - what you're saying is something which would imply that the Internet is useless since it's full of so much redundant, unorganized information. It's quite simply not true, and even though you should extract vital bits to organized systems, keeping the primary source around is very useful.
Extracting experience from current communication to improve business systems (or for that matter, technical routines) should be an ongoing process - it's vital going forward. Going back to old junk to try to figure out what's deletable just to run a "clean ship" is just a big timesink and waste of money. Maybe you'd have an argument if there was a good system not being used because it's all kept as unorganized mailboxes. In my expererience, usually the prolem is there's no such system and doing a clean-up would do nothing to change that.
Re:Waiting for... (Score:3, Insightful)
As someone with close to 300 DVDs (yeah, yeah...I know, MPAA evil...but I try to buy as many of them used as I can), I'm going to wait until HD technology starts catching up with disc technology before upgrading to HD. So any breakthroughs that make this possible are welcome in my book.
The small thing yaou neglected (Score:5, Insightful)
Yes, indeed, we've reached the point where any computer, even if 4 years old, is good enough to do most day-to-day activities (hanging around on the web, wrting some stuff in a word processor, e-mails, and ROFL/LMAOing on AIM/MSN/GMail/Facebook or whatever is the social norm du jour).
Case in point, my current home PC is still Intel Tualatin / 440BX based.
*BUT*...
As you said (and that's something I can confirm here around too), Joe 6 pack buy a new computer every other year, just because his current machine is crawling under viruses and is running too slow (and spitting pop-ups by the dozen). He either pay wads of cash to some repair service that may or may not fix his problems, may or may not lose his data in the process, and he'll have to wait without a machine for a couple of days. Or he gets a new machine. And...
Those outrageous configuration never showed up. Never the less, it seems like Vista was still designed with those in mind.
So in the end the new machine Joe Six pack *WILL* have to be better/faster/stronger, simply because the latest Windows-du-jour has tripled its hardware requirement for no apparant reason.
OS maker will continue to make new versions on a regular basis, mostly because that's their business and they have to keep the cash flow in. Also, there are security issues to fix (by adding additionnal layers of garbage over something that was initially broken by design), legal stuff (add whatever new DRM / Trusted Computing stupidy is latest requirement voted the **AA lobby), add a lot of dubious feature that still 0.1% of the user base will need (built-in tools to sort / upload photos, built-in tool to edit home-made movies, or whatever. Modern OS tend to get confused with distributions and go the Emacs-way of bloat).
All this will result in newer OS that take twice the horsepower to perform the exact same task as older.
And thus, each time Joe 6 pack changes his computer, he gets a newer one, which will obviously have the latest OS on it, and thus will *need* to have 4x the computing power. Just to continue hanging on some IM, sending e-mail, writing things, and browsing porn
Re:I have a need right now... (Score:3, Insightful)
Because, you see, you've just spent your budget on hardware that will never likely be used that gets you no visible day-to-day advantage, except leaving you vulnerable to multiple simultaneous drive failures. (This is surprisingly likely: go read the Google paper on drive failure rates.)
Instead,, you use a second system with snapshot backups, possibly using a system like rsnapshot that supports hard-linked backups. This gives you on-line backup, fast bare-metal restoration, and easy access to yesterday's or last week's data. It also offloads the tape backups. And the mirrored drives can be used for off-loaded backup or mirroring, for creating off-site backup media of actual hard drives, not tapes.
Actually, that's the scary part (Score:5, Insightful)
I started my programming experience almost directly with assembly. Well, I had about a year of BASIC on my parents' ZX-81 first. But that was a damn slow machine (80% or so of the CPU was busy just doing the screen refresh) and Sinclair BASIC was one of the slowest BASICS too. So with that and 1K RAM (you read that right: one kilobyte), you just couldn't do much, you know. So my dad took the Sink-Or-Swim approach and gave me a stack of Intel and Zilog manuals. Anyway, you had to be particularly thrifty on that machine, because your budget of CPU cycles and bytes makes your average wristwatch or fridge nowadays look like a supercomputer.
I say that only to contrast it to the first time I saw a stacktrace (Java, obviously) of an exception in a particularly bloated Cocoon application running in WebSphere. If you printed it, it would run over more than two pages. There were layers upon layers upon layers that the flow had to go through, just to call a method which, here's the best part, didn't even do much. That nested call and all the extra code for reusability sake, and checks, and some reflection thrown in for good measure, obviously took more time than the method code itself needed.
It hurt. Looking at that stacktrace was enough to cause physical pain.
Now I'm not necessarily saying you should throw Cocoon and J2EE away, obviously there are better ways to do that even with them. Like, for a start, make sure your EJB calls are coarse granularity so you don't go back and forth over RMI/IIOP just to check 1 flag.
But how many people do?
The second instance when it caused me pain is when I was testing a particularly bloated XML-based framework, and it took 1.1 seconds on a 2.26 GHz Pentium 4 just for a call to a method that did nothing at all. It just logged the call and returned. That's it. That's 2.5 _billion_ CPU cycles wasted just for a method call. That's more than 30 years worth of Moore's law. Worse yet, someone had used it between methods in the same program, because apparently going through XML layers is so much cooler than plain old method calls. A whole 30 years worth of Moore's Law wasted for the sake of a buzzword. The realization hurt. Literally.
Again, I'm not saying throw XML away generally, though I would say: "bloody use it for what it was meant, not as a buzzword, and not internally between classes in the same program and indeed the same module." It just isn't a replacement for data objects (what Java calls "beans"), nor for a database, nor as just a buzzword to have on the resume.
Each iteration of Moore's Law is taken as yet another invitation to write crappier code, with less skilled monkeys, and don't bother optimizing... or even designing it well in the first place. Why bother? The next generation of CPUs will run it anyway.
And the same applies to RAM and HDD, more or less. I've seen more than one web application which had ballooned to several tens of megabytes (zipped!) by linking every framework in sight. One had 3 different versions of Xerces inside, and some classloader magic, just because it beat sorting out which module needs which version. Better yet, they were mostly just the GUI to an EJB-based application. They didn't actually _do_ more than display the results and accept the input in some forms. Tens of MB just for that.
So now look on your hard drive, especially if you have Vista, and take a wild guess whether those huge executables and DLLs were absolutely needed, or are there mostly because RAM and HDD space are cheap?
At this rate and given 4TB HDDs, how long until you'll install a word processor or spreadsheet off a full HD DVD?
Re:I have a need right now... (Score:5, Insightful)
I sincerely hope you do backups anyway. RAID is simply there to allow you to continue running a service under some specific failure conditions that would otherwise cause the service to be down whilst hardware is replaced and backups restored - it is not a substitute for backups, RAID and backups accomplish different jobs.
Some examples of failure conditions where RAID won't save you but backups will:
- Some monkey does rm -rf / (or some rogue bit of software buggers the file system).
- The power supply blows up and sends a power spike to all the hard drives in your array (I've personally seen this happen to a business who didn't take backups because they believed RAID did the same job - they lost everything since all the drives in the array blew up).
- The building bursts into flames and guts your server room.
In all these conditions, having a regular off-site backup would save you whereas just using a RAID will not.
Re:Waiting for... (Score:2, Insightful)
In the meantime, I'll take the small performance hit of software RAID for the robustness it provides.