Researchers Achieve Storage Density of 2.2 Petabytes Per Gram of DNA 136
A reader sends news of researchers who encoded an MP3, a PDF, a JPG, and a TXT file into DNA, along with another file that explains the encoding. The researchers estimate the storage density of this technique at 2.2 petabytes per gram (abstract). "We knew we needed to make a code using only short strings of DNA, and to do it in such a way that creating a run of the same letter would be impossible. So we figured, let's break up the code into lots of overlapping fragments going in both directions, with indexing information showing where each fragment belongs in the overall code, and make a coding scheme that doesn't allow repeats. That way, you would have to have the same error on four different fragments for it to fail – and that would be very rare," said one of the study's authors. "We've created a code that's error tolerant using a molecular form we know will last in the right conditions for 10 000 years, or possibly longer," said another.
Latency and bandwidth? (Score:2, Insightful)
It's useless unless it's reasonably fast.
New error correction scheme? (Score:5, Insightful)
I understand they wanted the overall system to be fault tolerant, but it might be better to leave that part to established computer science. I understand DNA might be uniquely prone to certain types of errors or reading problems - but there's a lot of computer science theory (and practice) established here that would likely make the overall system more robust than what looks like a fairly simple redundancy scheme.
Redundancy (Score:5, Insightful)
It's 2.2 petabytes per gram, but only if you don't mind that it contains a billion copies of the same 2.2 megabytes. Making lots of copies of a short DNA sequence is easy. Making a whole gram of unique DNA sequences is much, much harder. What's the non-redundant storage density of this process?
"very rare"? (Score:2, Insightful)
How rare is "very rare"? If they have that 2.2 petabyte gram of sotrage, and "rare" means 0.0001% of the time, that's still 9 billion failures in your archived data.
Re:Latency and bandwidth? (Score:5, Insightful)
No, it's only useless for the specific application you're imagining, not "useless" in general. A jet airliner may be really, really fast in comparison to my car, but is useless if my task is to get to the grocery store for milk and eggs. That doesn't invalidate the usefulness of jet airliners.
Re:Where does it all end? (Score:5, Insightful)
Hard to say whether we should or shouldn't. But it's worth noting that there are at least two possible important differences between IBM's experiments and Monsanto's:
1) Monsanto's experiments are often self replicating.
2) IBM isn't trying to sell us MP3 files as food.
Re:Please use a real unit of measure (Score:3, Insightful)
Please wait until you sober up before posting again.