Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Data Storage IT Technology

Scientists Unveil Most Dense Memory Circuit Ever Made 249

adamlazz writes "The most dense computer memory circuit ever fabricated, capable of storing around 2,000 words in a unit the size of a white blood cell, was unveiled by scientists in California. The team of experts at the California Institute of Technology (Caltech) and the University of California, Los Angeles (UCLA) who developed the 160-kilobit memory cell say it has a bit density of 100 gigabits per square centimeter, a new record. The cell is capable of storing a file the size of the United States' Declaration of Independence with room left over."
This discussion has been archived. No new comments can be posted.

Scientists Unveil Most Dense Memory Circuit Ever Made

Comments Filter:
  • by Anonymous Coward on Wednesday January 24, 2007 @07:38PM (#17745328)
    [unveiling the most dense memory circuit ever made]
    Dr. Tufnel: Look... densest memory circuit ever, so dense you can't even see the data on it, so dense it's never been used.
    Reporter: [points his finger] It's never been used ...?
    Dr. Tufnel: Don't touch it!
    Reporter: We'll I wasn't going to touch it, I was just pointing at it.
    Dr. Tufnel: Well... don't point! It can't be used.
    Reporter: Don't point, okay. Can I look at it?
    Dr. Tufnel: No, no. That's it, you've seen enough of that one.
  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Wednesday January 24, 2007 @07:39PM (#17745340)
    Comment removed based on user account deletion
    • by HTH NE1 ( 675604 ) on Wednesday January 24, 2007 @07:44PM (#17745408)
      I know DNA has been proposed as a storage mechanism before. Since the immense human genome fits inside a cell, wouldn't DNA offer much denser storage?

      And have a stray biological virus get in and alter my computer's DNA-based memory?

      I wouldn't want to think what the computer would use to alter its DNA-based memory fast enough to be useful, let alone what would happen if it escaped and latched onto an organism.
    • Re:DNA memory (Score:4, Insightful)

      by phoenixwade ( 997892 ) on Wednesday January 24, 2007 @07:59PM (#17745612)
      As a Read only option, I suspect. The problem isn't really data density, it's data access speed. Three terrabytes of storage isn't going to do you much practical good if it takes two hours to find and recover the bit of information you want.
      • Re: (Score:2, Insightful)

        by Anonymous Coward

        Three terrabytes of storage isn't going to do you much practical good if it takes two hours to find and recover the bit of information you want.

        There is a large class of data storage requirements that could be met with a two hour seek time. As long as the throughput is there, it could replace tape drive type storage applications, for example.

        Or extremely large databases, which may be 99.995% write. Archival storage would be another example, if the medium proved hardy enough.

        While it won't replace RAM or hard drives, I would LOVE to see extremely high density storage of this type.

    • Re: (Score:2, Insightful)

      by Speed Pour ( 1051122 )
      Not a reliable media. Biological media, especially if it's based on Human DNA would potentially suffer from disease or short lifespan (begging the question of a special environment to keep it functional and stable). Non-living cells of DNA could be used to circumvent disease and lifespan issues, however they would deteriorate far more rapidly under any known method of reading (be it electrical, photo-reactive, irradiated, or chemical)

      A further set of issues, irradiation. Especially at such a small siz
  • by account_deleted ( 4530225 ) on Wednesday January 24, 2007 @07:39PM (#17745344)
    Comment removed based on user account deletion
  • by mrsam ( 12205 ) on Wednesday January 24, 2007 @07:40PM (#17745356) Homepage
    Please post all "Libraries Of Congress" jokes in this thread. Help keep Slashdot clean. Thank you.
    • by kalpaha ( 667921 ) on Thursday January 25, 2007 @01:48AM (#17748216)

      Stop opressing me, I can post where ever I wanna!

      But seriously, using the estimate from wikipedia: "It is estimated that the print holdings of the Library of Congress would, if digitized and stored as plain text, constitute 17 to 20 terabytes of information", we can use google to calculate how many such chips would be required to store the US Library of Congress:

      Enter into google: (20 terabytes) / (160 kilobytes) = 134 217 728

      Now, with some reasearch into White Blood Cells [iscid.org], we learn that a normal human has between 7000 and 25,000 white blood cells in a drop of blood. So going with a conservative estimate of 10,000 white blood cells per a drop of blood, we could store the Library of Congress in
      134 217 728 / 10 000 = 13 421.7728 drops of blood.

      That's not very accurate, let's try to get a better estimate. Wikipedia to the resque:

      There are normally between 4×10^9 and 1.1×10^10 white blood cells in a litre of healthy adult blood.

      Again, with a conservative estimate of 7 x 10^9 white blood cells per liter, we get
      134 217 728 / (7 * (10^9)) = 0.0191739611

      Entering into google 0.0191739611 liter to centiliter, we get
      0.0191739611 liter = 1.91739611 centiliter

      In other words, storing the whole Library of Congress using these chips would take about half a shotglass of blood.

      • Re: (Score:3, Insightful)

        by TheRaven64 ( 641858 )

        Entering into google 0.0191739611 liter to centiliter, we get
        ...depressed that someone needs a calculator to multiply by 100 in base 10?
  • by ENOENT ( 25325 ) on Wednesday January 24, 2007 @07:40PM (#17745360) Homepage Journal
    how many Libraries of Congress you can fit into an elephant with this technology.
    • by adpsimpson ( 956630 ) on Wednesday January 24, 2007 @08:13PM (#17745818)

      In all seriousness, I know how long a London Bus is, I know that an elephant is pretty heavy, I know roughly how much shelf space the Encyclopedia Britannica takes up and I know tall buildings can be quite tall.

      But I have no real concept of how big a white blood cell is, or how much some thousand words (how many thousand? It's out my mind now that it's off the screen...) really is.

      For all I know, the hard drive in my computer could be storing 600 birthday cards per germ already and I wouldn't have a clue.

      Anyone care to quote how fast the Concorde went in Ford Escorts per millisecond? [google.co.uk] (the link will give you a good start)

      • Re: (Score:2, Interesting)

        I have three sets of Encyclopedia Britannica, so know well how much space it takes up. One is the last set where the volumes are a single series from A-Z, the second is the following year when they split it to several series (Macropedia, Micropedia, I think are two of the designators) and the third is an early 20'th century set in leather bound octavio size volumes.

        It's more fun to browse through a volume of it on a rainy day than it is to hyperlink all over wikipedia.
      • I immediately wondered if they were using a 16-bit word length or some different architecture.
    • by joe_bruin ( 266648 ) on Wednesday January 24, 2007 @08:14PM (#17745820) Homepage Journal
      how many Libraries of Congress you can fit into an elephant with this technology.

      So you want to know the LoC / metric pachyderm of this technology? I'm not sure, but don't go by what it says on the box, they define a kilo-Library of Congress to be 1000 LoCs, not 1024.
      • Re: (Score:3, Funny)

        by jonasj ( 538692 )
        they define a kilo-Library of Congress to be 1000 LoCs
        You can implement a kilo-Library of Congress in a thousand lines of code? Impressive.
    • by Bogtha ( 906264 ) on Wednesday January 24, 2007 @08:34PM (#17746032)

      how many Libraries of Congress you can fit into an elephant with this technology.

      Well, this page [techtarget.com] estimates LoC at 10 terabytes, which works out to 81920 gigabits. According to the article, a bit density of 100 gigabits per square inch means that you'd need 819.20 square inches to store the Library of Congress.

      According to this page [iucn.org], an elephant can reach 11 feet tall, or 132 inches, and 30 feet long, or 360 inches. According to this page [galumpia.co.uk], an elephant can reach 6'4" wide, or 76 inches. That's a dimension of 132 x 360 x 76 inches, or 3,611,520 square inches — assuming cubic elephants (there's a phrase you don't hear every day!).

      Given these figures, a reasonable first guess would be that you could fit approximately 4,400 Libraries of Congress into an elephantine memory circuit. Or, if you prefer to work with more manageable quantities, 4.4 megalocs per kilophant.

      How long before Google add LoCs to their calculator?

      • Re: (Score:2, Funny)

        by maxume ( 22995 )
        O.K., but how many elephants can you fit into a Library of Congress?
      • Re: (Score:3, Funny)

        assuming cubic elephants

        You know, if elephants were cubic, they would be much easier to store and transport.

        Which reminds me of an old joke: a dairy farmer wanted to increase the milk output of his cows. A friend suggested he ask the local university for advice, and he eventually found a physics professor who was willing to help. After a few weeks of waiting, the farmer got a call from the professor, who claimed to have found a way to triple the milk production! The farmer raced to the university, whe

    • by skribe ( 26534 ) on Wednesday January 24, 2007 @08:37PM (#17746072) Homepage
      African or Asian elephant?
    • How many Libraries of Congress come OUT of an elephant in a year?
  • by adam.dorsey ( 957024 ) on Wednesday January 24, 2007 @07:41PM (#17745372)

    The cell is capable of storing a file the size of the United States' Declaration of Independence with room left over.
    Yeah, but how many 747s does it weigh? ...no, wait, how many Sears Towers is its height?

    Damn, none of my vague comparisons fit...

    WAIT! How many angels can dance on it? That one is for small stuff, right?
    • Yeah, but how many 747s does it weigh? ...no, wait, how many Sears Towers is its height?
      This is such a cliche on slashdot.

      Give the summary credit for stating the following: "100 gigabits per square centimeter." That is a fine way to measure storage density.

      • by Feanturi ( 99866 )
        Actually, the summary made me think of the cliche instantly with this line:

        The cell is capable of storing a file the size of the United States' Declaration of Independence with room left over.

        I mused to myself, "Cool, now we can measure storage in USDoIs!" I fully expected to see the very posts you are complaining about after that.
      • by Bender0x7D1 ( 536254 ) on Wednesday January 24, 2007 @08:28PM (#17745964)

        Um... gigabits per square centimeter is a horrible storage density metric. We need to deal with volume - unless we suddenly moved to a 2-dimensional universe - and even volume isn't perfect. For a drive platter do you only count the magnetic medium, or the underlying material as well? What about the space between platters or the read/write mechanism? I could have great storage density, but it wouldn't do me much good if I needed an entire scanning tunneling microscoope to read it.

      • by TopSpin ( 753 ) *
        Give the summary credit for stating the following: "100 gigabits per square centimeter." That is a fine way to measure storage density.

        Some NPR page has the volume of an M&M at 0.636cm^3. So this new ditty will store 7.95 GB in the space of an M&M.

        Plain.

        • by Dunbal ( 464142 )
          So this new ditty will store 7.95 GB in the space of an M&M.

                A chocolate M&M, or a peanut M&M? They're not the same size!!! See perhaps the Skittles or Smarties units would have been more appropriate, since these are of uniform size.
      • DoIs per white blood cell is a perfectly cromulent unit of measurement.
    • by dbIII ( 701233 )
      Don't lose it! You'll end up with a King called George!
  • Yeah, thanks (Score:3, Insightful)

    by d12v10 ( 1046686 ) on Wednesday January 24, 2007 @07:44PM (#17745410)
    You know what I hate? Articles that show the scale of whatever they're talking about in obscure ways, like "size of a red blood cell" or "as long as eighteen schoolbuses lined end to end". Next time, just tell us the actual size and we can make that approximation ourselves!

    d12
  • by maynard ( 3337 ) on Wednesday January 24, 2007 @07:47PM (#17745440) Journal
    Rough comparison here. [madsci.org] Short answer: DNA is far more dense information storage than this technology. Never mind that human white blood cells also contain the machinery to both compute and replicate data stored within DNA (as well as replicating the computation machinery).

    Biology still wins. But nanotechnology creeps ever closer year by year...
  • by Cracked Pottery ( 947450 ) on Wednesday January 24, 2007 @07:49PM (#17745468)
    However, 32 of them should be enough for anybody.
  • Which words? (Score:5, Interesting)

    by R3d M3rcury ( 871886 ) on Wednesday January 24, 2007 @07:53PM (#17745514) Journal

    [...] capable of storing around 2,000 words [...]
    Which words? "Antidisestablishmentarianism" or "It"? What about languages where words take up one character like Chinese and Japanese?
  • "Most dense"? (Score:2, Insightful)

    by hjo3 ( 890059 )
    Why not just say "densest"?
    • Re: (Score:3, Funny)

      Because "most dense" is more gooder grammar.
    • by glwtta ( 532858 )
      Why not just say "densest"? Because you could say "having highest densiness" instead.
    • Anyone who says "most dense" when they mean "smallest" isn't going to pick up on the semi-subtlety of "densest."

      Maybe you've forgotten, but when you apply the word dense to a single object, it refers to the object's density, not how many of them are packed into a given area. Given that many early ICs were made with lead, and that these are made with silicon, they're not anywhere near the densest ever, and to be clear, they're actually not the most densely packed ever either (thanks to 3d FRAM such as made
  • Very few details (Score:5, Insightful)

    by SmlFreshwaterBuffalo ( 608664 ) on Wednesday January 24, 2007 @08:15PM (#17745832)
    The article is very lacking in detail.
    • Is this volatile or non-volatile memory?
    • What size word are they using?
    • If non-volatile, what kind of endurance can be expected? What about data retention? It doesn't matter how small the memory is if the data only lasts 5 minutes. (Yes, I'm sure there would be applications even for that, but you get the point.)
    • What are the write and read times?
    • If volatile, does the data need to be refreshed continuously, or will it hold its value as long as power is applied?
    • How much power is required for different operation?
    Okay, so maybe I was expecting too much. But they could've at least given some of the most basic details, like word size (damned marketing dept!).
    • Well they also say it stores 160kb, aka 20kB, so presumably by 2000 words they mean words of ten characters (or nine plus spaces) encoded in ASCII. Doesn't really matter, the 160kb is the important bit.
    • Re:Very few details (Score:4, Informative)

      by StikyPad ( 445176 ) on Wednesday January 24, 2007 @08:51PM (#17746210) Homepage
      This one [physorg.com] is a bit better, but apparently the Nature article will be released tomorrow, which I assume would have the sort of detail you're asking for.
    • These are good questions!

      I can only answer a couple of them at the moment.

      Is this volatile or non-volatile memory?

      It is non-volatile... so long as nobody sneezes.

      What size word are they using?

      This should have been obvious from the context of TFA. They are using MavisBeaconWords. These have the equivalent length of 5 ascii characters plus one spacer character, so the conversion is 1 MavisBeaconWord = 6 bytes (assuming ascii encoding).

  • by scdeimos ( 632778 ) on Wednesday January 24, 2007 @08:26PM (#17745932)

    The Yahoo! News article got the figures wrong. To get only 2,000 words (a computer term, not a linguistic one) out of 160-kbits they'd have to be 80-bit words. The article at Technology Review [technologyreview.com] has better maths and more information to boot.

  • Say you have a sliver of very thin metal disk just several atoms thick that spins. At a reoccurring predetermined time, a photon or particle gun shoots energy at the disk at a very specific location and say every 1 ms rpm, it misses an atom and hits a detector. However, if on the last pass, it's time is changed by .5ms and at 9.5ms that energy is obstructed and doesn't hit the detector. If this continues could you reasonably determine that the photon has been obstructed by a nucleus? Then once you've mapped
  • ... who developed the 160-kilobit memory cell ...
    So 160,000 bits = 2,000 words.

    160000 / 2000 = 80

    One word = 80 bits?

    I've never heard of an 80-bit word architecture.

    Unless of course they're speaking of an MS Word architecture, in which case even the byte count would be bloated :P
    • by glwtta ( 532858 )
      I've never heard of an 80-bit word architecture.

      Are you sure they don't mean '80 / 8 = 10' - an estimate for average English word length? Pretty hight though, I think it's usually about 6 (counting the space, even).
    • In context, they mean real words, not the storage unit "word." They're talking about libraries, books, and text documents. Ten bytes per word is, if anything, generous.
    • I've never heard of an 80-bit word architecture.

      I had not either. But I had a bug in the Linux port of my code and discovered that deep inside the floating point processor of 32 bit intel chips, there are 80 bit registers and all intermediate calculations are accurate upto 80 bits and final result gets truncated and stored in 64 bit double words. I had to fiddle with compiler flags to disable the "extra" accuracy. A tree I was building was using 80 bit accurate key during insertion and 64 bit accurate st

  • capable of storing around 2,000 words in a unit the size of a white blood cell,
    Yeah so.. how many words in a unit the size of a football field? remember that's the only area measuring unit we understand!!
    • Is that an English football, or an American football?

      Sorry everybody, I just couldn't resist.

      Must control these Montyesque fingertappings...

  • Research abstract (Score:5, Informative)

    by FleaPlus ( 6935 ) on Wednesday January 24, 2007 @08:45PM (#17746138) Journal
    The piece on Yahoo! News was pretty low on details, so here's the abstract from the Nature paper:

    A 160-kilobit molecular electronic memory patterned at 1011 bits per square centimetre [nature.com]

    Jonathan E. Green1,4, Jang Wook Choi1,4, Akram Boukai1, Yuri Bunimovich1, Ezekiel Johnston-Halperin1,3, Erica DeIonno1, Yi Luo1,3, Bonnie A. Sheriff1, Ke Xu1, Young Shik Shin1, Hsian-Rong Tseng2,3, J. Fraser Stoddart2 and James R. Heath1

    The primary metric for gauging progress in the various semiconductor integrated circuit technologies is the spacing, or pitch, between the most closely spaced wires within a dynamic random access memory (DRAM) circuit1. Modern DRAM circuits have 140 nm pitch wires and a memory cell size of 0.0408 mum2. Improving integrated circuit technology will require that these dimensions decrease over time. However, at present a large fraction of the patterning and materials requirements that we expect to need for the construction of new integrated circuit technologies in 2013 have 'no known solution'1. Promising ingredients for advances in integrated circuit technology are nanowires2, molecular electronics3 and defect-tolerant architectures4, as demonstrated by reports of single devices5, 6, 7 and small circuits8, 9. Methods of extending these approaches to large-scale, high-density circuitry are largely undeveloped. Here we describe a 160,000-bit molecular electronic memory circuit, fabricated at a density of 1011 bits cm-2 (pitch 33 nm; memory cell size 0.0011 mum2), that is, roughly analogous to the dimensions of a DRAM circuit1 projected to be available by 2020. A monolayer of bistable, [2]rotaxane molecules10 served as the data storage elements. Although the circuit has large numbers of defects, those defects could be readily identified through electronic testing and isolated using software coding. The working bits were then configured to form a fully functional random access memory circuit for storing and retrieving information.


    Also, an interesting bit from the very end of the paper:

    Many scientific and engineering challenges, such as device robustness, improved etching tools and improved switching speed, remain to be addressed before the type of crossbar memory described here can be practical. Nevertheless, this 160,000-bit molecular memory does indicate that at least some of the most challenging scientific issues associated with integrating nanowires, molecular materials, and defect-tolerant circuit architectures at extreme dimensions are solvable. Although it is unlikely that these digital circuits will scale to a density that is only limited by the size of the molecular switches, it should be possible to increase the bit density considerably over what is described here. Recent nano-imprinting results suggest that high-throughput manufacturing of these types of circuits may be possible29. Finally, these results provide a compelling demonstration of many of the nanotechnology concepts that were introduced by the Teramac supercomputer several years ago, albeit using a circuit that contained a significantly higher fraction of defective components than did the Teramac machine4.
  • I thought white blood cells were giant honking cells. Aren't they much bigger than the size of the manufacturing process used to fabricate modern computer chips? I would have thought a piece of silicon the size of a white blood cell would be able to store more than 2000 words.
  • The cell is capable of storing a file the size of the United States' Declaration of Independence with room left over."
    How many Libraries Of Congress per Volkswagon Beetles is that?
    • by Dunbal ( 464142 )
      How many Libraries Of Congress per Volkswagon Beetles is that?


            So someone with AIDS and a low white cell count now has to worry about loosing their vocabulary, among other things?
  • it could hold .008 songs.
  • I don't know... I think the memory cells in some of my users' brains are denser. And I don't mean in the storage capacity sense.

After Goliath's defeat, giants ceased to command respect. - Freeman Dyson

Working...