Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Data Storage IBM Hardware Science

IBM Shrinks Bit Size To 12 Atoms 135

Lucas123 writes "IBM researchers say they've been able to shrink the number of iron atoms it takes to store a bit of data from about one million to 12, which could pave the way for storage devices with capacities that are orders of magnitude greater than today's devices. Andreas Heinrich, who led the IBM Research team on the project for five years, said the team used the tip of a scanning tunneling microscope and unconventional antiferromagnetism to change the bits from zeros to ones. By combining 96 of the atoms, the researchers were able to create bytes — spelling out the word THINK. That solved a theoretical problem of how few atoms it could take to store a bit; now comes the engineering challenge: how to make a mass storage device perform the same feat as scanning tunneling microscope."
This discussion has been archived. No new comments can be posted.

IBM Shrinks Bit Size To 12 Atoms

Comments Filter:
  • by paleo2002 ( 1079697 ) on Thursday January 12, 2012 @05:10PM (#38678256)
    Next thing you know, everyone will have to buy appliances with electron guns, magnetrons, lasers and other outlandish sci-fi devices built into them. They'll probably take up entire rooms and cost hundreds of thousands of dollars!
  • by dzr0001 ( 1053034 ) on Thursday January 12, 2012 @05:19PM (#38678350)
    Increasing disk density only solves a handful of problems. Unfortunately it can create more problems as well. As disk size increases, more and more applications will become io bound due to contending for the same piece of metal. For many, if not most, organizations that need large amounts of data, increasing per disk density is pointless unless new technology can be introduced to retrieve it at an exponentially faster rate.
  • Bad article (Score:5, Insightful)

    by Anonymous Coward on Thursday January 12, 2012 @05:19PM (#38678354)

    There's a better article here [] which includes some more information on the experiment. In particular the temperature was 0.5K.

    Also the computerworld article claims that using an antiferromagnetic arrangement of atoms is an advantage because it pulls the atoms more tightly together. I'm not convinced that this is true but even if it is the effect would be completely negligible. The interesting aspect of this arrangement is that each atom cancels out the magnetic field of the atoms either side of it which should help with data stability (a similar effect is seen in perpendicular recording today).

    Unrelatedly: have they/will they publish a paper on this? I can't find anything mentioning a paper in the press releases.

  • Re:Bad article (Score:2, Insightful)

    by sheepe2004 ( 1029824 ) on Thursday January 12, 2012 @05:21PM (#38678382) Homepage
    Gah posted this as AC by mistake.
  • by tocsy ( 2489832 ) on Thursday January 12, 2012 @05:38PM (#38678546)

    I'm a materials science graduate student, and my research is on semiconductors. While I don't work with materials for data storage, I have a pretty good background in electronic properties of materials so maybe I can shed some light on the situation.

    Basically, I suppose this would be hypothetically possible but the problems you'd face would be very, very difficult to solve. The big problem here is that in order to keep something ionized, you would have to completely isolate it from any other atoms that might donate/steal an electron. Again it's hypothetically possible, but impractical considering most of those are noble gasses. Not to mention, storing data as ionized/unionized atoms is fundamentally different from the way we store data now (magnetic domains). I think the more reasonable idea would be to shrink magnetic domains, as well as the number of magnetic domains required to form a bit. If I remember correctly, currently each magnetic domain consists of several hundred atoms and each bit consists of around 100 magnetic domains. As the article states, the best we could get is one atom representing one bit, and the probability of using magnetism over changing to ionization as the mechanism for differentiation between ones and zeroes is very high.

  • by pscottdv ( 676889 ) on Thursday January 12, 2012 @06:05PM (#38678818)

    Actually, an STM is typically about the size of a baseball. The vacuum chamber housing it, however...

  • by JustinOpinion ( 1246824 ) on Thursday January 12, 2012 @06:30PM (#38679002)
    You're right that for STMs and AFMs instruments, vibration is a huge issue. But when using those instruments, you're trying to image nano-sized objects, or even individual atoms. So of course vibrations bigger than an atom's width will ruin your image. You can compensate for this (to a point) by making the device more rigid, and also by dampening out environmental noise. But there's a limit to what you can do (e.g. you can't make the cantilever your tip is attached to very stiff, or you would ruin your sensitivity).

    In an atomic magnetic memory, though, you wouldn't really be imaging individual atoms. You'd be scanning the tip back-and-forth and trying to sense (or set) the local magnetic field. Thus you wouldn't need to use a soft cantilever to hold the tip. A very stiff/rigid one would be fine, as long as it is correctly positioned in relation to the encoding atoms (close enough for sensing, etc.). The magnetic response in general will be stronger than the usual imaging modes for STM.

    My point is just that using a STM-like device for storing/retrieving data eliminates many of the design constraints that a full-blown STM needs (because it's trying to do precise topography and density-of-states imaging...). You can play many engineering tricks that they can't afford to do in a real STM.

    Having said that, many challenges would remain. External vibrations could still make the device unstable (or require it to sample for longer periods to average-out signals, thus making data throughput lower). Temperature stability is probably going to be a major concern (thermal expansion will change the nano-sized gap between the tip and bits, which will need to be compensate for; thermal noise could overwhelm the signal entirely; thermal gradients could make alignment of the tips and compensation for temperature drift even harder; etc.).

    Then again, you only have to look at the absurd sophistication of modern HDDs or CPUs to be convinced that we can handle these kinds of challenging engineering problems (if there is enough economic incentive).

Nondeterminism means never having to say you are wrong.