One Step Closer To Speedier, Bootless Computers 249
CWmike writes "Physicists at the University of California at Riverside have made a breakthrough in developing a 'spin computer,' which would combine logic with nonvolatile memory, bypassing the need for computers to boot up. The advance could also lead to super-fast chips. The new transistor technology, which one lead scientist believes could become a reality in about five years, would reduce power consumption to the point where eventually computers, mobile phones and other electronic devices could remain on all the time. The breakthrough came when scientists at UC Riverside successfully injected a spinning electron into a resistor material called graphene, which is essentially a very thin layer of graphite. The graphene in this case is one-atom thick. The process is known as 'tunneling spin injection.' A lead scientist for the project said the clock speeds of chips made using tunneling spin injection would be 'thousands of times' faster than today's processors. He describes the tech as a totally new concept that 'will essentially give memory some brains.'"
Wishful thinking... (Score:4, Interesting)
Is it wrong that as fast as things as changing these days, part of me still hopes for one of these '1000x faster in 5 years' technologies to live up to its full promise?
I know it's coming; if not this tech than surely another one... I guess one hopes to live in interesting times, and I still dream for the day I wake up and there's a computer for sale that shatters Moore's Law. A computer 1000x faster than what was available the day before.
Faster, please.
(and thank you)
Re:I wonder though (Score:5, Interesting)
http://en.wikipedia.org/wiki/BTRON [wikipedia.org]
But MS and the US gov had it killed due to market intervention.
Computer vision for mobile devices (Score:3, Interesting)
Re:we dont need more processing power tho (Score:5, Interesting)
even today's mainstream cpus are far more powerful than what our everyday tasks involve.
Usually that's true. But today I was using Autodesk Inventor, which is a parametric CAD solid modeling system. That's one of the few desktop applications that can usefully use gigabytes of memory and a dozen CPUs.
(I worked on the development of AutoCAD in the early 1980s, when the problem was cramming usefully sized drawings into 640K of RAM, a 20MB hard drive, and an 0.25 MIPS CPU. It was a tough cramming job. I used to dream about the day when we could have a CAD system with real-time solid modeling, automatically connected to CNC machine tools, running on a desktop computer. It took four or five more orders of magnitude in CPU power to make it work, and it's here. I'm glad I got to see it happen.)
Re:Bad summary again... (Score:3, Interesting)
Wrong conclusion (Score:4, Interesting)
The earliest computers had non-volatile memory, but that is where the booting process originates from!
The word "booting" comes from the word "bootstrap" which was the tiny program you had to toggle in (with binary switches for the register and the address) into memory, which you could start and which would then load the OS from punch cards.
The memory was still filled, but you did not know what with. So the computer's memory was basically a swamp, and it had to pull itself out with its own bootstraps, like Baron von Münchhausen. Hence the name.
Bootless a reality already (Score:2, Interesting)
Bootless computers are a reality. The operating system needs to be written in flash memory (or ROM, with flash
memory patching). It's simple. The boot time of popular OSes stems from two reasons: Microsoft is a technically uninspired desktop OS monopoly; Linux has server origins and Linux on the desktop is nothing but an uninspired copycat of an uninspired MS implementation.
The Commodore 64 featured a bootless design like 30 years ago.
Re:Wishful thinking... (Score:2, Interesting)
A computer 1000x faster than what was available the day before.
Faster, please.
(and thank you)
Isn't 1000x faster too fast? I heard we are already close to the limit of speed of light. If we go faster than chips would have to get smaller so signals can travel across them in one cycle.
Re:Wishful thinking... (Score:3, Interesting)
Again, let's just look at the history. Computers are about 1000x faster than they were in 1980. What does software have to show for it? It's often more of a pain to use (I hate it when software tries to be "smart". Don't second-guess me, just give me an easy way to express what I want to do), and it's buggier than ever.
If you honestly believe that, fire up DOSBox and spend a day only using software from the '80s. It was no less buggy (and crashing one app did mean crashing the whole OS back then), and it was definitely more of a pain to use.
Re:Wishful thinking... (Score:3, Interesting)
True. You can make DOSBox emulate the CPU at the correct speed, but I don't think it has a way of emulating the speed of floppy disk drives. Maybe run all of your files off an NFS server on a different continent with no caching. Even then, a modern Internet connection is significantly faster than a floppy disk was, even in terms of latency if the disk was spun down.
That said, my 8086 PC from the late '80s had a 40MB hard disk. They cost about £100 back then, and it was an after-market addition, replacing the 20MB one it came with. It also had an 8MHz NEC v30 CPU and a whole 640KB of RAM. Windows 3.0 ran on that machine, although running more than one app at a time was a struggle - no MMU meant to virtual memory.
Re:Wishful thinking... (Score:3, Interesting)
part of our problem is we are using electrical waves - you can't put a second wave into the pipe till the first is finished - where as if we could switch to optics we could in theory slam the photons as close together as we can and have them back to back in the pipe...
while now at 5Ghz we have 1 signal per 6cm with photons we could have near infinite in the same space.
Re:Wishful thinking... (Score:3, Interesting)
I have yet to see anyone satisfactorily define "intelligence", let alone propose a plausible algorithm for it.
I use the definition, "a problem solving engine"; that is to say, an engine based entirely on solving any problem presented to it (presumably using an extensible language, internally if not externally). Things like "How do I gather all of my senses into one place for processing" and "given all of this sensory information, what do I do now" and "how do I express my feelings for this person (with or without saying anything)" are problems and so could be understood by such an engine, as could many other facets of intelligence. Or, in other words, once you had an engine like that, making something look intelligent is only about finding the right problems for it to take on... which, in the end, is also something we see in humans.
But yeah, I wouldn't want to try to implement that in code. I firmly believe it could be done, but I don't have the expertise to do it.
Re:Wishful thinking... (Score:3, Interesting)
You couldn't. Photons don't work like distinct particles, really. If you want a light signal that's localized in space, it will consist of multiple photons and will spread out as it travels. The EM wave packet will interfere with nearby wave packets in much the same way as you describe.