Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Data Storage Hardware

One Step Closer To Speedier, Bootless Computers 249

CWmike writes "Physicists at the University of California at Riverside have made a breakthrough in developing a 'spin computer,' which would combine logic with nonvolatile memory, bypassing the need for computers to boot up. The advance could also lead to super-fast chips. The new transistor technology, which one lead scientist believes could become a reality in about five years, would reduce power consumption to the point where eventually computers, mobile phones and other electronic devices could remain on all the time. The breakthrough came when scientists at UC Riverside successfully injected a spinning electron into a resistor material called graphene, which is essentially a very thin layer of graphite. The graphene in this case is one-atom thick. The process is known as 'tunneling spin injection.' A lead scientist for the project said the clock speeds of chips made using tunneling spin injection would be 'thousands of times' faster than today's processors. He describes the tech as a totally new concept that 'will essentially give memory some brains.'"
This discussion has been archived. No new comments can be posted.

One Step Closer To Speedier, Bootless Computers

Comments Filter:
  • Wishful thinking... (Score:4, Interesting)

    by Braintrust ( 449843 ) on Tuesday October 19, 2010 @02:10AM (#33943138)

    Is it wrong that as fast as things as changing these days, part of me still hopes for one of these '1000x faster in 5 years' technologies to live up to its full promise?

    I know it's coming; if not this tech than surely another one... I guess one hopes to live in interesting times, and I still dream for the day I wake up and there's a computer for sale that shatters Moore's Law. A computer 1000x faster than what was available the day before.

    Faster, please.

    (and thank you)

    • by Relic of the Future ( 118669 ) <dales AT digitalfreaks DOT org> on Tuesday October 19, 2010 @02:55AM (#33943358)
      1000x in 5 years IS wishful thinking, but it isn't THAT drastically off from Moore's law, which predicts a 1000x increase every 10 to 15 years. And it's never happened overnight, but in steps every few months. Many of the "1000x-predicted" technologies that /. covered 10 years ago probably have been part of the 1000x-actual increase of the last 10 years.
      • Re: (Score:3, Informative)

        by at_slashdot ( 674436 )

        Moore's law is an observation, not a law, and it's actually about the number of transistors per surface unit, it doesn't say anything about speed.

    • Re: (Score:3, Informative)

      by Katzhuu ( 1267952 )
      It's not breaking Moore's law. Schedule was 5 years for getting first spin transistor and 10 more till they have been introduced to consumer products. That is 15 years. With performance doubling avery 18 months it would mean that in 15 years performace should be ~1000 times todays performance according to Moore's law. Of course the article talked only about speed and not performace. Performance may get additional boosts from other sources that just the speed of the chip.
    • Re: (Score:2, Interesting)

      by pacinpm ( 631330 )

      A computer 1000x faster than what was available the day before.

      Faster, please.

      (and thank you)

      Isn't 1000x faster too fast? I heard we are already close to the limit of speed of light. If we go faster than chips would have to get smaller so signals can travel across them in one cycle.

      • Re: (Score:2, Informative)

        Isn't 1000x faster too fast? I heard we are already close to the limit of speed of light. If we go faster than chips would have to get smaller so signals can travel across them in one cycle.

        The day that the speed of light is holding us back we'll be in pretty good shape technologically speaking. I'm not sure if our planet will last long enough for us to get there, but it's not like we've got any other choice. Damn the electrons, full speed ahead!

        • Re: (Score:3, Informative)

          by Anonymous Coward

          5GHz means cycling every 0.2 nanoseconds. In 0.2 nanoseconds, light travels about 6cm. We're already pretty close to the limit for keeping processing synchronised over a large blob of silicon without using methods more cunning than just saying "feh, doesn't matter, light is fast"

          • Re: (Score:3, Interesting)

            by Amouth ( 879122 )

            part of our problem is we are using electrical waves - you can't put a second wave into the pipe till the first is finished - where as if we could switch to optics we could in theory slam the photons as close together as we can and have them back to back in the pipe...

            while now at 5Ghz we have 1 signal per 6cm with photons we could have near infinite in the same space.

            • Re: (Score:3, Interesting)

              by blueg3 ( 192743 )

              You couldn't. Photons don't work like distinct particles, really. If you want a light signal that's localized in space, it will consist of multiple photons and will spread out as it travels. The EM wave packet will interfere with nearby wave packets in much the same way as you describe.

          • Re: (Score:3, Informative)

            by durrr ( 1316311 )
            This is only a problem for off-circuit communication, current CPUs are something along the line of 1cm diagonally if they are large, meaning that computation within the limits of an integrated circuit(or equivalent device in similar size) can be ludicrously much faster before lightspeed becomes a true limitation.
            This especially when we consider spintronics devices as their dimnuitive power requirements allow us to hypothise about a cubic cm supercomputation circuit without having to factor in a cooling sys
    • by mrnobo1024 ( 464702 ) on Tuesday October 19, 2010 @05:21AM (#33943900)

      If history is any indicator, then the next version of every software program would then be 1000x slower.

      • Re: (Score:3, Funny)

        by ultranova ( 717540 )

        If history is any indicator, then the next version of every software program would then be 1000x slower.

        Yeah, but they'll also be 1000x smarter, meaning that SpinFox will automatically mod down any messages it thinks you might disagree with - with automatically created, nursed and ripened sock puppets!

        Seriously speaking, 1000x faster starts getting near the level of human brains in raw power, so it should be able to run a real artificial intelligence on it.

        • Re: (Score:3, Insightful)

          by mrnobo1024 ( 464702 )

          Again, let's just look at the history. Computers are about 1000x faster than they were in 1980. What does software have to show for it? It's often more of a pain to use (I hate it when software tries to be "smart". Don't second-guess me, just give me an easy way to express what I want to do), and it's buggier than ever.

          Seriously speaking, 1000x faster starts getting near the level of human brains in raw power, so it should be able to run a real artificial intelligence on it.

          Even if this were true, we would

          • Re: (Score:3, Interesting)

            by TheRaven64 ( 641858 )

            Again, let's just look at the history. Computers are about 1000x faster than they were in 1980. What does software have to show for it? It's often more of a pain to use (I hate it when software tries to be "smart". Don't second-guess me, just give me an easy way to express what I want to do), and it's buggier than ever.

            If you honestly believe that, fire up DOSBox and spend a day only using software from the '80s. It was no less buggy (and crashing one app did mean crashing the whole OS back then), and it was definitely more of a pain to use.

            • > ...crashing one app did mean crashing the whole OS back then

              Not when the OS was Unix.

          • Re: (Score:2, Informative)

            Again, let's just look at the history. Computers are about 1000x faster than they were in 1980.

            Math Fail.
            30 years is 30*12=360 months.
            360/13=about 27.7 //a doubling of speed in 13 months. Not sure if this is accurate
            2^27.7 = 218,037,342.4.
            That is way more than 1000 times

            Example: A Cray X-MP (1982) had 400 MFLOPS
            The Cray XT-5 (2009)has 1.759 PFLOPS
            This is (1.759x10^15)/(400*10^6)=4,397,500 times as much. Not as much as predicted with x2 every 13 months, but you get the picture.

          • Re: (Score:3, Interesting)

            I have yet to see anyone satisfactorily define "intelligence", let alone propose a plausible algorithm for it.

            I use the definition, "a problem solving engine"; that is to say, an engine based entirely on solving any problem presented to it (presumably using an extensible language, internally if not externally). Things like "How do I gather all of my senses into one place for processing" and "given all of this sensory information, what do I do now" and "how do I express my feelings for this person (with or without saying anything)" are problems and so could be understood by such an engine, as could many other facet

        • Re: (Score:3, Informative)

          by B1oodAnge1 ( 1485419 )

          The problem of artificial intelligence is not one of processing power. Even given infinite speed we have no clue how to begin emulating the function of the human brain.

          I'm assuming that by "real AI" you mean a self aware computer program.

          • Given infinite speed and memory, we wouldn't need to. Just start a few billion unwired neural networks of suitable elaborate and controllable activation functions and ten trillion neurons each, feed it into the genetic algorithm, set the fitness function to some form of tribal-life simulation game... and let it go from there.
        • Re: (Score:3, Funny)

          by skids ( 119237 )

          Well, supposedly consciousness will emerge on its own once enough complexity is introduced, so it's just a matter of jamming as much complexity down in there as we can. Maybe an entire copy of the business logic of all the world's health insurance and financial companies would do the trick. :-)

          Now as to whether the consciousness will have a will, or be a passive observer, that's a better question (neglecting the tenable argument that will is an illusion.) Creating consciousness won't be very interesting t

    • Re: (Score:3, Insightful)

      Comment removed based on user account deletion
      • I shut down rather than hibernate as I can't predict which OS I will need to use next. Having to wait for the 'wrong' OS to come out of hibernation, then shutting it down, is very annoying.

      • by tverbeek ( 457094 ) on Tuesday October 19, 2010 @08:34AM (#33944848) Homepage

        Uhhh...who boots anymore?

        Not enough people, if you ask me (front line support tech). Laptop users especially have completely gotten out of the habit of shutting down their computers, making their systems progressively slower and less stable as time goes on. Then they come into my office or call me on the phone with a problem (e.g. Program X won't start or keeps crashing). I shut down ("not just 'shut' but actually 'shut down'") their computer, turn it back on again, and it's "fixed". A waste of my time... and theirs.

        Annoying as it is, the boot process has the benefit of restoring a system to a largely-predictable known-good state. I miss it already.

    • We had this bootless technology in the 80s and earlier. ROMs were used for many things. Ultimately the bloat took over and ROM wasn't big enough, plus software changes so fast now too.

    • Computer technology seems to be one of the few industries left that really seems to continually push the limits in innovation. I fear the day when this isn't true.
    • Re: (Score:3, Insightful)

      Is it wrong that as fast as things as changing these days, part of me still hopes for one of these '1000x faster in 5 years' technologies to live up to its full promise?

      I know it's coming; if not this tech than surely another one... I guess one hopes to live in interesting times, and I still dream for the day I wake up and there's a computer for sale that shatters Moore's Law. A computer 1000x faster than what was available the day before.

      Faster, please.

      (and thank you)

      If you create stuff, you should know that everything takes longer than you think it will; and, therefore, nothing happens as fast as you expect it to happen.

  • My mobile phone already is on all the time. So are most of my computers.

    Graphene is going to turn out to be a 'before graphene/after graphene' landmark in history.

    • by mbstone ( 457308 )

      No, more like a 'without platformate/with platformate' [youtube.com] story.

    • Re: (Score:2, Insightful)

      by sempir ( 1916194 )
      'will essentially give memory some brains.'" Now if they can develop this for human consumption think what it would do for people with Alz........Aaahhhhhh......whassitcalled? ....
    • Bad summary ....(Surprise)

      Always on/Suspend mode already covers this and this has nothing to do with Spin/Quantum computing

      Non-volatile logic could be built now with conventional electronics (it would be slower in use so it isn't)

      This is not the major advantage of quantum computing .....and don't hold your breath the lead time on this is more than indicated here ...

  • spin computer (Score:2, Insightful)

    by Anonymous Coward

    am i the only one who read the title and thought that PR firms and politicians could be in serious trouble?

  • by Facegarden ( 967477 ) on Tuesday October 19, 2010 @02:14AM (#33943154)

    So, this is becoming a trend. Bad summary. It's not an outright lie, just misleading. From reading the article, one might get the sense that we might see this in products in 5 years. However, the article actually states that the guy said:
    "I'm one of those researchers that really cringes at the thought of saying this [new technology] can be useful. I think for us, maybe within five years we can get one device working."

    So, the guy is realistic, and not a douche. "We can maybe get one working in 5 years" is not the same as seeing it in devices in 5 years (which, again, wasn't explicitly stated in the summary, but i feel like thats what people would think).

    In reality, we might get something in products in 10 years.
    -Taylor

    • by Zouden ( 232738 ) on Tuesday October 19, 2010 @03:07AM (#33943394)

      It's also terribly pedestrian to say that this could lead to "speedier, bootless computers", like as if this technology will be implemented in the next Intel chip and suddenly Windows will load instantly and we'll all get high framerates in Crysis. Really, this technology is similar to quantum computing - eventually it'll find its way into extremely specialised applications, and by the time (or if) it does make it into our homes, computers will be very different things, almost unrecognisable.

      Also, "mobile phones and other electronic devices could remain on all the time." Guess what? My mobile phone already remains on all the time, because I recharge it every few days. If the reporter is talking about devices remaining on without charging, what does he think is going to power the antenna and the display? The scientists haven't invented a free energy device.

      • Re: (Score:3, Interesting)

        by shougyin ( 1920460 )
        I'm not sure if this technology is in any resemblance to the old atom spinning technology that I read about years ago, and I've not researched into this much yet, BUT from what I remember of spin technology there is no need for "boot time" or "shutting down" a system. With the atoms spun in a certain position (say, to that of your normal desktop) the computer can be turned off, probably by the same usual methods, but it would shut off much like if you were to kill the power instantly. The atoms being save
    • It also doesn't look like he's talking about getting one useful prototype chip working in five years. He seems to mean one spin transistor five years from now. That probably puts it at decades from hitting market and only if it's a more suitable technology than all the other technologies that could, just maybe, replace the current basic transistor design.

  • Another minor bit of progress in materials science being blown up into a revolutionary advance. We get about one of these every two weeks. Right now, these guys have a one-bit device that consumes more power than DRAM. They really should hold off on the press releases until they're further along. Maybe this will be useful, and maybe it won't be.

    It's stuff like this that gives nanotechnology a bad name.

  • for a scientist or engineer to say, a reality in 5 years, if he was referring to ready-for-production or the first trickle to concept models in technology product expos. But the one about 'You can keep them powered on', it's like a game changer from out of left field. Maybe booting will become irrelevant by then?

  • Cool stuff but... (Score:2, Insightful)

    by PmanAce ( 1679902 )
    Will this new technology finally bring us to our beloved flying cars?
  • Boolean Memory. (Score:2, Informative)

    by Anonymous Coward

    He describes the tech as a totally new concept that 'will essentially give memory some brains.

    Computer memory combined with logic gates. [trnmag.com]

  • by janek78 ( 861508 ) on Tuesday October 19, 2010 @02:30AM (#33943246) Homepage

    Close, but not what I need - I need something to give my brain some memory!

  • Re: (Score:2, Insightful)

    Comment removed based on user account deletion
  • by Dexter Herbivore ( 1322345 ) on Tuesday October 19, 2010 @02:32AM (#33943254) Journal
    What excuse do I use now to go and make my morning cup of coffee without looking like a slacker?
  • even today's mainstream cpus are far more powerful than what our everyday tasks involve. even the fps-hungry gaming crowd has been reaching perceptive limits in regard to human eye, and the frame rate has become a sport, a statistical value.

    unless society takes on seti, parallel computing etc as hobbies, we wont need more processing power in our daily lives.
    • eat all CPU power available and can eat couple of order of magnitude more.
    • by theheadlessrabbit ( 1022587 ) on Tuesday October 19, 2010 @02:46AM (#33943326) Homepage Journal

      even today's mainstream cpus are far more powerful than what our everyday tasks involve. even the fps-hungry gaming crowd has been reaching perceptive limits in regard to human eye, and the frame rate has become a sport, a statistical value.

      unless society takes on seti, parallel computing etc as hobbies, we wont need more processing power in our daily lives.

      Just wait till the next version of windows hits the shelves...

      I'm fairly certain that computing power is like hard drive space or time 'till the deadline , we will always find ways to fill it, no matter how much we think we have in the beginning.

    • by Animats ( 122034 ) on Tuesday October 19, 2010 @02:55AM (#33943356) Homepage

      even today's mainstream cpus are far more powerful than what our everyday tasks involve.

      Usually that's true. But today I was using Autodesk Inventor, which is a parametric CAD solid modeling system. That's one of the few desktop applications that can usefully use gigabytes of memory and a dozen CPUs.

      (I worked on the development of AutoCAD in the early 1980s, when the problem was cramming usefully sized drawings into 640K of RAM, a 20MB hard drive, and an 0.25 MIPS CPU. It was a tough cramming job. I used to dream about the day when we could have a CAD system with real-time solid modeling, automatically connected to CNC machine tools, running on a desktop computer. It took four or five more orders of magnitude in CPU power to make it work, and it's here. I'm glad I got to see it happen.)

      • well it happened... from this point on we need better and smaller cpus, only for mobile devices.
    • unless society takes on seti, parallel computing etc as hobbies, we wont need more processing power in our daily lives.

      Have you considered that most the dificulty of modern graphics comes from the limitations of not being able to simply tell the computer: "Take these math defined objects and represent them by simply raytracing everything from this PoV"?

      We're still very far (many decades) from the point where we don't need more computing power to represent graphics in a way that doeson't get between the concept of what we want to represent and the reality of what we're forced to accept as the most we can do.

      Actually, I'd bet

    • Alright pointdexters! You heard the man, we're done with your science and inventing shit. Now that we have good looking FPSs the computer revolution has done its job. Collect your pink slips on the way out. No, you can't keep your slide rule. Why I oughta.....

    • Right. Now we just have to completely rewrite every piece of consumer software out there from the ground up to actually make sensible use of all that power so that my desktop does not feel as if I was still sitting in front of that Pentium 1 machine from the dawn of modern times.
  • Shock and horror! Where will I stick the dead bodies? And the horse's head? Damned progress!
  • As for me, (Score:5, Funny)

    by mbstone ( 457308 ) on Tuesday October 19, 2010 @03:19AM (#33943428)

    I'd settle for speedier, botless computers.

    • Sorry, not going to happen. Unfortunately, Linux on the desktop is dead.

      A Windows editorial writer said so yesterday :(
  • Wrong conclusion (Score:4, Interesting)

    by Errol backfiring ( 1280012 ) on Tuesday October 19, 2010 @03:51AM (#33943530) Journal

    The earliest computers had non-volatile memory, but that is where the booting process originates from!

    The word "booting" comes from the word "bootstrap" which was the tiny program you had to toggle in (with binary switches for the register and the address) into memory, which you could start and which would then load the OS from punch cards.

    The memory was still filled, but you did not know what with. So the computer's memory was basically a swamp, and it had to pull itself out with its own bootstraps, like Baron von Münchhausen. Hence the name.

    • by Cato ( 8296 )

      Not really... Some of the earliest computers used 'core memory', which was the only RAM and also non-volatile. You toggled in the bootstrap into this core memory, from your own memory or a cheat sheet (fortunately it was very short), and it was then used for multiple boots. Only if a really bad error caused a program to scribble over the bootstrap did you have to re-enter the bootstrap code.

      On the PDP-8 I used, the bootstrap code was enough to read the OS in from paper tape - see http://en.wikipedia.org/ [wikipedia.org]

  • So far as "bootless" goes, my old PDA is ready for use virtually instantaneously. It still boots - more or less, but instead of the multi-minute bloat of modern operating systems, it is capable of doing anything I need within <clicks fingers> about that much time. Now the functions of the PDA are strictly limited. Let's see what we've got: word processor, games, internet browser, email, calendar, video/MP3 players -- hang on a second! Maybe it's not really that limited after all. Give it a keyboard,
    • by aXis100 ( 690904 )

      I think you will find it just stays in low power standby mode, and when you press the button it comes out of stand by. Properly rebooting a PDA takes a significant amount of time.

  • Bootless computers are a reality. The operating system needs to be written in flash memory (or ROM, with flash
    memory patching). It's simple. The boot time of popular OSes stems from two reasons: Microsoft is a technically uninspired desktop OS monopoly; Linux has server origins and Linux on the desktop is nothing but an uninspired copycat of an uninspired MS implementation.

    The Commodore 64 featured a bootless design like 30 years ago.

  • by mdm42 ( 244204 ) on Tuesday October 19, 2010 @05:31AM (#33943942) Homepage Journal

    So, all very nice, we'll be able to have always-on computers that don't pig out on energy, BUT...

    How much of the software we use can handle running for long periods of time without crashing? Not many, in my experience.

    What with memory leaks, bounds overflows and who knows what else, some of which may be an oversight in your own code, but more likely is a bug inside some library you're using, or a compiler bug, or linker bug, or...

    As anybody who has tried it and knows, writing software that runs for weeks and months on end without restarting is really quite hard. And it's no bloody use if the hardware can stay up for months on end if the software can't.

    (And, not having used Windows in about 14 years, I'm not talking about that piece of shite.)

    • Re: (Score:3, Insightful)

      Depends on your context: for a user-facing computer, that is pretty much true. If X pukes itself, taking my graphical programs with it(or even if my browser pukes itself, taking my tabs with it) I might as well have rebooted the computer for all the inconvenience I've just been put to. There are even a few situations where(without good design) non-volatile memory could make things worse: today, if some peripheral gets confused and its internal processor stops talking, or starts talking nonsense to the outsi
  • You may not need to *boot*, but as long as you run MS software you'll always need to REboot.
  • Booting (Score:5, Informative)

    by ledow ( 319597 ) on Tuesday October 19, 2010 @06:08AM (#33944070) Homepage

    Computers needing to "boot" is a relatively modern invention caused in part by hardware hotplug, backwards compatibility modes and reliability checks.

    Most of the boot process is:

    - Moving out of legacy modes (e.g. enabling increased capabilities from basic instructions sets to full modern ones, enabling different memory access models, enabling 64-bit etc.), ramping up core speed, enabling things like DMA and moving from "safe" memory timings to those that the chips report they can support when the negotiations finally take place, bringing up the non-boot CPU's, etc.

    - Contention. Doing only a certain number of things on the bus at any one time, making the buses serial, making the buses have sub-buses and other ideas. Sometimes there is no quicker way to do things. Sometimes it *will* take 1000ms before the disk will respond that it's up to speed.

    - Checking that RAM does indeed do what it's told, that a boot loader is present, that a floppy is present (yes, even on some modern BIOS's), checking IDE/SATA channels and retrieving capabilities, checking memory timings, checking PCI and USB buses, checking that disks are spinning, etc.

    Some of my servers take up to 3 minutes to get to the point where they can actually load the first byte from disk to begin loading it. A lot of this time is BIOS handoff to the BIOS on the RAID cards (and sometimes the network cards), those RAID cards checking, assembling and enabling the drives, etc. With two RAID cards, we've just nearly doubled boot time. Proper (reasonable) memory checks of several GB of RAM still takes a while, even for a simple test. And yet there's still a minute or so of absolute complete waste as we start in some 8086 legacy mode and slowly have to ramp up disks, cards and our own CPU's, not to mention external hardware like USB and DVD drives "just in case". And then the OS has to go and do it all itself again later anyway.

    This is why things like the LinuxBIOS (now called Coreboot) project actually work better and faster - when we KNOW what the BIOS needs to do, we find that lots of it is done twice, lots of it are unnecessary, lots of it can be delayed until we actually NEED the DVD drive, some of it can occur in the background because it will ALWAYS take a long time to start etc. But how many fixed sets of hardware does that project actually work on? Few. Because not only is it tricky to do that sort of analysis, but it's tricky to lock-down exactly what the BIOS needs to do and do better than the original BIOS.

    We can have an "instant on" computer. It's easy. My ZX Spectrum did it nearly 30 years ago. My calculator does it now. The Psion organisers all did it. Most portable games consoles manage it. The thing you have to realise though is that it means: booting into a single, fixed OS that's tricky to upgrade, making power management apply to every process perfectly, fixing a set of hardware down that we know can always boot into a certain configuration very quickly, changing the way that all our chips work so they start in their best mode, not their worst (and thus probably destroying things like OS installers as we know them and making them specific to a machine type - no more installers modern OS on old computers, or old OS on modern computers), removing any sort of consistency checks and having to rely on things not going wrong or the hardware being able to handle all hardware errors (e.g. ECC memory for everything with reporting of anything it can't handle), and building every component so it doesn't "negotiate" or "initialise" but just works (e.g. even a keyboard controller can take some time to come back online at the moment, not to mention graphics, disks, USB buses, etc.).

    Instant-on computers are always possible, and some of them are very useful for certain things. But generic PC's and instant-on won't happen until CPU's, disks and bus negotiations take literally fractions of a second for any operation (and thus we still do as many instructions to initialise but they take clock cycles

    • Thanks for the informative post. Wish I had mod points.

      Some of this a boot speed improvement might have to do with the nature of the overall architecture of a central CPU. When people boot a computer, they would like the thing to be immediately responsive. If there was a sort of bus that related to your primary display and core computing services that was independent of the rest, then you could get instant responsiveness (like a calculator) even if the rest of the system took a while to come up to speed. Fo

  • by Viol8 ( 599362 ) on Tuesday October 19, 2010 @06:14AM (#33944098) Homepage

    I don't want to sound like your usual get-off-my-lawn but in the in the days of home computers you could switch it on and it would be ready literally in under a second. Yes I know the "OS" was probably only 16K in size or less but it was in ROM and the computer didn't bother with pointless self checking (you'll soon know if some hardware on your PC isn't working).

    Even early DOS machines could boot in mere seconds. So really all this very complicated technology is doing is bringing us back to where we were 20 or 30 years ago.

    Plus ca change.

  • While this technology sounds quite interesting and(assuming it pans out outside the lab) will definitely shake up the world of tiny embedded devices, smart dusts, bridge bolts that you can set SNMP traps on, etc. it will be very interesting to see whether or not, and how quickly, it shakes up the world of "computers" in the more or less conventional "you sit in front of it and type at the intertubes" sense.

    For years now, we've had computers that can(albeit by much lower tech means) be said to have "non v
  • Once upon a time, before many readers here were born, most computers were bootless. Memory (RAM) was non-volatile "core" -- little magnetized iron donuts in a grid of wires. Discrete semiconductors (transistors -- TTL) were far to expensive and used only for registers. Power cycling would wipe the registers and cause a restart. But woe betide you if core got corrupted. Then it had to be rebuilt, a relatively long process almost certainly involving lots of tape and toggling (entering bits with switches)
  • My guess is that graphene is going to make the manufacturing of computers and chips much less expensive and reducing the price of computers even further. I predict that pretty soon a super-fast, always on laptop will cost around $99.00 in the near future.

One man's constant is another man's variable. -- A.J. Perlis

Working...