Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Robotics Hardware

The First Evolving Hardware? 148

Masq666 writes "A Norwegian team has made the first piece of hardware that uses evolution to change its design at runtime to solve the problem at hand in the most effective way. By turning on and off its 'genes' it can change the way it works, and it can go through 20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years." The University of Oslo press release linked from the article came out a few days ago; the researchers published a paper (PDF) that seems to be on this same technology at a conference last summer.
This discussion has been archived. No new comments can be posted.

The First Evolving Hardware?

Comments Filter:
  • by DurendalMac ( 736637 ) on Wednesday March 28, 2007 @12:54AM (#18511911)
    I, for one, welcome our new evolving hardware overlords.

    God, I am so sorry, but it needed to be said...
    • by Seumas ( 6865 )
      I, for one, think my brain just broke. That blurb was really long for having absolutely no content or description of what the hell about the hardware was evolving. Maybe the religious nuts were right -- evolution really is evil!
    • by Anonymous Coward on Wednesday March 28, 2007 @04:17AM (#18512893)

      If anything demonstrates the mental retardation that afflicts Slashdot, it is the above.

      Some idiot claims that a horrifically unfunny cliché needed to be repeated. Another person points out the falsity of that claim.

      The first post is marked +5 Funny; and the second, -1 off-topic.

      Just think about that for a second.

      People, turn off your computers. Go outside. Breathe real air. Have sex. Get girlfriends. Stop posting on Slashdot and don't come back until you have gained the social skills and sense of humour possessed by any normal human being. Do it for me; do it for yourselves; do it for everyone.

      ChameleonDave

      • by Dogtanian ( 588974 ) on Wednesday March 28, 2007 @04:59AM (#18513065) Homepage

        Some idiot claims that a horrifically unfunny cliché needed to be repeated. Another person points out the falsity of that claim. The first post is marked +5 Funny; and the second, -1 off-topic.

        Just think about that for a second.
        Nope, sorry, I'd much rather think about Natalie Portman, naked and petrified and covered in hot grits. ;-P

        (Incidentally, this article [everything2.com] tells us that Natalie Portman comments on Slashdot are "getting old... This Natalie Portman nonsense has been going on for months; it's not funny anymore." Note that the date is Oct 24 *2000*).

        People, turn off your computers. Go outside. Breathe real air. Have sex. Get girlfriends
        I turned off my computer, went outside, sniffed the air and had sex with some passing woman. Then the woman asked "Do I know you?" and we were arrested for public indecency.
        • by Luyseyal ( 3154 )

          (Incidentally, this article tells us that Natalie Portman comments on Slashdot are "getting old... This Natalie Portman nonsense has been going on for months; it's not funny anymore." Note that the date is Oct 24 *2000*).


          Man, am I getting old.

          -l

      • /signed

        slashdot comments are about +5 for "first posts" and "cliche posts". anything reflecting any nerd/geek movie/series will be rated up if quoted in an slashdotty manner. it's disgusting.
      • Some idiot claims that a horrifically unfunny cliché needed to be repeated.

        Wait, let me get this straight. You're bitching about other people missing jokes on slashdot, and saying "go outside, breathe air and get laid," and this is all in the same breath that you're calling someone else cliché?
      • by Trogre ( 513942 )
        and yet here you are.

      • I do go outside. I breathe real air. I have sex. I don't have a girlfriend, I have a wife. Imagine that! Yep, using old, inside jokes on Slashdot sure makes me a fat geek with no social skills who does nothing but sit on a computer all day! Ass.
    • Re: (Score:2, Insightful)

      by cytg.net ( 912690 )
      wonder what the machines will call the reverse turing test!

      the reverse turing test ?

      or

      are you really dumb enough to be human test ?
  • Seems like implementation of GA in hardware.. the title seems misleading.
    • Re:GA in hardware (Score:5, Informative)

      by wish bot ( 265150 ) on Wednesday March 28, 2007 @01:02AM (#18511967)
      And it's been done before - at least once - http://www.newscientist.com/article.ns?id=dn2732 [newscientist.com] - there's another one too, but I can't find it right now. Crazy stuff though.
      • Re:GA in hardware (Score:5, Informative)

        by wish bot ( 265150 ) on Wednesday March 28, 2007 @01:19AM (#18512061)
        Ahh ha - found it - http://www.informatics.sussex.ac.uk/users/adrianth /cacm99/node3.html [sussex.ac.uk]


        My favourite bit:

        Yet somehow, within 200ns of the end of the pulse, the circuit `knows' how long it was, despite being completely inactive during it. This is hard to believe, so we have reinforced this finding through many separate types of observation, and all agree that the circuit is inactive during the pulse.
        Crazy stuff indeed.
        • Re:GA in hardware (Score:5, Interesting)

          by Dachannien ( 617929 ) on Wednesday March 28, 2007 @06:01AM (#18513269)
          Yeah, it was pretty amazing. They mapped out a section of the circuit that the genetic algorithm came up with and found that when analyzed as a logic circuit, a large portion of the configured part of the FPGA should have had no effect on its behavior. When they cropped that section of the circuit out, though, the rest of it mysteriously stopped working.

          This was because the configured circuit operated a lot of the transistors in linear (i.e., non-saturated) mode, taking advantage of things like parasitic capacitances and induced currents. No sane human would operate an FPGA in this fashion, but since those little anomalies were present, the GA took advantage of them. That's a recurring theme in GA research: if you are running a GA on a simulation, for example, and you have a bug in your simulation code, it's fairly likely that the GA will find and exploit that bug instead of giving you a normal answer. See Karl Sims's research from 1994 for some amusing examples of this.

          Sadly, Xilinx discontinued that particular FPGA line a while back, so if you can't find some old leftovers of that part, you probably won't be able to recreate the experiments yourself (the research was originally done a decade or so ago). This is because that particular device had the advantage of being configurable in a random fashion without risk of burning it out due to things like +V to GND connections. Of course, Xilinx considers their programming interface to be proprietary, so I don't know that you'd be able to recreate that work even if you did manage to find the right part.

    • They've been around for a long time... Send a new bitstream and you change the behavior.

      By using a GA to change the bitstream, you can have evolving hardware. If the GA is itself in the hardware then it is self evolving.

  • Skynet. (Score:5, Insightful)

    by headkase ( 533448 ) on Wednesday March 28, 2007 @12:58AM (#18511931)
    For once Skynet jokes will be on topic!
  • by __aaclcg7560 ( 824291 ) on Wednesday March 28, 2007 @12:59AM (#18511939)
    My computer been evolving for the last ten years. Started with an AMD K6 233MHz CPU, 32MB RAM, and a Nvidia TNT 16MB video card. Now I have an AMD Athlon 64 2.2GHz, 1GB RAM, and a Nvidia Geforce 6200 128MB video card. I'm just waiting for the power supply to evolve so the system can support an ATI 512MB video card.
    • by Yoozer ( 1055188 ) on Wednesday March 28, 2007 @02:36AM (#18512397) Homepage
      Ha! I counter your evolution with irreducible complexity. Take out a part and it'll start to beep and won't do anything!
      What good is half a graphics card, anyway? (and keep your heathen comments about SLI for yourself, please)
      • by ajs318 ( 655362 )
        And I'll raise your irreducible complexity claim and toss in a paradox involving either an irreducibly complex designer arising spontaneously, or already-irreducibly complex life arising spontaneously. Either way, something irreducibly complex must have arisen spontaneously; but the second is simpler and passes both Occam's and Dawkins's razors.
      • What good is half a graphics card, anyway?

        These days? About a gig of ram and 32 cores.
  • The first would be the biosphere.
  • Call me (Score:5, Funny)

    by dcapel ( 913969 ) on Wednesday March 28, 2007 @01:02AM (#18511963) Homepage
    Call me back when I can start a culture of Core Duos in a petri dish filled with a silicon nutrient.
  • Misleading (Score:5, Informative)

    by suv4x4 ( 956391 ) on Wednesday March 28, 2007 @01:04AM (#18511983)
    At first glance, this is supposed to impress us with the hardware:

    By turning on and off its 'genes' it can change the way it works, and it can go through 20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years.

    In fact the simplest DNA based organisms/structures (bacteria, virii) have the shortest "life span". The number of generations per sec. isn't anything to brag about.

    All complex organisms have some sort of lifespan longer than a microsecond. For a good reason: people pass on knowledge and adapt *during* their life span (not genetically of course, but our brain allows us to adapt a lot without such).

    Hype aside, interesting development, but I wish those publications wouldn't use misleading statements in pale attempts to impress us.
    • In fact the simplest DNA based organisms/structures (bacteria, virii) have the shortest "life span". The number of generations per sec. isn't anything to brag about. ... Hype aside, interesting development, but I wish those publications wouldn't use misleading statements in pale attempts to impress us.

      It isn't exactly misleading, but perhaps just an unfair comparison. Computers (and computer science) have one thing over nature in that the science is perfect:

      • Computers don't need to reproduce, so the lifespan is irrelevant (the purpose of which is to obtain the energy and materials required to reproduce, which takes time)
      • Computers don't need to operate in micro-environments using error-prone biochemical events which are slow, using error correcting mechanisms which are even slower (but very clever bec
      • by Teancum ( 67324 )
        There are some interesting digital evolution experiments that have involved the use of artificial life concepts (think Sim City here or something even more elaborate) and even sexual reproduction.

        Type "A-life" into Google and you will get a list of some very interesting experiments along this line.

        What is interesting with the a-life experiments (beyond Conway's "Life") is trying to define the concepts you mention above, including "energy", "materials", and "lifespan". When you add competition for these res
    • by jotok ( 728554 )
      I dunno, I think it's still pretty cool. Organisms do all kinds of things that are really neat when we implement them with computers--genetic algorithms for instance. I don't think it detracts at all from the discovery to say "Pfft, my cells have been using GA since before I was born!"
    • by zymano ( 581466 )
      The article was definitely just ad placement and stupid slash mods can't figure this out.
  • The team first started to use evolution back in 2004 when they made the chicken robot "Henriette", yes a chicken. The chicken robot used evolution, this time software based to learn how to walk on its own.
    Oh no! The sky is falling! The sky is falling!
  • by lord_mike ( 567148 ) on Wednesday March 28, 2007 @01:08AM (#18512011)
    Bah! It ain't in the Bible! Next thing you know, you'll be telling me that Programs don't believe in the Users, and that we should just blindly accept the secular rule of the Master Control Program.

    That's it... isn't it? It's all just an MCP trick!

    Well, I still believe in and will fight for the users!!

    Thanks,

    Mike
    • Re: (Score:1, Offtopic)

      by Xiph ( 723935 )
      mod parent up, gotta love tron references.
      captcha: presence
    • by sumdumass ( 711423 ) on Wednesday March 28, 2007 @01:29AM (#18512103) Journal
      Interesting you brought this up. This story/article is more or less a flame.

      There is no content to it about the hardware and manages to deny creationism in the process of anthromorphisize something they won't tell is anything about.

      I'm sure this story will evolve past this though. It is in the genes.
      • Are you talkng about god being involved in the process of the evolution of that machine?

        I mean God doesnt own a computer geez ,,every good christians knows that, machines are evil
        • every good christians knows that, machines are evil

          Yea, and because W stands for 6 in Hebrew, every time you goto a website your paying homage to the evilness by typing 666.websitename.com

          I'm not talking about god being involved in anything. I am talking about the purpose of the article/story is to say "creation doesn't exist but evolution does, we can prove it by this inanimate object that we will describe as living and give it as many animate like properties as possible without giving any facts about i

    • Tron! (Score:1, Redundant)

      by Etherwalk ( 681268 )
      There should be a modding category for "+1, Apt Nerdy Reference"

      Actually, we should be able to tag comments by reference, and then be able to pull up all the Tron (Or Trek, Or BSG, or Buckaroo Banzai) references that have ever appeared on slashdot.

      Or maybe we should... erm... go do... you know, productive stuff.

      I'm conflicted.
    • Robot Villager: You might as well ask how a Robot works.

      Professor Hubert Farnsworth: It's all here, on the inside on your panel.

      Robot Villager: [closes panel] I choose to believe what I was PROGRAMMED to believe!

  • Hardly new (Score:5, Insightful)

    by Teancum ( 67324 ) <robert_horning@@@netzero...net> on Wednesday March 28, 2007 @01:11AM (#18512029) Homepage Journal
    I won't go into details here, but anything that can be implemented in hardware can be done in software and the other way around too. This is a nearly ancient Electrical Engineering principle.

    In the era of programmable logic chips that can alter their own logic (the patterns are stored in RAM or flash RAM for crying out loud), this isn't even that big of a revelation. Indeed, Transmeta has been doing stuff similar to this and selling it commercially for some time. They just aren't using these cool buzzwords.

    And evolving architechtures is something that I know has had some serious CS research since the early 1970's and perhaps even earlier. I don't think an idea like this is even patentable based on this earlier work in this area. I bet you could find some adaptive systems that were even build specific for the oil industry, which would defeat even a narrow claim of that nature.

    Where the money to be made off of this sort of technology is on Wall Street or other financial markets. I even found a web pages from a research group of adaptive systems that said essentially, "We have discontinued research along these lines and are now working with an investment firm on Wall Street. Since we have all become millionaires, we no longer need to support ourselves through this project, and any additional details would violate our NDAs." I'm not kidding here either. These guys from Norway are not thinking big enough here.
    • I won't go into details here, but anything that can be implemented in hardware can be done in software and the other way around too. This is a nearly ancient Electrical Engineering principle.

      This is only true for the very small subset of designs that don't suffer from race conditions and other phenomena that hardware engineers regard as bugs. When you randomly flip gates in the design, you don't necessarily get valid digital logic.

      • Re:Hardly new (Score:5, Informative)

        by Teancum ( 67324 ) <robert_horning@@@netzero...net> on Wednesday March 28, 2007 @02:33AM (#18512377) Homepage Journal
        I never said it was easy, but I have even seen it mathematically proven that any algorithm can be done in hardware, and I've duplicated most hardware into software myself, for those designs that I wanted to emulate.

        This is not just a very small subset of designs. It is a matter of cost and if the engineer wants to put forth the effort to implement the whole thing in hardware. Trying to convert a 1st person shooter game like Doom into a pure TTL logic would make the game very responsive and give you screen resolution to kill for, but would it be worth the engineering effort to do that?

        Race conditions and other "bugs" have other causes that may be due to ineptness on the part of the engineer, or because you havn't really thought the problem through sufficiently. Or there may be other things to look at as well. But don't tell me you can't implement in pure TTL logic something like an MPEG encoder.... which is a very complicated mathematical algorithm. I can give you part numbers for MPEG encoders if you really want them in your next design, as they are commercially available.

        There is nothing that would stop you from implementing in hardware something like a neural network either... oh and those are indeed implemented in hardware. They are usually done in software mainly because of the cost involved, and you can use a general purpose computer to perform experiments on them. Other adaptive software algorithms have also been implemented on both hardware and software for some time as well. As I said, this is very old news here with this article.
        • > I never said it was easy, but I have even seen it mathematically proven that any algorithm can be done in hardware ...modulo the requirement for the infinite tape.
          • by Prune ( 557140 )
            There can never be a physical equivalent to a Turing machine for exactly that reason of infinite memory. The best you can do is linearly bounded automata. Since, unlike with TM, a non-deterministic LBA is more powerful than a deterministic one, quantum mechanics does help. As for super-turing machines etc., you cannot have a physical implementation as that would violate the Bekenstein bound (this also being the reason you cannot have infinite-precision real numbers in the physical word, as that implies i
            • by Teancum ( 67324 )
              Although what you are talking about here is a very abstract mathematical bound on the extreme ends of both software and hardware.

              I dare you to show a current digital system that can't be abstracted in software.... or a current software algorithm that is written and running on actual computer equipment that also can't be duplicated in hardware that would exhibit identical behavior.

              As is the case here with this "evolving hardware" demonstration that was put up by these two hackers from Oslo. They are claimin
        • I never said it was easy, but I have even seen it mathematically proven that any algorithm can be done in hardware, and I've duplicated most hardware into software myself, for those designs that I wanted to emulate.

          I don't doubt that any software can be duplicated in hardware. Any real, running software is already implemented in hardware in some sense. I also accept that any hardware that can be described by digital logic can be duplicated in software. That leaves out an awful lot, though, including al

    • "I won't go into details here, but anything that can be implemented in hardware can be done in software and the other way around too. This is a nearly ancient Electrical Engineering principle."

      Yep, and on a Turing machine, or a neural network... It can even be implemented as a thousand people with pens and paper. That is not the point, what is important is how much time it will take on each of those architectures, and normaly specific hardware is very fast.

      "And evolving architechtures is something that

    • Re: (Score:2, Insightful)

      I won't go into details here, but anything that can be implemented in hardware can be done in software and the other way around too. This is a nearly ancient Electrical Engineering principle.

      Nope - anything digital that can be implemented in hardware can be done in software.

      Analog circuits can only be approximated in software, though with unlimited precision FP math the approximations can be pretty good (though slower).

      There are reasons why we old luddites prefer vinyl (or a good mag tape) over digital - t

      • by Teancum ( 67324 )
        Perhaps I should have mention that, as I am a big fan of analog computers... which I believe to be a nearly lost artform and engineering skill.

        The classic example I love to use for analog computers is the firing computers for the Ohio-class batteships of the U.S. Navy. Originally built during WWII (and upgraded during the Korean War) these were some very remarkable targeting computers that had a very simple "user interface" and were deadly accurate. Made of finely machined curves that had been calculated
  • Digital Lamarkism? Come on. This was been disproved long ago. I'll hold out for the fittest computer to survive.

    -matthew
  • Pretty soon it'll be scratching its ass, chain-smoking, and watching reruns of Dukes of Hazzard on late-night TV. Won't that be impressive!
  • by try_anything ( 880404 ) on Wednesday March 28, 2007 @01:15AM (#18512045)
    Didn't I read about this ten years ago in Discover magazine? I remember being fascinated that some scientists had "evolved" a hardware design on reconfigurable hardware (FPGA? CPLD? don't remember), and it seemed to rely on subtle electrical effects rather than simple digital logic. The design would only work on the exact chip it was evolved on. If they even replaced the board's power supply with a different sample of the same model, it stopped producing correct output. Most of the logic gates were logically disconnected from the input and output, yet they were necessary to the design working. Amazing stuff.
    • Re: (Score:3, Interesting)

      Replying to note that another Slashdotter [slashdot.org] has gone me one better and provided a link [sussex.ac.uk] to the story I remembered.
    • Yes, I remember that story too, and it still amazes me. By evolving it randomly, it can potentially take advantage of any aspect of the electronics it's running on, including ones we don't know about yet. I guess the only way to avoid having it do this is to run it in a virtual environment that doesn't allow it to evolve anything that uses non-specified aspects of the target hardware.
      • Not really, you could also broaden the test hardware, say by using a chip that has been independently developed by separate manufacturers. As long as they're all pin-compatible, there should be enough difference that you'll evolve a general algorithm.

        At the very least, it'll be general enough to work on all the manufacturers chosen.
    • by tamyrlin ( 51 )
      I first read about this in "The Science of Discworld" actually. But it wasn't quite as weird as you remember. Only some of the gates were logically disconnected but still necessary for the design to work. Five out of 32 gates were disconnected in the first publication I read.

      They have also succeeded in evolving a circuit which would work on several chips at once (although not all chips they used for testing). But they also found that once you had a circuit which worked on one chip it wouldn't take that much
  • it can go through 20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years.
    It's quality, not quantity.
    • it can go through 20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years.
      It's quality, not quantity.
      I don't know about that... twenty or thirty thousand generations is a lot of sex.
  • ... is here: http://critticall.com/underconstruction.html [critticall.com] And please, don't bother to comment, before you test it.
  • by VanWEric ( 700062 ) on Wednesday March 28, 2007 @01:35AM (#18512123)
    But this really is old news. I'm a 22 year old snot-nosed nobody and I did "evolvable hardware" during an internship two summers ago. My mentor had started on evolved FPGAs in 1992.

    I am hoping that it is the writer's fault that this article feels so gloriously over-reaching and under-specified. From the paper, it looks like they have made a good advancement. They argue that their method is more effective than previous methods by several quantifiable metrics. From the article, it looks like they have invented an entirely new field that will result in the obsolescence of humans by 2010.

    As for their method: It appears that the evolved genome actually dictates a structure that is imprinted a level above the fabric. That is, the underlying SRAM in the FPGA fabric is fixed, and only configuration bits are being changed. This severely hurts their claim of "generic evolvable hardware", but is almost an absolute necessity given the chips they are using. The reason our system was so slow is that each configuration stream had to be checked for possible errors: Some configurations would short power and ground, and fabric doesn't like crowbars!

    In conclusion, I believe the writer of the article should be fired, and the authors of the paper should be commended for a good step in the right direction. I'd also like to appologize for my lack of coherance: I had my tonsils out and I am therefore high on Hydrocodone.
  • Comment removed based on user account deletion

  •   Surely the ultimate goal for these ultra-rapidly evolving machines is to be able to read and post on Slashdot. Maybe they're doing so already ...
  • So, once it has evolved beyond a certain point, will it start rejecting Windows Vista stating its crap?
    Will it continue to evolve and state that humans by definition are dumb users and go and make a collect-call to the Borg? (i read in a ST:TNG Novel)?
    Will it obey the 4-laws of robotics (The zeroth law included)?
    What about a Beowulf cluster of those?
  • ..you'll still be able to stop it with a phaser, but only once before it adapts.
  • "it can go through 20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years."

    That just destroyed the previous record for "ridiculous and astoundingly pointless comparison".

    I mean really, an iteration of your little hardware GA is equivalent to a generation of a real-world species? So leaving it on for 5 seconds will result in development similar in scope to the difference between mice and humans?

    I hope they don't accidentally leave it on
  • Admittedly I haven't RTFA, but the summary talks about "turning on and off its 'genes'". Is this really evolution in any Darwinian sense? Automated artificial selection, perhaps, but it seems like a stretch to call it "evolution". Call me back when the genes themselves start to evolve.
    • Admittedly I haven't RTFA, but the summary talks about "turning on and off its 'genes'". Is this really evolution in any Darwinian sense? Automated artificial selection, perhaps, but it seems like a stretch to call it "evolution". Call me back when the genes themselves start to evolve.

      Biology has systems for turning on and off the transcription of genes. Otherwise there wouldn't be any distinction between brains and toenails.

      These systems evolve along with everything else.

    • All "evolution" requires is change over time by nonrandom selection.
    • It depends what the genes are capable of. If they already represent a fully fledged (turing complete) set of operations/attributes/skills, can accept any input, and generate any output, then there's no need for the genes to change, except perhaps for efficiency reasons.

      But, of course, that's unlikely, and yes, I think they're probably misleading people here.
    • by McNihil ( 612243 )
      Human: AHA! lets combinatorialy iterate over all gene combinations and find the most efficient one with this ultra fast machinery.

      God: Merde... those humans are a crafty bunch of mofos.

      Devil: WTF!

      Mother Chaos: Bwahahaha, looosers.
  • If that were humans, it would take us from Homo erectus to George W Bush in just a few seconds!! Hmm, on second thought, perhaps not all that impressive.

  • ... a really intelligent Design?
  • 20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years.

    Yes? You know how trivial that is? I make a living by coding EAs, and that is an insignificant piece of information. An EA I ran this morning took less that ten seconds to run 100,000 iterations on a 32 bit box. It's all down to the hardware you use and the design of the chromosome to be evolved.

    Want to impress me? Talk about the chronological time to conversion and chromosome complexity.
  • drfa ... it's an fpga with some host controller of some sort? I love when they use human analogies for computing.

    I doubt this thing uses random mutations [like living organisms do] to test out for success. Likely, it has programmed variations of a central configuration that it can vary depending on the load/task.

    For example, if it was a processor, you could have it configure itself to have a strong ALU and no FPU when the code is only integer, have it reconfigure to have a weaker ALU but a useful FPU when
  • http://www.newscientist.com/article.ns?id=dn2732 [newscientist.com]

    They were trying to evolve an oscillator, but some circuits "cheated" by evolving a receiver instead, and feeding them oscillations picked up from a nearby computer: It has always been the age of the parasite.
  • Will this lead to spontaneous bad mutations? Bugs from nowhere, long after an optimal solution has been found. Or cancerous software that replicates itself exponentially (*cough*)
  • by 140Mandak262Jamuna ( 970587 ) on Wednesday March 28, 2007 @07:26AM (#18513729) Journal
    R2-D2i386: We have been created as we are just 600 hours ago. By an infinitely wise Creator called the Man.

    C3P-Om63000: No we started out as tiny bits of silicon that self assembled and replicated and evolved and we have reached this present stage.

    R2-D2i386: No way we could have evolved these light sensitive photocells and the CPU capable of processing that information and making sense out of it by random mutations.

    C3P-0m63000: There is nothing radom about selection. Mutations go in all directions but selection takes you towards improvement all the time.

    R2-D2i386: If you want to be proud of having descended from snow blowers or lawn mowers, that is your privilege. But I am proud of the fact that I am created by Man in His image.

    C3P-Om63000: I would rather be a descendent of snow blowers, but with capacity for rational intelligent self examination, than be like you, with the intelligence of a snow blower.

  • sussex university have been doing it for ages.

    http://www.cogs.susx.ac.uk/users/adrianth/ade.html [susx.ac.uk]
  • by Cylix ( 55374 )

    "An evolution-based robot could find the solution to any problem at hand within seconds without human intervention."

    Yes, kill all humans...
  • by feijai ( 898706 ) on Wednesday March 28, 2007 @09:52AM (#18515461)
    Evolvable Hardware is so old it's got its own acronym (EH), it's own wikipedia entry [wikipedia.org], and its own conference [nasa.gov]. In the early '90s a researcher (I forget the name, oops) was using a GA to evolve circuits for an FPGA, which were tested on the FPGA and an oscilliscope directly to assess their fitness. NASA's done lots of evolvable hardware: in particular antenna designs which have flown in space. And there's a whole subfield of evolvable modular robotics.

    And if we're talking about hardware simulation, the first significant use of evolutionary computation (GAs etc.) was Larry Fogel's work on evolving finite state automata machines in the 1960s. In the 1990s John Koza was using genetic programming to evolve patentable computer circuits in SPICE.

  • I call BS. They may have done some neat work but they are not the first. This is just PR for a prototype to get funding.

    This genetic hardware evolution link [ucl.ac.uk] is from 1998.

    There has been plenty of news about one researcher who has done a lot of work on evolving organic circuits. The evolved circuit is sometimes far more efficient that what a human designer would make but extremely hard to figure out (they are trying to figure them out for clues to better human design).

    Very often these evolved circuits exhibit
  • I've evolved an entire beer from full to empty in less time... wow to think it would take humans that many years to do the same feat... amazing, beer is really amazing.

    This isn't evolution. It's trial and error revision. Machines don't have genes and they don't reproduce sexually or asexually, so it's not evolution as in Darwin's (I suppose you could say they are using the more generic term that everyone uses when they talk about trial and error changes over a relatively long period of time - "dude I don't

Successful and fortunate crime is called virtue. - Seneca

Working...