Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Robotics Hardware

The First Evolving Hardware? 148

Masq666 writes "A Norwegian team has made the first piece of hardware that uses evolution to change its design at runtime to solve the problem at hand in the most effective way. By turning on and off its 'genes' it can change the way it works, and it can go through 20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years." The University of Oslo press release linked from the article came out a few days ago; the researchers published a paper (PDF) that seems to be on this same technology at a conference last summer.
This discussion has been archived. No new comments can be posted.

The First Evolving Hardware?

Comments Filter:
  • by VanWEric ( 700062 ) on Wednesday March 28, 2007 @02:35AM (#18512123)
    But this really is old news. I'm a 22 year old snot-nosed nobody and I did "evolvable hardware" during an internship two summers ago. My mentor had started on evolved FPGAs in 1992.

    I am hoping that it is the writer's fault that this article feels so gloriously over-reaching and under-specified. From the paper, it looks like they have made a good advancement. They argue that their method is more effective than previous methods by several quantifiable metrics. From the article, it looks like they have invented an entirely new field that will result in the obsolescence of humans by 2010.

    As for their method: It appears that the evolved genome actually dictates a structure that is imprinted a level above the fabric. That is, the underlying SRAM in the FPGA fabric is fixed, and only configuration bits are being changed. This severely hurts their claim of "generic evolvable hardware", but is almost an absolute necessity given the chips they are using. The reason our system was so slow is that each configuration stream had to be checked for possible errors: Some configurations would short power and ground, and fabric doesn't like crowbars!

    In conclusion, I believe the writer of the article should be fired, and the authors of the paper should be commended for a good step in the right direction. I'd also like to appologize for my lack of coherance: I had my tonsils out and I am therefore high on Hydrocodone.
  • by try_anything ( 880404 ) on Wednesday March 28, 2007 @02:36AM (#18512131)
    Replying to note that another Slashdotter [slashdot.org] has gone me one better and provided a link [sussex.ac.uk] to the story I remembered.
  • Re:GA in hardware (Score:5, Interesting)

    by Dachannien ( 617929 ) on Wednesday March 28, 2007 @07:01AM (#18513269)
    Yeah, it was pretty amazing. They mapped out a section of the circuit that the genetic algorithm came up with and found that when analyzed as a logic circuit, a large portion of the configured part of the FPGA should have had no effect on its behavior. When they cropped that section of the circuit out, though, the rest of it mysteriously stopped working.

    This was because the configured circuit operated a lot of the transistors in linear (i.e., non-saturated) mode, taking advantage of things like parasitic capacitances and induced currents. No sane human would operate an FPGA in this fashion, but since those little anomalies were present, the GA took advantage of them. That's a recurring theme in GA research: if you are running a GA on a simulation, for example, and you have a bug in your simulation code, it's fairly likely that the GA will find and exploit that bug instead of giving you a normal answer. See Karl Sims's research from 1994 for some amusing examples of this.

    Sadly, Xilinx discontinued that particular FPGA line a while back, so if you can't find some old leftovers of that part, you probably won't be able to recreate the experiments yourself (the research was originally done a decade or so ago). This is because that particular device had the advantage of being configurable in a random fashion without risk of burning it out due to things like +V to GND connections. Of course, Xilinx considers their programming interface to be proprietary, so I don't know that you'd be able to recreate that work even if you did manage to find the right part.

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...