The First Evolving Hardware? 148
Masq666 writes "A Norwegian team has made the first piece of hardware that uses evolution to change its design at runtime to solve the problem at hand in the most effective way. By turning on and off its 'genes' it can change the way it works, and it can go through 20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years." The University of Oslo press release linked from the article came out a few days ago; the researchers published a paper (PDF) that seems to be on this same technology at a conference last summer.
I'm so, so sorry... (Score:5, Funny)
God, I am so sorry, but it needed to be said...
Re: (Score:1)
Re:I'm so, so sorry... (Score:5, Funny)
Re:I'm so, so sorry... (Score:4, Insightful)
If anything demonstrates the mental retardation that afflicts Slashdot, it is the above.
Some idiot claims that a horrifically unfunny cliché needed to be repeated. Another person points out the falsity of that claim.
The first post is marked +5 Funny; and the second, -1 off-topic.
Just think about that for a second.
People, turn off your computers. Go outside. Breathe real air. Have sex. Get girlfriends. Stop posting on Slashdot and don't come back until you have gained the social skills and sense of humour possessed by any normal human being. Do it for me; do it for yourselves; do it for everyone.
ChameleonDave
Re:I'm so, so sorry... (Score:5, Funny)
(Incidentally, this article [everything2.com] tells us that Natalie Portman comments on Slashdot are "getting old... This Natalie Portman nonsense has been going on for months; it's not funny anymore." Note that the date is Oct 24 *2000*).
Re: (Score:2)
Man, am I getting old.
-l
Re: (Score:1)
slashdot comments are about +5 for "first posts" and "cliche posts". anything reflecting any nerd/geek movie/series will be rated up if quoted in an slashdotty manner. it's disgusting.
Re: (Score:2)
Wait, let me get this straight. You're bitching about other people missing jokes on slashdot, and saying "go outside, breathe air and get laid," and this is all in the same breath that you're calling someone else cliché?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2, Insightful)
the reverse turing test ?
or
are you really dumb enough to be human test ?
Re: (Score:2)
A human generation is not 40 years. Even today most people reproduce BEFORE they are 40. Historically our ancestors reproduced in their teens and early 20's.
Re: (Score:2, Troll)
Mmmmm.... yeah, but it resulted in George Bush. If they're really serious about equivalent advantages, you could end up with an "evolved" CPU that tries to execute your software using a dialect of COBOL that is not only obsolete, but contains misspellings and incorrectly used operators... then when an error occurs, the system will insist on executing the operation anyway.
Actual George Bush quote:
Re: (Score:2)
Oh, we are the "retards", eh? Here are some more George Bush quotes. Please feel free to "explain" to us how these are all brilliant, deeply meaningful statements that we low-functioning, ignorant citizens are just "misunderstanding":
You teach a child to read, and he or her will be able to pass a literacy test.
- US President George W. Bush (2000?)
Reading is the basics for all learning.
- US President George W. Bush (Discussing his "Reading First" plan in Reston, Virginia, March 28, 2000)
Rarely is t
GA in hardware (Score:1)
Re:GA in hardware (Score:5, Informative)
Re:GA in hardware (Score:5, Informative)
My favourite bit:
Re:GA in hardware (Score:5, Interesting)
This was because the configured circuit operated a lot of the transistors in linear (i.e., non-saturated) mode, taking advantage of things like parasitic capacitances and induced currents. No sane human would operate an FPGA in this fashion, but since those little anomalies were present, the GA took advantage of them. That's a recurring theme in GA research: if you are running a GA on a simulation, for example, and you have a bug in your simulation code, it's fairly likely that the GA will find and exploit that bug instead of giving you a normal answer. See Karl Sims's research from 1994 for some amusing examples of this.
Sadly, Xilinx discontinued that particular FPGA line a while back, so if you can't find some old leftovers of that part, you probably won't be able to recreate the experiments yourself (the research was originally done a decade or so ago). This is because that particular device had the advantage of being configurable in a random fashion without risk of burning it out due to things like +V to GND connections. Of course, Xilinx considers their programming interface to be proprietary, so I don't know that you'd be able to recreate that work even if you did manage to find the right part.
They call these FPGAs (Score:2)
By using a GA to change the bitstream, you can have evolving hardware. If the GA is itself in the hardware then it is self evolving.
Skynet. (Score:5, Insightful)
Re: (Score:1)
Re: (Score:2)
Been there, done that... (Score:4, Funny)
Re:Been there, done that... (Score:5, Funny)
What good is half a graphics card, anyway? (and keep your heathen comments about SLI for yourself, please)
Re: (Score:1)
Re: (Score:2)
These days? About a gig of ram and 32 cores.
It's the second... (Score:2)
Call me (Score:5, Funny)
Re:Call me (Score:5, Funny)
Re:Call me (Score:5, Funny)
Re: (Score:2, Funny)
Misleading (Score:5, Informative)
By turning on and off its 'genes' it can change the way it works, and it can go through 20,000 - 30,000 generations in just a few seconds. That same number of generations took humans 800,000 - 900,000 years.
In fact the simplest DNA based organisms/structures (bacteria, virii) have the shortest "life span". The number of generations per sec. isn't anything to brag about.
All complex organisms have some sort of lifespan longer than a microsecond. For a good reason: people pass on knowledge and adapt *during* their life span (not genetically of course, but our brain allows us to adapt a lot without such).
Hype aside, interesting development, but I wish those publications wouldn't use misleading statements in pale attempts to impress us.
Re: (Score:2)
In fact the simplest DNA based organisms/structures (bacteria, virii) have the shortest "life span". The number of generations per sec. isn't anything to brag about. ... Hype aside, interesting development, but I wish those publications wouldn't use misleading statements in pale attempts to impress us.
It isn't exactly misleading, but perhaps just an unfair comparison. Computers (and computer science) have one thing over nature in that the science is perfect:
Re: (Score:2)
Type "A-life" into Google and you will get a list of some very interesting experiments along this line.
What is interesting with the a-life experiments (beyond Conway's "Life") is trying to define the concepts you mention above, including "energy", "materials", and "lifespan". When you add competition for these res
Re: (Score:3)
Re: (Score:2)
Re: (Score:2, Insightful)
The proper English plural of "virus" is "viruses", and this is why. Words adopted into the English language generally retain the pluralisation from the donor language, barring a significant change in meaning (which is why beetles have antennae, but radios have antennas). However, "virus" in Latin is a stuff-word, not a thing-word, and therefore does not have a plural form. (If it did, it would be "viri" [one i; "-u
Re: (Score:2)
Let me guess.. You're single, right?
Let me guess.. you're trying to insult me or something? I'm not sure, it's just too cliche and void of point at the same time.
The perfect Slashdot comment (Score:2)
And upon reading this, the luddites screamed... (Score:1)
Computer Evolution??? (Score:5, Funny)
That's it... isn't it? It's all just an MCP trick!
Well, I still believe in and will fight for the users!!
Thanks,
Mike
Re: (Score:1, Offtopic)
captcha: presence
Re:Computer Evolution??? (Score:4, Insightful)
There is no content to it about the hardware and manages to deny creationism in the process of anthromorphisize something they won't tell is anything about.
I'm sure this story will evolve past this though. It is in the genes.
Re: (Score:1)
I mean God doesnt own a computer geez
Re: (Score:2)
Yea, and because W stands for 6 in Hebrew, every time you goto a website your paying homage to the evilness by typing 666.websitename.com
I'm not talking about god being involved in anything. I am talking about the purpose of the article/story is to say "creation doesn't exist but evolution does, we can prove it by this inanimate object that we will describe as living and give it as many animate like properties as possible without giving any facts about i
Re: (Score:2)
Are you new to the internet or something? This thing made it's rounds several time in the past. I think sometime around 95 or so is when I first heard of it. It
Re: (Score:2)
And as you mentioned the doubleU (you=W) is really an extension to v to signify the way it sounded when certain words were pronounced? I have read so many things on this and thought about it for a while. I'm still not sure how to understand it all completely. But using the VV in place of the W seems to give the same sounds as you would expect with W.
Tron! (Score:1, Redundant)
Actually, we should be able to tag comments by reference, and then be able to pull up all the Tron (Or Trek, Or BSG, or Buckaroo Banzai) references that have ever appeared on slashdot.
Or maybe we should... erm... go do... you know, productive stuff.
I'm conflicted.
Futurama... (Score:2)
Robot Villager: You might as well ask how a Robot works.
Professor Hubert Farnsworth: It's all here, on the inside on your panel.
Robot Villager: [closes panel] I choose to believe what I was PROGRAMMED to believe!
Re: (Score:2)
10 SIN
20 GOTO HELL
Hardly new (Score:5, Insightful)
In the era of programmable logic chips that can alter their own logic (the patterns are stored in RAM or flash RAM for crying out loud), this isn't even that big of a revelation. Indeed, Transmeta has been doing stuff similar to this and selling it commercially for some time. They just aren't using these cool buzzwords.
And evolving architechtures is something that I know has had some serious CS research since the early 1970's and perhaps even earlier. I don't think an idea like this is even patentable based on this earlier work in this area. I bet you could find some adaptive systems that were even build specific for the oil industry, which would defeat even a narrow claim of that nature.
Where the money to be made off of this sort of technology is on Wall Street or other financial markets. I even found a web pages from a research group of adaptive systems that said essentially, "We have discontinued research along these lines and are now working with an investment firm on Wall Street. Since we have all become millionaires, we no longer need to support ourselves through this project, and any additional details would violate our NDAs." I'm not kidding here either. These guys from Norway are not thinking big enough here.
Re: (Score:2)
This is only true for the very small subset of designs that don't suffer from race conditions and other phenomena that hardware engineers regard as bugs. When you randomly flip gates in the design, you don't necessarily get valid digital logic.
Re:Hardly new (Score:5, Informative)
This is not just a very small subset of designs. It is a matter of cost and if the engineer wants to put forth the effort to implement the whole thing in hardware. Trying to convert a 1st person shooter game like Doom into a pure TTL logic would make the game very responsive and give you screen resolution to kill for, but would it be worth the engineering effort to do that?
Race conditions and other "bugs" have other causes that may be due to ineptness on the part of the engineer, or because you havn't really thought the problem through sufficiently. Or there may be other things to look at as well. But don't tell me you can't implement in pure TTL logic something like an MPEG encoder.... which is a very complicated mathematical algorithm. I can give you part numbers for MPEG encoders if you really want them in your next design, as they are commercially available.
There is nothing that would stop you from implementing in hardware something like a neural network either... oh and those are indeed implemented in hardware. They are usually done in software mainly because of the cost involved, and you can use a general purpose computer to perform experiments on them. Other adaptive software algorithms have also been implemented on both hardware and software for some time as well. As I said, this is very old news here with this article.
Re: Hardly new (Score:3, Funny)
Re: (Score:1)
Re: (Score:2)
I dare you to show a current digital system that can't be abstracted in software.... or a current software algorithm that is written and running on actual computer equipment that also can't be duplicated in hardware that would exhibit identical behavior.
As is the case here with this "evolving hardware" demonstration that was put up by these two hackers from Oslo. They are claimin
Re: (Score:2)
I don't doubt that any software can be duplicated in hardware. Any real, running software is already implemented in hardware in some sense. I also accept that any hardware that can be described by digital logic can be duplicated in software. That leaves out an awful lot, though, including al
Re: (Score:2)
This would be something interesting, although I would have to point out that a compiler would produce just as much bloat in circuitry that happens in opcodes. In spite of some claims that a "hand-assembled" piece of software is less efficient than something run through a compiler, I have my extreme doubts about that concept. And moving down to a hardware level, I think you could improv
Re: (Score:2)
Yep, and on a Turing machine, or a neural network... It can even be implemented as a thousand people with pens and paper. That is not the point, what is important is how much time it will take on each of those architectures, and normaly specific hardware is very fast.
Re: (Score:2, Insightful)
Nope - anything digital that can be implemented in hardware can be done in software.
Analog circuits can only be approximated in software, though with unlimited precision FP math the approximations can be pretty good (though slower).
There are reasons why we old luddites prefer vinyl (or a good mag tape) over digital - t
Re: (Score:2)
The classic example I love to use for analog computers is the firing computers for the Ohio-class batteships of the U.S. Navy. Originally built during WWII (and upgraded during the Korean War) these were some very remarkable targeting computers that had a very simple "user interface" and were deadly accurate. Made of finely machined curves that had been calculated
Lamarckism? (Score:2)
-matthew
Cool. (Score:2)
Re: (Score:2)
http://www.cmt.com/shows/dyn/dukes_of_hazzard/ser
There's an evolution joke involving Daisy Duke in here somewhere...
Not first, if my memory is correct (Score:3, Insightful)
Re: (Score:3, Interesting)
Re: (Score:1)
Re: (Score:2)
At the very least, it'll be general enough to work on all the manufacturers chosen.
Re: (Score:2)
They have also succeeded in evolving a circuit which would work on several chips at once (although not all chips they used for testing). But they also found that once you had a circuit which worked on one chip it wouldn't take that much
Yes but... (Score:2)
Re: Yes but... (Score:2)
The real thing ... (Score:1)
Not to be a KillJoy... (Score:5, Interesting)
I am hoping that it is the writer's fault that this article feels so gloriously over-reaching and under-specified. From the paper, it looks like they have made a good advancement. They argue that their method is more effective than previous methods by several quantifiable metrics. From the article, it looks like they have invented an entirely new field that will result in the obsolescence of humans by 2010.
As for their method: It appears that the evolved genome actually dictates a structure that is imprinted a level above the fabric. That is, the underlying SRAM in the FPGA fabric is fixed, and only configuration bits are being changed. This severely hurts their claim of "generic evolvable hardware", but is almost an absolute necessity given the chips they are using. The reason our system was so slow is that each configuration stream had to be checked for possible errors: Some configurations would short power and ground, and fabric doesn't like crowbars!
In conclusion, I believe the writer of the article should be fired, and the authors of the paper should be commended for a good step in the right direction. I'd also like to appologize for my lack of coherance: I had my tonsils out and I am therefore high on Hydrocodone.
Human obsolescence by 2010? (Score:1)
Re: (Score:1)
(Five is right out.)
Re: (Score:2)
Re: (Score:2)
Greetings to the Machina (Score:2)
Surely the ultimate goal for these ultra-rapidly evolving machines is to be able to read and post on Slashdot. Maybe they're doing so already
So will it reject Windows... (Score:2)
Will it continue to evolve and state that humans by definition are dumb users and go and make a collect-call to the Borg? (i read in a ST:TNG Novel)?
Will it obey the 4-laws of robotics (The zeroth law included)?
What about a Beowulf cluster of those?
Once it starts assimilating stuff around it... (Score:2)
Hey, nice! (Score:2)
That just destroyed the previous record for "ridiculous and astoundingly pointless comparison".
I mean really, an iteration of your little hardware GA is equivalent to a generation of a real-world species? So leaving it on for 5 seconds will result in development similar in scope to the difference between mice and humans?
I hope they don't accidentally leave it on
Activating genes is evolution? (Score:2)
Re: Activating genes is evolution? (Score:2)
Admittedly I haven't RTFA, but the summary talks about "turning on and off its 'genes'". Is this really evolution in any Darwinian sense? Automated artificial selection, perhaps, but it seems like a stretch to call it "evolution". Call me back when the genes themselves start to evolve.
Biology has systems for turning on and off the transcription of genes. Otherwise there wouldn't be any distinction between brains and toenails.
These systems evolve along with everything else.
Re: (Score:1)
Re: (Score:2)
But, of course, that's unlikely, and yes, I think they're probably misleading people here.
Re: (Score:1)
God: Merde... those humans are a crafty bunch of mofos.
Devil: WTF!
Mother Chaos: Bwahahaha, looosers.
Impressive (Score:2)
Isn't that just... (Score:1)
oh rlly? (Score:2)
Yes? You know how trivial that is? I make a living by coding EAs, and that is an insignificant piece of information. An EA I ran this morning took less that ten seconds to run 100,000 iterations on a 32 bit box. It's all down to the hardware you use and the design of the chromosome to be evolved.
Want to impress me? Talk about the chronological time to conversion and chromosome complexity.
let me guess (Score:2)
I doubt this thing uses random mutations [like living organisms do] to test out for success. Likely, it has programmed variations of a central configuration that it can vary depending on the load/task.
For example, if it was a processor, you could have it configure itself to have a strong ALU and no FPU when the code is only integer, have it reconfigure to have a weaker ALU but a useful FPU when
Heres some evolving hardware from a few years back (Score:1)
They were trying to evolve an oscillator, but some circuits "cheated" by evolving a receiver instead, and feeding them oscillations picked up from a nearby computer: It has always been the age of the parasite.
Argh (Score:2)
A predicted conversation in the future ... (Score:3, Funny)
C3P-Om63000: No we started out as tiny bits of silicon that self assembled and replicated and evolved and we have reached this present stage.
R2-D2i386: No way we could have evolved these light sensitive photocells and the CPU capable of processing that information and making sense out of it by random mutations.
C3P-0m63000: There is nothing radom about selection. Mutations go in all directions but selection takes you towards improvement all the time.
R2-D2i386: If you want to be proud of having descended from snow blowers or lawn mowers, that is your privilege. But I am proud of the fact that I am created by Man in His image.
C3P-Om63000: I would rather be a descendent of snow blowers, but with capacity for rational intelligent self examination, than be like you, with the intelligence of a snow blower.
First? No. (Score:2)
http://www.cogs.susx.ac.uk/users/adrianth/ade.htm
Hrmmm (Score:2)
"An evolution-based robot could find the solution to any problem at hand within seconds without human intervention."
Yes, kill all humans...
Evolvable Hardware is *very* old news (Score:3, Informative)
And if we're talking about hardware simulation, the first significant use of evolutionary computation (GAs etc.) was Larry Fogel's work on evolving finite state automata machines in the 1960s. In the 1990s John Koza was using genetic programming to evolve patentable computer circuits in SPICE.
They are NOT the first (Score:2)
This genetic hardware evolution link [ucl.ac.uk] is from 1998.
There has been plenty of news about one researcher who has done a lot of work on evolving organic circuits. The evolved circuit is sometimes far more efficient that what a human designer would make but extremely hard to figure out (they are trying to figure them out for clues to better human design).
Very often these evolved circuits exhibit
Number of evolutions per minute.... (Score:2)
This isn't evolution. It's trial and error revision. Machines don't have genes and they don't reproduce sexually or asexually, so it's not evolution as in Darwin's (I suppose you could say they are using the more generic term that everyone uses when they talk about trial and error changes over a relatively long period of time - "dude I don't
Re: (Score:2)
evolution & ID (Score:1)