Teaching Robots New Tricks Without Programming 42
cylonlover writes "Maya Cakmak, a researcher from Georgia Tech, spent the summer at Willow Garage creating a user-friendly system that teaches the PR2 robot simple tasks. The kicker is that it doesn't require any traditional programming skills whatsoever – it works by physically guiding the robot's arms while giving it verbal commands. After inviting regular people to give it a try, she found that with few instructions they were able to teach the PR2 how to retrieve medicine from a cabinet and fold a t-shirt."
The Program is Right There in the Article. (Score:5, Insightful)
"Without programming?"
Bullshit. Look in the article, in the picture in the article.
Program's right there, on the right side. [gizmag.com]
"Test subjects were provided instructions on how to teach the robot similar to what you'd expect when buying a sophisticated appliance."
"Tutorial: Programming PR2 by Demonstration."
"Step 1... Say: 'TEST MICROPHONE'."
"Step 2... Say: 'RELEASE RIGHT ARM.' ... Move the arm to a neutral pose and say HOLD RIGHT ARM."
If this isn't programming, then I'm not a programmer. Instead, I'm just someone who manipulates a text editor.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Getting a robot to write it's own program has been done for a long time (10 years at least). This isn't even a new way of doing it, they are just using this guys code http://www.eejournal.com/archives/fresh-bytes/baxter-is-the-humanoid-robot-you-can-teach-without-programming/ [eejournal.com] .
Can Baxter (feminine of Baker) make twinkies?
Re: (Score:2, Insightful)
The point is that this is an advancement in Human Robot Interaction, not that they did away with programming. I blame the article and slashdot for this misleading premise.
Re: (Score:2)
The difference is quite clear if you see the extremes, that would be like asking what is the difference between white and black when we are observing shades of grey.
So the difference, as a teacher vs programmer , is your level of expectation toward your student / compiler
There is a joke that goes like "this damn compiler is doing exactly what I tell it to instead of what I want it to"
For the student, you do expect it to get what you mean instead of what you say
But of course, t
Re: (Score:3)
"Step 1... Say: 'TEST MICROPHONE'."
Step 2... Calm down, Say: 'Test Microphone' again
Response (Score:1)
First (Score:1)
Its still programming (Score:5, Insightful)
its just the method that has changed
Corrected title... (Score:5, Insightful)
Teaching Robots New Tricks With Non-Traditional Programming
There, fixed that for you.
Re:GOOD! (Score:4, Insightful)
Isn't that still an important and useful qualitation, since the vast, vast majority of people can't, shouldn't and don't fucking want to write code? And, I would argue, nor should they ever need to. Writing arbitrary invented languages, with awkward syntax and extremely-non-human thought-structures, to accomplish esoteric tasks has never been an intuitive or optimal way of getting shit done.
Trust me, everybody would loooooooove for the computer to take instructions like a human but it's not going to happen because of everything that's implicitly understood. So you can teach this computer to fold a shirt, if you hand it an XS shirt and an XXL shirt will it figure out that it must adapt the folding action to the size of the shirt? I bet you any 5yo would figure that out all on their own because they've understood the basic concept of folding a shirt. Take a fundamental sentence like "put the black and white pants on the top shelf" did we mean the black pants and the white pants, or the black and white checkered pants?
All that happens is that some really smart people will try really hard to write code that guesses what it was people actually meant but without actually knowing the context and purpose they'll fail miserably. Not to mention all the times they'd have to guess at do what I meant, not what I said because normal people when facing a choice between the reasonable and the absurd pick the reasonable like. Like say you have a knife and a chicken and you ask what to do with the knife and they answer "Cut the chicken to pieces and put it in the oven" most people will understand that you're to put the chicken in the oven, not the knife - even though you didn't ask what to do with the chicken.
Or the TL;DR version: Good luck, I don't think we'll be unemployed any time soon.
Re: (Score:3)
So... how did that 5yo come to "implicitly understand" so much that you never had to write code to teach her how to adapt the folding action to the size of the shirt? DNA defines how to grow a brain, not really how it will understand the world it encounters, how it will respond to that world, or the methods of thought internally used to process either of those things. Is there really any reason why artificial creatures shouldn't follow biology's lead in the whole "learning" thing?
Actually I'd say that's a pretty complex question how much the brain is "preprogrammed" by DNA, clearly all the inputs like sight, hearing, taste, smell and touch are hooked up in some fashion with some form of processing, some basic output like crying, a lot of reflexes and instincts and possibly also knowledge are considered innate and studies on twins vs siblings vs half-siblings vs adopted have shown considerable correlation on "how it will understand the world it encounters, how it will respond to that
Yes, that's the hard problem (Score:2)
Trust me, everybody would loooooooove for the computer to take instructions like a human but it's not going to happen because of everything that's implicitly understood. So you can teach this computer to fold a shirt, if you hand it an XS shirt and an XXL shirt will it figure out that it must adapt the folding action to the size of the shirt?
Yes, that's the hard problem in learning from demonstration - working back from the demonstration to a model which can be generalized to new tasks. One way to approach this is by doing the same task with variations - guide the robot through folding various different shirts, and then use a machine learning system to separate the commonalities from the differences. There's been some progress in recent years in making this work. It's not very powerful yet, but it's getting to be good enough for teaching asse
Fold a shirt? (Score:1)
That's an amazingly complicated task. If the robot can be taught to do that, that's a pretty advanced robot. I wonder how anybody can teach a robot to fold a t-shirt unless you have a load of constraints on movements. In which case, you'd be better off folding your own t-shirt.
Re:Fold a shirt? (Score:4, Funny)
Re: (Score:2)
That's an amazingly complicated task.
http://www.youtube.com/watch?v=b5AWQ5aBjgE [youtube.com]
Re: (Score:2)
Good point. What is this "Fold a T-Shirt" task of which you speak?
Nao (Score:1)
Robot Nao from French Aldebaran Robotics does that already like it is last year's news and no tommorw Charlie..It has been know in robotics as a Gepetto programming...but then again if it doesn't come from "some_American_academic_institution" it didn't happen. Also Battle for Seatlle in 1992 started current world-wide revolution...right? Or at least something along those lines..
Great, only need a sexy skin now (Score:1)
then tons of slashdot people will want to program it to be their virtual girlfriend... I am sorry I don't have a car analogy for that yet :)
Re: (Score:1)
For a Second There (Score:2, Funny)
"Hey Robot, look at you just sitting there! It's because you don't have any programming! I'm going to sharpie a penis on your case! Ooh! Don't like that? If you had some programming you could do something about it! And you'd have the ability to not like it!"
Oh dear, now someone's probably going to arrest me for cyber-bullying...
We already have these things... (Score:2)
I'm not sure what the big deal is. We have these things already. We call them "infants" and even half-witted, mouth-breathing hillbillies can make them from the time they're about 12.
Agreed, the only innovation is the speech (Score:2)
Yeah, Common practice.
It's pretty easy to teach points by running with external software that looks for a location deviation of .001 on an axis, and then moves the robot in that direction repeatedly until there's no long a deviation vs. where the servo thinks it should be on that axis.
I never got around to more than a test program to validate the idea for the Google Touchbot, but it's quite common practice in the industry to do that sort of thing with Toshiba CA-100 and similar robot controllers. All the c
"Repeat for one to one mil...one hundred billion" (Score:1)
(Guides robot arm) "Program Move Name 'Jerk'."
"What are you doing in there?"
"Nothing!"