Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Robotics Software Technology

Denver Couple Unveils Homemade Service Robot 140

An anonymous reader writes "Jim & Louise Gunderson, owners of a Denver-based computer software tool development company, have finally unveiled their autonomous robot, Basil. Basil is completely home built, runs Linux with some instructions in Java, uses a sonar-based 'reification' logic system, and can go get you a beer or a pot of tea. Quoting: 'The plan is this: The Gundersons will ask Basil to go to the bar, request a couple of stouts from the bartender, and then, once they're placed on the titanium tray perched on his head, bring them back to his creators. They haven't told him how to do this — there's no set script in his processors that tells him to roll a certain distance southwest, speak a certain command, then come back. He'll have to figure it all out on his own, using a basic knowledge of bars and beers and so on, reasoning skills and an ability to understand certain parts of the world. When his sonars capture the image of a person, for example, he knows it's a person, not just a nameless object to be avoided. And he knows that, in this case, that person wants a beer.'"
This discussion has been archived. No new comments can be posted.

Denver Couple Unveils Homemade Service Robot

Comments Filter:
  • FAAAAAKKKEE (Score:5, Insightful)

    by BadAnalogyGuy ( 945258 ) <BadAnalogyGuy@gmail.com> on Sunday December 21, 2008 @12:03PM (#26191971)

    "I recognize a person 69cm away"
    "I recognize a wooden chair"

    Right. Using sonar, the robot is able to determine the composition of the chair.

    Given that the robot's speech patterns are not broken at all, and that it speaks in complete sentences, it seems more likely that this is a blinkenlites contraption with a very human person controlling it the whole time.

    • Re:FAAAAAKKKEE (Score:5, Insightful)

      by Baron_Yam ( 643147 ) on Sunday December 21, 2008 @12:19PM (#26192069)

      I won't disagree that it's fake, but I expect the sonar return is qualitatively affected by the type of surface it hits.

      Even my human ears can tell the difference between some types of wall coverings based on ambient sound reflections.

      In short, I'd want an expert in sonar to call bullshit on this one before I definitively choose sides.

      • Differences in the sound reflected in your example are because of the differences in frequency in the originating sound. The sound rangefinding they're using uses one specific frequency and is going to be pretty darn close. Unless they're also using laser rangefinding as well to compare the difference, I don't think there's anyway to distinguish what sort've material an object is made of with just sound.
        • The amount of sound reflected would vary depending on the material it bounced off. Wooden chairs are quieter than metal chairs at any frequency. In principle I see no reason why single-frequency sonar couldn't tell the difference between a typical metal chair and a typical wooden chair. The time gives the range and the intensity of the return gives a hint about its composition. I'm sure it's just trained on the chairs it knows and has pretty limited experience, so would get confused about chairs which were

        • NOOOOT FAAAAAKE! (Score:3, Interesting)

          by d3ac0n ( 715594 )

          I don't think there's anyway to distinguish what sort've material an object is made of with just sound.

          Modern military-grade sonar can EASILY tell materials just by the sound quality bounceback. So can whales, dolphins, bats and pretty much any creature with ears, including humans.

          Try this: Walk into an empty room with sheetrock walls and a wood floor and clap your hands. Now do it in a similar room with a tile floor and wood paneling on the walls. Now an all-concrete cinderblock room. You will notice

      • Re:FAAAAAKKKEE (Score:5, Informative)

        by DriedClexler ( 814907 ) on Sunday December 21, 2008 @03:37PM (#26193523)

        Even my human ears can tell the difference between some types of wall coverings based on ambient sound reflections.

        Oh, there's a lot more potential for you than that. Humans actually be trained in echolocation [wikipedia.org]. Blind people even pick it up, thinking they're using their face for it, and so it's been called "facial vision".

      • Re: (Score:1, Insightful)

        by davester666 ( 731373 )

        Evidently, my mind is in the gutter, as when I read "service robot", I did not think of the kind of service that the summary actually discusses.

      • by TheLink ( 130905 )

        "I'd want an expert in sonar to call bullshit on this one"

        Like a dolphin? They can most certainly tell.

        The dolphin could give you an ultrasound and see if something is seriously wrong with your insides...

        Or, pick a choice internal organ to ram and damage:

        http://www.telegraph.co.uk/earth/earthnews/3323070/Killer-dolphins-baffle-marine-experts.html [telegraph.co.uk]

        Our sonar of course is a lot crappier.

      • by fugue ( 4373 )
        Sailors and fishermen sometimes prefer analog sonar depth sounders to the digital ones because the analog ones show a qualitative picture of the shape of the bounce, rather than just its peak. This means that they can help you to guess what the seabed is made of--the bounce from mud looks different from that from stones, from sand, from a shoal of fish. And the digital ones are getting more sophisticated, and giving more information than just the peak of the reflection, allowing them to make those guesses
    • Re: (Score:3, Insightful)

      by Zironic ( 1112127 )

      From reading the article it seems to think every object with 4 feet and a straight back is a wooden chair and all the voices are probably prerecorded. It's not like it can invent new abstract objects on it's own.

    • Re: (Score:3, Insightful)

      by juiceboxfan ( 990017 )

      Right. Using sonar, the robot is able to determine the composition of the chair.

      That's a bit cynical. While it's unlikely this thing is as autonomous as they would like us to believe there may be an explanation for the "detailed" description of the objects. Perhaps it was taught that an object of that height/width is a "wooden chair". And, much as a young child will run around and point at any small animal and say "doggy!" no matter what type of animal it is, anything about that size and shape is recognized as a "wooden chair".

      Without more information it's hard to say for sure.

    • Re: (Score:3, Interesting)

      I know the people involved. They're not fraudsters.

    • Anybody peddling an autonomous general purpose domestic robot is a snake oil salesman. We are at least 20 yrs away from that (ie never).

      The DARPA challenge has been a major step forward, but those robots only do one "simple" task. They did not have to deal with any mechanical challenges. They just added sensors and software to an already highly developed (over 100 years) mechanical system.

      Domestic automation will continue as it has done for the last few decades, with the development of cost effective
      • I thought the point isn't that a robot servant would be able to have attachments to be able to cut the grass or clean clothes, but that it could make use of the same tools that humans use for these things. A robot with decent legs, arms, and appropriate sensors (for balance, vision, grip, etc) is all you need.

        The only thing missing is software, but make the robot's software expandable with plugins or firmware updates and it would count as general purpose in my book. It doesn't have to actually do everything

        • A robot with decent legs, arms, and appropriate sensors (for balance, vision, grip, etc) is all you need

          I agree with you. That is all you need. But we don't have it. They have not built a "humanoid" robot, but a very crude machine. So I am not impressed. In my opinion, the mechanical engineering problem is much harder then the software problem. New materials and actuators are required, and I don't see them coming any time soon. So we will be stuck with incremental improvements to the machines we curren
      • From what I see, this prototype isn't very multipurpose. It seems just to be a table that moves itself and figures out tasks based on environment instead of following a set of instructions (if things are as they are portrayed).

        I don't think there is any claim that this device is going to be doing everything tomorrow, just that it can approach tasks within its featureset (move things around) with the AI equivalent of an 'open mind' that adapts to changing environments.
        • I don't think there is any claim that this device is going to be doing everything tomorrow.

          The headline of the article is:-

          "The Gundersons get us ready for Basil, the robot of our dreams"

          Quite a claim, really, but an "intelligent" table is not the robot of anyones dreams.
          • I would say its quite disingenuous to call that a 'claim', that's just an eye-catching, hyperbolous title no doubt originating only in the mind of the author of the article. It seems that you are irrationally predisposed to be hostile toward this thing, and can't be bothered to separate the developers from the lens of the press that you're necessarily viewing them through.
    • Re:FAAAAAKKKEE (Score:4, Informative)

      by tachyonflow ( 539926 ) * on Sunday December 21, 2008 @05:17PM (#26194487) Homepage

      I saw a demonstration of Basil earlier this month at the event mentioned in the article, and the Gundersons explained some of the technology and what they are trying to accomplish.

      There is nothing special about the sonar -- it's just a simple low-bitrate input scheme. The Gundersons are focusing on solving the problems of environment perception by focusing on a cognitive model instead of throwing horsepower at interpreting the input in fine detail, as computer vision or perhaps some sort of advanced sonar would. The robot manages an internal model of its environment, and compares the input to its expectations instead of continually trying to reconstruct a scene. Perhaps it distinguishes a chair from a person with clues (a chair doesn't move on its own, for instance).

      • So you are saying they have entered the environmental data (ie room dimensions, furniture type and expected position etc). Then they know roughly what to expect and adapt from that baseline. That would simplify the task. A valid approach.
      • by jelle ( 14827 )

        So... In that case that robot is going to get utterly confused when somebody moves a chair when the robot is looking the other direction, or when a person keeps standing in the same spot... or when a person pushes another person aside...

        And it will think that it's a chair, while it's actually grandpa sleeping _in_ a chair... Unless you first get the builders of the 'robot' to show it grandpa in a chair from all angles (see the article...) and tell it that that is grandpa sleeping in a chair... and it will s

        • Re: (Score:3, Interesting)

          It seems to me that you have missed (what I believe is) the point. The robot has an initial, object-based, dimensionally-limited/understood model of its environment. If somebody 'moves a chair' when not in the sensory view of the robot, the robot isn't going to get confused, it's just going to process the basics of the space (such as the walls not moving) see that a previous element in that space is now not there, delete that object from its model, add the same object back in its new location. A robot doesn
    • by coaxial ( 28297 )

      I doubt that it it's actually identifying the composition and shape. Its most likely a big case statement. Sonar comes back with some profile and the case statement contains stuff like:

      "person"
      "wooden chair"
      "wall"

      If you stick something roughly the same size and shape of the chair like a metal chair, or even a box, it will also think that's a wooden chair. So yeah, it's sort of faked, just not in the way you're thinking.

      • The process is a little more complex, but if there are two objects that appear identical to the sonars, the robot cannot tell them apart, any more than if you were to see two identical objects, you would not be able to distinguish between them. We based the design on the mechanisms that biological systems use for recognition and preafference, so it has the same basic characteristics, and the same failure modes.
        • by coaxial ( 28297 )

          My point was that "wooden chair" is simply a label. The machine has no idea that it's two words, nor meaning of the words. It could have been called "battleship" and the machine would be humming along identically.

          It's not doing material analysis like some here want to believe.

          I was thinking you're running some sort of multiclass classification system. Originally I was thinking you were doing something like k-means, but then I realized that with the noisy environment, you're probably doing something like

          • Semantic Tags (Score:2, Informative)

            by jgunders ( 1436953 )
            Yes, 'wooden-chair' is a label. When the robot is mapping from the sensor domain to the semantic the result of the recognition is the label. So any label would do. Once the semantic tag is selected, along with the position and pose of the object, it is added to the 'mental model'; the robot keeps track of the things that it has identified, and where they are. If Basil stopped here, as you said, any label would do.

            However, when the robot is given a goal ("deliver tea to the conference-table-area") the men

    • Re: (Score:1, Informative)

      by Anonymous Coward

      Sorry about the confusion, but you are absolutely correct, the sonars are not classifying the material. In the lab there are several different types of chairs, and Basil has constructed sensor based models of most of them. The two main types are the four-legged wooden chairs and the wheeled office chairs. Basil uses the sonar patterns (Percepts in our terms) to distinguish between these based (primarily) on the different shape of the legs. The semantic tags attached to the percepts are 'wooden-chair' an

    • I won't say that it is fake because it's very easy to generalize about the composition of a chair of a certain shape. Make that shape out of metal and you will fool the robot, however that's a long way to go to fool a stupid robot!
    • Re: (Score:2, Interesting)

      by Greyhart ( 1437153 )
      As one of the Friends of Basil (The people helping to build him) I can assure you that it is not fake. Yes, Basil was taught that a certain sonar return equals a wooden chair, and another sonar return equals a person. Remember, he has 12 sonars to work with. If he gets the return from those 12 sonars for the wooden chair, he calls it such. If he gets a return that is similar, he will say that it could be a wooden chair, or it could be something else. If he's not sure, he would have to orbit the unknown ob
  • by cliffiecee ( 136220 ) on Sunday December 21, 2008 @12:06PM (#26191991) Homepage Journal
    Don't let that bowtie fool you. I know a Dalek [gamma-two.com] when I see one.
  • by Ronald Dumsfeld ( 723277 ) on Sunday December 21, 2008 @12:09PM (#26192009)
    Beer? Great! But what is beer without titties?

    http://www.youtube.com/watch?v=lwb1s1DYnDU [youtube.com]
    • Re: (Score:1, Funny)

      by nurb432 ( 527695 )

      i don't think you want to see hers.

      No wonder they don't have kids.

    • I wonder if they called the robot "XQJ 37"
      • Listen to the song, Zappa wants to sell his soul to the devil for his titties and beer...

        Devil: Listen fool, you've got to prove to me that you're rough Enough to get into hell, That you've got the style enough to get into hell, So start talkin'...
        Zappa:Alright, lemme tell ya somethin'
        Devil: Alright!
        Zappa: I'll prove to you that I'm bad enough to go to hell
        Devil: Yeah!
        Zappa: Because I have been through it!
        Devil: Yeah!
        Zappa: I have seen it!
        Devil: Yeah!
        Zappa: It has happened to me!
        Devil: Yeah!
        Za
  • by germansausage ( 682057 ) on Sunday December 21, 2008 @12:17PM (#26192053)
    Any UBC EEs from 83/84 will remember our robot, also called Basil, because it was somewhat faulty (Fawlty).
  • Uh-oh (Score:5, Funny)

    by sfjoe ( 470510 ) on Sunday December 21, 2008 @12:34PM (#26192183)

    "...runs Linux with some instructions in Java..."

    Uh-oh, they used the J-word. Wait until the Slashdot Religious Order gets their hands on them.

    • by fnorky ( 16067 )

      Oh it gets even better. I know the Gunderson's and have had the pleasure of looking over the hardware that Basil. The micro-controllers that run the motors and sensors are programmed in Java.

      The main system board is running Slackware Linux 12.1. I expect a BIG religious war now.

  • I, for one, (Score:5, Funny)

    by oodaloop ( 1229816 ) on Sunday December 21, 2008 @12:34PM (#26192185)
    welcome my beer. Thanks, robot underling.
  • Sounds exaggerated (Score:5, Insightful)

    by Animats ( 122034 ) on Sunday December 21, 2008 @12:37PM (#26192205) Homepage

    It looks like the sensors are dumb ranging sonars at four heights. Those are very crude sensors; all you get is the range of the nearest solid object in a 30 degree cone. You could probably separate walls, tables, chairs, and humans with that, at least some of the time. It won't ever work very well. People have been fooling with those things since the 1980s. (The usual sonar sensors are left over from Polaroid auto-focus cameras. Very few robotics people have tried to do serious sonar processing, like submarines or bats.) You're just too information-starved. Vision, though...

    There's been much more progress in the last five years than most people realize, though. SLAM works now. Vision algorithms actually work. Low-cost inertial devices work. We're starting to see the payoff from the DARPA Grand Challenge, which gave robotics a serious and needed butt-kick.

    • I will be the first to admit that I don't know that much about the practical application of sonar in situations like this, but abstractly, wouldn't the use of 12 different sonar sensors possibly create a matrix that through some kind of differential process create a sensory model that's more useful than a single sensor or smaller array?
      • by mrmeval ( 662166 )

        I doubt they have the computing power to do that. You can get a 3D model that way using some of the bleeding edge DSP chips and novel software but it won't give you composition. You would also need higher resolution ultrasonic sensors ones capable of sending out and receiving more complex signals. Multiple frequencies would be better either from on or multiple sensors.

        • by Animats ( 122034 )

          It's not the compute power that's the problem. It just hasn't been done. Robot sonar data reduction could be smarter than it is. But most research is going into vision and LIDAR.

          The available sonar hardware for air is not only dumb, but obsolete. Most of it is a holdover from the Polaroid instant camera auto-focus systems. Today, everybody does auto-focus optically. There's good work going on with multi-beam sonar for underwater robotic vehicles, where vision doesn't work well but audio propagation i

    • Re: (Score:3, Informative)

      by Yvanhoe ( 564877 )

      There's been much more progress in the last five years than most people realize, though. SLAM works now. Vision algorithms actually work. Low-cost inertial devices work. We're starting to see the payoff from the DARPA Grand Challenge, which gave robotics a serious and needed butt-kick.

      In my humble opinion, the Darpa Grand Challenge, by offering a market to LIDAR makers, made vision-based SLAM a thing of the past and the under-budgeted : This beast [velodyne.com] has 64 laser telemeters on a rotating head. It gives a 100 000 3D points cloud of the environment 10 times per second. A working video slam seems to pale in comparison...

      • Re: (Score:3, Informative)

        by Animats ( 122034 )

        In my humble opinion, the Darpa Grand Challenge, by offering a market to LIDAR makers, made vision-based SLAM a thing of the past and the under-budgeted.

        That's what many of us with Grand Challenge entries once thought. Even Sebastian Thrun once thought that. But, in fact, the winning 2005 Stanford "Stanley" vehicle was running mostly on vision. Above 25MPH it was out-driving its LIDAR range. The vision system wasn't doing SLAM, though. It was comparing the road further ahead with the near road. If

        • by Yvanhoe ( 564877 )
          Well, I may have a distorted view of the Velodyne's pervasiveness because I am working on a project involving it in my current job but I seem to remember that it became quite popular after Stanley's victory. In 2006 it gave (in an early version) many good results and in 2007 I think I read somewhere that it was used on most of the competing vehicles. Many people I met told me that they saw flash-LIDARs as a promising tech and quite probably as the future of LIDAR sensing but they doubt it will become availa
  • by PolygamousRanchKid ( 1290638 ) on Sunday December 21, 2008 @12:43PM (#26192245)

    He'll have to figure it all out on his own, using a basic knowledge of bars and beers and so on, reasoning skills and an ability to understand certain parts of the world.

    This strategy seemed to work very well for George W. Bush.

    • This strategy seemed to work very well for George W. Bush.

      You must have a very different definition of 'working well' than I normally use. But Bush's behavior in the White House and Basil's behavior in the bar are eerily similar.

      FTFA:

      "This is the first time Basil's been out with his brains intact," Louise notes, adding that they've never had him complete complicated tasks in public before. When they brought him out for their recent wedding anniversary party, for example, they turned off his higher-level

  • ... I welcome our beer-toting overlords.

  • After RTFA I don't see this robot being released as a free technology, which is too bad since the last thing we need is for a revolutionary new tech industry to be once again built on marketing and closed technologies.

    The redhat business model can go into overdrive in the upcoming robot-helper industry. Deployment is assisted by open hardware and software standards, and the need for professionally paid support and custom programming will create a large new market.
  • Beer (Score:2, Funny)

    by Anonymous Coward

    Unless it can swim to Europe Hows it going to obey a command to fetch a REAL beer?

  • by mlwmohawk ( 801821 ) on Sunday December 21, 2008 @01:18PM (#26192469)

    At Denning we had a mobile robot security guard. It could roam a factory or warehouse looking for intruders. it had sonar, radar, and other things.

    Notifying people of appointments, delivering small objects, and serving drinks is not only possible, it is probably the easiest set of tasks that you can do.

    I have a project on-line that allows you to build a basic robot for $500. It has PWM motor control and basic tips on building the base. It uses a PS/2 mouse to do wheel encoders. (cheap) and using a USB A-D/D-A board to control stuff. (I won't give the URL for fear of slashdotting my server.)

    So, my two points: 1) It is possible they are doing what they say they can do. 2) Its fairly trivial if you have the time to waste.

    • by Enigma2175 ( 179646 ) on Sunday December 21, 2008 @02:23PM (#26192891) Homepage Journal

      I have a project on-line that allows you to build a basic robot for $500. It has PWM motor control and basic tips on building the base. It uses a PS/2 mouse to do wheel encoders. (cheap) and using a USB A-D/D-A board to control stuff.

      I am a current user of your software, I found your site when looking for a way to implement wheel encoders for my robot. It has been extremely useful to me.

      For the I/O hardware on my robot, I have implemented drivers for both a Pontech SV203 and Arduino Diecimila [arduino.cc] board. I also wrote an encoder driver to use the Linux event interface rather than the ps2 interface so I could use a USB mouse encoder. On top of your software I have written a Player [sourceforge.net] driver to allow me to use the robot within their framework, opening up a massive amount of new high-level functions for the robot.

      I just wanted to thank you for making your software freely available, it has helped me transform my robot from nothing to something that can localize, navigate and avoid obstacles. It has done real work sanding my deck and vacuuming my floor, now if I can only get a snowblower attachment going I will be set.

      • I also wrote an encoder driver to use the Linux event interface rather than the ps2 interface so I could use a USB mouse encoder.

        With USB, I could not get the mouse to send events unless and until it wanted too. The PS/2 interface allowed a fairly stable polling system from which I could calculate the interval for PID. We you able to get a stable PID system or, like most of the project, "stable enough" for actual work.

        • With USB, I could not get the mouse to send events unless and until it wanted too. The PS/2 interface allowed a fairly stable polling system from which I could calculate the interval for PID. We you able to get a stable PID system or, like most of the project, "stable enough" for actual work.

          Mostly just "stable enough", I am still working on tuning it since I switched to the Arduino for motor control. The event interface seems to provide the data quickly enough to fit into my loop but I don't know enough about what is going on underneath to know if the mouse is sending events as it gets them. "Close enough for Government work" is the phrase that comes to mind.

          • Mostly just "stable enough", I am still working on tuning it since I switched to the Arduino for motor control. The event interface seems to provide the data quickly enough to fit into my loop but I don't know enough about what is going on underneath to know if the mouse is sending events as it gets them. "Close enough for Government work" is the phrase that comes to mind.

            The problem with the USB mouse interface is that there is no polling mechanism. You get the events when the mouse thinks you want them. Y

            • The problem with the USB mouse interface is that there is no polling mechanism. You get the events when the mouse thinks you want them. You can't control the period and you can't be sure the time-frame in which all the clicks happened.

              Older mice with USB and PS/2 connectors (US witha PS/2 adapter) exhibit this behavior on USB yet work fine using PS/2.

              Because of the lack of determinism in the USB mouse protocol it isn't well suited. In a practical sense, and in keeping with the $500 "close enough" philosophy, it can probably work. It will have trouble in low speed precision movement, but will work well enough on cumulative "cross the room" motion.

              The biggest issue you will have is "dead reckoning" because on the diametrically opposed motor design, the relative motion of the two wheels has to be pretty accurate. Then again, inconsistent surfaces are probably a greater source of error.

              It hasn't been too much of an issue, on the surface it seems to work as well as the PS2 interface. The reason I changed to a USB mouse was because after I moved the bot to a 2.6 kernel none of the PS/2 mice I have would allow me to set the [resolution|scaling] (I forget which). I could only read up to 127 clicks per loop, so it really limited my top speed (or would make the PID run away if the target was higher than 127). With the Linux EVDEV interface there is no overflow (as far as I can tell) so I don

              • o it really limited my top speed (or would make the PID run away if the target was higher than 127).

                Yea, I believe I have since added PID overflow detection in the code. I haven't worked on it in quite a while. Jobs, wife, kids, etc.

                The important part of the project, for me, was to do the PID algorithm based on measured time on a standard i.e. non real-time kernel. I was pretty happy with the results.

                I think it is time, however, to refurbish the project with a nifty dual/quad core CPU and solid state hard d

  • THIS is the kind of robot we need! I mean, I had a girlfriend who wouldn't get me a beer and wouldn't have sex, and who started nagging pretty much as soon as the sun came up, so the machine is already ahead on points.

  • Interesting (Score:5, Interesting)

    by Yogiz ( 1123127 ) on Sunday December 21, 2008 @02:05PM (#26192789) Journal

    I for one, really like the way they decided to proceed when making this robot. It works by a healthy mix of abstracting and trial and error.

    Let's take the wooden chair, that is used as an example in TFA. As far as I understand it, learning about it and using this information for the robot goes like this.

    They put the robot in front of the chair and let it use it's sonars on it from different angles and distances. I imagine that in the case of a typical wooden chair with a back it sees four points for the legs and a line for the back. At least I believe that it abstracts it as such. For the first time it will be input to it that the thing it sees is a wooden chair and it knows that all things that have four points about so far from each other in a squared manner and have a line above two of the side points can be regarded as a wooden chair. If it sees another chair made of metal without the back for example, it might consider that to be a wooden chair as well because it's similar enough and in that case the makers correct it's assumption and say it's a metal chair. Sure, it will start to think that all the chairs without the back are metal chairs, but if that's the case in their home, so what, it's right. If it understands anything wrong enough that it fails at its task it can always be corrected and its knowledge about the world as it sees it will increase. Now when performing tasks it can treat the chair as an abstract object, now that it can recognize it. It can memorize where it stands, it can learn to avoid it or push it or whatever, as long as humans correct its assumptions and choices. Now these abstractions could be abstracted even further. The idea is to let it do very simple things and then combine them into larger tasks, much like programmers think about and solve programming problems: If you want to solve a large problem and you don't know how to, you break it into smaller pieces until you get a piece, that is simple enough to be solved. You solve it and see the next piece. Then you combine the solutions to a solution to the bigger problem and you finally end up with the first and biggest problem getting solved. This robot 'learns' the exact opposite way.

    It seems to me that the biggest concern in this case is abstracting the objects it 'sees' into such a form, that they take minimal memory but can still be used in the recognition process.

    That came out as ranting. I have no knowledge in the subject and have no idea what I'm talking about but that should make this a good enough Slashdot comment.

    • I have no knowledge in the subject and have no idea what I'm talking about but that should make this a good enough Slashdot comment.

      You're going to fit in well here!

    • Congrats on understanding what it takes to make a successful slashdot post.

      Commiserations on entirely failing to grok what this robot is all about.

      The key principle this robot uses to sense/model it's surroundings is what it's builders are calling "reification", which they've just published a book about. This is a way to bridge the divide between fuzzy (and semantically empty) sensory data and a symbolic (and semantically rich, if you choose to make it so) model of it's environment. The idea is simple (and

      • Who wants it to take over the world? I can picture it coming in a box, some assembly required, then its first baby steps would be to learn the environment i.e. the household it'll work in: the couch. the fridge, the carbon based unit that will give it a 5 year mission or until the warranty expires, the domestic US beer it will leave in the fridge, and so on.

        The couple have got their priorities straight. With a smug rubbing of the hands, 'Now that the refreshments can be brought to us, what shall we work on

    • According to the late Douglas Adams, a robot only needs the capacity to be bored or happy.

      Want it to get you beers? program it to be happy when delivering beer and bored when nobody needs one.

      Want it to guard your facility? program it to be bored when it hasn't got things to report and happy when it does. (works for making traffic cops too :p )

      Just be careful that someone doesn't throw a towel over its head and reprogram it to be happy all the time no matter what though.

  • Basil?! (Score:2, Funny)

    by whopub ( 1100981 )
    I hope it isn't fawlty...
    • The idea of letting such a machine near me with an (unopened) can of beer fills me with mild but slowly waning interest (unlike my Roomba, which is going to get let loose on the living room as I go out of the door this morning).
      The idea of letting such a machine near me with a nice fresh pot of scalding hot tea fills me with an acute and strengthening desire to be somewhere else.

      I suspect that naming it "Basil" is a sign that the inventors harbour such misgivings too. Have they tried to sell it to McDonal

  • ... for one of the Connors to drop by?

  • by Chemisor ( 97276 ) on Sunday December 21, 2008 @04:31PM (#26194039)

    Go east, to a place called Klamath. K-l-a-m-a-t-h. Find Vic. V-i-c. Ask for beer. B-e-e-r. *sigh* You are the chosen one. Find the beer. Be our salvation.

  • Sounds like a great idea. It's such a good idea that researchers and inventors have been working at it for years - and at this point are still just obtaining some insight into how difficult the problems are.

    Consider this: when we use language, the meaning of what we want to communicate is not contained in the words we use. They're just symbols that we use to refer to shared knowledge. So if I say "cat" then you already know about the small mammal that many of us keep as pets. Or maybe this refers to a shel

  • by jamesh ( 87723 ) on Sunday December 21, 2008 @05:04PM (#26194357)

    And he knows that, in this case, that person wants a beer.

    That's a simple algorithm:

    if (object == person)
        wants_beer = 1;

    Sure there is going to be some margin of error in that algorithm, but it's going to be right most of the time.

    • Nuh uh! You should express that as a global constant.

      people_want_beer=1

      Because we don't want some smart ass java parsing tin can second guessing us do we?
  • When they brought him out for their recent wedding anniversary party, for example, they turned off his higher-level brain and had him dance around by dumbly bouncing from one lady to the next -- the way most guys function on the dance floor.

    Was the quip after the EM dash really necessary? Now, I know that most women have experience with outlaw bikers, but there are a lot of decent guys out there. The problem is they're not outlaw bikers.

  • I hope they remembered to program in the Laws of Service Robotics:
          1. A robot may not damage a beer or, through inaction, allow a beer to come to harm.
          2. A robot must obey beer orders given to it by human beings, except where such orders would conflict with the First Law.
          3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

  • What do you get when you mix a room full of scientists and beer? A VERY smart robot with the wheels falling off.

    A little while ago, one of Basil's wheels fell off and they had to glue the sucker back on.

    http://www.cafescientifique.org/ [cafescientifique.org]

  • A couple of quick points, based on some of the comments:

    1) Basil is an autonomous robot, not a tele-operated system. The robot has a fully functional probability-aware planning system, execution monitor, and a reification engine that maps between symbolic representations and the sensor domain. This enables us to give Basil a goal, and let the robot figure out how best to achieve it. Basil then executes the plan and monitors the results, so he can re-plan if things go wrong.

    2) The sonars cannot tell the

  • - but you've heard this one before...

You know you've landed gear-up when it takes full power to taxi.

Working...