After so many years of working, developing and refining Honda’s ASIMO robot, plenty of work still needs to be done in order to help it understand humans better. You still cannot bark orders at ASIMO for it to bring you a drink, or pick up your socks. However, humanity is getting there, where Cornell intends to overcome this particular issue by teaching robots to interpret natural language instructions. Of course, this will include the casual ones, and as you can see in the video above, in order for a PR2 robot to scoop some ice cream for you. This, of course, after it serves you beer.
Ashutosh Saxena’s Robot Learning Lab are the ones behind the ability for the PR2 to be able to figure out logical inferences concerning missing steps or poor instructions, and it uses its 3D camera in order to perform a scan of its environment, before getting a better idea on the kind of objects within.
For instance, it has been programmed to associate different objects with their capabilities, where a pan would mean one can pour stuff into it, and from it, while stoves will provide the allowance to place a corresponding utensil on them in order to heat stuff up later on. This means asking it to “heat water” would see it make use of the available tools, stove or microwave included. You do not even need to place the items in the same place, since the instructions will still work as it’s 3D camera does the work of scanning the kitchen.