Robots have no trouble picking up objects, but the problem is they don’t know what they are touching. This is actually a very useful feature to have, because if robots could identify what they are touching or holding, it will make it easier for them to help sort and categorize objects, which might be handy when recycling, picking up hazardous materials, and so on.
However, it seems that researchers at MIT’s CSAIL have managed to do just that by creating a robot that can not only identify objects by sight, but also by touch. This is thanks to the use of a new sensor called the GelSight. The information gathered by the sensor was then fed to an AI that would help it learn the relationship between visual and tactile information.
In order to train the AI to be able to identify these objects, they fed it with a ton of videos of various objects being touched, which was then further broken down into still images. According to Yunzhu Li, CSAIL PhD student and lead author of the study, “By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings. Bringing these two senses together could empower the robot and reduce the data we might need for tasks involving manipulating and grasping objects.”
This isn’t the first time that MIT’s CSAIL has explored the concept of robots that can identify objects by touch. Previously they had actually created a recycling robot that could sort items based on touch.