As immersive as virtual reality (VR) tech is visually, there is a bit of a disconnect between what you see and what you can “touch” in a VR environment. We’ve seen companies come up with various ways to allow users to feel objects in VR, so it doesn’t come as a surprise that researchers at Cornell University have also come up with their own solution.
This comes in the form of a stretchable, synthetic skin that’s attached to fiber-optic sensors. By using a stretchable material, it would allow for a variety of applications, not just for humans, but it could also apply to robots where allowing them to feel objects, it could help them recognize these objects that could increase their capabilities and functionality.
According to lead researcher, Rob Shepherd, an associate professor of mechanical and aerospace engineering in the College of Engineering, “Right now, sensing is done mostly by vision. We hardly ever measure touch in real life. This skin is a way to allow ourselves and machines to measure tactile interactions in a way that we now currently use the cameras in our phones. It’s using vision to measure touch. This is the most convenient and practical way to do it in a scalable way.”
He adds, “VR and AR immersion is based on motion capture. Touch is barely there at all. Let’s say you want to have an augmented reality simulation that teaches you how to fix your car or change a tire. If you had a glove or something that could measure pressure, as well as motion, that augmented reality visualization could say, ‘Turn and then stop, so you don’t overtighten your lug nuts.’ There’s nothing out there that does that right now, but this is an avenue to do it.”
While it might be a while before we see a commercial application of Cornell’s technology, it does present options for how companies approach “touch” in VR or AR systems in the future.