On the face of it, it looks like developers can’t make full augmented reality apps for Google Glass. There could be a number of reasons for that, perhaps Google wants to wait before it lets developers do that, or perhaps there’s a concern that running AR apps all the time would severely affect the battery life of Glass. Nevertheless, it has now been confirmed that Google Glass is more than capable of handling AR apps, thanks to its impressive list of sensors, most of which were hidden.

The list of sensors was found by a Google Glass explorer called Lance Nanek, who was combing through the debug mode of his unit. He succeeded in pushing an Android app to the display which revealed the sensors that can allow full augmented reality apps to be run. These sensors include:

  • MPL Gyroscope
  • MPL Accelerometer
  • MPL Magnetic Field
  • MPL Orientation
  • MPL Rotation Vector
  • MPL Linear Acceleration
  • MPL Gravity
  • LTR-506ALS Light sensor
  • Rotation Vector Sensor
  • Gravity Sensor
  • Linear Acceleration Sensor
  • Orientation Sensor
  • Corrected Gyroscope Sensor

Three location providers, network, passive and gps would ensure that AR apps can constantly function without being sidetracked by lack of location determining services. Right now third party apps can only update location once in 10 minutes, however its not impossible to go around the limitation. With Google Glass already being rooted, we can expect many tinkering opportunities once the units are released for public, that is if Google doesn’t plug the exploit used for rooting Google Glass. We’ll have to wait for a new exploit to be found in that case.

Filed in Gadgets. Read more about , and .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading