Just recently, Lenovo launched the PHAB2 Pro, the first commercial handset equipped with Project Tango (official site), a Google software that makes a phone particularly aware of its physical surrounding. Built like a portable cousin of Kinect, Tango knows exactly how the handset moves in space because it projects an infrared (IR) grid from the back of the device. That device was equipped with the Snapdragon 652 processor, but Qualcomm just announced that its most powerful chip, the Snapdragon 820, also supports Tango.

It makes complete sense because both the Snapdragon 820 and 652 are built on a similar heterogeneous compute architecture that is suitable to run Tango. The difference is that Snapdragon 820 is much more powerful.

Essentially, Tango uses sensor information from 5 sources:

snapdragon-820-heterogeneous-tango
  1. gyroscope (orientation
  2. accelerometer (relative movement)
  3. fisheye camera (640×480) (see Tango camera specs)
  4. main camera
  5. depth sensor (IR grid).

All data from sensors is time-stamped, which is required for later merging and composition. The time stand is like a global registry that can be read by all the compute units inside the chip. #1 and #2 go to the sensor HUB, which is a low-power unit that will process the signal.

#3 goes the DSP which will scan the image to find recognizable features that can be used as markers in the space (corners, sharp edges, etc…) the relatively low resolution is enough to “see” but not too much to compute.

#4 is high-resolution data, so it goes to the ISP. It’s not the same kind of computing as #3 and behaves like traditional image processing, so the ISP (Image Signal Processor) is the best place to do it.

#5 is mostly computed with the help of Neon, a computing unit embedded in the CPU cores, which is specialized for performing computations on vectors (x,y,z,w) data. Results from all co-processors are read by the CPU, which runs the main app.

This hardware was not invented just to run Tango, but Tango shows that all the work done by Qualcomm to integrate specialized computing units is paying off. Without them, it would not be possible to get all this computing done, at a power budget that would make a commercial device possible.

project-tango

Also, Tango needs the depth sensor and the fisheye camera to “sense” it’s surrounding. In theory, one could use the motion sensor embedded in virtually every phone to detect movement, but error accumulates very rapidly because there are no points of reference in space. Those sensors allow Tango to have key markers that will be cross-referenced and merged with other sensors’ data to increase accuracy.

Some of the Tango apps are very fun to use and bring new experiences, including Augmented Reality (AR) experiences. There’s no telling what developers will come up with, but they will appreciate having more speed for their apps.

Qualcomm cannot comment on unannounced handsets or tablets, but have said that there is a healthy interest in Tango. Since the Snapdragon 820 is the most successful high-end mobile chip in 2016, we know that the integration won’t be too difficult for an OEM. The question is: who’s going to release the next Tango product? Maybe Google itself, since it is rumored to get even more serious about hardware.

Filed in General. Read more about , , , and .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading