It is safe to say that Google has made huge strides with their camera technology over the years, whereas before cameras on the company’s Nexus phones felt average at best, but that has definitely changed with their Pixel phones. One of those features is known as Top Shot which attempts to snap the “best” moment.
In a way it is similar to Apple’s Live Photos where it captures seconds before and after the shutter button is pressed, to try and give photos more context, or in Top Shot’s case, tries to give users more options in selecting the best moment in a photo. Now in a post on its blog, Google shares some details on how the feature works.
According to Google, they share details about the feature like what goes on behind the scenes when the camera app is opened, and what are the things the software is on the look out for, such as the optical flow of the image, exposure time, whether or not the subject is smiling, and etc.
Google also talks about the process in which they developed the feature, and how they understood that because it is collecting all this information that it cannot consume too much energy, and had to design around that concept. It is a rather detailed post that gives us an insight into how much thought goes into the feature, so if you’d like to learn more, head on over to Google’s website for the details.
Filed in AI (Artificial Intelligence), Google and Google Pixel 3.. Read more about