Given how small the sensors and lenses on our smartphones are, there is only so much it can do in terms of hardware performance, which means that when it comes to making our photos taken by our phones look better, we need to start looking towards the software, which is what researchers Andrey Ignatov, Nikolay Kobyshev, Radu Timofte, Kenneth Vanhoey and Luc Van Gool are trying to do with neural networks (via Engadget).

The researchers have basically created a neural network in which it will help to give photos taken by your smartphone a DSLR-like quality. The idea is that the neural network has been trained to know what photos should look like in terms of DSLR-like quality, and will tweak the photos taken by our smartphone cameras to try and get them to achieve a similar look.

In a way we suppose this could be seen as editing photos in a smarter way to try and achieve a more vibrant image that one might expect from a higher-end camera. However whether or not these photos are comparable to DSLRs will vary from image to image since it doesn’t appear to be 100% yet, but we reckon it does a pretty good job so far. Like we said, hardware is one of the limiting factors.

That being said, this isn’t the first time companies and researchers are turning towards AI to enhance our photos. Earlier this year Google also experimented at pushing the cameras on the Pixel and Nexus 6P to its limits, resulting in some pretty stunning photos. This year’s Pixel 2 and Pixel 2 Xl handsets have also yielded some pretty impressive photos despite it packing only one lens. In the meantime for those who are curious and want to give this particular neural network a shot with their own photos, head on over to its website for the details.

Filed in Cellphones >Photo-Video. Read more about .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading