Image credit – Chelsea Turner/MIT

In 2017 when Apple launched the new iPhones, they also introduced the new A11 Bionic chipset. This is the latest chipset for iPhones and while it comes with the usual performance upgrades, it also came with a dedicated AI chip that help to power and process the front-facing TrueDepth camera system which authenticates users through Face ID.

Apple isn’t alone in creating dedicated AI chips as we have also heard that Amazon is looking to do the same. Now it looks like education institution MIT might also have their own solution that could be potentially faster and also more energy efficient. MIT claims that this new chip is 3-7 times faster at neural network computations compared to its predecessors, and reduces power consumption by as much as 94%.

They say that with these improvements, it makes the chip they’ve developed ideal to power local neural networks, such as those on phones or even embed them into household appliances, which we imagine could potentially cover smart devices like the Amazon Echo or Google Home.

The work of Avishek Biswas, an MIT graduate who led the chip’s development seems to have caught the attention of IBM, in which the company’s VP of AI Dario Gil was quoted as saying, “This is a promising real-world demonstration of SRAM-based in-memory analog computing for deep-learning applications. The results show impressive specifications for the energy-efficient implementation of convolution operations with memory arrays. It certainly will open the possibility to employ more complex convolutional neural networks for image and video classifications in IoT [the internet of things] in the future.”

Filed in General. Read more about .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading