nue_N100_purpleAudience, the company renowned for its audio-processing chips is going to the next level by using its Neuroscience intellectual property (IP) to provide smartphones with a higher level of context-awareness, thanks to the interpretation of data from sensors embedded into the phone. Increased awareness may lead to even smarter devices that know when you’re holding them, or when they can turn off functions to save power in ways that were not obvious before.

Once reserved only to the realm of fitness apps, phones could now be aware of things such as : “the user is running”, “the user is riding a car”, “was riding a car, which has just stopped” etc… to make important user-interface decisions in an autonomous way.

This can range from forwarding calls and messages to a more appropriate place, to shutting down power to specific functions when they are not needed. It can also prevent the phone from shutting down when the user is actually using the phone (by sensing the minute motion of holding the phone in the hand). All of these functions are now made available to OEMs via Audience’s MotionQ technology.

As you may point out, some of these technologies existed before, so why is Audience coming out with this now? The main difference is that audience can do all of this using an extremely limited amount of power – literally hundreds of time less than what it would take if one did this with the main processor of the phone. For this to function, the N100 chip has to work 24/7 and look at the data provided by the sensors.

Of course, the N100 chip also has voice processing on-board, and this is also optimized to be listening 24/7 in search of a key/trigger phrase. The goal is to detect a command as fast as possible, wake up the phone and transmit the sentence for processing. Audience has done a lot of work to minimize false-positive to avoid having the phone wake up while the user wasn’t trying to talk to it. This is critical because false-positive are detrimental to the battery life. This technology is named VoiceQ.

The N100 also has nearly all the audio processing feature that its predecessors had, minus a few ones, which are likely to be integrated in a later version. Audience is actively expanding from pure audio processing, to audio awareness, to context awareness – all of which provide key clues to improve the overall usability of a smartphone, and many opportunities to optimize its usage to the context at hand.

Filed in Cellphones. Read more about .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading