Voice-controlled assistants are already capable of a thing or two. Apple Siri proactively recommends actions to take, Google Now will soon know better than you where you're headed next. The challenge of such AI-based tools is to understand more complex questions, learn from user behaviour and - all in all - make the human-device interaction more conversational. The latest demos of Google Home, for example, already point in that direction as the device is increasingly better at deciphering human intent and context by interpreting intonation.
Why AI functions run at distant locations... for now
All of this AI wizardry requires robust broadband access and a significant amount of hardware capacity. However, at the moment the computing that allows those smart functions takes place not on your gadget but in the gigantic server farms of behemoths like Google or Facebook, and there are many reasons for this.
Firstly, running neural networks in real-time can be capacity-intensive on the hardware front, which in turn creates a size problem: tiny chips used in mobile smart devices are not yet powerful enough to get the job done. On the other hand, such intensive computing guzzles a massive amount of energy meaning some portable gadgets may find it challenging to cope. There's also the issue of overheating - another bottleneck only the nicely chilled environment of server farms can easily overcome.
It would be ideal for personal devices to cut themselves loose of the constant reliance on real-time connection to mega-sized servers. What's more, wouldn't it help if 'locally run' AI functions on your personal devices could continually learn from your behaviour - even in offline mode as well?
Surprising as it may sound, autonomous driving technology is leading the charge to make a real breakthrough in this area by developing AI-optimized hardware. Self-driving vehicles ultimately operate on AI-based ecosystems that interpret and analyze real-time data, while their deep learning functionality helps constantly improve accuracy. The operation of an autonomous vehicle (AV) does not differ in this sense from any AI function used in smart devices.
A new breed of AI-accelerated chips will reshape the digital landscape
Complex real-time computing tasks, however, cannot be completed without solid hardware capacity. Therefore, the autonomous driving industry didn't have much of a choice but to develop customized chips that are optimized for fulfilling the extraordinary capacity needs of AI. Self-driving cars demand built-in computers that can work online and offline, are small enough to fit inside even the most compact car, don't overheat and handle the complexity related to AI-based technology. Otherwise autonomous driving just won't work.
Development has now entered a phase where a new generation of AI-friendly hardware will hit the market in the near future that can be used to not only run neural networks on AVs smoothly, but on your personal devices as well.
This new breed of chips will be a lot more power-efficient than their existing counterparts, and they will allow devices to take deep learning to the next level. So if you have a bad day for example, Microsoft Cortana will know just what to say, will play the right music to cheer you up and maybe even order you that pint of Ben and Jerry's!
Your interaction with your smart fridge or printer will also be smoother, and your 'relationship' with your voice-controlled assistant will witness an upgrade too: you will be free to multitask in your home while your assistant feeds you useful information, so much so it may feel it can truly read your mind. When we reach that point in the very near future, it might be worth noting that we got there because of autonomous driving development.
-- This feed and its contents are the property of The Huffington Post UK, and use is subject to our terms. It may be used for personal consumption, but may not be distributed on a website.