The new buzz word
Everyone got a Google Home or Amazon Echo for the holidays and now I keep hearing the term “Zero UI” pop up. What does it mean? Prior to researching for this story I had no idea either, so don’t feel like a dim bulb – let’s learn together!
Natural language and gestures
Zero UI focuses on interacting with technology in more natural ways, moving away from a screen-focused experience. Technology is now learning our language rather than vice versa.
So instead of providing stilted commands to our phones and devices, we can speak in a more human way or simply gesture.
Andy Goodman, who coined the term, explains “humans have always had to interact with machines in a really abstract, complex way.” Zero UI utilizes haptics, voice control, artificial intelligence, and voice control elements to make the whole experience more human.
In command, in control
Right now, voice control is at the forefront of the Zero UI goal. Although voice commands have been around for quite a while now, we’re seeing the most promising results in futuristic home technology through more integrated uses of it.
It turns out I’m not just imagining hearing Alexa’s name more frequently than usual this year. Amazon reported millions of Alexa devices were sold this holiday season. Sales for the Amazon Echo family were up nine times from last year and the Google Home devices spent some time on backorder in stores. Zero UI devices are quickly gaining traction with consumers.
[clickToTweet tweet=”Devices are moving from a trendy piece of technology to an integral part of our lives.” quote=”As devices begin to interact with us in a more normal way, they move from a trendy piece of technology to an integral part of our lives.”]
Goodman believes the future of Zero UI requires designing in a multi-demensional way–literally. He explains designers need to consider not just the 2D, linear way consumers use their products, but instead consider every possible interaction.
Devices must adapt to our stream-of-consciousness ways of interacting with them if they want to stay relevant. This includes not just voice commands, but more intuitive body language recognition systems.
Beyond the screen
Goodman notes zero UI devices will “need to have access to a lot of behavioral data, let alone the processing power to decode them.” Additionally, non-linear design is going to require completely different sets of skills than current app design. Designers will need to think beyond the screen when it comes to programming devices in order to create a highly adaptable device.
A quick note
While Zero UI does sound promising, Goodman assures that it’s not meant to be taken literally as a term.
User interface of some kind will always be present. It’s really a matter of moving away from screens, not the complete elimination of user interface.