Move over, hoverboards
Guys. Guys. I can’t even believe this. I don’t want a jetpack anymore. Or a flying car. I’m even okay not having a hoverboard! Real talk: I should never have any of those things. I’m a klutz. Klutz plus flying machines without safety features equals a very stylish hospital visit.
That’s OK. This is the future I want, a Google-Lens-erific future.
That is to say, I want systematically integrated augmented reality (AR). AR has been a buzzword in the tech world for almost as long as virtual reality (VR), albeit with fewer hilariously failed attempts. The tl;dr on augmented reality is that it integrates digital info into the real world for you, making you more aware and bringing more options to your attention than boring old sense input and social cues.
You’ve already used AR, friends
AR’s not new. If you’ve used Foursquare or played Pokemon Go, you and AR have met: digital info linked to real-world objects and spaces. Google in particular has gone to great lengths in the AR space with Maps, Earth etc., which is why it can get you to a meeting, then find you a Starbucks afterward. Well, it’s the 21st century: you can pretty much blindfold yourself, point in a random direction and find a Starbucks, probably without taking the blindfold off, but you know what I mean.
For all that Google makes everything from chatty robots to intimidating cameras now, it still has its roots in a search engine. Nobody’s better than Google at providing users with data. They’re the company that gets you the info.
Why the upcoming Google Lens is so exciting
At Google I/O, their big developer gathering, CEO Sundar Pichai announced a new functionality: AR through your phone camera. Per the announcement, it’s going to start as a component of Google Assistant and Google Photos, interfacing publicly available data with your personal files.
Got a picture of a flower? It’ll give you genus and species. Local restaurant looks yummy? Snap a pic; it’ll tell you the hours.
Sounds convenient, right? But it is, or could be, so much more.
It has been noted that the Information Revolution comes down to the fact that we digitally enabled types now have the ability to delegate parts of our brains. A computer, after all, is only what you put into it: a box of memory. That’s how we use our computers and phones and digital sundries. Type is more comprehensible than handwriting, Facebook posts are faster than “thank you” notes, LinkedIn gets more eyeballs than a resume, but it’s all still us, our work, composed and conveyed in a convenient form. Computers make our lives easier.
AR makes our lives so much better
AR makes our lives bigger. Done right, it’s the smooth, non-invasive interface by which we can integrate as much or as little as we want of the Internet’s consensual hive mind into our non-digital lives.
It’s the difference between noticing a pretty flower and knowing what it is and how to care for it, so a month from now you have a windowbox full.
It’s how you get past picking your Friday night dive from the phonebook (or a Google search) in favor of telling a tiny robot you feel like a Hendricks and tonic and it finding the place 15 strangers agree does the best one.
My favorite: it’s about holding your phone up and, for the first time, being able to read the daily inspiration at the Korean church you’ve walked past every morning for a year. AR means if someone who uses Google speaks Korean, so do you.
[clickToTweet tweet=”Google Lens offers crowdsourced wisdom. I know it’s geek blasphemy, but that’s cooler than a hoverboard.” quote=”Crowdsourced wisdom. I know this is geek blasphemy, but for real – that’s cooler than a hoverboard.”]
Google Lens is set to be launched later in the year and be integrated into the Google Assistant, already available on Android smartphones.