Connect with us

Tech News

Snapchat’s new AR filters are wicked cool

(TECH NEWS) Snapchat recently released a set of filters which can be used to create some pretty cool 3D effects in the real-world.

Published

on

snapchat 3d filters

Flower crowns and voice changers

Snapchat’s photo filters are a large part of what has made the social media platform so popular. Snapchatters love using the filters to turn themselves into rainbows, butterflies, dogs, fairies, or just fine tune their complexion with a color-correcting filter.

bar
Now, users can take the filters they know and love and use them in the real-world to create 3D effects with their rear-facing smartphone camera.

3D filters for augmented reality

Snapchat announced it would be added their popular Lenses feature, which superimposes 3D effects over real-world photos using the rear-facing camera.

Previously, these effects could only be used with the front-facing camera for selfies, and they were not in 3D.

With Snapchat’s recent release of World Lenses, users can apply Lenses, add 3D text, create flowers that sprout, along with many other options that give the user a sense of augmented reality. Remember how wildly popular Pokemon Go was when it began? Augmented reality (AR) played a big part in that popularity.

AR allows the user to become more engaged with their surroundings.

Many other social media platforms (Facebook, Google, and Instagram) have a team of AR or VR experts working to fine tune these features to take their users’ experiences to the next level. The fact that Snapchat was the first to fully roll out a set of AR features, could up their game substantially. Considering Instagram Stories managed to reach an impressive milestone and surpass Snapchat by hitting 200 million users. Since Instagram’s stories were launched as a full-force competition to Snapchat’s Stories, it’s likely that Snapchat decided to launch their World Lenses as a one-up on Instagram.

How to use the real-world filters

Ready to give it a try? It’s not too different than the regular Snapchat lenses. To interact with an object, simply tap the camera screen after taking a rear-facing photo (from the app). Swiping up or down will change the 3D object’s distance or you can press and hold an object to pick it up and move it somewhere else. If you want to add a new object, simply tap somewhere in the screen (away from any previously placed objects) and you can add to your real-world photo.

The first time you use these lenses, there is an animation to help you through the process.

Like the regular selfie lenses, the 3D lenses will be rotated, changed, and continually updated so you always have a fresh selection of objects from which to choose.

In the past, Facebook and Instagram have adapted some feature from Snapchat so, it will be interesting to see if either platform takes on Snapchat’s recently introduced AR aspects.

It seems like AR and VR are the hot ticket item to engage users and keep them on your platform. Click To Tweet

Have you tried the new World Lenses yet? If you haven’t, will you?

#SnapAR

Jennifer Walpole is a Senior Staff Writer at The American Genius and holds a Master's degree in English from the University of Oklahoma. She is a science fiction fanatic and enjoys writing way more than she should. She dreams of being a screenwriter and seeing her work on the big screen in Hollywood one day.

Tech News

Tinder creators launch Ripple, a professional networking app void of pros

(TECH NEWS) Ex-Tinder employees have come together, backed by Match.com, to create a swipe-based professional network, but we don’t plan on giving it a second date.

Published

on

ripple app

In 2015, we discussed briefly the possibilities of taking the dating app’s and repurposing them for professional networking. What if finding professional connections was as easy as finding a date on Tinder? Tinder (executives) literally heard us because they have introduced a solution in their new mobile app called Ripple.

Not to be confused with Ripple the cryptocurrency, Ripple the app is a professional networking tool that literally feels like Tinder.

As it should, the former CTO, Director of Engineering, and Lead Designer of Tinder all make up the founders, along with Mike Presz from Match.com. People who make good dating platforms came together for a professional networking solution that they hope makes networking easier, more natural, and more modern. I took the liberty of signing up for a few days and experimented with the app and I have a few things to say about it…

The good?

Design. Design. Design. The app has a luxuriously simple UI, and is fabulously easy to use. If you even tried Tinder for six minutes, you’ll be able to use this app. The use of symbols, big images, and easy UI is great. The application navigates simply.

It’s fantastic. It’s minimal, it’s content oriented, the interest categories are so good (but they could be better – no interest in process improvements? Go learn about Six Sigma) LinkedIn should look it. The profile set up takes no time at all, about five minutes and you’re ready to go.

But that’s about it.

Everything that’s not good? Everything else.

This is probably because the app is new, but there is nothing going on for the US market. I saw a lot of European professionals and professional groups, but zero people in my area, a major US metropolitan area also called Dallas-Fort Worth. The lack of content and the lack of professionals means the app has nothing.

I can’t rate group experience or say I met the mentor of my professional dreams because no one is on it. Which leads me to ask: What’s next?

The branding, marketing, and advertising for this app are going to have to take off. This is a beautiful product, but the lack of content makes it a pretty dull use. I had to actively remind myself to use it, and I’m in a serial relationship with LinkedIn.

Basically, no second date for me with Ripple until they get… something to happen.

Continue Reading

Tech News

The cutest part of CES was Sony’s AI robot doggo, Aibo

(TECH NEWS) The Consumer Electronics Show revealed the technologies that are dominating and will dominate the market, with Sony’s AI puppers stealing the show.

Published

on

aibo sony ai doggo

One of the most endearing items to emerge from CES this year was Sony’s revamped robot dog, Aibo.

Aibo’s first unveiling in 1999 featured a blend of emergent Sony technology, such as their Memory Stick and companion operating system. By the time of its demise in 2006, the Aibo was equipped with a large vocabulary (it could speak 1,000 words) and could interact with an owner’s commands and motion. The computerized canine wasn’t limited to just the realm of their traditional counterparts, however – the 2006 model of the Aibo could take pictures from the eye-embedded camera system, play music, and write blogs.

Equipped with more personality and a better interactive capability with its environment, the 2018 Aibo looks more like a real dog as well.

Composed of 4,000 parts and OLED-screen eyes to more authentically mimic movements, Sony says it relies on sensor systems and embedded cameras akin to those in self-driving cars to provide as close to an authentic experience as they can. The cameras, located in nose and tail, allow the robot to learn its way around the house and to deliver it back to its charging station once the two-hour charge runs out.

Reviewers at CES noted that the updated version of the Aibo was very “puppy-likem” barking and scampering with unlimited energy.

The current model is also touch responsive on its head, back and under its chin, allowing the user to give “puppy love” in a way that mimics that of what real dogs like.

Perhaps proving that Aibo is capable of acting more and more like a real dog, the robot canine was unresponsive to commands from Sony CEO Kazuo Hirai on stage at its unveiling, prompting Hirai to return Aibo to Sony staff quickly.

Slated to go on sale in Japan later this year, the dog isn’t cheap, priced at nearly $1,800, but does find itself selling into a dedicated Aibo fanbase from its earlier issue and a consumer market which is hungrier and more accepting for interactive experiences of this type of poo-free pet ownership.

Continue Reading

Tech News

Lyft offers test rides in their autonomous cars – how’d it go?

(TECH NEWS) Lyft let passengers roll around Vegas in their self-driving cars, and surprisingly, no shocking viral videos resulted.

Published

on

lyft self driving cars

If you haven’t been paying attention to the progress of self-driving cars, you’re in for a shock – they’re closer to a daily reality than you might think. As part of this year’s CES conference, Lyft offered test rides in a handful of their autonomous cars, and the results were reportedly decent.

Unlike other companies’ public tests in the past, Lyft’s demonstrations consisted of normal passengers taking normal routes in Las Vegas; there was little in the way of preemptive route control, meaning that the tests were as authentic as possible. Passengers were able to board autonomous Lyfts from the Las Vegas convention center, with some testers traveling well over three miles with minimal operator interference.

The cars themselves are designed by Aptiv, which is a technology company heretofore unaffiliated with Lyft.

While both companies are aware of the potential for flaws and the need to iron them out before production begins en masse, test riders reported that the cars were able to anticipate and respond to a myriad of traffic conditions (for example, slowing down to allow a faster vehicle to merge); this bodes well for the 2020 goal that many autonomous car companies have set.

Naturally, there were a few kinks in the cars’ respective operations, including yellow light confusion and some other finessing issues, wherein the cars’ human operators had to intervene.

The technology behind self-driving cars is only part of the equation, however. As autonomous vehicles become more commonplace, cities will have to adapt to accommodate them.

This process will most likely include things like redefining road architecture, legislation regarding car use (at the moment, autonomous cars must always have a driver in them), and implementation of smart technology.

There’s also the matter of public perception. While most of the reports from the Lyft demo in Las Vegas were positive, the fact remains that plenty of people will be skeptical of new technology – as well they should be, since any emerging technology is bound to make a few bad headlines before it evens out.

How Lyft counters this perception will be key in determining the future of its autonomous fleet, and perhaps even the future of autonomous cars as a whole.

Continue Reading
Advertisement

The
American Genius
News neatly in your inbox

Join thousands of AG fans and SUBSCRIBE to get business and tech news updates, breaking stories, and MORE!

Emerging Stories