Augmented Reality (AR) uses computer-generated information to create an enhanced and interactive experience of the world. It intertwines the physical world with the digital one to make it more entertaining and fun. And, this week Apple unveiled its latest phone models, iPhone 12 Pro and iPhone 12 Max, and along with it, its custom-designed LiDAR scanner.
LiDAR stands for Light Detection and Ranging, and it measures how long it takes light to reach an object and reflect it back. With the sensor, the new iPhone’s machine learning capabilities, and the iOS 14 framework, the iPhone can “understand the world around you.” “LiDAR makes iPhone 12 Pro a powerful device for delivering instant AR and unlocking endless opportunities in apps,” said iPhone Product Line Manager, Francesca Sweet.
Apple says their new technology will help enable object and room scanning, photo and video effects, and precise placements of AR objects. With LiDAR’s ability to “see in the dark”, the sensor can autofocus in low-light six times faster. In doing so, it improves focus accuracy and reduces capture time “so your subject is clearly in focus without missing the moment.”
And, Snapchat is making sure it isn’t missing the moment either. The company is among the first to leverage iPhone 12 Pro’s LiDAR scanner for AR on its iOS app. On Wednesday, Snapchat announced it is launching Lens Studio 3.2, which will allow creators and developers to build their LiDAR-powered lenses for the iPhone 12 Pro.
“The addition of the LiDAR Scanner to iPhone 12 Pro models enables a new level of creativity for augmented reality,” said Eitan Pilipski, Snap’s SVP of Camera Platform. “We’re excited to collaborate with Apple to bring this sophisticated technology to our Lens Creator community.”
According to a Lens Studio article, the new iPhone 12 Pro AR experience will have a better understanding of geometry and the meaning of surfaces and objects. It will let Snapchat’s camera “see a metric scale of the scene”, which will allow “Lenses to interact realistically with the surrounding world.”
Even though the iPhone 12 Pro isn’t here yet, this isn’t stopping Snapchat from letting creators and developers start bringing their “LiDAR-powered Lenses to life.” Its new and interactive preview mode in Lens Studio 3.2 will already allow them to do that. So, if you’d like to get started, you can download the template on their site.
According to Apple, the new iPhone 12 Pro’s LiDAR scanner “puts advanced depth-mapping technology in your pocket.” Overall, Apple’s new technology has fancy sensors that will allow you to take top-quality photos and videos in low-light. It will also allow you to create an AR experience that should be better than what exists now. During Apple’s announcement, they said all these new “incredible pro technologies” won’t come with a higher price tag. I guess it’s up to you whether you really need the fancy new iPhone 12 Pro to play with the new lenses in Snapchat.