Snapchat Adds Lava And Water AR Lenses Using Ground Segmentation And Machine Learning

4


Although it’s still best known as a social media network, Snapchat has rapidly become a leader in real-time augmented reality effects, thanks to Lenses that alter the look of people and landmarks. This week the app is adding two ground replacement Lenses to the mix, enabling users to swap solid pavement, carpeting, or other terrain for bubbling lava or reflective water through a mix of segmentation technology and machine learning.

Both of the new Lenses work the same way, quickly determining which part of the camera’s live video feed is “ground” and swapping it for your preferred form of liquid — plus a little yellow caution sign. The lava version is arguably more convincing, as it uses heat haze, smoke, and particles to mask the edges of the areas it’s replacing, forming little islands of land next to the moving liquid. Snapchat’s water Lens broadly swaps almost all of the ground in front of you for reflective water, which alternates between somewhat believable and obviously artificial, depending on how well the segmentation works.

The real-time ground segmentation system uses machine learning models to understand geometry and semantics, isolating obviously ground-based objects from contrasting backgrounds. While the system does a good job outdoors, lower contrast or more blended indoor environments can lead to segmentation hiccups — generally an over-application of the effect to unwanted areas — suggesting that the machine has some more learning to do. Snapchat says the new Lenses were built using an internal version of Lens Studio and it’s considering bringing the technology to a public version in the future.

Facebook and Google have both open-sourced image segmentation tools that use computer vision to classify whole objects and individual pixels, though in Google’s case the software’s intended to run on dedicated cloud-based hardware with tensor processing units. Snapchat’s recent Lens innovations have been particularly impressive because they run in real time on common mobile devices, enabling at least semi-plausible time-warping of faces and other live augmentations of reality. The company has a deal with Verizon to use 5G for advanced AR features using high-bandwidth, low-latency connections with access to edge processing resources.

This post by Jeremy Horwitz originally appeared on VentureBeat.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here