Snap’s AR Global Head Tells Us How Snapchat Is Planning for the Metaverse

Snap on Thursday hosted its fourth annual Snap Partner Summit to announce new augmented reality (AR) features that are aimed to help users in their daily lives and go beyond merely sharing interactive content on Snapchat. The AR adoption by the company has helped it grow against the likes of Meta and YouTube. Some of the recent advancements in giving interactive experiences have also enabled Snapchat as a distinct offering in the app world.

At Snap Partner Summit 2022, Snap announced its plans to help clothing brands and businesses build AR-based solutions to cut down returns. The Santa Monica, California-based company is also making it easier for developers to create new AR experiences — what it calls Lenses.

To understand more on how Snap is integrating deeper AR experiences within its Snapchat app, and how it is dealing with challenges including privacy concerns, Gadgets 360 spoke with Qi Pan, Director — Computer Vision Engineering at Snap, over a virtual call.

Snap Director for Computer Vision Engineering Qi Pan highlighted how it is moving ahead with AR
Photo Credit: Snap

 

Pan explained how Snap is building AR shopping lenses and planning to help developers create new interactive experiences using Lens Cloud. The executive also talked about Snap’s perspective towards metaverse — nascent technology space that has recently seen the entrance of companies including Meta and Microsoft. Below are the edited excerpts from the conversation.

How has been the growth of AR at Snap? And what has so far been the role of India in that journey?

The AR journey has been fantastic. Even when I joined the company, like five, five and a half years ago, we’re talking to the leadership of the company, and they were already very clear back then that AR was going to be such an important part of Snap’s fusion. To the outside world, it looks like we’re now getting more and more into AR, but internally, AR has always been very important for us in our trajectory. And the reason for that is we’re looking at the five or 10-year horizon. We expect this kind of shift in people using mobiles as their primary computing device to people using AR glasses. The goal of my team really is to try to unlock that end-user value — to try to establish the technologies to enable new AR experiences, things like location-based AR where you can interact with your home or office or your street. It also includes things like multiuser AR, where you can start benefiting from actually experiencing AR with someone rather than recording your own solo experience of AR and sending a video of that to someone which is predominantly what is happening today.

If we’re kind of thinking about social AR, in the future, people are actually gaining a lot of benefits of actually interacting in AR together. And that’s one of our Lens Cloud offerings. So, it’s going to develop tools that will allow people to push beyond what lenses can do today, to really explore new use cases, things like utility, providing information, all sorts of new use cases like that, because my belief is that these AR glasses are going to have to provide value all the time. They are going to have to make your everyday life better in some small way. To your point on India’s contribution in the journey, the growth in the country has been absolutely astonishing. So, there are 100 million users now in India, which is fantastic. A lot of that growth is mirrored on the AR side as well. I see that India is such an exciting market, it’s such a fast-growing market. And it’s really important for us to understand the use cases in AR.

AR requires users to open their camera — allow the app to look at not just them but also at their surroundings. This might not be something comfortable for many users out there in the market due to factors including privacy concerns. So, how is Snap trying to convince people to open the camera on their devices and experience AR — without any hesitation?

One of the very unique things about Snapchat is that it actually opens to the camera. We do see users are really engaging with the camera. We have 250 million people who are playing with AR and opening the camera, playing with AR every single day, which is a really astonishing number. I think people already have that kind of behaviour. On our Spectacles, the approach that we’re taking is to try to be very transparent about what is happening, say, even if you look back at the first few versions of spectacles that we launched, which were camera-only capture devices, we made a very conscious effort to let other people know if the camera is recording, and we want to be really explicit when the camera is recording, and when it’s not recording so that people around you are kind of comfortable, kind of understand what the hardware is doing. And it’s the same with the new generation of spectacles. These are kinds of habitual changes, they’re gradual changes that are taking place. So, ​​as soon as you can provide that real tangible value to people, they will be willing to use the camera in order to make their lives better.

Snap recently enabled users to anchor AR experiences to their favourite places by bringing Custom Landmarkers. How would you deal with the privacy issues, if some users may try to abuse the feature — maybe even unintentionally to some extent — by violating privacy of others and some physical spaces?

We’ve taken a very cautious approach. We want the world to be covered in interesting and useful experiences. And so, every single location-based AR lens, similar to other lenses, go through moderation workflows, and they have to abide by lens creation guidelines, community guidelines, to make sure that the content is suitable. But yeah, I think it’s a really important topic, because with any of these tools that are adding information into the world, we really want to make sure that this content is useful.

Snap is bringing new AR lenses to let users try on outfits without ever having to change clothes. How would the app deal with accuracy as size is a major concern in using virtual solutions to buy clothing and apparel?

A lot of it works with scale accuracy. If you look at our kind of eyeglasses trial product, it is using the front facing depth sensors on the devices to understand the scale of people’s faces to estimate size. So, you have an experience, which is relatively accurate in understanding of how large glasses are on your face. Also, there’s another class of experiences, which is going to visualise objects in the world around you like visualising a sofa, or visualising a handbag. And these are also visualised to scale. You understand roughly the size of the sofa in your living room or the size of the handbag in front of you. But with clothing, this is a pretty complex area. The first generation of experiences would help people understand what visually the clothes would look like on them. But I think it is a really important capability in the future to be able to help people pick the right size between medium and large sizes, for example, because that would help reduce the rate of returns on online orders.

Tech companies that started with VR and AR initially are now moving towards the metaverse. Any plans to enter this nascent space with Snapchat in the coming future?

Yeah, so the metaverse terminology used quite a lot in industry means a lot of different things to a lot of different people. At Snap, we’re focused on what is the value that we bring to the end users. And also, one of the differences in Snap’s thinking is that we think the real world is a great place. Now our goal is not to take people out of the real world. We actually want to enhance the real world in a small or significant way. So, our approach really is how we understand the world around us. And how we can make that better versus how to kind of take you out to the real world and bring into this kind of metaverse.


For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechNewsBoy.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.