Exploring AR Try-On with React Native
AI Generated Video Summary
This Talk discusses exploring AR Tryon with React Native, implementing AR try-on experiences in e-commerce apps, and considerations for AR development. It also covers the integration of AR platforms like ARKit and ARCore with React Native using the Duvero bridge. The Talk highlights the use of off-the-shelf solutions like Wanna's SDK for virtual try-on and Snap's AR technology and shopping extension. The importance of creating 3D models for AR Try-On and the challenges of writing native code for AR development are also mentioned.
1. Introduction to AR and VR
I'm excited to talk about exploring AR Tryon with React Native. I'll discuss what's happening in the AR and VR space and share a case study of implementing a virtual try on feature. AR enhances the real world with computer-aided graphics, while VR offers a fully simulated experience. AR is becoming an expected part of the shopping experience, as seen with Amazon's AR-powered virtual shoe tryout. Examples include Bailey Nelson's AR try-on for glasses and Gucci's app with various AR features.
I'm really excited to be here today, and I'm going to be talking about exploring AR Tryon with React Native. So, here is the intro slide.
Hi, my name is Kadi. As Yanni said, amongst his very kind words, I'm currently the head of mobile development at Formidable. I've been an engineer for about ten years, but for the past five years since 2017, I've been building things in React Native. I've been very fortunate to be able to work on some really exciting projects in React Native, and I'm very excited to share one of them with you today.
So, in this talk, it's going to be in two parts. So, first, we are going to talk in general about what's happening in the AR VR space, and in the second part, I'm going to go through a bit of a case study of how we actually implemented a virtual try on feature in a React Native application.
So to kick us off, what's currently happening in the AR and VR space? So I'll do a quick clarification on the difference between AR and VR, because we tend to use those terms together. VR stands for virtual reality and it is a fully simulated experience, so it may look like the real world or it may look completely different. So the distinguishing feature in a VR experience is that you will have some sort of a VR headset, so it will be an Oculus or a Google Cardboard or an Index VR, but the point is, you need to wear something in order to have this immersive experience, and an example of that is, for example, Beatsaber which you'll see on the screen.
AR stands for Augmented Reality and it is a process in which the real world is enhanced by some computer-aided graphics. So usually, for an AR experience, you'll be looking through a screen, so it could be your phone screen, it could be your laptop screen that has a camera that's recording the real world, and then your device will add some kind of computer-aided enhancements. So for example, you could use it to place furniture in your space. If you've followed the news recently, you might have seen that last week there was a news story making rounds that Amazon has launched an AR-powered virtual shoe tryout experience in their iOS app. Now this in itself is not particularly groundbreaking. Amazon are nowhere near the first, and they're not going to be the last company that launches something like that. But the reason that I found it significant is that it's speaking to us of a trend that is happening more and more. Which is that in e-commerce, AR is going to stop being a gimmick. It's not stopping the cool thing that maybe 5% of the people would use. And it's more and more starting to be an expected part of our shopping experience. And just to give you a couple of examples of myself using AR in my shopping experience. So, these were a couple of things that I used before I even knew that I was going to be giving this talk. So, Bailey Nelson is a company that sells glasses. I like their glasses quite a lot. And on their website, they have an AR try-on where you can try on glasses before you purchase them. So, another thing, so, Gucci have an app that's just for experience features. They have a couple of different AR features. They have a shoe try on. They have a makeup try on and in this example, you can also try on different nail polishes.
2. Moving into an Unfurnished House
I recently moved into a completely unfurnished house and had to furnish it from scratch. I struggled with placing a sofa in my weirdly shaped living room, but ultimately settled for a two-seater sofa.
I recently moved house. Well, I say recently. It was the beginning of the year. And it was my first time moving into a completely unfurnished house. It was my first time moving into, sorry, I can hear some echo. It was my first time moving into a completely unfurnished house, so I needed to furnish it from scratch. And here was me trying to figure out how to place a sofa into my very weirdly shaped living room. So, as you can see, that corner sofa was wishful thinking, but I did end up going with a smaller version of this, um, the two-seater sofa.
3. Introduction to AR Platforms and Duvero
Both iOS and Android provide AR platforms called ARKit and ARCore respectively. React Native has a bridge called Duvero that connects React Native with ARKit and ARCore. Duvero is an open-source library that simplifies AR development in React Native.
Now you might be wondering, how do I do this in React Native? Well, with everything in React Native, we're going to start by looking at the underlying platforms. Both iOS and Android provide an AR platform. In iOS, it's called ARKit. It was launched in 2017 and it's available from iOS 11 onwards. In Android, not to be far behind, launched the equivalent, on the Android platform, is called ARCore and it was launched in 2018 and it's supported on Android 7 and onwards. As ever with React Native, someone somewhere has written the React Native bridge for the native libraries, and in this case, it's called Duvero, which basically builds the React Native bridge for ARKit and ARCore. It's an open-source library. I think they've moved it into the community space now. They're not actively maintaining it. But if you just want to try out AR in React Native with no native code for yourself to write, then this would be a good place to start.
4. Implementing AR Try-On
We implemented a virtual try-on experience in a React Native app for an e-commerce client. They wanted users to try shoes before purchasing, as part of the main shopping app. The UI was designed before finding a provider or building any code. Due to limited resources, we couldn't start from scratch. We found a suitable solution based on these requirements.
Now, let's look at how we actually implemented a virtual try-on experience in a real world React Native app. As always, let's start with the requirements. So, what our client needed was an AR shoe try-on experience. So, we are in the e-commerce space. We are selling shoes amongst other things, and on our product details page, we want the users to be able to try the shoe before they purchase it, and hopefully improve conversion. They wanted this to be part of the main shopping app. Some companies have decided to have a separate app specifically for experience features, so gucci is an example of that, but for this client, they wanted it to be part of the application that all the users will have, regardless of whether they actually use the AR feature. They also wanted this to be a bespoke UI, so the UI was designed way before we even got to finding a provider or building any of the code. And in terms of resources, we had in the order of weeks rather than months or years to build this, meaning that it's unlikely that we'll be able to start completely from scratch. So based on these requirements, we found the solution that will work for us.
5. Considerations for AR Try-On
To build the AR Try-On feature in our shopping app, we had to consider the bundle size, API level, and limited development time. We opted for an off-the-shelf solution for the native platform and integrated it with React Native. Learning native code, such as Swift for iOS and Kotlin for Android, can greatly enhance the capabilities of a React Native application.
So because it's part of the main shopping app, the bundle size and the API level matter, so we can't add anything that will add hundreds of megabytes of space because every single user will have to download this app regardless of whether they use the feature. Also with the API level, we need to make sure that our minimum API level is as low as possible, again, to make sure that the most amount of users are able to install this app.
6. AR Trion with Wanna
We looked at a company called Wanna that specializes in virtual Trion and provides an SDK for building your own integration. The SDK allows you to try out the virtual Trion experience for shoes. We received a demo SDK from Wanna and integrated it into our app. The API was straightforward, requiring minimal native code. However, the higher minimum API levels for Android and iOS were a dealbreaker for us, as our client wanted to support older devices.
7. Snap's AR Technology and Shopping Extension
Snap, the company behind Snapchat, has a developer portal where you can access their AR technology. They offer libraries like CameraKit, which powers Snap and provides more than just virtual try-on for shoes. They also have a new AR shopping extension in beta that creates an integrated experience for AR on e-commerce product pages.
Did you know that Snap, the company that builds Snapchat, has a developer portal? And you can use a lot of their AR technology in your own application. You can check out developers.snap.com. They actually do more than just AR try-on. But the point is that they provide a couple of libraries, most notably for us a library called CameraKit, which is basically the SDK that powers Snap itself. So, it does a lot more than just virtual try-on for shoes. It does make-up, it does clothes, it does front camera, back camera, everything you see in Snap. So, it's very, very powerful. And under certain circumstances, you can use it in your own application. And something new that's been recently built to enhance it that's currently in beta is the AR shopping extension. So, this is something that Snap is providing. It's alongside their CameraKit. And it's specifically built to create an integrated experience for AR on the e-commerce product details page. So, within the shopping experience. And using the Snap's SDK, this is what we ended up with. So, starting from the left, we have the product details page with the AR badge on the products that have an AR. Then we have the product details page with the AR badge. And finally, on the right, we have the module with the actual AR experience.
8. Implementing AR Experience
I haven't shown you any code, which is basically a crime, so I've added some code. We use the SDKs to determine if a product has an AR experience and show the AR badge. The actual AR experience is rendered using React Native, while the camera experience is rendered natively using Stubbs SDK. We parse the product ID to the native component, fetch the products and Lens using the Stubbs SDK, and call the necessary callbacks. The end-to-end experience includes launching AR, viewing a 3D model of the shoe, swapping to the AR experience, and taking pictures to share.
So, on the product listing page, we have these AR badges, and what we need to know from the SDKs, which is the source of truth on whether something has an AR experience, we pass in the product IDs, and it does its magic and tells us whether or not this product has an AR experience. And if it does, we'll show the AR badge. We use the exact same method here on the product details page.
And this is the interesting part. This is the actual AR experience. Being the React Native developers we are, we made the conscious choice to have as much of the code as possible on the React Native side. So the attribution, the product name, the camera button, and the variant picker, these are all rendered using React Native. So the only thing actually rendered natively using Stubbs SDK is the actual camera experience.
So the way that we've exposed this is as a native component. And the way it works under the scene is that we parse in the product ID to this native component. Then using this Stubbs SDK we fetch the products. So this is from the shop kit. We fetch the products for that variant. Then we call the on products loaded callback, which will then populate the products underneath the screen. Then next what the SDK does is it fetches the Lens for the selected product. So a Lens is kind of a snap terminology, but it's basically the virtual reality experience with something inside. In this case, the pair of shoes. And then when the Lens is fetched, we're going to call onLensLoaded callback, which will then tell us that it's time to stop the loading spinner and the user's good to go.
And let's look at this experience end-to-end. So here I'm on a product details page. I'm going to scroll down to the launch AR. So in this case, this is the first time we're launching it, so we have to say yes to camera, agree to terms, which we've definitely read. So here we can also look at the 3D model of the shoe, which is actually built into the Lens, which is pretty cool. So you can change colors and see what it looks like from either side as if you were holding it. And then if you swap to the AR experience, obviously you can look down and you can see the shoes on your feet, on your friend's feet. You can also mix and match and show one on your feet, one on your foot, and one of your friend's foot. That works as well. And finally, you can take a picture and share it with your friends.
Snap SDK and Q&A
The Snap SDK required writing a significant amount of native code, but it provided a fully working example app. The UI had to be customized, but the overall experience was smooth. The SDK works from Android API 24 and iOS 11 onwards. In this talk, I discussed Viro, Wana, and Snap for developers. Shoutout to Formidable, the company I work for. Now, let's move on to the Q&A. The first question is about obtaining 3D models for shoes.
So in summary, the Snap SDK, I will say that there was quite a lot of native code that we had to write. So the previous slide that I wrote about embrace native code, it was there for a reason. I definitely had to embrace the native code for this project. But on the upside, they did provide us with a fully working example app. Obviously, the UI didn't look like anything like us. Anything like us. We had to change things quite a bit and build it from scratch, but it was a really nice help to get started. On the upside, it was a very, very smooth experience. You can tell this is an SDK that is powering a really powerful app, the Snapchat app itself. They've had lots of engineers and years and years of practice of making this as smooth and as nice as possible. So, it was a very nice end experience with kind of a minimal effort on our part. And very importantly for us, it works from Android API 24 and iOS 11 onwards.
So, in summary, in this talk, the three main things I talked about was Viro, which is a React bridge for AI. If you just want to try it, try out VR and React Native. And then for Virtual Try On in particular, we talked about Wana and we talked about Snap for developers. So, if you're interested in either of these things, I would definitely check them out. And just a shoutout to Formidable, the company I work for, thanks to whom I am here, who are awesome. We do lots of open source. Check us out at formidable.com. Thank you very much. Thank you so much.
Oh, hello. Thank you so much, Kati. Now we're going to do a brief Q and A. We're a little behind schedule, so we're going to keep this a few minutes. And then if you have more questions for Kati, you can find her at the speaker Q and A booth afterwards. So, the first question from the audience is, how did you get the 3D model for the shoes that you're selling? That's a great question. Actually, so there is another company called Vertebrae that we work with, and they are the ones that actually, well, they get shoes shipped to them. They take the 3D model of the shoe, and then they work with Snap to get it integrated. So could anybody just go and get their shoes modeled, or is this like a sort of like an enterprise deal type of situation? I think this particular one is an enterprise deal type of situation, but I'm sure there are ways to do it as an individual person as well.
Creating 3D Models and Q&A
The easiest way to create a 3D model for AR Try-On is to use 3D modeling software. Sarah Viera will discuss modeling in Blender. Despite the challenging timeline, we found the best SDK and tool for the job. If you have more questions, please visit the speaker Q&A booth.
I think the easiest way to actually do it would be to create a 3D model in, like some kind of 3D modeling software, and use that rather than the real shoe. That's cool. And by the way, I think we do have a talk about that. Sarah is talking about, Sarah Viera is talking about modeling in Blender later, so that might be a good time.
Perfect. I have a question for you. This is not from Slido, but you were given a brief to do this within a few weeks or within the timeline of weeks. That sounds scary and difficult. How did that actually go in the end? I think there was always, I think when people tend to figure out things based on the time frames that we have. I think if we were told that we had two years to build this, we would have probably started from a much lower level, and were like, yes, we're going to build our own custom AR solution. But knowing that we had a couple of weeks, then that made that option go out the door straight away. We needed to find the best SDK, the best tool for this job.
Nice. Thank you. So now, if you do have more questions for Catty, that's all we have in Slido. So feel free to go and talk to Catty in the speaker Q&A booth. We're going to be moving on to our next talk. Thanks so much, Catty.