Exploring AR Try-On with React Native


React Native can be much more than a toolkit for building mobile UIs in JavaScript. We’ll explore how to use features beyond the core library, and use the built-in native module system to integrate AR capabilities into your mobile app.



Wow, hello. Thank you all for coming. I'm really excited to be here today, and I'm going to be talking about exploring AR try-on with react Native. So, here's the mandatory intro slide. Hi, my name is Kadi. As Jani said, amongst his very kind words, I'm currently the head of mobile development at Formidable. I've been an engineer for about ten years, but for the past five years, since 2017, I've been building things in react Native. I've been very fortunate to be able to work on some really exciting projects in react Native, and I'm very excited to share one of them with you today. So, in this talk, it's going to be in two parts. So, first, we're going to talk in general about what's happening in the AR VR space, and in the second part, I'm going to go through a bit of a case study of how we actually implemented a virtual try-on feature in a react Native application. So, to kick us off, what's currently happening in the AR and VR space? So, I will do a quick clarification on the difference between AR and VR, because we tend to use those terms together. VR stands for virtual reality, and it is a fully simulated experience, so it may look like the real world, or it may look completely different. So, the distinguishing feature in a VR experience is that you will have some sort of a VR headset, so it will be an Oculus, or a Google Cardboard, or an Index VR, but the point is you need to wear something in order to have this immersive experience. An example of that is, for example, Beat Saber, which you will see on the screen. AR stands for augmented reality, and it is a process in which the real world is enhanced by some computer-aided graphics. So, usually, for an AR experience, you will be looking through a screen, so it could be your phone screen, it could be your laptop screen that has a camera that's recording the real world, and then your device will add some kind of computer-aided enhancements. So, for example, you could use it to place furniture in your space. If you've followed the news recently, you might have seen that last week there was a news story making rounds that Amazon has launched an AR-powered virtual shoe try-on experience in their iOS app. Now, this in itself is not particularly groundbreaking. Amazon are not, like, they're nowhere near the first, and they're not going to be the last company that launches something like that. But the reason that I found it significant is that it's speaking to us of a trend that is happening more and more, which is that in e-commerce, AR is going to stop being a gimmick, it's not stopping the cool thing that maybe 5 per cent of the people would use, and it's more and more starting to be an expected part of our shopping experience. And just to give you a couple of examples of myself using AR in my shopping experience, so these were a couple of things that I used before I even knew that I was going to be giving this talk. So, Bailey Nelson is a company that sells glasses, and I like their glasses quite a lot, and on their website, they have an AR try-on where you can try on glasses before you purchase them. I actually got this pair, so that was very pleasant. Another thing, so Gucci have an app that's just for experience features. They have a couple of different AR features. They have a shoe try-on, they have a make-up try-on, and in this example, you can also try on different nail polishes. I recently moved house, well, I say recently, it was the beginning of the year, and it was my first time moving into a completely unfurnished house. It was my first time moving into, sorry, I can hear some echo. It was my first time moving into a completely unfurnished house, so I needed to furnish it from scratch. Here was me trying to figure out how to place a sofa into my very weirdly-sleeping living room. As you can see, that corner sofa was wishful thinking. I did end up going with a smaller version of this two-seater sofa. Now you might be wondering how do I do this in react Native? Well, with everything in react Native, we're going to start by looking at the underlying platforms. Both iOS and Android provide an AR platform. In iOS, it's called ARKit. It was launched in 2017, and it's available from iOS 11 onwards. Android, not to be far behind, launched the equivalent on the Android platform, it's called ARCore, and it was launched in 2018, and it's supported on Android 7 and onwards. As ever with react Native, someone somewhere has written the react Native bridge for the native libraries, and in this case, it's called Duvero, which builds the bridge for ARKit and ARCore. It's an open source library. I think they've moved it into the community space now. They're not actively maintaining it. If you want to try out AR in react Native with no native code for yourself to write, then this would be a good place to start. Now, let's look at how we actually implemented a virtual try-on experience in a real-world react Native app. As always, let's start with the requirements. So, what our client needed was an AR shoe try-on experience. So, we are in the e-commerce space, we are selling shoes, amongst other things, and on our product details page, we want the users to be able to try the shoe before they purchase it, and hopefully improve conversion. They wanted this to be part of the main shopping app, so some companies have made the decision to have a separate app specifically for experience features, so Gucci is an example of that, but for this client, they wanted it to be part of the application that all the users will have, regardless of whether they actually use the AR feature. They also wanted this to be a bespoke UI, so the UI was designed way before we even got to finding a provider or building any of the code. And in terms of resources, we had in the order of weeks rather than months or years to build this, meaning that it's unlikely that we will be able to start completely from scratch. So based on these requirements, we found the solution that will work for us. So, because it's part of the main shopping app, the bundle size and the api level matter, so we can't add anything that will add hundreds of megabytes of space because every single user will have to download this app regardless of whether they use the feature. Also, with the api level, we need to make sure that our minimum api level is as low as possible, again, to make sure that the most amount of users are able to install this app. And finally, because we have weeks and not months to build it, our only real option is to use an off-the-shelf solution for the native platform and build the react Native integration with it. And just as a side note at this point, to really take your react Native engineering to the next level, I very much recommend that you start learning native code. So, if you're someone that came from the web, like me, you might find and your first experience with iOS development was a bit of Objective C. Sorry. Yeah. A bit of Objective C. You might find it terrifying. What are all these brackets? However, on iOS, you can use Swift, and on Android, you can use Kotlin, and both of these languages are much more lightweight, they're much more functional, and much more easier to grasp from a javascript developer. And once you get more acquainted with the native platform, native languages, it will really enhance the amount of features you're able to add to your react Native application because it opens the door of being able to install native only SDKs and write your own bindings comfortably. So, for our AR try-on, we looked at two different companies. The first company that we had used is called Wana. They are a company that specializes in virtual try-on. So, they have an SDK where they've implemented a virtual try-on experience for shoes, and you can use their SDK to basically build your own integration. You can actually try this out yourself. So, this is their example app, which is available on both the iOS and Android stores. So, this is an app that uses their own SDK, and you can see what the try-on experience would look like. So, from an implementation point of view, we got in touch with them. They sent us a demo SDK so we could have a look at integrating it and seeing what it would look like in our app. So, here are some screenshots of when we built the integration. This was obviously a proof of concept, so the UI isn't nothing to write home about. But in general, the api itself was pretty straightforward. There wasn't a huge amount of native code that we had to write in order to build the integration. It was fully customizable, so we were able to just render the virtual try-on video and then do everything else in javascript. The thing that was kind of a deal-breaker for us is that it had quite higher minimum api levels than what we were aiming for. So, it requires Android api 26, whereas we were at 24, and it requires iOS 13, whereas we were at 11. Now, this is a trade-off for some clients, for some customers, for some use cases. This doesn't matter, and you can just aim for higher APIs. But in our use case, our client was quite sensitive to make sure that no one gets missed out because of their older api levels, so we didn't want to upgrade. Did you know that Snap, the company that builds Snapchat, has a developer portal, and you can use a lot of their AR technology in your own application? You can check out developers.snap.com. They actually do more than just AR try-on, but the point is that they provide a couple of libraries, most notably for us, a library called CameraKit, which is basically the SDK that powers Snap itself. It does a lot more than just virtual try-on for shoes. It does makeup, it does clothes, it does front camera, back camera, everything you see in Snap, so it's very, very powerful. And under certain circumstances, you can use it in your own application. And something new that they've recently built to enhance it that's currently in beta is the AR Shopping Extension. So, this is something that Snap is providing. It's alongside their CameraKit, and it's specifically built to create an integrated experience for AR on the e-commerce product details page, so within the shopping experience. And using the Snap's SDK, this is what we ended up with. So, starting from the left, we have the product details page with the AR badge on the products that have an AR. Then we have the product details page with the AR badge, and finally on the right, we have the module with the actual AR experience. I realize this is a technical talk and I haven't written, I haven't shown you any code, which is basically a crime, so I've added some code. I'm not going to show you the native side of things, so I'm just going to show you what we actually import and use on the javascript side. So, on the product listing page, we have these AR badges, and what we need to know from the Snap SDKs, which is the source of truth on whether something has an AR experience, we pass in the product IDs, and it does its magic and tells us whether or not this product has an AR experience. If it does, we'll show the AR badge. We use the exact same method here on the product details page. And this is the interesting part. This is the actual AR experience. Being the react native developers we are, we made the conscious choice to have as much of the code as possible on the react native side. So, the attribution, the product name, the camera button, and the variant picker, these are all rendered using react native. So, the only thing actually rendered natively using Snap SDK is the actual camera experience. So, the way that we've exposed this is as a native component. And the way it works under the scene is that we pass in the product ID to this native component. Then using the Snap SDK, we fetch the products. So, this is from the shop kit. We fetch the products for that variant. Then we call the on products load at callback, which will then populate the products underneath the screen. Then next, what the SDK does is it fetches the lens for the selected product. So, a lens is kind of a Snap terminology, but it's basically the virtual reality experience with something inside. In this case, the pair of shoes. And then when the lens is fetched, we're going to call on lens loaded callback, which will then tell us that it's time to stop the loading spinner and the user is good to go. And let's look at this experience end to end. So, here I'm on a product details page. I'm gonna scroll down to the launch AR. So, in this case, this is the first time we're launching it. So, we have to say yes to camera, agree to terms, which we've definitely read. So, here we can also look at the 3d model of the shoe, which is actually built into the lens, which is pretty cool. So, you can change colors and see what it looks like from either side as if you were holding it. And then if you swap to the AR experience, obviously, you can look down and you can see the shoes on your feet, on your friend's feet. You can also mix and match and show one of your feet, one of your foot to one of your friend's foot. That works as well. And finally, you can take a picture and share it with your friends. So, in summary, the snap SDK, I will say that there was quite a lot of native code that we had to write. So, the previous slide that I wrote about embrace native code, it was there for a reason. I definitely had to embrace the native code for this project. But on the upside, they did provide us with a fully working example app. So, obviously, the UI didn't look like anything like us, anything like us. We had to change things quite a bit and build it from scratch, but it was a really nice help to get started. On the upside, it was a very, very smooth experience. You can really tell that this is an SDK that is powering a really powerful app, the Snapchat app itself. They've had lots of engineers and years and years of practice of making this as smooth and as nice as possible. So, it was a very nice end experience with kind of a minimal effort on our part. And very importantly for us, it works from Android api 24 and iOS 11 onwards. So, in summary, in this talk, the three main things I talked about was Vero, which is the react Native bridge for AI if you just want to try it, try out AI in react Native. And then for virtual trial in particular, we talked about Wana and we talked about Snap for developers. So, if you're interested in either of these things, I would definitely check them out. And just a shout out to Formidable, the company I work for, thanks to whom I am here, who are also with you lots of open source. Check us out at formidable.com. Thank you very much. Thank you so much. Oh, hello. Thank you so much, Catty. Now, we're going to do a brief Q&A. We're a little behind schedule, so we're going to keep this in with a few minutes. And then if you have more questions for Catty, you can find her at the speaker Q&A booth afterwards. So, the first question from the audience is, how did you get the 3d model for the shoes that you're selling? That is a great question. Actually, so there is another company called Vertibray that we work with and they are the ones that actually, well, they get shoes shipped to them, they take the 3d model of the shoe and then they work with Snap to get it integrated. So, could anybody just go and get their shoes modeled or is this like a sort of like an enterprise deal type of situation? I think this particular one is an enterprise deal type of situation, but I'm sure there are ways to do it as an individual person as well. I think the easiest way to actually do it would be to create a 3d model in like some kind of 3d modeling software and use that rather than the real shoe. That's cool. And by the way, I think we do have a talk about that. Sarah is talking about, Sarah Viera is talking about modeling in Blender later. So, that might be a good time. Perfect. I have a question for you. This is not from Slido, but you were given a brief to do this within a few weeks or within the timeline of weeks. That sounds scary and difficult. How did that actually go in the end? I think there was always, I think people tend to figure out things based on the time frames that we have. I think if we were told that we had two years to build this, we would have probably started from a much lower level and we're like, yes, we're going to build our own custom AR solution. But knowing that we had a couple of weeks, then that made that option go out the door straight away. So, we were like, okay, we need to find the best SDK, best tool for this job. Nice. Thank you. So now, if you do have more questions for Kati, that was all that we have in Slido. So, feel free to go and talk to Kati in the speaker Q&A booth. We're going to be moving on to our next talk. Thanks so much, Kati. Once more for Kati Krammer.
20 min
17 Jun, 2022

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Workshops on related topic