Building Fun Experiments with WebXR & Babylon.js

Bookmark

During this session, we’ll see a couple of demos of what you can do using WebXR, with Babylon.js. From VR audio experiments, to casual gaming in VR on an arcade machine up to more serious usage to create new ways of collaboration using either AR or VR, you should have a pretty good understanding of what you can do today.


Check the article as well to see the full content including code samples: article

by



Transcription


Hello, my name is David, working at Microsoft and today I'm going to show you some WebXR experiments using Babylon.js. So going to be fun, I'm going to show you a lot of different samples and let's have a look together what you can do today with WebXR. So I'm working in the developer division of Microsoft, the people in charge of GitHub, VS Code, Visual Studio, some azure features too. But today I'm going to talk about webgl and WebXR. So feel free to follow me on Twitter if you've got question after the talk. So what is WebXR? So probably you've heard about that, but let's briefly define before having some demos about this technology. So WebXR is a web api, obviously, that can enable both virtual reality, so using Oculus Quest 2, HTC Vive headsets or Valve Index, but also AR using either smartphone, using Android OS, but also the HoloLens 2 can support the AR feature of WebXR. So it's a replacement of WebVR, if you heard about WebVR before, so it was only doing VR and there's been some refactor around the api. Also it's supporting currently only in Chromium browser, so Microsoft Edge, Chrome, of course, but also Samsung Internet on Android and also obviously Chrome on Android. So I already talked about the various hardware supported. Today we're going to talk about BabylonJS. This is a 3d engine I've been working on a couple of years ago with a friend of mine, so that's why it's my little baby in a way. If you'd like to start on BabylonJS, feel free to have a link below, but we're going to see that just after that. So BabylonJS, if you don't know what BabylonJS is yet, it's a 3d open-source engine running on top of webgl, but also webgpu today, we are supporting webgpu. We are using WebAudio, which could be great for spatialization in VR, and we are supporting WebXR out of the box. It's an open-source framework using the Apache 2 license, lots of contribution from the community, more than half of the code source now come from the community, completely written in typescript, which enable great feature like auto-completion and quality of the code generally thanks to types, and it supports WebXR out of the box as also using controls already for VR. So meaning you have a lot of control that work both in 2D with mouse and in VR thanks to the pointer event. Soon we'll have MRTK, which is Mixed Reality Toolkit coming to BabylonJS version 5, that will enable even more control for productivity. It's being used today by a lot of first-party applications in Microsoft, so PowerPoint, like the 3d model you have on your screen, SharePoint, Teams, also the xCloud, the cloud gaming service of Xbox is using BabylonJS for the touch control, but also used by Adobe, some partners, Sony, and the US Army or Ubisoft. So let's jump into some fun demos. So later on, I will publish a blog post on my blog to share the source code of all the demos, but we have only 20 minutes. So let's just see what you can do with WebXR, if you don't know yet what you can do with it. So first, we're going to use super simple but yet super powerful feature named the XR Experience. We're using this line of code, using only one line of code, you will enable some VR experience in our case. So I'm going to jump into Chrome for now. So you see this scene which only load this scene valley with Hill Valley from Back to the Future 3d scene. Once the scene has been loaded, so in this callback code, what I'm going to ask to BabylonJS using this single line of code is please turn this existing experience into a VR compatible experience. You need to provide only one parameter, which is what should act as a floor to support the teleportation target we're going to see in video just after that. If you don't have a VR headset, what you can install is an extension in Chrome or Edge, named the WebXR extension. So you can search for it in your favorite search engine. If you have this extension enabled, you will have this button being there asking you to switch into immersive mode. So I'm going to click on this button, it's going to render for left eye and right eye for stereoscopic rendering. To be able to simulate a VR headset, you need to press the F12 key, and then you will have a new WebXR there property. So let's snap that on the right and this one on the left. Let's try to, for instance, move the headset. You see I can move on the value side to simulate the fact that you're moving, but also obviously the controller. You can also switch from Quest by default to HTC Vive. You see that Babylon.js is getting the right model out of the box, downloading it from our CDN. So let's switch back to Oculus Quest. So very useful. You see one line of code and really have an experience running out of the box. You can exit the immersive mode to go back to the classical mode, I would say. So super easy to have a first experience, just load a scene, add this line of code, and it will work. But let's have a look how it works in the real headset like the Valve Index in my case. So let's switch back to my PowerPoint presentation. This time will be a video of me inside this room reusing it. So you see that we can point on the floor and teleport, choose the orientation when you will arrive on the target. So I decided to face the door of the Doloréan there. Then I can move away and face the front of the car. You see that in the headset, obviously, I've got a stereoscopic rendering. I'm using there the VR view of SteamVR on Windows. You see the immersive experience. Well, you have to one day try it using any headset compatible with WebXR, like Valve Index on desktop, but also the Oculus Quest 2. You have super cool immersive experience with only one line of code once again, and we manage for you all the teleportation, display of the controller, stuff like that. So you see, super easy to enable. So the next demo is something like, you see that I've got an arcade machine just behind me. So I'm a huge fan of video gaming in general, but also arcade machine. So what I wanted to do is to transform an existing 2D game I had, that was using EaselJS, CreateJS if you know it, into a 3d experience and then obviously into VR experience. So let's have a look to the source code of that one. So it was really a simple experience. So let's open it. You see that you can move from the right to the left and jump. It's a platform game. Nothing special about that. This is a 2D canvas game. So what I wanted to do is to transform it in 3d. So this is what I've done there. So this is like I took a free model on Sketchfab, I think, as far as I remember about an arcade machine. I put a 2D plane on top of the arcade machine to transform that into my 2D canvas. So what I'm doing basically is to take this 2D canvas, projecting it into 3d. BabylonJS support dynamic texture, which means that you can take a 2D canvas and project that into 3d, which is super important if you like to move into VR. You need to do everything into the 3d canvas because classical html element cannot be rendered in the immersive mode in the headset. So you need to do everything in the webgl canvas. So this is what I'm doing. Basically, I'm loading my game from the 2D version and I'm displaying it in 3d, which means that now I can play to the game also that way, but in 3d that time, which is fun, but not as fun as doing that in VR. So let's try this same experience now in VR using videos. So for that, I'm going to jump back to my presentation and I'm going to show you the full experience, you know, how it works fully. So you can see that I'm currently playing using the Xbox controller. So like in 2D and I'm then going to press the, you know, headset, VR headset button at the bottom right. So it's going then to switch things to SimVR in the immersive mode. And obviously I need to put the headset on to be able to do it. And in the same way, like during the, you know, back to the future demo, I was able to use that in 3d. This time, what I've been doing is mapping, you know, on the controller, the SimStick to be able to move. And the experience is really cool. I wanted to really replicate what I was able to do on my Arcan machine, but in VR. So you can see in a couple of lines of code, once again, I was able to transform a 2D game into 3d and then thanks to Babylon.js into VR this time. So pretty fun. Like you will have access to the source code later on if you like to try it at home, but maybe the next version I will have to map on the, you know, physical, in a way, Arcan machine controller to move that with my hand because we support hand tracking out of the box. So the next one is also interesting. So I'm really a huge fan of music. So I'm doing music composition myself and I wanted to mix my passion, like how to do VR and music. So what I'm going to do this time, I'm going to show you this demo. So it's like a 360 piano. So if I'm pressing, you know, the various notes, I can play some music using it. And I'm using web audio to analyze in real time the wave of the sound to display that on the ribbon. So it's fun. And I'm going to be at the center of this experience to switch into VR this time. So obviously I've done that using my headset. I've done a recording of it and using the Oculus Quest 2. So let's have a look. So I am inside the Oculus Quest 2, as you can see, and I'm going once again to press the magic button. So I am now in the center of the experience like we've seen in the browser. And maybe you can recognize this super famous music done by John Williams. So you see the controller are being displayed in 3d. And I can now have a 360 piano experience. I wanted to do the same, but using the HoloLens 2 this time. So the demo is quite similar. What I've done this time is, you see I'm at the center of the piano. I can press once again the various notes. But I've replaced all the background with a unique black color. And when you use the HoloLens 2 in VR, and if you have a black background, like for the, I would say, the Sky Dome, the HoloLens is going to remove the black texture with a transparent one. So I will be able to see through if you like to. So let's have a look how it works now inside the HoloLens. So I'm first displaying Microsoft Edge in 2D, even if I'm seeing that as an hologram. And I'm going to press the immersive mode button once again. And this time I will be surrounded by my virtual piano. And you see that the black texture has been removed. And tracking is being done by Babylon.js out of the box too, which means that now I can use my hand to tap and, you know, simulate a pointer event. We could have done something similar, like taking a virtual object and knocking on the various keys. But you see how it works. As long as you have a pointer event, it will be simulated in VR for you. So you see, super easy once again, we've been able to use the Valve Index, Oculus Quest 2, HoloLens 2, using like the same technology, web technology that can work across all the hardware. Next one is super fun, is an AR demo. This time we've seen VR up to now. AR is also supported out of the box in WebXR and Babylon.js. So this time I'm going to put an object virtual object into this room. And you will see, I will have kind of an augmented experience being displayed. So you can maybe already see some particles being displayed. You can see someone flying around. So this is not normal in the classical world. And when I'm going to move slightly, you know, at the back of this room, you will see kind of a orb, magic orb being displayed. You know, sending magic particles in the real world. So this one has been done using Samsung Android phone, which is S8, which is not super recent. So you need to have a fairly recent phone to make it work. And you have this kind of experience. So it's super fun. If you like to do it at home, you can simply go on this URL, ak.ms.org. This URL, ak.ms.org demo AR. And you will be able to try it yourself if you like to, and have a look to the source code if you got a recent Android phone. So we've seen a lot of fun demos, but obviously we're using this kind of technology at Microsoft for more serious scenarios, even if for me gaming is also a serious business. So first, e-commerce is really interested by augmented reality scenarios for sure. So we have this article on the Babylon.js blog that you can see there, that can show you the type of experience you can build. So we have access to this demo there. So if you'd like to try it at home too, you can simply go there and load it. So you need, once again, a WebXR compatible phone. And you really have this kind of experience. So I can show you that in the demo, which means that you can position it, a virtual chair in your real living room, potentially change the material, the color, and then have a kind of experience of how the object will look like into your home. So we can see plenty of useful scenario, obviously for e-commerce, but not only for e-commerce as we're going to see just after that. So feel free to go to the Babylon Medium blog and check that on your side. Once we've done this fun stuff, we can go a step further. So we all talk or hear a lot about metaverse. So this is something I wanted to do is to enable a virtual call in a way inside a VR scene to be able to call someone being in Microsoft Teams, which is our equivalent of Zoom. If you don't know it or Slack, so we can do video calls. So for that, I'm using azure Communication Services, which is like a kind of the underlying building blocks of Teams that you can use in your own web app to be able to establish a call with Teams. So there is various SDK like the javascript one that I'm going to use and mix that with Babylon.js. So let's have a look to the output of this demo. It's going to last- So hello, I am in this virtual museum inside a metaverse running on top of a WebXR and Babylon.js. And let's imagine I like to visit this museum with a lot of paintings, and that I could be quickly lost because I'm- So you see, I've been able to use the same teleport target we've seen at the beginning. So really the same type of code we've seen so far. Oh, and let's be cool. Being able to call someone over Teams. So what I'm going to do, I'm using azure Communication Services to be able to call someone inside Microsoft Teams. So it's a regular Microsoft Teams, nothing special about this version. And I'm going to press a call button and I'm going to establish a connection with this beautiful woman. So normally I should be in the lobby in the other side and she should be able to accept me. And as soon as she's accepted me, so hello, Christina. So Christina is my girlfriend. So she helped me to do this demonstration. And Christina, can you please help me to go inside this museum because I'm a little bit lost? Yes, of course. Could you please go in front of you and go closer to the wall that's on the right or on the big room, do you see it? Okay, I'm going to take you with me on my controller. It will be easier. So I'm going to take you there. So is it on my right? Oh yes, look at the picture on your right. Just the first one, it looks so beautiful. This one? Yeah. So what do you think? Should we buy it? Do you like it too? It depends on the number of the zero after. Yes, don't worry, it's not the price. It's the when it was made. So I don't know about the price. Maybe we should discuss about that later on. So, but it's great. We agreed on this one. So as you can see, so I'm going to go back there to put you back in. So you see that being able to do a fully immersive call in the browser this time, calling my girlfriend, being in Teams. So you can think about this kind of scenarios to do virtual visit when you buy your house and you like to have the kind of interactive virtual visits. There's a lot of interesting scenarios using those technologies. So we are using it also in other Microsoft products like SharePoint is using it in a product named SharePoint Spaces. We are using it also into Power Apps. We have controls to enable the same kind of, you know, with a chair that was in the living room, but like with important object to be able to see if one big object will enter like your buildings. So it's using once again, WebXR. And also to be able to do some kind of, you know, interactive media experience like something that we often see in AR. This can be done also using WebXR. So if you'd like to go further than that, you should try our WebXR experiment and controls. We have also 2D and 3d controls that works either in 2D or in VR because we're using the pointer events to work in both mode. Babylon.js version five should be out soon or will be soon when this video will be released. We will have a new mixed reality toolkit being available. And if you'd like to be able to play those demos yourself at home, if you get the hardware, go to daverousse.com, which is my blog. And I will publish you in a full article about this conference. Thank you so much. And if you've got questions, ping me on Twitter and ask me some questions after the talk. See you, thank you. Yeah, thanks David for that fantastic first talk. I'm guessing you're a bit of a movie buff with all the Back to the Future and Star Wars references. Yes, I'm a huge fan of such movies. I love movies and sci-fi movies, I guess like a lot of developers and also movie scores. And I love listening to Anne Zimmer for instance. I love it, I love it. Now that's not one of the official questions, but before we jump into the questions that are coming through from the chat, we're gonna have a look at the poll results, which are up here on the screen for everyone to see. We can see most people are interested in the virtual reality side of gaming. As someone who works quite a lot in the games industry, I've worked quite a lot with VR as well. These results don't really surprise me. So what are your thoughts on this poll results, David? I'm surprised too, because most of the time people I'm meeting doing WebXR and before it was WebVR. So WebVR was only VR, but now WebXR is AR. And a lot of people are interested more usually in AR scenarios like e-commerce, because it's easier to get in the pocket of people because you just need the phone rather than hearing your headset. But it's interesting, maybe we've got a bias because of the community also attending to this event, interesting in gaming. So, but it's great. It's great to see a lot of people are interested in doing gaming in VR. I'm a true believer that web could be the next frontier for gaming in general and VR. So I would love to see more and more gaming experiments in VR using web technologies would be awesome. And metaverse, of course, because of how this is a buzzword it's super popular. So I'm not surprised metaverse neither. And to be honest, I'm currently involved in many projects at Microsoft around the metaverse topic using web technologies. So a lot of people are working on that too to push and create new products. So great to see that. But a little bit surprised like yourself, Michelle. And what do you think is where web development and web development VR is moving towards? Is it still kind of a gimmick or what are you starting to see, I suppose, experimented with and tried in web VR? It's not, well, it used to be something seen as not really useful, complex to deploy. So it's simpler now, thanks to many frameworks. So thanks to Babylon GS, A-Frame, FreeGS, many great frameworks you have on the market. So it's now really easy to set up an experience and people are using it for true. Like at the end of my conferences I was referring to Microsoft products, of course, because I know them because we've been working with them. But it's now in like official product to use these kinds of technologies, to do this kind of placing objects in the real world. So it's start to be real for sure. And the metaverse to my point of view, if it works, because nobody knows if it's going to really work and if people are going to be interested in that. But web could be also a great usage implementation of those technologies. So maybe a lot of people will be more interested into doing VR or I would say also mixed reality like XR, really XR using metaverse. So I do hope we are at the beginning of a new cycle, but it's already being deployed in production to be honest, for AR at least. I'm assuming a lot of people here are going to be developing some of those VR and AR. So good luck to you if you're up to. So I've got a question from Melody Connor, who's seems to be an iOS developer, who is asking, when do you think WebXR will make it to the Safari iOS system anytime soon? Yeah. So it's maybe more a question to Apple and I would love Apple to embrace WebXR. So there's been some rumors. So if you follow some people that being influenced in the web space or PWA, some people have seen that WebXR was maybe about to be implemented by Apple. I don't have any official information about that. We have rumors about maybe Apple that's going to chip, you know, glasses. I would love Apple to embrace this technology because it would mean for sure that we would be able to have a single code base, being able to target all devices, which is also the promise of web technologies. Before that, you can already experiment a little bit AR on the iPhone using more or less web technology. We have some kind of component, web component that exists that can translate, you know, your model to the USD format of Apple and being able to do this kind of AR. So I've been doing experiment myself, but it's not using WebXR. You are forced today to use a native implementation and a way to influence that, I would say, the more people will use WebXR, the more it will put pressure on Apple to consider WebXR implementation for sure, because they don't want to miss those great experience on their own devices. So this is a way for you also to influence that in the way. So I guess we need to get Apple on the next one and we can ask them that question. So I've got another WebXR question from KB, which who's asking, how does WebXR compare with other native or desktop based counterparts in terms of the performance of the product? Yes, so we always have this question to be honest, like outside of WebXR, even with webgl, now we've got webgpu. Of course, the performances of what you can do in a browser today are less impressive than what you can do using the native stack, like either DirectX, OpenGL, or, you know, or Metal on iOS. But that's why you need to take that into account too, to design your experience. You won't be able to build triple A games today in the browser, not yet. Even if it's been time since we have the equivalent of the performance of at least Xbox 360, and now even more than that in the browser, which means that it's already enabled some interesting scenarios in the browser. The main limitation we have is not the GPU to be honest, it's the CPU side, because javascript is still being monoshedded and most of the time. So it's going to limit a lot what you can do regarding physics, for instance, because it's all on the CPU side. So compared to native, for sure we have less powerful features, but that's why I think using simple model to do AR or even VR, like a lot of people are doing VR using low poly models, you can already do a lot of stuff. And compared to native, you can target much more devices. Awesome, thanks for that. So another one on the Babylon.js and WebXR toolkit, and I know Babylon.js just got a major update only last week, so it might've been included, but can the toolkit detect vertical surfaces? Vertical surfaces, so this is a good question. I would need to check with my team because I'm not sure. I know that we have access to special mapping to add flat surfaces like horizontal one. I will need to check out to do the vertical one. So I'm not sure, like what we shipped exactly with the toolkit was a set of UI building blocks you can use coming from the HoloLens because we've been working closely with the people building the HoloLens and to get inspired by their UX paradigm and to put that into Babylon. But I would need to check the vertical one. And feel free to ask the question on our forums, like on the Babylon.js forum. So you can find it on Babylon.js.com. Some people that would be smarter than I am would be able to answer that question. So thanks, Maier, for asking that question. Really good, I think Babylon might even be using the discussions feature on GitHub too. So yeah, go ask them. They should be able to tell you. And also go check out their latest version, version 5.0. Just got released only, yeah, seven days ago now. I'm looking at my calendar, it's ticked over midnight. They've got a ton of new changes. So go check it out, it's really cool. So our next question is, what are the key differentiators of Babylon.js regarding the WebXR support? So our main differentiator, I would say, is something I've been working at for a long time. So it's not because I am, you know, got too big ego, but this was something really we wanted to work was this to make it super simple to use. So once you're loading, for instance, a full scene, like I've shown, like with the back to the future, like Doloréan, you know, scene, you have a single line of code to enable, ready to use out of the box experience. We name it WebXR Experience Helper, which means that by default, you can teleport, you can, you know, have the model being loaded from our CDN. So it's really, really simple to use if you're not a 3d expert. We are managing everything for you. And if you are a more advanced developer, what we are going to provide you is also like a set of controls, like the Mixed Reality Toolkit one, and also advanced control that going to work on using mouse touch and your VR controller. So we are really trying to simplify as much as we can your onboarding experience. And everything is inside, like we've got web audio, we've got, you know, spatial audio included, physics changing that could be enabled if you take care of that in VR. So this is our main differentiator, simplicity and lots of stuff included out of the box. Awesome. And where can we find the awesome things that you've built on Babylon, JS and WebXR? Do you have a GitHub repo or something we can check out for your projects? Yeah, so I will publish soon, like this week or probably beginning of next week on my blog, daverous.com, D-A-V-E-R-O-U-S.com. All the sample I've been working on, to be honest, I've been working on a spare time. So I need now to clean a little bit of code to share that with developers. And I will put that on my blog with kind of the same, the video maybe of this conference later on. Awesome. And I know there's a few people in the chat who we might not have gotten to their question because we're coming to time, but if people have questions or they're watching the recording and they want to get in touch with you, what's the best way for them to get in touch with you and ask questions after this summit? Our best way is to contact me on Twitter for sure. So D-A-V-E-R-O-U-S on Twitter. I'm usually super responsive there. So feel free to ask me any question there, either public or private, and I will try to help you on your questions. Awesome. So thank you so much for joining us today, David. I really enjoyed it. I love the movie references and it's really cool to see what's happening in the AR and VR space. So thanks for being here. Thank you so much. See you. ♪♪♪
33 min
07 Apr, 2022

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Workshops on related topic