Building Fun Experiments with WebXR & Babylon.js

Rate this content
Bookmark

During this session, we’ll see a couple of demos of what you can do using WebXR, with Babylon.js. From VR audio experiments, to casual gaming in VR on an arcade machine up to more serious usage to create new ways of collaboration using either AR or VR, you should have a pretty good understanding of what you can do today.


Check the article as well to see the full content including code samples: article

33 min
07 Apr, 2022

Video Summary and Transcription

This Talk explores the use of Babylon.js and WebXR to create immersive VR and AR experiences on the web. It showcases various demos, including transforming a 2D game into a 3D and VR experience, VR music composition, AR demos, and exploring a virtual museum. The speaker emphasizes the potential of web development in the metaverse and mentions the use of WebXR in Microsoft products. The limitations of WebXR on Safari iOS are discussed, along with the simplicity and features of Babylon.js. Contact information is provided for further inquiries.

Available in Español

1. Introduction to Web XR and Babylon GS

Short description:

Hello, I'm David from Microsoft. Today, I'll show you web XR experiments using Babylon JS, a 3D engine running on WebGL and WebGPU. Web XR enables both VR and AR experiences on different devices. It's a replacement for web VR and currently supported in Chromium browsers. Babylon GS is an open-source framework that supports WebXR and provides great features like auto completion and spatialization in VR. It's widely used in Microsoft applications.

Hello, my name is David. I'm working at Microsoft and today I'm going to show you some web XR experiments using Babylon JS. It's going to be fun.

I'm going to show you a lot of different samples and let's have a look together what you can do today with web XR. So I'm working in the developer division of Microsoft, the people in charge of GitHub, VS Code, Visual Studio, some Azure features too. But today I'm going to talk about WebGL and web XR.

So feel free to follow me on Twitter if you've got questions after the talk. So what is web XR? So probably you've heard about that, but let's briefly define before having some demos about this technology. So web XR is a web API obviously, that can enable both virtual reality, so using Oculus Quest 2, HTC Vive headsets or Valve Index, but also AR using either smartphone, using Android OS. But also the HoloLens 2 can support the AR feature of web XR. So it's a replacement of web VR, if you heard about web VR before. So it was only doing VR and there's been some refactor around the API. And also it's supporting currently only in Chromium browser, so Microsoft Edge, Chrome of course, but also Samsung internet on Android, and also obviously, Chrome on Android. So I already talk about the various hardware supported.

And today we're going to talk about Babylon GS. So Babylon GS, this is a 3D engine I've been working on a couple of years ago with a friend of mine, so that's why it's my little baby in a way. And if you'd like to start on Babylon GS, feel free to have a link below, but we're going to see that just after that. So Babylon GS, if you don't know what Babylon GS is yet, it's a 3D open source engine running on top of WebGL, but also WebGPU today. We're supporting WebGPU. We are using Web Audio, which could be great for spatialization in VR. And we are supporting WebXR out of the box. It's an open source framework using the Apache 2 license. A lot of contribution from the community. More than half of the code source now come from the community. Completely written in TypeScript, which enables great features like auto completion and quality of the code, generally thanks to types. And it supports WebXR out of the box, also using controls already for VR. You have a lot of controls that work in 2D with mouse and VR thanks to the pointer elements. Soon we have MRTK, which is mixed reality tool kit coming to Babylon GS version 5. That will enable even more control for productivity. It's being used today by a lot of first-party applications in Microsoft, so PowerPoint, like the 3D model you have on your screen.

2. WebXR Demos and VR Experience

Short description:

SharePoint, Teams. Also, the XCloud, the cloud gaming service of Xbox is using Babylon GS for the touch control. Later on, I will publish a blog post on my blog to share the source code of all the demos. First, we're going to use the super simple, but yet super powerful feature named the XR Experience. We're using this line of code. If you don't have a VR headset, what you can install is an extension in Chrome or Edge named the WebXR extension. And if you have this extension enabled, you will have this button being there asking you to switch into a immersive mode. So very useful, you see one line of code and really have an experience running out of the box and you can exit the immersive mode to go back to the classical mode, I would say. But let's have a look how it works in a real headset like the Valve Index in my case. So let's switch back to my powerpoint presentation and this time will be a video of me inside this room reusing it. And you see that in the headset obviously, I've got a stereoscopic rendering. I'm using there the VR view of SteamVR on Windows and you see the immersive experience, so you have to one day try it using any kind of headset compatible with WebXR like Valve Index on Desktop but also Xerox Oculus Quest 2 and you have like super cool immersive experience with only one line of code once again and we manage for you all the teleportation, display of the controllers, stuff like that. So you see, super easy to enable.

SharePoint, Teams. Also, the XCloud, the cloud gaming service of Xbox is using Babylon GS for the touch control. It's being used by Adobe, some partners, Sony and the US Army or Ubisoft.

So let's jump into some fun demos. Later on, I will publish a blog post on my blog to share the source code of all the demos, but we have only 20 minutes, so let's just see what you can do with WebXR if you don't know yet what you can do with it.

First, we're going to use the super simple, but yet super powerful feature named the XR Experience. We're using this line of code. Using only one line of code, you will enable some VR experience in our case. So I'm going to jump into Chrome for now. So you see this scene, which only loads this scene valley with Hill Valley from Back to the Future 3D scene, and once the scene has been loaded in this callback code, what I'm going to ask to Babylon.js, using this single line of code, is please turn this existing experience into a VR compatible experience. You need to provide only one parameter, which is what should act as a floor to support the teleportation target we're going to see in video just after that.

If you don't have a VR headset, what you can install is an extension in Chrome or Edge named the WebXR extension, so you can search for it in your favorite search engine. And if you have this extension enabled, you will have this button being there asking you to switch into a immersive mode. So I'm going to click on this button, it's going to render, you know, for left eye and right eye for stereoscopic rendering. And to be able to simulate a VR headset, you need to press the F12 key and then you will have a new WebXR there property. So let's snap that on the right and this one on the left. And let's try to, for instance, move the headset. You see I can move on the various side to simulate the fact that you're moving, but also obviously the controller and you can also switch from Quest by default to HTC Vive. And you see that Babylon JS is getting the right model out of the box, downloading it from our CDN. So let's switch back to a Oculus Quest. So very useful, you see one line of code and really have an experience running out of the box and you can exit the immersive mode to go back to the classical mode, I would say. So super easy to have a first experience, just load a scene, add this line of code and it will work. But let's have a look how it works in a real headset like the Valve Index in my case. So let's switch back to my powerpoint presentation and this time will be a video of me inside this room reusing it. So you see that we can point on the floor and teleport, choose the orientation when you will arrive on the target. So I decided to face the door of the DeLorean there. Then I can move away and face the front of the car. And you see that in the headset obviously, I've got a stereoscopic rendering. I'm using there the VR view of SteamVR on Windows and you see the immersive experience, so you have to one day try it using any kind of headset compatible with WebXR like Valve Index on Desktop but also Xerox Oculus Quest 2 and you have like super cool immersive experience with only one line of code once again and we manage for you all the teleportation, display of the controllers, stuff like that. So you see, super easy to enable.

3. Transforming a 2D Game into 3D and VR

Short description:

I transformed an existing 2D game into a 3D and VR experience using BabylonJS. By projecting a 2D canvas into 3D, I was able to display the game in VR. With just a few lines of code, I replicated the experience of playing the game on an arcade machine in VR.

So the next demo is something like you see that I've got an arcade machine just behind me. So I'm a huge fan of video gaming in general but also arcade machine. So what I wanted to do is to turn out, yes, to transform an existing 2D game I had that was using IsleJS, Create.js if you know it, into a 3D experience and then obviously into a VR experience. So let's have a look to the source code of that one. So it was really a simple experience. So let's open it. And you see that you can move from the right to the left and jump. It's a platform game. Nothing special about that. And this is a 2D canvas game. So what I wanted to do is to transform it in 3D. So this is what I've done there. So this is like I took a free model on Sketchfab, I think, as far as I remember, about an arcade machine. I took like a 2D plane on top of the arcade machine to transform that into my 2D canvas. So what I'm doing, basically, is to take this 2D canvas, projecting it into 3D. BabylonJS support dynamic texture, which means that you can take a 2D canvas and project that into 3D, which is super important if you like to move into VR. You need to do everything into the 3D canvas because classical HTML elements cannot be rendered in the immersive mode in the headset. So you need to do everything in the WebGL canvas. So this is what I'm doing, basically, I'm loading my game from the 2D version and I'm displaying it in 3D, which means that now I can play to the game also that way, but in 3D at the time, which is fun, but not as fun as doing that in VR. So let's try this same experience now in VR. So for that, I'm going to jump back to my presentation and I'm going to show you the full experience, you know how it works for me. So, um, you can see that I'm currently playing using the Xbox controller. So like in 2D, and I'm then going to press the headset button at the bottom right. So it's going then to switch, thanks to SteamVR in the immersive mode. And obviously I need to put the headset on to be able to do it. And in the same way, like during the, you know, back to the future demo I was able to view that in 3D. This time, what I've been doing is mapping, you know, on the controller, um, the thumbstick to be able to move. And the experience is really cool. I wanted to really replicate what I was able to do, um, on my, uh, Arcan machine, but in VR. So you can see in a couple of lines of code once again, I was able to transform a 2D game into 3D and then thanks to BabylonJS into VR this time.

4. VR Music Composition and 360 Piano

Short description:

You will have access to the source code later on. I will have to map the Arcan machine controller to move it with my hand. I'm going to show you a 360 piano demo. I'm using web audio to analyze the sound wave in real time. I can play music using the various notes. I can have a 360 piano experience in the Oculus Quest 2. I can also do it in the HoloLens 2 with a unique black background. The black texture is removed, and tracking is done by BabylonGS. I can simulate a pointer event using my hand. It's super easy to use different hardware with the same web technology.

So pretty fun. Like you will have access to the source code later on, if you'd like to try it at home, but uh, the, maybe the next version I will have to map on the, you know, physical, you know, way, Arcan machine controller to move that with my hand, because we support hand tracking out of the box.

So the next one is also interesting. So I'm really a huge fan of music. So I'm doing a music composition myself, and I wanted to mix my passion, like how to do VR and music. So what I'm going to do this time, I'm going to show you a, this demo. So it's like a 360 piano. So if I'm pressing, you know, the, the various notes I can play some music using it. And I'm using web audio to analyze in real time, the wave of the sound to display that on the ribbon. So it's fun.

And I'm going to be at the center of this experience to switch into VR this time. So obviously I've done that using my headset, I've done the recording of it and using the Oculus Quest 2. So let's have a look. So I'm inside the Oculus Quest 2, as you can see, and I'm going once again to press the magic button. So I am now in the center of the experience like we've seen in the browser. And maybe you can recognize this super famous music done by John William. So you see the controller are being displayed in 3D. And I can now have a 360 piano experience.

I wanted to do the same but using the HoloLens 2 this time. So the demo is quite similar. What I've done this time is, you see I'm at the center of the piano, I can press once again the various notes, but I've replaced all the background with a unique black color. When you use the HoloLens 2 in VR and if you have a black background like for the Skydome, the HoloLens is going to remove the black texture with a transparent one. So I will be able to see through if you like to. So let's have a look at how it works now inside the HoloLens. So I'm first displaying Microsoft Edge in 2D, even if I'm seeing that through as an hologram, and I'm going to press the immersive mode button once again. And this time I will be surrounded by my virtual piano and you see that the black texture has been removed and tracking is being done by BabylonGS out of the box too. Which means that now I can use my hand to tap and simulate a pointer event. We could have done something similar like taking a virtual object and knocking on the various keys, but you see how it works. As long as you have a pointer event, it will be simulated in VR for you. So you see, super easy once again with being able to use the Valve Index, Oculus Quest 2, HoloLens 2, using the same web technology that can work across all the hardware.

5. AR Demo and Virtual Calls

Short description:

Next is an AR demo where a virtual object is placed in a room, creating an augmented experience. You can see particles and objects interacting with the real world. To try it yourself, visit ak.ms/orb-demo-AR. Augmented reality scenarios have applications in e-commerce, allowing users to visualize virtual objects in their real environment. Check out the Babylon Medium blog for more information. Additionally, we explore the concept of a virtual call within a VR scene using Azure Communication Services and Babylon.js.

Next one is super fun, it's an AR demo. This time we've seen VR up to now. AR is also supported out of the box in WebXR and BabylonJS. So this time I'm going to put an object, a virtual object into this room. And you will see I will have kind of an augmented experience being displayed. So you can maybe already see some particles being displayed. You can see someone flying around. So this is not normal in the classical world. And when I'm going to move slightly, you know, at the back of this room, you will see kind of a orb, magic orb being displayed, you know, sending magic particles in the real world. So this one has been done using Samsung Android phone which is S8, which is not super recent. So you need to have a fairly recent phone to make it work. And you have this kind of experience. So it's super fun. If you like to do it at home, you can simply go on this URL, ak.ms slash orb demo AR, and you will be able to try it yourself if you like to, and have a look at the source code if you got a recent Android phone.

So we've seen a lot of fun demos, but obviously, we're using this kind of technology at Microsoft for more serious scenarios, even if for me gaming is also a serious business. So first, e-commerce is really interested by augmented reality scenarios for sure. So we have this article in on the Babylon JS blog, that you can see there, that can show you the type of experience you can build. So we have access to this demo there. So if you'd like to try it at home, too, you can simply go there and load it. So you did, once again, the WebXR compatible phone, and you really have this kind of experience. So I can show you that in the demo, which means that you can position a virtual chair in your real living room, potentially change, you know, the material, the color, and then have a kind of experience of how the object will look like into your home. So we can see plenty of useful scenarios, obviously, for e-commerce, but not only for e-commerce as we're going to see just after that. So feel free to go to the Babylon Medium blog and check that on your side.

Once we've done this fun stuff, we can go a step further. So we all talks about Metaverse. So this is something I wanted to do is to enable a virtual call in a way inside a VR scene, to be able to call someone being in Microsoft Teams, which is our equivalent of Zoom, if you don't know it, or Slack. So we can do video calls. So for that, I'm using Azure Communication Services, which is like a kind of the underlying building blocks of Teams that you can use in your own web app to be able to establish a call with Teams. So there is various SDK, like the JavaScript one that I'm going to use, and mix that with Babylon.js. So let's have a look to the output of this demo.

6. Exploring a Virtual Museum and Making Calls in VR

Short description:

Hello, I am in a virtual museum inside a metaverse, running on WebXR and Babylon.js. Let's imagine I'm visiting a museum with many paintings. I could get lost, but I can use the same teleport target. I can also make calls using Azure Communication Services to connect with someone in Microsoft Teams. I called Christina, my girlfriend, to help me navigate the museum. We found a beautiful painting. Should we buy it?

It's going to... So hello, I am in this virtual museum inside a metaverse, running on top of WebXR and Babylon.js. And let's imagine I like to visit this museum with a lot of paintings. But I could be quickly lost because I'm... So you see I'm being able to use the same teleport target we've seen at the beginning. So really the same type of call with Teams so far.

Oh, and let's be cool. Being able to call someone over Teams. So what I'm going to do, I'm using Azure Communication Services to be able to call someone inside Microsoft Teams. So it's a regular Microsoft Team, nothing special about this version. And I'm going to press the call button, and I'm going to establish a connection with this beautiful woman. So normally, I should be in the lobby, in the other side. And she should be able to accept me. And as soon as I she's accepted me. So hello, Christina. So Christina is my girlfriend. So she helped me to do this demonstration. And Christina, can you please help me to go inside this museum? Because I'm a little bit lost. Yes, of course. Could you please go in front of you? And go closer to the wall that's on the right in the big room. Do you see it? Okay, I'm going to take you with me on my controller. It will be easier. So I'm going to take you there. So is it on my right? Oh, yes. Look at the picture on your right. Just the first one. It looks so beautiful. This one. Yeah. So what do you think? Should we buy it? Do you like it too? It depends on the number of the zero after the first price.

7. Using WebXR in Microsoft Products

Short description:

We are using WebXR in other Microsoft products like SharePoint Spaces and PowerApps. It enables interactive virtual visits and media experiences. Try our WebXR experiment and controls, including 2D and 3D options. Babylon.js version 5 and a new mixed reality toolkit will be available soon. Visit daverus.com for more information.

Don't worry. It's not the price, it's the ... when it was made. So I don't know about the price. Maybe we should discuss about that later on. But it's great. We agreed on this one. So as you can see, I'm going to go back there to put you back in ... So you see that being able to do a full immersive call in the browser this time, calling my girlfriend, being in Teams. So you can think about this kind of scenarios to do virtual visit when you buy your house and you like to have the kind of interactive virtual visits. There's a lot of interesting scenarios using those technologies.

So we are using it also in other Microsoft products, like SharePoint. He's using it in a product named SharePoint Spaces. We are using it also into PowerApps. We have controls to enable this same kind of, you know, with a chair that was in the living room, but like with important objects to be able to see if one big object will enter like your your buildings. So it's using once again WebXR. And also to be able to do some kind of interactive media experience, like something that we often see in AR, so this can be done also using WebXR. So if you like to go further than that, you should try our WebXR experiment and controls We have also 2D and 3D controls that works either in 2D or in VR because we are using the pointer event to work in both mode. Babylon.js version 5 should be out soon, or will be soon when this video will be released. We will have a new mixed reality toolkit being available. And if you like to be able to play those demos yourself at home, if you get the hardware, go to daverus.com, which is my blog. And I will publish a full article about this conference. Thank you so much. And if you've got question, ping me on Twitter and ask me some question after the talk. See you. Thank you. Yeah. Thanks, David, for that, that fantastic first talk. I'm guessing you're a bit of a movie buff with all the Back to the Future and Star Wars references. Yes, I'm a huge fan of such movies.

8. Web Development VR and the Metaverse

Short description:

I love movies and sci-fi movies. I love listening to Hans Zimmer. Most people are interested in the virtual reality side of gaming. WebXR is now AR and many people are interested in AR scenarios like e-commerce. I'm a true believer that web could be the next frontier for gaming in general and VR. I'm currently involved in many projects at Microsoft around the metaverse topic using web technologies. Web development VR is moving towards simplicity and is now easier to deploy thanks to frameworks like BabylonJS, A-Frame, and FreeJS. It's now in official products and starts to be real. The metaverse is a buzzword and web could be a great implementation of this technology.

I love movies and sci-fi movies, I guess, like a lot of developers and also movie stars. And I love listening to Hans Zimmer, for instance. I love it. I love it.

Now, that's not one of the official questions. Before we jump into the questions that are coming through from the chat, we'll have a look at the poll results, which are up here on the screen for everyone to see. Because see, most people are interested in the virtual reality side of gaming as someone who works quite a lot in the games industry. I've worked quite a lot with VR as well. These results don't really surprise me. So what are your thoughts on this poll results, David?

I'm surprised too, because most of the time people I'm meeting are doing WebXR before it was WebVR, so WebVR was only VR. But now WebXR is AR and a lot of people are interested more, usually in AR scenarios like e-commerce, because it's easier to get in the pocket of people, because you just need the phone rather than hearing a headset. But it's interesting, maybe we've got a bias because of the community also attending to this Eben interesting in gaming. So, but it's great, it's great to see a lot of people are interested in doing gaming in VR. I'm a true believer that web could be the next frontier for gaming in general and VR. So I would love to see more and more gaming, experiment in VR using web technologies would be awesome. And metaverse, of course, because of how this is a buzzword, is super popular. So I'm not surprised by metaverse either. And to be honest, I'm currently involved in many projects at Microsoft around the metaverse topic using web technologies. So a lot of those people are working on that to push and create new products. So great to see that, but a little bit surprised like yourself, Michèle. And what do you think is where web development and web development VR is moving towards? Is it still kind of a gimmick or what are you starting to see? I suppose experimented with and tried in web VR?

It's not well, it used to be something that was not really useful, complex to deploy. So it's simpler now, thanks to many frameworks. So thanks to BabylonJS, A-Frame, FreeJS many great frameworks you have on the market. So it's now really easy to set up an experience and people are using it for true like at the end of my conferences I was referring to Microsoft products of course, because I know them because we've been working with them. But it's now in like you know, official product, to use these kinds of technologies to do this kind of placing objects in the real world. So it starts to be real for sure. And the metaverse to my point of view, if it works, because nobody knows if it's going to really work. And if people are going to be interested in that. But web could be also a great usage implementation of this technology. So maybe a lot of people will be more interested into doing VR.

QnA

WebXR and Babylon.js on Safari iOS

Short description:

WebXR on Safari iOS is a question for Apple. Rumors suggest Apple may implement WebXR, which would allow a single code base for all devices. Currently, you can experiment with web technology on the iPhone using a web component that translates models to Apple's USD format. WebXR performance in the browser is not as powerful as native counterparts, but it enables interesting scenarios. The main limitation is the CPU, as JavaScript is mostly monothreaded. However, using simple models, you can achieve a lot in AR and VR. The Babylon.js WebXR toolkit may not detect vertical surfaces, but it offers UI building blocks inspired by HoloLens. For more information, visit the Babylon.js forum.

Or I would say also mixed reality like XR, really XR using metaverse. So I do hope we are at the beginning of a new cycle, but it's already being deployed in production, to be honest for AR at least. I'm assuming a lot of people who are going to be developing some of those VR and AR, so good luck to you if you have to.

So I've got a question from Melody Connor, who seems to be an iOS developer who is asking, when do you think WebXR will make it to the Safari iOS system anytime soon? So it's maybe more a question to Apple, and I would love Apple to embrace WebXR. So there's been some rumors. Also if you follow some people that have been influenced in the web space or PWA, some people have seen that WebXR was maybe about to be implemented by Apple. I don't have any official information about that. We have rumors about maybe Apple going to chip, you know, glasses. I would love Apple to embrace this technology, because it would mean for sure that we would be able to have a single code base being able to target all devices, which is also the promise of web technologies. Before that, you can already experiment a little bit on the iPhone using more or less web technology. We have some kind of component, web component, that exists, that can translate, you know, your model to the USD format of Apple and being able to do these kinds of AR. So I've been doing experiments myself, but it's not using WebXR. You are forced today to use a native implementation. And a way to influence that, I would say the more people will use WebXR, the more pressure on Apple to consider WebXR implementation, for sure, because they don't want to miss those great experiments on their own devices. So this is a way for you also to, to influence that in a way. So I guess we need to get Apple on the next one and we can ask them that question.

So we've got another WebXR question from KB, who's asking, how does WebXR compare with other native or desktop-based counterparts in terms of the performance of the product? So we always have this question, to be honest, like outside of WebXR, even with WebGL and now we've got WebGPU. Of course, the performances of what you can do in a browser today are less impressive than what you can do using the native stack like either DirectX, OpenGL, or meta on iOS. But that's why you need to take that into account too to design your experience. You won't be able to build AAA games today in the browser, not yet, even if it's been the equivalent of the performance of at least Xbox 360, and now even more than that, in the browser which means that it already enables some interesting scenarios in the browser. The main limitation we have is not the GPU to be honest, but the CPU side because JavaScript is still being monothreaded most of the time so it's going to limit a lot what you can do regarding physics for instance because it's all on the CPU side. So compared to native, for sure we have less powerful features. But that's why I think using simple model to do AR or even VR like a lot of people are doing VR using low-poly models, you can already do a lot of stuff and compared to native you can target much more devices. Awesome, thanks for that.

So another one on the Babylon.js and WebXR toolkit and I know Babylon.js just got a major update only last week, so it might have been included, but can the toolkit detect vertical surfaces? Vertical surfaces, so this is a good question. I would need to check with my team because I'm not sure. I know that we have access to you know spatial mapping to have flat surfaces like a horizontal one. I would need to check how to do the vertical one. So I'm not sure like what we shipped exactly with the toolkit was a set of UI building blocks you can use coming from the HoloLens because we've been working closely with the people building the HoloLens and to get inspired by their UX paradigm and to put that into Babylon, but I would need to check the vertical one and feel free to ask the question on our forums like on the Babylon.js forum. So you can find it on Babylonjs.com.

Babylon.js, WebXR Support, and Contact Information

Short description:

Some people that would be smarter than I am would be able to answer that question. Thanks for asking that question. Babylon.js just released version 5.0 with a ton of new changes. The key differentiator of Babylon.js in WebXR support is its simplicity and the inclusion of various features out of the box. You can find the awesome things built on Babylon.js and WebXR on davrous.com. Feel free to contact me on Twitter, davrous, if you have any questions or want to get in touch. Thank you for joining us today!

Some people that would be smarter than I am would be able to answer that question. Thanks for asking that question. I'm really going to think Babylon might even be using the discussions feature on GitHub too. So yeah, go ask them. They should be able to tell you and also go check out their latest version, version 5.0 just got released only seven days ago now. I'm looking at my calendars ticked over midnight. They've got a ton of new changes. So go check it out. It's really cool.

Our next question is, what are the key differentiators of Babylon js regarding the WebXR support? So our main differentiator, I would say is something I've been working at for a long time. So it's not because I am, you know, got too big ego, but this was something we wanted to work was this to make it super simple to use. So once you're loading, for instance, a full scene like I've shown, like with the back to the future, like the Lorian scene, you have a single line of code to enable ready to use out of the box experience. We name it the WebXR experience helper, which means that by default you can teleport. You can, you know, have the model being loaded from our CDN. So it's really, really simple to use. If you're not a 3D expert, we are managing everything for you. If you are a more advanced developer, what we are going to provide you is also like a set of controls, like the mixed reality toolkit one and also advanced control that's going to work on using mouse touch and your VR controller. So we are really trying to simplify as much as we can your onboarding experience. And everything is inside, like we've got web audio, we've got, you know, spatial audio included, physics and scene that could be enabled if you take care of that in VR. So this is our main differentiator, simplicity and lots of stuff included out of the box.

Awesome and where can we find the awesome things that you've built on BevLand.js and WebXR? Do you have a GitHub repo or something we can check out for your projects? Yeah, so I will publish you like this week or probably beginning of next week on my blog davrous.com All the samples I've been working on to be honest, I've been working on a spare chunk of time so I need now to clean a little bit the code to share that with developers and I will put that on my blog with a kind of the same, the video maybe of this conference later on.

Awesome, and I know there's a few people in the chat who we might not have gotten to their questions because we're coming to time, but if people have questions or they're watching the recording and they want to get in touch with you, what's the best way for them to get in touch with you and ask questions after this summit? The best way is to contact me on Twitter, for sure, so davrous on Twitter. I'm usually super responsive there so feel free to ask me any question there, either public or private, and I will try to help you on your questions. Awesome, so thank you so much for joining us today, David. I really enjoyed it. I love the movie references and it's really cool to see what's happening in the AR and VR space, so thanks for being here. Thank you so much.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

React Summit 2023React Summit 2023
32 min
How Not to Build a Video Game
In this talk we'll delve into the art of creating something meaningful and fulfilling. Through the lens of my own journey of rediscovering my passion for coding and building a video game from the ground up with JavaScript and React, we will explore the trade-offs between easy solutions and fast performance. You will gain valuable insights into rapid prototyping, test infrastructure, and a range of CSS tricks that can be applied to both game development and your day-to-day work.
JS GameDev Summit 2023JS GameDev Summit 2023
7 min
Boost the Performance of Your WebGL Unity Games!
Unity, when deployed on the web, faces three critical challenges: build size, memory usage, and overall performance. This lecture delves deep into advanced optimization techniques to help you address each of these issues. Attendees will gain insights into:
- Effective strategies for optimizing textures, audio, and models.- A detailed analysis of our ASTC experimentation with Unity, shedding light on the unexpected results despite Unity's claims.- A comprehensive guide to Unity's memory profiling tool and its implications.- An exploration of lesser-known Unity settings that remain underutilized by many developers.
Additionally, we'll introduce our proprietary tool designed specifically for Unity optimization. We will also showcase CrazyGames' developer dashboard, our platform that enables developers to monitor and enhance the performance of their web-based games seamlessly. 
Join us to equip yourself with the latest strategies and tools to elevate your Unity web gaming projects.

Workshops on related topic

JSNation 2023JSNation 2023
116 min
Make a Game With PlayCanvas in 2 Hours
Featured WorkshopFree
In this workshop, we’ll build a game using the PlayCanvas WebGL engine from start to finish. From development to publishing, we’ll cover the most crucial features such as scripting, UI creation and much more.
Table of the content:- Introduction- Intro to PlayCanvas- What we will be building- Adding a character model and animation- Making the character move with scripts- 'Fake' running- Adding obstacles- Detecting collisions- Adding a score counter- Game over and restarting- Wrap up!- Questions
Workshop levelFamiliarity with game engines and game development aspects is recommended, but not required.
JS GameDev Summit 2022JS GameDev Summit 2022
121 min
PlayCanvas End-to-End : the quick version
Top Content
WorkshopFree
In this workshop, we’ll build a complete game using the PlayCanvas engine while learning the best practices for project management. From development to publishing, we’ll cover the most crucial features such as asset management, scripting, audio, debugging, and much more.
JS GameDev Summit 2022JS GameDev Summit 2022
86 min
Introduction to WebXR with Babylon.js
Workshop
In this workshop, we'll introduce you to the core concepts of building Mixed Reality experiences with WebXR and Balon.js.
You'll learn the following:- How to add 3D mesh objects and buttons to a scene- How to use procedural textures- How to add actions to objects- How to take advantage of the default Cross Reality (XR) experience- How to add physics to a scene
For the first project in this workshop, you'll create an interactive Mixed Reality experience that'll display basketball player stats to fans and coaches. For the second project in this workshop, you'll create a voice activated WebXR app using Balon.js and Azure Speech-to-Text. You'll then deploy the web app using Static Website Hosting provided Azure Blob Storage.