Building the Interconnected, Traversable Metaverse

Rate this content
Bookmark

Largely based on Free Association in the Metaverse - Avaer @Exokit - M3, Avaer will demo some of the ways that open standards enable open and free traversal of users and assets throughout the interconnected metaverse.

FAQ

The project leverages the universality of web technologies, ensuring that the virtual worlds it creates are accessible on any device with a web browser, promoting inclusivity and wide accessibility.

The main goal is to demonstrate what's possible for open virtual worlds on the web, showcasing the integration and functionality achievable using browser-based technologies.

The project aims to address skepticism and controversy around the Metaverse by demonstrating the positive potential and benefits of open virtual worlds, trying to convince users that the Metaverse can be good.

The project utilizes browser-based technologies, open source tooling, JavaScript for asset loading, and integrates various APIs for enhanced functionality in creating virtual worlds and experiences.

Exokit is a browser written by the speaker, designed to explore the capabilities of web technologies in running advanced applications like virtual worlds, directly influencing the development of this Metaverse project.

Yes, assets from different sources including IPFS, GitHub, and even NFTs can be integrated into the virtual world using standard web browser APIs and JavaScript, showcasing the flexibility and openness of the platform.

Totem is a system developed within the project that uses templating and intelligent code generation to facilitate the loading of 3D assets from URLs, effectively allowing easy asset integration and management in the virtual environment.

The future development aims to include more advanced features like AI integration, multiplayer support, and continuous improvement on asset interoperability, striving to create an immersive and dynamic virtual world experience.

Avaer Kazmer
Avaer Kazmer
103 min
12 Apr, 2022

Comments

Sign in or register to post your comment.

Video Summary and Transcription

The Workshop discussed the potential of the Metaverse and the open virtual worlds on the web. They highlighted the use of AI integration, open-source projects, and standard tools to build an open and forward-thinking Metaverse. The focus was on performance optimization, rendering, and asynchronous loading, as well as the integration of 3D NFTs and metadata. The Workshop also explored procedural generation, avatar customization, and multiplayer synchronization. They emphasized the importance of interoperability and the potential of the web as the best gaming platform.

1. Introduction to the Metaverse

Short description:

Let's take a look! The main goal is to open your eyes to what's possible for open virtual worlds on the web. Another goal is to convince you that the Metaverse can be good. My personal goal is to learn something today. I've been making Metaverse adjacent technologies for eight years, constantly surprised by what's possible on the web. I wrote a browser called Exokit. We have the ingredients to import the metaverse using a web browser API. All you need is a translation engine and JavaScript code to load assets. We're making the world's first distributed video game using Totem. Everything in the game is an asset with a URL. In the future, we'll provide services like VRChat with AR and VR support and integration with KaleidoKit.

Let's take a look! Yeah, so another title for this room might have been Import- the Metaverse. I kind of have a few goals that I wanted to hopefully accomplish here. The main is to just open your eyes, maybe to what's possible for open virtual worlds on the web. A challenge goal is to kind of convince you that the Metaverse can be good. I know there's a lot of skepticism and controversy about that these days. And my personal goal is, I don't know, I hope I learned something today.

I wrote some script notes that I kind of wanted to cover, but I hope that anybody who has any questions about the crazy things that I'm going to be talking about, just kind of chime in. I'd love to kind of deep dive into whatever parts of this people find most interesting.

So some background for me, I've been making basically Metaverse adjacent technologies for about eight years now. And all of that has been on the web. I basically started on the web and I never left. And I've been constantly surprised what I was able to do just using basically what's available in the browser. The open source tooling that's already out there, just to make experiences, worlds, and even my own browsers.

I wrote a browser called Exokit. I was really surprised that all of this is possible on an open technology like the web. That basically anybody can host, anybody can experience on any device. And it seems like I was the only person that was doing that. Every single time I would show off one of the cool new hacks that I made, whether that was the browser, running on magic leap or some immersive N64 emulators that I wrote which basically let you play Legend of Zelda in your HTC Vive in the browser no less, using a web VR. It was called at the time.

So I basically started hanging around all the people who thought that, like, wow, this is really cool, like, I didn't know that this was possible. And yeah, we kind of started building up this technology toolkit of all these different things that we've been releasing basically open source from day one and working together on. And eventually what it started looking like, was that we basically had the ingredients to quote unquote import the metaverse, meaning you can use a standard web browser API to load in assets from all over the web. From all over different places, including for example IPFS, GitHub hosting basically any website, or even things like NFTs.

And it turns out that this actually isn't that hard to do. All you really have to do is add a translation engine that can basically take a URL and then write some JavaScript code which will be able to load that asset. So what you're looking at here, basically everything here is just an import statement in JavaScript. So this is actually a GLB file. This I'm not sure if it's a GLB file or it's procedurally generated. This is a JSON file that's actually describing that pistol and referencing all the assets and the character itself is on VRM file that we just imported into the world. And so with the way that this works is we actually just have a system on the back end where it takes those URLs and basically uses templating and some intelligent code generation to give you a loader. For the front end that will give you the 3D version of that asset. And that gets hooked in through some API's to the rest of a quote unquote game engine, which provides the services that you'd need for something like this, or you can walk around, you can interact with the environment. You can have things follow you And you can have an inventory that you can kind of equip on yourself.

So we basically started plugging all of this stuff together and started realizing that, hey, we're actually making possibly the world's first distributed video game where all of these assets are just kind of plugged together using a client that we're all kind of agreeing on using. So that technology that we're using to basically make all these translations happen and plug into the game engine. We ended up calling that Totem, because we're basically trying to develop a culture of creating art from the raw ingredients of the web. And we're basically sticking things together. Things are being imported. They're interacting with each other. And we're basically stacking art together. We're creating these virtual totems in a sense. Everything in the game is an asset with a URL and that can be even a local file if you're hosting locally. I am right now. It could be something on a GitHub repo, IPFS, or like I mentioned, even NFTs if you just want to point it at an Ethereum address. Yeah. And in the future, there's a lot of really crazy things that we'll be able to do once we get this baseline game working. One of the things is just providing simple services to recreate the software that a lot of people love like VRChat. Where you basically have your avatar, which can animate. It has facial expressions. You have mirrors, and we'll have soon AR as well as VR support as well as integration with KaleidoKit.

2. Building Worlds and Creating Spatial Relationships

Short description:

This allows us to build worlds and traverse across them, creating spatial relationships. Scenes can be contained within each other, providing level of detail without background noise. The street is a directional representation of virtual worlds, with different content sections. Metadata defines game elements like quests. Our goal is to build the first distributed video game, using open-source technology. We use three.js, CryEngine, and PhysX for rendering. React is used for the UI. We support drag and drop and smooth animations. Our toolkit aims to make these elements work together seamlessly.

So that's basically a way where you can use your webcam to v-tube your avatar here on the side. And another thing that this allows is for us to build worlds so that we can traverse across and share in the sense that I have some sort of way to identify where I am in this quote-unquote metaverse. And I can go over to your place, and we can have a spatial relationship between those two places.

So this is still actually kind of buggy, but I'm going to make it a little bit more manageable. Because we're loading everything through these structured import statements, it allows us to basically have one scene be contained within another scene. And for scenes to be a first-class concept that can be, for example, captured... so you can draw previews of the scene and basically get level of detail from a distant asset without having to deal with any background of the scene.

So there you have it, the perfect scene that's been created for a scene and basically get level of detail from a distant asset without actually having to go there. So right now what's going to happen is this is a place that we call the street. It's basically supposed to be a directional representation of virtual worlds, and each section of the street has different content. As I approach, what this should actually do is it should sideload that scene, take a 3D virtual screenshot of it, and then basically give me a low-res version that's LOD'd before I even go there. This is basically yes, so I actually have to turn off the fog there. But what that actually did is all that is JavaScript code that ran on the side, and basically gave you a little detail. If you turned off the fog, you'd even see that that is fully textured.

And what we're also going to do to improve that is use marching cubes to give basically a full 3D mesh with texture of the virtual world that will be there when you go there. And of course, you can actually just go there, and now you're in some of these virtual world. Another thing that this allows us to do is use metadata to define the kinds of things that you'd want in a video game, like for example, quests. There's actually a quest in this world. It says, destroy all Reavers in the area, and there's a reward of some virtual asset. So if I can find it, I'm not sure where it is. This might actually be broken, but we also have a pathfinding system to kind of geolocate you in the virtual world so you can locate your objective. And then if there's some Reavers, you can go and slay them and get your virtual silk, whatever that means.

All of this is actually open source. And our goal is to essentially build the first distributed kind of video game. Anybody can fork this stuff. And one of my priorities right now is actually to plug in all the different virtual communities that have already been starting to build on top of this. So technology-wise on the render, in terms of engine, a lot of the stuff is both stuff that I built and a lot of it is just plain open source code. Like for example, everything is three js. There's a scene graph. There's really nothing crazy about what we're doing, other than we do use Unreal Engine's Bloom shaders and SSAO. Sorry, not Unreal. This one's CryEngine. But that's what kind of gives worlds. If you defined the render settings properly, this really nice misty effect. We use PhysX, which is the same, basically, PhysX engine that everybody's using, including Unity. It's built into WebAssembly and we're loading it that way. For the UI, we're just using React, although there's a lot of WebGL, for example, to render these kinds of panels and these different shaders. It's all just actually react code under the hood. The avatar IK stuff is actually an open source Unity project that we ported over to JavaScript. That's how you get all the different smooth animations, and basically, the ability to animate any character regardless of what asset they are. In fact, let me show you some re-targeting of animations. We're going to change characters. This is just using the same exact animations, but the character still works. Just the same. We also support drag and drop in this game. If you have your own virtual assets, you can drag them in. That actually gets uploaded to a server and then downloaded back again. If you want to drop them into the world, you can. And if it's an avatar, you can wear it. And essentially, we're trying to build the world's best toolkit that we know of to make these kinds of things work together in the context of a video game.

Watch more workshops on topic

Make a Game With PlayCanvas in 2 Hours
JSNation 2023JSNation 2023
116 min
Make a Game With PlayCanvas in 2 Hours
Top Content
Featured WorkshopFree
Steven Yau
Steven Yau
In this workshop, we’ll build a game using the PlayCanvas WebGL engine from start to finish. From development to publishing, we’ll cover the most crucial features such as scripting, UI creation and much more.
Table of the content:- Introduction- Intro to PlayCanvas- What we will be building- Adding a character model and animation- Making the character move with scripts- 'Fake' running- Adding obstacles- Detecting collisions- Adding a score counter- Game over and restarting- Wrap up!- Questions
Workshop levelFamiliarity with game engines and game development aspects is recommended, but not required.
PlayCanvas End-to-End : the quick version
JS GameDev Summit 2022JS GameDev Summit 2022
121 min
PlayCanvas End-to-End : the quick version
Top Content
WorkshopFree
João Ruschel
João Ruschel
In this workshop, we’ll build a complete game using the PlayCanvas engine while learning the best practices for project management. From development to publishing, we’ll cover the most crucial features such as asset management, scripting, audio, debugging, and much more.
Introduction to WebXR with Babylon.js
JS GameDev Summit 2022JS GameDev Summit 2022
86 min
Introduction to WebXR with Babylon.js
Workshop
Gustavo Cordido
Gustavo Cordido
In this workshop, we'll introduce you to the core concepts of building Mixed Reality experiences with WebXR and Balon.js.
You'll learn the following:- How to add 3D mesh objects and buttons to a scene- How to use procedural textures- How to add actions to objects- How to take advantage of the default Cross Reality (XR) experience- How to add physics to a scene
For the first project in this workshop, you'll create an interactive Mixed Reality experience that'll display basketball player stats to fans and coaches. For the second project in this workshop, you'll create a voice activated WebXR app using Balon.js and Azure Speech-to-Text. You'll then deploy the web app using Static Website Hosting provided Azure Blob Storage.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Optimizing HTML5 Games: 10 Years of Learnings
JS GameDev Summit 2022JS GameDev Summit 2022
33 min
Optimizing HTML5 Games: 10 Years of Learnings
Top Content
The open source PlayCanvas game engine is built specifically for the browser, incorporating 10 years of learnings about optimization. In this talk, you will discover the secret sauce that enables PlayCanvas to generate games with lightning fast load times and rock solid frame rates.
Building Fun Experiments with WebXR & Babylon.js
JS GameDev Summit 2022JS GameDev Summit 2022
33 min
Building Fun Experiments with WebXR & Babylon.js
Top Content
During this session, we’ll see a couple of demos of what you can do using WebXR, with Babylon.js. From VR audio experiments, to casual gaming in VR on an arcade machine up to more serious usage to create new ways of collaboration using either AR or VR, you should have a pretty good understanding of what you can do today.
Check the article as well to see the full content including code samples: article. 
Making Awesome Games with LittleJS
JS GameDev Summit 2022JS GameDev Summit 2022
34 min
Making Awesome Games with LittleJS
LittleJS is a super lightweight game engine that is easy to use and powerfully fast. The developer will talk about why he made it, what it does, and how you can use it to create your own games. The talk will include a demonstration of how to build a small game from scratch with LittleJS.
How Not to Build a Video Game
React Summit 2023React Summit 2023
32 min
How Not to Build a Video Game
In this talk we'll delve into the art of creating something meaningful and fulfilling. Through the lens of my own journey of rediscovering my passion for coding and building a video game from the ground up with JavaScript and React, we will explore the trade-offs between easy solutions and fast performance. You will gain valuable insights into rapid prototyping, test infrastructure, and a range of CSS tricks that can be applied to both game development and your day-to-day work.
Boost the Performance of Your WebGL Unity Games!
JS GameDev Summit 2023JS GameDev Summit 2023
7 min
Boost the Performance of Your WebGL Unity Games!
Unity, when deployed on the web, faces three critical challenges: build size, memory usage, and overall performance. This lecture delves deep into advanced optimization techniques to help you address each of these issues. Attendees will gain insights into:
- Effective strategies for optimizing textures, audio, and models.- A detailed analysis of our ASTC experimentation with Unity, shedding light on the unexpected results despite Unity's claims.- A comprehensive guide to Unity's memory profiling tool and its implications.- An exploration of lesser-known Unity settings that remain underutilized by many developers.
Additionally, we'll introduce our proprietary tool designed specifically for Unity optimization. We will also showcase CrazyGames' developer dashboard, our platform that enables developers to monitor and enhance the performance of their web-based games seamlessly. 
Join us to equip yourself with the latest strategies and tools to elevate your Unity web gaming projects.