In this workshop we will go over the basics of streaming video over the internet including technologies like HLS and WebRTC. We will then build our own React app to play live streams and recordings of live streams.
Build Your Own Live Streaming Platform
AI Generated Video Summary
The Workshop covered various topics related to video streaming and adaptive bitrate. It discussed the importance of video quality and the use of advanced tools in the modern video streaming environment. The role of video codecs and compression in reducing video size was highlighted. The integration of livestream into a React app and the use of HLSJS for video playback were demonstrated. The Workshop also touched on topics such as thumbnail previews, playback security, and the benefits of using MUX assets. Overall, the Workshop provided valuable insights into the world of video streaming and engineering.
1. Introduction to Video and Adaptive Bitrate
In this part, Ed introduces the webinar and discusses the importance of video quality and adaptive bitrate. He demonstrates a website for video streaming and shows the difference between good and bad internet connections. Ed also mentions the Q&A section in Zoom.
Well, let's get started. How about we get started? Let me hit some buttons over here real quick. Cool. Alright. Thanks to everybody for coming by for this webinar. It's going to be cut into a couple parts. I'm Ed, I'm going to be handling the initial bit, the sort of discussing a little bit about video and getting the video side of things all set up, we're using OBS. If you have not downloaded OBS, either now or in the recent past, you should probably go do that. If you go over to Discord, we've dropped the link in there. At the same time, as noted, Dylan is provisioning folks with test mux accounts for handling the code side of this. Dylan's over there. Well, he's over there on my screen. I don't know where he is on your screen, but yeah. So we're going to talk a little bit about video while everybody gets situated, we're going to get everybody set up to stream out to muck. So you have something to... while you're writing code against it, we have something for you to live stream to. And then we're going to write a little bit of code to exercise the platform, and to sort of think about how to incorporate live stream video into your own projects. So without further ado, let's get to it. Oh, one quick thing. There is the Q and A section in Zoom. If you take a look at the bottom of your screen, actually it might've moved during the screen share, but there should be a Q and A button. You can drop questions in there at any time. And we'll make an effort to take a second to answer those questions as time allows. So let's hop to it.
So a big part of video, and this is kind of a global thing, right? Like this is something that you see a lot of when watching video on the internet is quality of experience. Depending on where you are in the world, depending on what your internet connection is, you know, the bandwidth you have available, you can have really, you know, you can have either great crisp video or you can have this, right? You can have really blocky video that buffers all the time. You know, you see the pinwheel of death and you have to wait for your content. This is often due to pretty naive video management, but also just, you know, sometimes the pipes are not big enough for all of your video. One of the things that video technology has really done in the last, you know, 10 years or so in particular, is the idea of adapted bitrate. We're going to first watch a quick demo and you'll have to excuse me. It's going to take a second to pull that demo up. I'm going to have to sort of do it on the fly because it's a little bit of a tricky thing. But, all right. Let me switch over to the other screen that we've got here. Let me stop the screen share, hello, and let's pull it back up over here. All right. This is a website that is done by Phil Clough over at Mux. It's an example website for using what's called HLS for video streaming. We're going to watch a quick video here and we're going to do a little bit of magic behind the scenes and hopefully, the combination of the two works well over Zoom. I haven't tried this one over Zoom before, so this could be interesting. But the idea being to illustrate adaptive bit rate and what happens when you have a very good connection versus when you have a very bad connection. So let's load ourselves a quick. I think from the last time we did this, it might still be set up. All right, cool. All right. So as we are watching this, please pay attention. It might be a little hard for you to see in the corner, but this is running at 7,800 kilobits per second at 1080p resolution. We'll talk more about resolution in a minute. But hopefully it is relatively clear. Let me put away the sidebar. And you can see that it has a pretty high level of detail, right? Like, the video quality is pretty nice. It's about what you would expect from a relatively high quality video on the internet. But what happens if we bring up something like our Network tab here? And let's mimic a much worse connection. Let's mimic a 3G connection on a phone. There we go. So you're going to see that it's going to start buffering. And hopefully, if I've done everything right, we should see adaptive bit rate kick in right now. And if we don't, we will refresh it and try again. And that should hopefully bring it up. All right, so it's not giving me what I want. But that's OK. We are going to throttle the network again. And we will load up that manifest again. And let's see what happens when we full screen it now. Hmm. I think I might be buffered. Oh, that would be why. It turned my throttling off. That's no fun.
2. Video Streaming and Frame Size
In this part, Ed discusses the importance of using advanced tools in the modern video streaming environment. He explains how videos are divided into chunks and encoded at different quality levels. Ed also talks about the concept of frames in video and how they create the illusion of motion. He mentions the difference between analog film and digital video in terms of pixel size and resolution.
That's no fun. So again, apologies. This is the first time we've done this one over a Zoom call. So it might not look quite as expected to start with. Let's give it one more go and see if we can get that figured out. But it does help to illustrate the overall point and the reason we use these more advanced tools in the modern video streaming environment.
All right. What do we got? I have a question, Ned. So if I am watching Netflix on my phone, let's say, and I'm on a bus or something and in a car driving. And then I'm seeing a high def version of the video. And then all of a sudden, it drops down to a blurry version of the video. What's going on there? Is that what you're showing? Yeah, that's what we're trying to show. But it turns out, I guess, my internet connection is just too fast today. Yeah, I think you have to keep the DevTools open. Oh, all right. That'd do it. Yeah. That would do it. When you close the DevTools, it doesn't throttle it. Ugh. Well, we're learning things. But, you know, the moral of the story is pretty established, right? Like, yeah. I'm just going to go with Fios is great. Let me pull back up the presentation real quick, though. As just flexing his powerful internet? Absolutely. 100%. All right, cool. But yeah, so the moral of the story established, right? And this goes to exactly that question that Dylan so very kindly teed up for me. When a video is going into the system, right? Like it's coming in as just a single stream of video, right? Like it's coming in as, in this case, we have 18 seconds of video. And a suitably smart video delivery platform, hello, we work at Mux, will subdivide that into a set of chunks. And those chunks are a subset of the initial input. And they will be encoded at high quality, medium quality, low quality. So to answer Dylan's question, where you might be watching a video on your phone And it goes from this high amazing quality down to the blocky, chunky mess, it's because your phone has realized that it is not able to download video fast enough in order to keep you at that high level of quality. So rather than make you sit and wait for a buffering, like we used to see a lot in early video on the internet, where you'd get a little bit of the video, then you'd have to wait. Then you'd get a little bit more of the video. And what everyone ended up doing is just pausing the video and going and doing something else until it all downloaded. Adaptive bit rate attempts to compromise by giving you worse quality, but allowing you to keep watching the thing. So you might see it ladder up or down. You might go from low to medium. You might go from medium to low. But we're talking about chunks. And we're talking about video as this sort of continuous thing.
When, of course, video is not continuous. Video is made up of frames. And what are frames? This is something that this may be a little bit of a review for some folks. But to give us that baseline understanding, we need to talk a little bit about biology. This is a human eye. Human eyes are amazing evolutionary creations and also need to be hacked in order to make video work. Like what you end up with is you end up with a set of individual still pictures that all are slightly divergent from one another. Like they're slightly changed from frame to frame. And I don't know if you can see it over the Zoom, but each of these particular frames has a number inscribed into the corner of the frame to indicate from 1 to 16. But when these frames are slightly changing one at a time and then those frames are then played back at sufficient speed, you have video. This is not high frame rate video, and it's not high resolution video. But it's an interesting one to look at because this video is from 1878. This video is 140 years old. So a lot of these principles go way, way back. And as you sort of see, even relatively low quality, relatively low resolution, we'd call it today, video has a certain sort of lifelike quality when it is just run in order at not a particularly high frame rate. We're not showing these super fast. But the human eye and the human brain sort of meld these things together and come up with what looks like reasonably fluid motion.
There's an important question to ask there, though. And that question is, how big is this video? And this is a question that is relatively new, because a lot of what we call now, what we now call video comes from film, where film is not really, film is an analog medium. You don't have specific pixels here and there. Instead you have, you know, instead you have like frame size, as we're noting, in terms of physical size. There's a piece of celluloid, and how much detail you get is based on exactly how big it is. That's not how we deal with things in the digital realm. We have to ask about things like pixels. How many pixels high is this image? Is this 1080p? It is not. Is it 4K? Absolutely not. But what we end up with now, is we end up with these rows and rows and rows of pixels. When you are looking at a 4K image, that is 2,160 vertical lines, each one at, well, depending on your standard, but probably around 3,840.
3. Video Compression and Codecs
Video is very large and needs to be compressed using codecs and containers. Codecs like H.264, VP8, VP9, and H.265 are used to compress video frames. Containers like MP4 and MOV store the compressed frames. The standards around codecs ensure interoperability. The internet's slow and unreliable nature is addressed by adaptive bitrate and HTTP live streaming. Mux takes high-quality video and live streams, compresses them into renditions, and provides tools for video quality and analytics.
So 3,840 multiplied by 2,160 is a very large number, and then you have either 30 or 60 of them per second, right? Like, we're talking about a lot of data. One of my favorite sort of go-tos is I went and looked at my personal Gmail the other day, and it turns out for 12 years of email, I have about 11 gigabytes of stuff in there. I also have a recording studio here in Boston, and if I'm recording audio, I can fit about 130 hours of audio in that 11 gigabytes. I have cameras here in the studio, reasonably high quality ones. If I am recording at the maximum quality of those cameras, I fill 11 gigabytes in three and a half minutes. Video is very, very large compared to things like text or even things like audio. And since it's so large, we have to compress it. We have to make it smaller because not everyone's got FIOS. And in the course of having to compress this stuff down, we encounter two main concepts pretty much all the time, codecs and containers.
A codec is what you've probably heard of if you've worked in the space a little bit. You've heard of things like H.264. If you've ever streamed to Twitch or something like that, you know H.264 because it's all you get. But there are other options. Like Google has VP8 and VP9. H.265 is well established in some places and there's even more advanced stuff coming out of that group. But you don't have an H.264 file because the H.264 standard doesn't describe a file. It just describes a way of packing video frames together. Those frames, compressed through that codec have to be placed into some kind of file. And that file is usually a bit of metadata and then some streams of relatively opaque binary data. And that's what we put into like an MP4 file or an MOV file, right? Like you have your container, it contains frames that have been passed through a codec in order to shrink that data down to something you can deal with as a mere mortal with a 3G connection.
So, this is a very difficult problem. It is certainly beyond my pay grade in terms of quality and in terms of like, perceptive like human perceptive stuff. Understanding why the human eye does what it does and what data you can not just shrink down but also sometimes throw out, right? Like you're shrinking the video, you're not just shrinking it in terms of data of size but also sometimes you're throwing data away especially like color data. Some ways of encoding video actually just take advantage of things like the fact that the human eye is more green sensitive than it is red and blue and it drops off some of that red and blue color data. So, if you need fewer bits to express the thing then you have to send fewer bits to the end user and these are incredibly complicated things and the reason we have these standards around codecs or H264, H265 VP9 whatever is because very, very smart people have gone ahead and done most of this work, have figured out how to express this stuff and how to express the stuff in a way that is interoperable right? Like if you have an iPhone and you have an Android phone they're not running the same code most of the time but they're able to predictably take the same video from the internet and display it more or less in the same way. You know, there are test suites and there are verification suites around all this stuff but this is the hard part and then you add on top of that the internet right? The internet is slow, the internet is not always reliable and because it's not always reliable and because it can be slow we have hacks like adaptive bitrate and HLS, HTTP live streaming which we're gonna be using in a little bit to handle a lot of the ugly stuff around it and that's what Mux does right? Mux takes your high quality video, it takes your live streams, mashes them down into a set of renditions, presents them in an easy to access way for a developer as you're going to see in a little bit and also includes some smarts on the backend for helping you understand what client you have and how that client should best present your video. We also have a set of data and analytics tools for users who are concerned about things like rebuffering and concerned about video quality experience on the client side of things but that is a topic for another day.
4. Video Codecs and Compression
Video codecs like H.265 and H.264 are used to compress videos. H.265 can shrink videos more while maintaining quality, but it takes longer to encode. H.265 chews on your data more and is more efficient at keeping more data in a smaller space. It creates better quality video, but decoding can be harder. Hardware-accelerated encoders like NVENC can mitigate some of the encoding time. Personally, I use H.265 in my studio unless I need to give it to someone else.
So video as a thing is pretty tricky and obviously we have abstraction tools like this to help make that as easy as possible. We're gonna be playing with those today. Before we do that, however, I hope that, and I'm gonna press some buttons over here so please excuse me. I would hope that folks have downloaded OBS. The link is in the April 12th live streaming platform discussion, like, channel on Discord, sorry words are hard. And, you know, as Dylan has mentioned, we've only given out about 35 accounts for about 125. We're up to 45 now. Whoa, all right. This is the biggest one we've done. There is a question from Jordi in the Q&A about H.265 and H.264. Oh, there it is, okay. I've read that H.265 is able to shrink videos down more than H.264 while keeping quality. This is true. Is it more efficient at keeping more data in a smaller space, or is it better at interpolating data in a way? So with the caveat that I am more of a video person than a video tech person, right? Like I've got the shiny happy cameras and the silly microphones and all that. My understanding of it is that H.265, it does generally create better quality video, but the trade-off is in terms of encoding time. I believe decoding is also a little harder, but don't quote me on that. It is specified to do more with less. It chews on your data a lot more. These days that's a little less of a thing if you're using something like a hardware-accelerated encoder, if you have a GPU that does NVENC. I personally in my own studio run pretty much everything at H.265 until I have to give it to somebody else, because I don't notice a difference. But it does definitely take a little longer to encode to give you that same sort of quality at the same bitrate. All right, so.
5. Getting Ready to Stream with OBS and Mux
Now let's move on to getting everyone ready to stream. You'll need OBS and a Mux account. In OBS, select Mux as the streaming service and paste your stream key. Click start streaming, and if all goes well, your live stream will become active. Feel free to set up a scene in OBS if you don't already have one.
Cool. So yeah, now that we've sort of talked a little bit about the basics of video and a very basic outline of why this stuff can be a little tricky, we're going to get started in putting in, in getting everybody ready to stream. To provide a live stream that you're then going to be able to write code against with Dylan in a little bit. As mentioned in Discord, you are going to need OBS in order to do this. I'm going to drop the link into the Zoom chat as well as, it's already been in the Discord chat for a little while, so if you've already got OBS, you're going to be ahead of the game.
So we're going to then, I'm going to pop over here to Mux, and again, you're going to need a Mux account for this, if you already have your own, you're more than welcome to use it. If you don't, hopefully you've already reached out to Dylan, that's about 30% of the audience. Almost 40% so, we're getting there. But I am going to- I'm typing as fast as I can, they're coming in fast now. All right, all right. We have got folks' attention. If you send me a DM, I'm getting to it. All right.
So, I'm going to pull up our Mux dashboard, and we're going to take a look at this real quick. So, this over here is the Mux dashboard, and if you don't have access yet, you will soon. We are going to go ahead, and we are going to go to video, and we're going to go to live streams. We're going to create a new live stream. I'm actually not going to create a new live stream because I've already got mine programmed, but, oh what the- You only live once. You'll see this, and this is a neat thing about Mux. This is the same post body that you're going to see if you wanted to create a live stream programmatically through Mux. You can modify this however you want, right? You can add whatever you want to the POST body. And, sometimes there are settings that one thing might need another. We try to make it so you can explore this in your browser as much as possible without having to drop down into one of our SDKs, though you can do that as well if you would like, so I'm going to create a new live stream. And, this is the same API response that you would see that you would see on coming back as a JSON response. So, I'm going to trust everyone to see that stream key for about two seconds. Hopefully no one is going to troll me on that, but, you know, if you do, I can just go regenerate it, go put up another one if I have to. But, we're going to go ahead and I'm going to pop over to OBS. So I've got OBS right here, and actually I think I have to do a screen share at that point, hold on a second. There we go. OBS doesn't always like it when you pop open Windows on top of it. So you're going to go to settings, and you're going to go to stream. And this is one of the neat things about Mux. I put this in the other day, I guess it was months ago at this point, Mux is now in OBS as a preset. So if you go into, well, let's back it up and let's do it again, just so folks can see it a little better. You go to stream, you go to service, you go to show all, and then you scroll down to the Ms to find Mux. We support RTMP and RTMP S output. We're just going to use our secure one for now. And I'm going to paste in my new stream key. Cool, so now that I have done that, I am going to click start streaming over here. And if all goes well, if all goes well, when I pop back over to my live stream, and we just saw that this has now gone from idle to active. And right, like this isn't, this isn't gonna be anything crazy, but like we'll go ahead we'll switch over to a different test pattern. We'll wait a second because of the, we have this set with a reconnect window so that the encoder will, will buffer if you've, the encoder will buffer if you have a network interruption or disconnect which does introduce a little more latency though you can turn that off. And you saw that the test patterns switch over from the normal color bars to the grid pattern. So. Do you wanna show everyone real quick again where you got the stream key up, where you got the stream key from and you pasted it into OBS. Yep, so when you are at your live stream page after you've created your live stream, let me bring that back up right over here. You'll see this right here, show stream key. You'll click it, you can just copy it and you can go from there. Back in OBS. I do see the question over there about the test pattern. I just pulled it off of, off of a Google image search, but it is a pretty standard grid test pattern. You go to settings, you go to stream, I'm still streaming, so I can't change that. Hold on, let's bounce back up, all right. So settings. Still see mux dashboard on your screen share. All right, that's weird. I should probably, oh! I hit the, I didn't hit the screen. The window. There we go, I hit the window, and not the screen. Thank you, Dylan. Yeah, so we've copied it from the mux dashboard. Go to settings, stream, service, you'll have to hit show all. Scroll down to where mux is, and you'll just select mux. We'll leave the server as it is, we'll keep our RT-MPS. The only difference between RT-MP and RT-MPS is TLS. So we'll, we will encrypt before we send, now that we've pasted in our stream key. And click apply. And then click start streaming. And after you've done that. Then, if folks don't already have like a, something, a scene in OBS, do you maybe wanna show how we can just set up a scene? Yeah, that's a good call.
6. Setting Up Scenes and Sources
Let's say you're looking at an empty scene. Click the plus under sources and go to text. We'll go with Hello World as our starting point and make a second scene. Let's add a browser source and type in a URL. Switch between scenes by clicking them on the bottom left. The changes are reflected after about 10 seconds. Leave your stream going as we're going to start writing some code against it.
Definitely, that's a good call. So, let's say you're looking at a, an empty scene, you know, I downloaded a couple of images ahead of time, but you don't have to do that, you can just go over, click the plus under sources.
Go to text, no, not video source. Text. Looks like some folks are streaming to the dashboard, so it's working. Excellent, excellent. All right, so we'll just go with Hello World, as is our, our legally obligated programmer starting point. And let's make a second scene. Just setting up my mux. Yeah. And then let's, let's go. Oh no, it's too big.
Another source, another source sometimes I like to add is like a browser one. You can add a source from the browser. We just type in a URL. Yeah, let's mess with that. After I get some, some, some orange text going. But yeah, let's go pop up. I actually have not tried a browser on this build. This is a, this is a local build of, of OBS, so let's, let's hope I linked it up correctly. Score, all right. And we'll just, we'll just keep an eye on Google. Oh, so you might see something like, I'm using an older version of Chrome because it's whatever was compiled in with OBS. But it's fine, it's fine, everything is fine. And if you're unfamiliar with, with OBS, you just switch between scenes by clicking them on the bottom left. So we'll go from our Hello, World to our, our Hello for Mux clipped off at the ends. And when we hop back over to our livestream, you can see that these things, you know, that these, that these, that these are reflected after about around 10 seconds. So let me pop back over to OBS real quick and stop my stream. But you're gonna, you're gonna wanna leave your stream going as, as we're, we're gonna start writing some code against it a little bit.
7. Setting Up Streaming and Monitoring
All of the code will be done through codesandbox.io. You don't need to download anything. Can we do polls in this webinar? No, it's Q&A. If you are streaming to the Mux dashboard, let us know. Start streaming by clicking the start streaming button in OBS. The stream key can go into any software. OBS is the most popular. Contact Dylan in Discord to get a Mux account. If you're not in Discord, email dylan at mux.com. To generate a stream key, go to video, live streams, create a new live stream in Mux. Copy the stream key and paste it into OBS. Use the View Live Stream Input Health button to monitor the health of your input.
For those who were not taking a look in the Discord earlier, all of the code is going to be done through codesandbox.io. So you're not going to need to download anything on that side of things, and yeah. So. Can we do polls in this webinar? I don't think we can, but. Can in Mux, I don't know, you mean in Zoom? No, in Zoom, no, it's Q and A, I don't think there's polls, but quick, maybe thumbs up or something if you actually did get, if you are streaming to the Mux dashboard, let us know. All right, so if you're seeing idle, it might take a couple of seconds to update. You also may want to refresh the dashboard. It's usually trailing by about 10 seconds. Got a few more people jumping in. All right, we're up to, I've sent 56 Mux accounts, that's pretty good. Nice, all right. Start the actual streaming somehow? Yes, let me pull that back up. I'm sorry, I might have gone a little quick on that one. So we'll hop back over to sharing the screen and what you can see over here, the start streaming button is just down here under controls. So you'll just click that and that will start your stream out to Mux and then you'll be able to view it through your dashboard, right? Like mine is still staying idle for a couple seconds and then it pops back up as an active stream. So just to kind of recap real quick, so in Mux's world you make an API call that creates a live stream, every live stream has a unique stream key and then you put that stream key into OBS and then start streaming. So that stream key can also go in whatever other software you're using, it doesn't have to be OBS. OBS is the most popular, I think like most of Twitch streamers use OBS, there's some other ones out there that are varying complexities but OBS is by far the easiest or expensive. OBS is actually the open source. And it's really reliable, most Twitch streamers use it even despite all the other software that's out there. Yeah, totally. It's the metric standard streaming platform. Exactly. Yeah, a couple of folks asking where to get a Mux account, if you contact Dylan in Discord, he's holding it down in the april12-live-streaming-platform channel, can get yours. Somehow I missed you, let me know. I think I don't have any more of my queue right now, I should have gotten you all. If I did not for some reason, then send me another message. And if you're not in the Discord or having trouble accessing Discord, you can email me dylan at mux.com. Okay, got another one here. Cool. And when you copy and paste the user name, or the email and password, just check for trailing spaces and stuff like that if they're having any problems. So far they've all worked, so it should work, we haven't had any problems like that. All right. So it seems like folks are generally succeeding with being able to stream this stuff out while we are waiting to take care of the last few folks. If anyone has any questions on the previous bit, I am happy to answer whatever I can. Give it a few more minutes and then, well, if you are successfully streaming, you can poke around the Mux dashboard, you can also play with the controls in OBS, see about adding scenes, adding text, adding your video. You can add your video source so you can actually stream your own video, audio source, things like that. While these next folks are... And pretty soon, we're gonna switch over to coding and then once we're coding, I'm not gonna be able to send you Mux accounts, so get them while you can. All right. Can we repeat how to generate the stream key? Yes, we can. Let me hop back into the browser real quick. Cool. So if you are at the homepage, your environments page here in Mux, you'll just go to video and you'll go to live streams and you'll just create a new live stream. As mentioned before, this is the same post you'd use of their API. We'll just run the request, we'll view the live stream and then you can click on Show Stream Key, which will give you the stream key right here. You can then copy it and then you can paste that stream key straight into OBS. While we're here, we might as well take a quick look over at that and show everybody how to do that again. I should stop streaming, but we'll go to settings. We'll go to stream. You'll click on the service drop down and go to show all. We'll scroll down to MUX. We will leave the server as it is. And you just paste in your stream key right here. Click apply, click okay. Now I'm gonna start streaming on this live stream, which is actually a, you know, this is a different logical live stream. So when you're writing code, you would use this live stream ID instead. There's been no, you know, there's been no content going out to it, which is why it's still idle. We will wait a minute, and then we will start seeing our live stream right here. So the other cool thing that's in here that's probably worth mentioning while we have a second is this button down here, View Live Stream Input Health is pretty cool. What this allows you to do is this allows you to monitor the health of your input, right, like say that your users are reporting that there's drops in their stream, right, like that it's going all stuttery or whatever. This will help you at a glance, figure out whether or not that this is, you know, whether the input coming from your encoder is maybe not great, you know, maybe your local network isn't doing so hot. Maybe, you know, there was a fiber seeking backhoe in your neighborhood, whatever. Yeah, this shows you, you know, rough bit rate. I like how I have three kilobits of audio even though there is no audio on my stream. That's cool. And, you know, frame rate is sometimes indicative of other problems, right, like frame rate. Say you've overtaxed your encoder, right, like say you've done, you've asked your encoder to do too much work. In that case, you might start seeing some frame drops and you might need to simplify things. You might need to stop running as much stuff on your machine while you're encoding, that sort of thing.
8. Live Streaming and Mux API
There are cool tools available to help you in the workshop. Mux offers an API for live streaming to platforms like Twitch, YouTube Live, and Facebook Live. Simulcasting allows you to stream to Mux and syndicate it out to multiple platforms. Mux also keeps a local recording of the livestream, which can be downloaded using the API. Test Mux accounts have limitations, but you can remove them by entering your credit card information. Livestreams can be turned into video-on-demand assets and downloaded later. To switch the video from idle to active, follow the instructions we provided. The Mux dashboard displays recent asset IDs corresponding to video files.
But, yeah, there's a lot to learn in there but there's also a lot of really cool tools to help you out. Nice. Yeah, this is the first workshop we've done since we launched that live stream. Oh really? Hey, help, which is pretty cool. Yeah, that's a newer, newer feature. Great. All right, I think we're up to 60 people? 60 people? Not bad. I should probably turn my stream off. Okay, and this is a React Summit after all. So we will be writing some React code. If you are streaming, why don't you drop a screenshot? Some folks are dropping screenshots into the Discord in the April 12th live streaming platform channel. That's a good, someone's streaming Once Upon a Time in Hollywood. So yeah, we might get a DMT takedown notice. Thanks for that. Nice. Oh, good question, Geordi. No, we don't do auto generation of transcripts or captions, subtitle stuff. But there are other platforms that do that for video on demand. So not necessarily for live streams, but for video on demand. So like when you have the recording of the live stream, you can kind of, you can grab the MP4 out of Mux, send it to a third party service, they can get the transcript, they can make it into a subtitles file, and then you could send that subtitles file back to Mux, and we can actually add that onto your video so that on the player side, it'll show the subtitles. Okay, still getting more folks? For folks who also need to be able to subtitle live streams, there are options that do work with Mux. They're a little more involved, and they're, you know, they're a lot more of a discussion, but that sort of thing does exist, and we've done it before. We did it for Demuxed. Oh, screenshots are just rolling in. Nice. Right on. So if any of you have, I'm sure you have seen Twitch, or maybe watched Twitch, but if you were to stream to Twitch, it would be a basically similar process. You'd go into OBS, and instead of using Mux's server and Mux's stream key, you would use Twitch's server and use your stream key from Twitch, and it would start streaming to Twitch. YouTube Live works the same way. Facebook Live works the same way. So it's like all the platforms kind of, they kind of work with, you run everything through an encoder, and then you stream out over RTMP, but Mux basically gives you an API to do that. So if you were to build a product like Twitch, or you were to build something that had live streaming as part of your product, then you can kind of use those APIs to spin that up yourself, because running all that infrastructure on your own is quite, quite complicated, I would say. Yeah, it is. Oh, I did that a couple gigs ago. And sort of the other benefit ends up being a lot of existing platforms are a little bit precious about how you use embeds and stuff like that, right? Like the purple screen of death on Twitch to really control your video stream end to end. A provider like Mux makes life a lot easier. These are awesome screenshots, by the way. Oh, I have questions. What's that dog? And then probably outside of the scope of this workshop, but when you create the Mux livestream, there is another API where you can pass in your Twitch credentials or pass in your Facebook live credentials or pass in your YouTube live credentials, and then what happens is you send your stream to Mux, and then Mux can syndicate that out to Twitch or Facebook or whatever RTMP ingest service you configured, so you can kind of stream to Mux to one place and then syndicate it out to a bunch of other places. We call that simulcast targets, so that's this kind of something else that is kind of cool, but we won't be doing that in this workshop, but if you're curious for more... Yeah, one of the cool things about simulcasting is that Mux also keeps a local recording of it, so you can get at it with an API to download it rather than having to go click through a browser and stuff. I see one of the users here, CMBF, it looks like you have a test Mux account, so I assume you signed up for your own Mux account. That's gonna cut off after 10 minutes, and then your recordings are gonna be chopped down to 10 seconds, so there's kind of a limitation on the test accounts. So you have kind of two options. I assume you just signed up with your own email address, so you have two options there. I mean, you can just use a test account with those limitations, that's fine. If you wanted to get rid of those limitations, you have to enter a credit card, and you won't be charged or anything, but because we're a video platform, we have to deal with a lot of abuse, so we have to require certain things in order to start streaming, and so that's why if you're not comfortable with that, you can just shoot me a DM. I'll send you a Mux account that's already unlocked so that you don't have to enter your credit card and stuff. So this livestream can be downloaded as video later, right? Yes, correct. The way that Mux handles these is that every livestream is then turned into a video-on-demand asset, so you can download those if you don't want that for that particular livestream. For some reason you can use the API to delete it, stuff like that. How to switch video from idle to active? If you have followed the instructions we went through, when you click start streaming, it should go to active. Yeah, when you click start streaming, it'll connect, and then about 10 seconds after that or so, it'll go to active and then it'll actually be streamable. We can go through that one more time for folks. Then maybe you can also show in the Mux dashboard where to look for the recordings. Yeah, good call. Alright, so I'm gonna take my stream key. I clicked show stream key. Going to head over, click on settings in OBS. Going to go to stream. Gonna go to service, and you'll just click on show all to show the full list that OBS has. Click on Mux. We're going to leave the server as RTMP S. And then we're just going to paste in our stream key right there. Click okay, then down, it should be in the bottom right corner, you should see a button that's called Start Streaming. You will click that button, we'll head back over to Mux real quick, and this should go from Idle to Active. Just like that. So I'm going to go back to the other livestream because that one probably has some more stuff. In the livestream dashboard, you can scroll down to recent asset ID's and these ID's correspond to video files. So when you're here you'll just, and again, this is one of the things Mux tries to do is sort of coach you into using the API for this sort of thing.
9. Playing Back Videos in Your Application
You can get this out of the API just as easily, but we'll just click on our first asset ID. And this is not a livestream. This is now a video on demand asset. It is three minutes, 34 seconds, and 97 milliseconds long. You can click it to play back a sample. It's a very boring video, I'm sorry. But some of the other stuff you can do off of this are pretty cool. You can get a thumbnail off of the video. In this case, they're a little bit samey, but you're able to control what part of the video that your thumbnail comes from. You're also able to do a bunch of other neat stuff. You're also able to extract your audio track separately from your video. Just lots of neat building blocks because a lot of these things are useful for one platform or another. Like you might be uploading video casts, but also want to extract your audio as like a podcast or something like that. And a lot of these tools are, most of these tools aren't useful for every project, but they're all useful for some project.
All right, so I think it might be time to jump into some coding, so we don't run out of time. This is definitely the biggest workshop we've done, which is awesome. And Ed, you might wanna roll your Stream Key there because you did share it in the Screen Share, and if there are some hackers in this group, they might overtake your Mux Live Streams. Everyone was paying attention. They could have copied your Stream Key and stolen it. It's true, it's true, they could have. I've already just deleted all of my streams, but. Perfect. But yeah, no, that is definitely something, obviously we're in a test environment, so. And it only took 46 minutes for us to get RickRolled. Awesome, thanks, Philip. I mean, it's a race to it. I thought I'd be sooner, honestly.
10. Integrating Livestream into React App
In this part, Ed discusses the process of integrating a livestream into a React app using Code Sandbox. He explains how to create a video player component and pass a playback ID. Ed also mentions the use of useRef and useEffect hooks. The shortened version contains the key points without altering the original text.
Hey Dylan. Yes. Chat's requesting that you increase your font size, please. Over here? How about that? Is that better? That is too big for me. Okay, how about this? I think I can do that. I don't even know where to see the chat anymore. Cool. Cool. Yep, that's it. HLSJS.Netlify.apps.demo. And then you get the stream.mux.com. You have to come to the dashboard, get the play-back ID and click copy URL. All right, and then you should be able to see your live stream right here. How we doing? Cool. Folks are getting it? All right, so well. Awesome. So now I'm going to open up Code sandbox, and now let's do this in our own actual application. So we go to Code sandbox, click create sandbox, and let's just do a regular old React app and just hit create. So click yeah, create sandbox in the top right, and then click create. Oh if you don't have a Code sandbox account, I think you might have to OAuth with GitHub or sign up for an account, it's free. So go ahead and do that. Great. Nice, all right. So how's the font on the Codes sandbox actually? We need to increase that. And this is the link to mine. So as we're going through this, it might be helpful. You might wanna pull mine up where you can, oh bigger, oh man, this is gonna be, okay, that kind of breaks the layout. Let me see. What theme is that? This is the default, this is just the default Codes sandbox theme. I don't think I've customized it. It's pretty, it's good. I never, I never felt the need to customize it. Okay, so let's get a real quick demo of our livestream actually playing in our react app. So the first thing we will, the first thing I'm going to do here, I'm just gonna create a folder called components. And then I'm going to create a file in here called video player. Does it need to be.JSX? I always forget. I think it could be JS, okay. Okay. Cool. Awesome. So I'm going to create this component called video player and let's just put export default function video player. And let's just start with that. And on the index.jsx, let's import video player from. And then, oh, actually no, let's put this in the app. Okay. Okay. Brian's having problems trying to download the file but not running the browser, what does that mean? It's like not found try going to, maybe someone else can, who hasn't, who can help debug with Brian? Cool, okay. So if you all with me so far, we have this code sandbox, I'm going to just change the title. Let's change that. Delete that cool, Now we have this video player component. Should we put it in source? Okay, let's just put everything in source. By the way, this components directory, we're putting it all in source. There we go. So that transpiles stuff. Okay, cool. Makes sense. Okay, so we have this video player component. Now what we want this video player component to do, I think is we want to pass in a playback ID. Now, so for me, I'm just going to hard code a playback ID. In your application, let's call this live playback ID. In your application, you should use the playback ID that corresponds to your live stream. This is one for me. And then we want to pass it in here into this video player component. Now in this video player component, let's just make sure we're getting the playback ID here. And we are, so that playback ID is being passed down into our video player component. All right, so now let's make this video player component actually play some video. All right, so first thing we need to do here is let's return, okay, a few steps. Let's import use ref. We're gonna need to use ref and let's import use effect from React. And then we also need to add a dependency.
11. Using HLSJS and Video Reference
We need to use HLSJS and import HLS from HLSJS. We'll put a video element inside a div and keep a reference to it using const video-ref equals use-ref.
We're going to need to use HLSJS. And HLSJS, they just had their 1.0 release. That just happened like last week, which is awesome. They've been doing some cool stuff over there. So we need to import these two things and let's import HLS from HLSJS, do those two steps. And now what we want to do here is let's return a div and then let's put a video element inside this div. And let's make a, let's keep a reference to that video. We need a reference to that video element so that we can do things with it. So we want to use const video-ref equals use-ref and it starts out as null.
12. Integrating HLSJS and Handling Component Unmount
In this part, Ed explains how to initialize HLSJS and set the video source. He demonstrates how to check if the browser supports HLS natively and sets the source accordingly. Ed adds controls to the video player and handles the cleanup when the component unmounts. He also mentions checking for any questions or issues from the audience.
And what that does is that'll just give us a handle onto that video element, the actual html video element. And we use that in this use-effect function. So basically when the component mounts, we want to run this function. As soon as the video, as soon as we have the video-ref available, then we want to run this function. And what we're going to do here is, let's just say if we don't have, okay, yeah, this works. So video-ref current, we need to initialize HLSJS. So let's open up the docs for that. And the way HLSJS works is you have to get a handle to a video element. You have a video source, and then you can see if the video, this check here, says CanPlay type. If HLS is supported natively by the browser, then you can just set the video source. And if HLS is supported, it has to do with if the browser is like a modern browser basically, and it supports MediaSource extension. So let's go ahead and implement that. So first step. So first we need a video source, right? We have a playback ID. So let's just make a variable here. VideoSource equals stream.mux.com slash playback ID.m3.us. Okay, now we have a source that we can use, an HLS source that we can use with our video. VideoRefGutCurrent. And I'm gonna say if video.canPlayType. So I actually think it's better to swap around the logic that they have in these docs. I like to check if the browser supports HLS natively first, particularly Safari, because Safari's HLS actually works quite well. So I kind of, if the browser does support HLS natively, I kind of like to use the browser's HLS capabilities. So then we'll do video ref.current.source equals video source. So then we just set this source. Let's just make this, let me extract this out to a variable. So this video variable I have here is the actual video element itself. Why is this linter yelling at me? No, video source, okay. So if the browser can play the type the video source, then we'll just set the source directly. Else if hls is supported, then what we can do is we can do this bit here, which is create an HLS variable, call HLS load the source into this HLS instance, and then attach media to the... And then attach the media to the video element. And this is far too big. So let's just drop into styles.css and let's do something like width 100. I usually just do the width 100% max with 500 pixels, something sensible. Okay, cool. So now we can see that we've actually loaded the video player, HLS JS initialize stuff, but it doesn't have any controls, it's not playing or anything. So let's just add the controls attribute here. I'll just use the default controls. And we're getting this error when like unmounts. So we're gonna have to handle something in our hook there to deal with that. So now we have the controls. Now I'm actually playing the live stream. This is just using the default controls of the browser. And there we go. Now let's test this out in Safari real quick. And in Safari, it works too. So in Safari what's actually happening is it's getting into this block because Safari supports HLS natively. And then here we're using HLSJS. And if it doesn't support media source extension, then get a new browser. Because that just ain't, you ain't gonna be able to play video. Cool. Now one thing we need to do is there actually, we need to actually kind of do some cleanup when the component unmounts. So here in this live reloader it's kind of mounting the component, unmounting the component, things are getting mixed up. So usually in use effect, what we want to do is return the function that will do like the tear down of the component. So we can return here, and I think it's like HLS Destroy. What is it? Where's the docs? HLS Destroy should be called to for use resources and destroy HLS context. So here, so actually what I need to do here is I need to put a variable here and let HLS. So we're not like at this point, we don't know if we're going to be using HLS or not and then in this return from this hook, I just wanna say, if we have the HLS variable then Destroy. So that should fix my like live reloading problem. Okay, who's got this? Who's with me? Who's able to actually stream their view their live stream? I'm gonna pause real quick and check to see if any other folks need Mux accounts. Let me check the email. Okay. Need one more. One sec. Oh, folks doing. Do we have some React code where you're actually viewing your live stream? You reached there then. Yup. Also someone asked the code. Yeah, you could just go directly to the code sandbox and you can follow along. Share some screenshots if you did it. Okay, so while we are here, there's no Q and A, anything? Cool.
13. Recap and Video Player Setup
We have a single page that loads the app component, which in turn loads the video player component. The playback ID is passed into the video player component to load the livestream. The video player component creates a video source and attaches the HLS source to the HTML5 video element for playback.
Okay, so just a quick recap what's going on. We have this single page, it loads the app component, the app component loads the video player component, and we pass in a playback ID. Normally, in your application, you would be getting the playback ID from a server, you do an API call, or if you're using Next.js or something, you would get it from server-side props. So, this livestream ID would be an ID that you have in your database or something like that, you get that from an API and then you pass that into the video player component, and then the game will just load from there, you pass that into the video player component, the video player component creates a video source, has the video element, this is the HTML5 video element, and you just attach the HLS source to the video element and start playing, and then now you can actually play your livestream like that.
14. Handling Invalid State Error and Teardown
In this part, Ed explains how to handle the invalid state error that occurs when the component is not being properly torn down. He demonstrates the use of the useEffect hook to handle side effects and explains the React pattern of returning a function at the end of the useEffect function. This ensures the proper teardown of the HLS context. Ed also mentions the importance of adding the HLS JS dependency and addresses the issue of the invalid state error.
Okay, oh, you have to add the dependency down here on the dependencies panel. HLS JS. Yeah. Nice. Okay, so this is great for a livestream, right? We have a single livestream page, we're playing the livestream. That's all fine and fun. Now, let's say we wanted to have two pages in our application, right? So let's say you're building a website and you wanna have a place, let's see. Oh yeah you just refresh there, CRS1138. That error happens with the reloading. That gets fixed when you do this return at the end and you call HLS destroy. That invalid state error, that happens when the component's not being teared down properly. So that's why I added this code here that actually tears down the HLS context when the component unmounts. The typical way to use React hooks, if you're not too familiar with them, is to use use effect and then at the end of the use effect function you return a function that takes care of the side effects of any code that you wrote in the actual hook. That's a React pattern. Awesome.
15. Creating Livestream and Recordings Pages
Let's create two pages in the app: one for the livestream and another for the recordings. We can extract the useEffect into a separate hook called useHLS. Jordy can share the code for that later.
Okay, so now let's say this is perfect. Jordy will need the useHLS hook. Awesome, I like that. Well done. So we can follow Jordy's lead, we can extract this useEffect into its own hook called useHLS, love that, that'd be nice. So maybe Jordy can share the code for that. But for now, let's say you have this livestreaming site, you want to show a livestream on this one page and then you want to have another page where you show the recordings. So we can totally do that. So let's, in this app, let's make kinda like two pages. So let's make the homepage be for the livestream and then let's make a page called /recordings where we will show the recordings. So let's get started on that. There we go.
16. Creating Routes and Navigation
In this part, Ed adds dependencies, such as react-router-dom, and imports the necessary components for routing. He creates a homepage route and a recordings route, each with their respective components. Ed also creates a navigation bar with links to the two pages. He discusses the security of playback IDs and stream keys and mentions the use of signed URLs for securing playback. Ed concludes by mentioning the various use cases of businesses using live streams.
There we go. Okay, let me find something real quick. Let's do some... Okay, so let's go over, so let's add a couple dependencies. Let's add react-router-dom, that'll help us make the two pages that we need. Add react-router-dom and then let's go to this index file. Okay, and then let's import browser-router. As router we'll need the switch component, we'll need the route component and let's say let's import those from react-router-dom. Oh, where did I mess up? Oh, router. Oh wait, this should be an import route. Cool, okay. Now for here, what we wanna do is let's make, use the router component and then let's pause on the app component. And then let's make a switch. And then let's make a route exact path here. And then we wanna, we'll pass a component to that. And then let's make route exact path recordings. And then we'll pass another component to that. So right now we have this app. Let's rename this to a homepage. So make this, call this homepage. Let's dial so we can call that homepage too. Okay. We're not using a strict mode. I'm not sure. Does someone know what strict mode does? I haven't used that before. So let's call this homepage from the homepage component. And now what we can do, let's pass in this homepage component into the, into the slash path. And then let's make a new component and let's call this recordings.js. Okay, let's see if this works. Looks like we have a problem, route is defined. They're reused. Do we need that? Oh, this should be your route. There we go. Okay, so this is the homepage route that just uses our homepage component, our homepage component here. It uses the video player, just with the live playback ID, and then we have this recordings route that just has the recordings. Now we should probably, for completeness, we should probably just kind of create a little nav while we're at it. So let's just create a nav.js and let's just export function, nav, and then let's, let's just make a nav with links to the two pages. So let's make this the live page. And then let's make this the recordings page, and then let's import that. So, import nav. And then on the recordings page, let's import that nav here. And let's, sort of track it. Cool. So you have the live page and then we have the recordings page. And let's just clean up the style so it's not completely ridiculous. Oh, and in the recordings page, we'll want to, we'll wanna import our styles on the recording page too. Now this could be made a bit cleaner with just like creating like a layout component and things like that but just for something quick we can do this. And actually let me do this. We need to do this. And maybe we should do placeBetween and let's hit list, style, none. Maybe space around, that's a little better. Let's just do background. There we go. How about that? Someone will have something prettier than me. Let me check the chat. 6-336-711-6. Got you. Then it will crash instead of warnings. Awesome. To answer your question, yes, playback ID is public, and stream key is private. There are ways to secure the playback. You can think of it, we have signed URLs, similar to if you have a file in S3, you can have the public link to that file, or you can have the file be private, where you need to add a signature with a token parameter. There are ways to secure the playback ID so that people don't just take it and use it. We do have some security features around that. The stream key should definitely be private because you can see with the stream key, all you need is a stream key to start streaming. Oh, I think I need to fix these links so that they're not... Cool use cases of businesses that use this. I mean, oh, there's all kinds. When COVID started, there's been a lot of people doing live streams. Hold on, let me just refocus my camera.
17. Live Streaming, Scaling, and Use Cases
HLS live streaming scales well and delivers video content over HTTP. WebRTC, on the other hand, requires an open connection and is more difficult to scale. RTMP ingestion and HLS delivery scale better but introduce latency. Low latency HLS solutions are being developed. Building your own app for live streaming offers ownership and customization. Platforms can provide live streaming services to their users by leveraging Mux APIs. This eliminates the need to build live streaming infrastructure from scratch. The live stream page and recording page can now be built using the Mux dashboard. Stopping and restarting a live stream creates recordings of the live stream.
Better, okay. There's been a lot of use cases – live streaming concerts, live streaming education is a big one where it's like people doing lectures and things like that, kids' sports games being live streamed, weddings. There's a big boom in weddings being live streamed. And the thing about HLS live streaming is that it scales. I mean, we've had live streams with hundreds of thousands of concurrent people watching.
Now there's another form of video which is called WebRTC. You might be familiar with WebRTC. WebRTC is when two people are having a chat. It has to be very low latency, so we're talking in the 100 to 200 millisecond range is what WebRTC is. WebRTC is quite different, because that requires you to have an open connection, an open connection from the client to the server, so it's not sending HTTP traffic, it's using UDP. And it means you have to keep this open connection, send data back and forth, and that becomes a lot more difficult to scale. So that's when you're talking about a few people on a call like a Google Hangouts, you can't have a hundred thousand people join that call, you can't have ten thousand people join a call. Once you get over a thousand actually get quite hard to scale WebRTC platforms. So that's why when you see stuff over RTMP, where you have RTMP being ingested, and then you're delivering over HLS on the delivery side. That scales much better, and it's much easier to scale that kind of thing. But you know the trade-off is that it introduces latency, that's why what you're seeing is probably about 20 seconds of latency from your streaming OBS to when it's actually delivered into your React app. So that's the kind of trade-off. And now there are solutions coming out, especially with HLS.js Version 1.0, where to find ways to get low latency HLS, and that's something I'm personally really excited about, the whole industry is kind of excited about, because we are going to be able to get low latency HLS coming into—it requires changes on the server side, the delivery side of things. It also requires changes on the client side and the player side. Yeah, it's really exciting stuff. Brian, it looks like—cool. Knocking the nav bar to show up? You maybe need to import it into both places. So on the home page, Import Nav, and I'm putting the nav there. When—let me name this app, and then change the style. App, and then on the recordings. So you probably need to import the nav into both places. And then I think in my nav what I actually want to do is import the link from... oh whoops. So in my nav I actually don't want to use ahrefs, I want to use link2. Okay, now you can see the URL is changing as I navigate. Alright, cool, so how we doing? Why make your own app when you can use Twitch or instagram without any second join? Yeah that's a good question Omar. So I guess the use cases to think about is well some people just like to stream on their own site they like to maybe own their audience where you know maybe you're creator and you want to kind of have that direct relationship with your audience so maybe you have a newsletter and you want to allow people to sign up to their emails and sign up for your newsletter So maybe you just want to stream to your own site and own that, that's one use case. The other use case is let's say you're building a platform so let's say you're building an application for let's see for dance teachers to live stream their dances or something like that right? So you're building that application. What you can do is you can actually build an app where users sign up, they register, they pay a monthly subscription and then you offer them this service of being able to live stream from them. So then what you do is you use MugsAPI, you create a live stream for every user that you have, for every like broadcaster that you have in your application. You give them a stream key and then they go to OBS, they set up the live stream, they start streaming and now you're actually able to offer this live streaming platform to your users. And then it's really more of the use case. More of the use case is not building it as a one-off just for your own site. You can certainly do that but it's more like you're building an application and you want live streaming to be part of your application. Now you have two choices. Do you go build all that live streaming infrastructure yourself, set up the RTMP Ingest servers, set up the HLS delivery, process the video, deliver the video, integrate with CDNs, do all that stuff? Or you can kind of do what we've done here and use Mugs' APIs to create live streams which gives you unique stream keys for each one then you give those stream keys to your end users and those end users are actually the ones that are streaming into your platform. Let me know if that makes sense. I think someone raised their hand in the Zoom. If you raise your hand in the Zoom. Oh cool, if you have a question, let me know. All right, so how are we doing, are we here? Do we have the live stream page and the recording page? Because now we can start building out the recording page. Index.js here. And I'll share this link again. This code sandbox is public, right? If I go here. I think anyone can access this. Okay, so now let's start building up the recordings page. So let's go back to our Mux dashboard. And what we can do is over here, video, we were in the live streaming section, right? That live stream is still active, but now if you click over to assets, that will give you a bunch of, this will give you the recordings of your live stream. Now if you have not started and stopped a broadcast, you can stop. I can stop the live stream now. And if I stop this live stream, you can see this is actually the asset that's currently active. So it's the current live stream that's happening. So if I stop the broadcast, then this will be ready. And now this is the recording of it. And then if I restart the broadcast, that will start a new live stream.
18. Creating Recordings and Assets
It'll start a new active stream on this existing live stream. Every time you start and stop, a new recording is created. You can create an asset by passing a URL to a video file. The video player component takes a playback ID, creates the playback URL, and attaches HLS JS to it. Autoplay is unreliable, but if it's muted, it will usually work.
It'll start, it'll start a new active stream on this existing live stream. So now this has gone active again. And then when I stopped that, so every time you start and stop is when you'll have a new recording. So I'm going to click into this asset and grab this playback ID. This is from the recording I just did. And now let's make a const up here and playback ID equals this. So let's call this recordings. And then let's go here, go in assets, let's get a couple more. Here's one. So here's another playback ID. Let's get a couple more. There we go. Now, we have only covered livestreaming on this, in this workshop, but an asset is basically a static video on demand. We've been creating them with recordings of livestreams, but you can actually create an asset by clicking the Create New Asset button and just pass in a URL to a video. So this could just be a URL to a video file that's on the internet. It could be in an S3 bucket, in a Dropbox file or whatever you have, and click Run Requests, and that will actually create a new video in your account. And this will create a MUX asset. So there's kind of two ways to create MUX assets, which is create the asset or have it come from a livestream. So what I did here is I just went to the assets, grabbed a couple of playback IDs that I have in my account. So here's a few.
Okay, so now I have four. Now, again, in a real application, you would be making an API call to fetch this data from the server, but we're just gonna hard code it here for our purposes. So now let's make recordings, StopMap, and grab the playback IDs from the recordings, and return. And now we can take video player, playback ID equals playback ID. And what did I mess up here? There we go. So now you can see I have all these recordings here. So let's maybe make these a little smaller. And then let's add some margin in between them. OK. So now, you can see we're able to use the same video player component whether it's a recording or to livestream. We can always pass in a prop into it to say, treat this as a maybe we want to change the styling. So we always have that option available to us. But the video player component just takes a playback ID, creates the playback URL, video source, and then just handles that attaching and then just handles that attaching HLS JS to it. And then it ends up rendering a video element. So let's do, let's make this a little bit better. There we go. Just a line them in the center and we can do. Okay. Now how are we doing? Let me check the chat real quick. Ah, autoplay. Autoplay. Hold on. There's a blog post about this somewhere. Yeah. Autoplay is very unreliable. If it's muted, it will usually work. So autoplay muted will usually work. But still not always. There is basically always situations where you have to deal with the potential of autoplay not working. Browsers kind of block it pretty aggressively. Chrome has a whole set of rules that it uses to determine am I going to allow this page to autoplay or not. It's called the media engagement index. Basically as you browse around the Web, Chrome keeps track of on this website, this was your engagement with video or audio on this website. It keeps this index. Depending on your behavior, it will allow that website to autoplay or not. For example, if you use Twitch a lot, Chrome will know you are using Twitch a lot. It will allow Twitch to autoplay. If you go to a new website that you have never been to it will not let that website autoplay because you haven't been to that website before. It really causes a lot of headaches. There is all kinds of things on mobile, if your battery power is low it won't let you autoplay. It will block autoplay from working all together. The general way to get around it. If it is muted, autoplay will usually work. You can go with that rule. That's one thing to keep in mind. It is again consistent about browsers. You might see it will autoplay Safari and another browser won't autoplay. You have to be prepared for that. If you are going to retarget you want to try it at two o'clock. If you are going to record a quick video, you might try to get a little higher than two o'clock.
19. Adding Thumbnails and Enhancing the Video Player
You can now add thumbnails to the video player by passing in specific time stamps. You can also integrate a live stream into your application by keeping track of the live stream's status in a database and displaying the appropriate video player based on the web hooks received. To enhance the appearance and consistency of the video player, you can use the plier.io player, which offers nice controls and a Timeline Hover preview feature.
They might not look so close. It is a very important thing to know. This is really convenient. It is a bit of a pain to work with just a small grouper. You could start with hundreds of people at the same time. That is the first time. With Mux, we actually get thumbnails from the video. That is one thing we can do. Let's do that. Let's say poster. On the video element it is called a poster image. Let's call this poster source. And then image.mux.com slash playback ID slash thumbnail dot png. I'll put that in the chat. With the playback ID, that will allow us to get images. And now poster source we can just go onto the video element and say poster equals posters source. And now you'll see that we have these thumbnails that are showing on the video player. You'll see it's just like click the play button and it plays. Um, so there's that, that makes it a little better. And now let's say we want to pass in specific time stamps. So what we can do for our recordings is we can pass in like a thumb time, let's say this we want the 10 second mark in this video. And let's say in this one we want like the 4 second mark, and let's say this one we want the 3 second mark, and let's say in this one we want the 15 second mark. And we can pass in these thumb times. And then we can pass that into our video player component. And now our video player component can be a little smarter. It can take the thumb time. And then we can say—let's see. Let's just make this a let. And then if we have thumb time, then we can—or actually hold on—we can use memo here. So if we have a thumb time, then we can just add a query param. And we're trying the source and the dependencies for this. Well, this should be use memo and then the dependencies for this would be PlayBack ID and dub time. Okay, so now you can see that we're passing in this the thumbnail image time, let's just make sure. So right here, you can see this image that we're passing in, it has a time equals 10 parameter. So that, let's get something more exciting here, this one, this one has time equals three. So you can see on this video, that's the thumbnail from time equals three, this is time equals two, it's time equals six. So you can pass in that time parameter and it'll give you the thumbnail from that exact time. Cool, so now that's a bit better.
Okay, who's with me? Who has the thumbnails working? And now we're going to take this up to the next level and am I still live? Do I go here to the live tab? Yep, I'm still live. Cool, the other idea here is on this live page, this corresponds to a live stream that's in Mux's world, right? In the dashboard, we can see this live stream. So what you would want to do in your application is you would have a database, right? With these live stream IDs and with the stream keys. And then if this particular live stream is offline, you would kind of keep track of that in your database. If Mux sends you web hooks, you keep track of that in your database. And then you could show something on this page that says, like, oh, currently offline. And then when you get a web hook that the live stream's active, then you can say, hey, we're online, we're streaming, and then you can show the video player of the active live stream at that point in time. Yeah, that works. That works, Nix. Okay. So now this video player is still, like, pretty ugly, right? Because it's just using the basic video element controls, and, like, it doesn't really look so good. It's also not super consistent between browsers. If you go to another browser, it looks slightly different. So now let's make this a little better. So, there's this other player that I've been... There's this player called plier.io, which works pretty well, I've been liking this one. And it has a couple of cool features. Let me open up the Q&A, so I can... Okay, it has a couple of cool features that we can make use of with Mux. So, let's install this. It looks pretty good, if you look here. It looks pretty good, and, then, you see, it has, like, nice controls. There's also this Timeline Hover preview, which is pretty cool, like, that's always super handy. You see that on YouTube, where you can hover over the timeline and get those thumbnails as you hover over the timeline. That's super handy, and we can actually get that to work with Mux. So, let's install that. This is on npm. I believe it's on npm, yeah. Okay, so, let's go here, let's add this as a dependency. And, then, let's update our video player so that we'll be using, I don't know how to actually pronounce this. I'm gonna call it Plier. So, this is how it works, import plier from, Plier.
20. Enhancing the Video Player with Thumbnails
The work we've already done to use HLS.js actually works with HLS.js. Import plier, video-source, and initialize the video element. Make the recordings bigger and enhance the UI. Use the HLS component to use the video player. Display preview thumbnails by using a VTT file and a single image containing all the thumbnails. Mux has a feature called storyboards that generates a storyboard of the video. Combine the storyboard with the VTT file to show thumbnails as you hover over the timeline.
Oh, and then, there's actually a note in the docs about, with HLS. This actually works with HLS.js. So, the work we've already done to use HLS.js actually works with HLS.js. And, they have a demo here. So, all we need to do is import plier here, and then, video-source, and then, what we need to do, up here, before we do all the HLS stuff, we can just say, new plier pass in the video element. And, that's, that's really it. Oh, we probably need some CSS. And, where does this get installed? Is it, or where do we get the CSS from? Hold on. Hold up, let me check something. I think it is here. Yeah, there it is. Okay cool. Let's make these recordings a little bigger. Cool. Good boy. What do you think how we did? What do you think? Cool. So now we have this nicer looking player. You can see this is a live stream. So we actually probably we would want to do some stuff to like control change the controls because we actually don't want to control. We don't want to show this scrub bar. No, that's on a live stream. Is it? Home page. This is the live playback I'd. Oh, this is from this is from a live stream that I'm doing. Yeah. Okay, let's mix that one. That's from an that's from my currently active live stream. Okay. So this is a recording recording screen recording. This is recordings. You can see we have like a nicer we have a nicer kind of UI here volume. We can have the speed controls. These are just kind of built into the player. We can pop out the picture and picture control. So I can drag this around my desktop. These are all kind of nice features. If you're building a video platform. Um, cool. Cool. Now zero with me so far, maybe, um, maybe we can update that. Use HLS component to use, uh, to use this, this video player. Um, it should just be that one line change, import, import the CSS and then just initialize it here. Um, okay. Now there was that one cool feature with the timeline, hover previews where we have hover over here. Now, so let's see how that works. Preview thumbnails. So I say it's possible to display preview thumbnails. Once you hover over the scrubber. Um, so the way that works is there's a VTT file and the VTT file basically describes if anyone's a UCSS sprites, um, the way it works is there's like one gigantic file that's broken out into little rectangle boxes. Um, cause the idea is if you're hovering over this timeline and you have to download a new image, you see how many images are like, if you had to download a new image for every single, if every one of these was a unique image, that would not be so great because you'd have to be downloading a lot of images in the browser and that would be a little unwieldy. So, um, the way this works, let's see if we can fit, we can find, find how it's working, but yeah. So there should be not that one. Yes. So this is it. So you can see there's this one image that has all the thumbnails in the single image, or there's actually two images, but this is one of them. So you can see all these thumbnails are broken out into each image now. So what we need to do is, and then, so there's kind of two components. You have this one image that contains all this little thumbnails, and then you have another file that tells the player, Hey, that's this. These are the coordinates of the image for this timestamp. And it goes through each of these and specifies the timestamps for each of them. So now we want to do that in our application. So we have this here. So Mux has a feature to do this called storyboards. So we take this playback ID. What we can do is we can go to image.mux.com slash playback ID slash and remember we were doing thumbnail.png in order to get the thumbnails. We can call it storyboard.png. And what that does, the very first time you do one of these it takes a little, it takes kind of maybe five to ten seconds to generate and then it's cache and then it doesn't take any longer than, but then the next time you get it it's instant. So, now you can see this is a storyboard of the video. So, there's these timestamps as that just, there's this, it pulls images from specific times throughout the video and then we can use this in conjunction with the VTT file in order to show, in order to so that the player understands where to show these thumbnails as you hover over the timeline. So, let's go ahead and do that.
21. Enabling Storyboard Thumbnails
Storyboard URL is image.mux.com/playbackID/storyboard.png. Storyboards only work for recordings, not live streams. To enable storyboard thumbnails in the video player, add a new property called isLive and set it to true for live streams. Pass the options object with enabled set to true and the source as image.mux.com/playbackID/storyboard.vtt. This will display the timeline hover preview feature.
Storyboard URL is image.mux.com slash playback ID slash storyboard dot png. That's it. Yep. Now storyboards only work for recordings, they don't work for live streams. So, let's add a new property to our video player component and let's just call it is live, default it to false. And then on the home page, when we render our video component, we'll just say we'll pass is live, true into there. So now, in our video player component, when we initialize the video, what we wanna do here is, so let's say options. And let's pass these options into player. And then the options and then preview thumbnails. Okay, so this is what we pass, we pass this in an object and then enable Boolean and then a source. So let's do that. Okay, now if it's not live, then we'll do options preview thumbnails, enabled true. And then the source, the source that we pass in will be image.mux.com and we don't actually pass in the storyboard file directly. What we pass in is storyboard.vtt. So I'll take a look at what vtt is. So this is, this is the storyboard image that contains the PNG that contains all the images. Now vtt, what vtt is, is it's this web VTT format file and it describes the images for, it describes where each thumbnail is. Slow down a bit, okay, cool. So let's go back to here. So I added a is live prop to the video player and it defaults to false, so. So on the homepage, when we run to the video player, we're passing in is live true and on the recordings, we're not passing in anything so we'll get the default is live is false. So now if this video player is not for live, I'm gonna change options.preview thumbnails and the value is this object enabled true and then the source is image.mux.com slash playback ID slash Storyboard.vtt. Let's see if it works. So you can see that's working now. So now we have that cool timeline hover preview feature.
22. Thumbnail Previews and Video Players
The VTT file, storyboard.vtt, describes the timestamps and coordinates of each thumbnail in the video. The player uses this information to show thumbnail previews as the user hovers over the timeline. The backend creates the storyboard and VTT file, while the player, Plier, supports this thumbnail preview format. In the live stream, the progress timeline is hidden, enhancing the live experience. In the recording, extra controls like progress and timeline are shown. The workshop participants can ask questions or seek help with their apps. Different video players are available for the web, such as Plier, HLSJS, VideoJS, JW Player, and Theo Player. On iOS, AV Plier is the standard player, while ExoPlier is used on Android. Commercial players like JW Player and Theo Player require licensing. Troubleshooting is done to get thumb time working, and the thumb time controls the display of thumbnails based on the video's length.
Okay, so to back up, this VTT file, storyboard.vtt, this is what storyboard.vtt looks like. It's this text file format, and it describes where each thumbnail is, and what it does is, these are the timestamps in the video, so it says from this timestamp to this timestamp, this timestamp to this timestamp, and it goes through. And it describes where each thumbnail actually lives. So we open up this storyboard file.png, and then this x, y, width and height. So it says the x-coordinates, the y-coordinates, the width and the height. So the player can actually use this information and understand how to show the thumbnails as the user is hovering over the timeline.
So the user is hovering over the timeline. Based on where the mouse is, it knows how many seconds it is into the video. And based on that, then it can use this VTT file in order to show a little thumbnail preview as the user is hovering around. So who has, does anybody have that working? So now, as I go through all these, I hover over the timeline, I can actually see the little preview of the video. So, there's kind of two pieces at work here. First is on this backend side, on the mux side, to actually create the storyboard and create that VTT file. And then there's the player side of it, which we're using this player called Plier and Plier supports this standardized format of doing this kind of thumbnail preview. So, let me pause right there, check the Discord. Yeah, Nix, that's what I was thinking. I think you could just do enabled as live, but I figured maybe the player would try to download the file and the file won't exist if it's live. So, yeah, I just thought that was more explicit, but yeah, you're exactly right. Nice job, Jordy. Now, this is live, so we probably also would want to hide the, you can see this progress timeline. We might wanna hide that when it's live, so we can probably do that too. Controls. Yeah, this is the controls. So, this is the default. So I think what we can do here is we can say... So let's do if it's live, what else here? So if it's live, we want to show the play button. Of play large, we want to show the play button. We don't want to show the progress. We don't want to show current time. We do want to show muted, we can show captions volume. We don't want to show the settings when we're alive because it doesn't make sense to change the speed when you're live. You can't go faster than 1.0. So we can maybe just like hide the settings panel when it's live. Let's do that to kind of like override the default controls when it's live. Yeah, so now in the live stream, I'm playing this, I have the mute, I don't have the settings panel because it doesn't make sense to change the controls. So now, so that's another way we can enhance the live experience. Then when I go over to recording, I have the player looks a little different in the recording because I have those extra controls. I have the progress, the timeline. So where's everyone at anyone need a specific help with their app, this was kind of the main part of the coding workshop. So I'm happy to take this kind of whatever direction you folks want. We can dive a little more into this, answer questions. If anyone has specific problem they're running into, we can debug that. The other thing I'll talk about is, so on the web, there's a handful of players that can be used. This one we use Pliers, one of them, HLSJS is kind of like the standard for just getting HLS functionality but you'll see that HLSJS doesn't have any UI components. HLSJS just controls actual HLS functionality. It actually gives that functionality kind of on to the video element. Doesn't have any UI features. This one, Plier, actually has the UI features of making it look nice. We can also theme it with the CSS variables and stuff like that. Now when you switch over to mobile, like a native mobile app, so we're talking like an iOS app or an Android app, there's really not many options. On iOS app, iOS ships with a framework called AV Plier. AV Plier is kind of the canonical one that everybody uses. It's the only one really Apple lets you use. There are some other player frameworks that are built on top of AV Plier, but on iOS, Apple really controls that pretty tightly. Android has an equivalent called ExoPlier. So there's these kind of like set known things on iOS and Android, but then when you get to the web, it's kind of a Wild West. There's a bunch of different players out there that have different features, different functionality. Another one is VideoJS. VideoJS is another one. Plier, then there's commercial players too like JW Player, Theo Player. These are ones that you have to like license and pay to get access to. Get thumb time working, cool. Let's take a look. So do you have the first step is in the array of recordings. Do you have oh, nice, what's this? So this, yeah, CRS 1138, I believe what you need to do in that is over here, you need to return a function that calls HLS.destroy. Otherwise, your live reload is not going to be working. So try that, try doing this return here. But what you also have to do is you'll have to create this let HLS at the top, and then down here, instead of assigning const HLS equals new, just HLS and then you're assigning that variable that you declared up top, and then over here in this function, you can call return, if HLS is defined then call HLS destroy couldn't get some time working. Yeah. So the thumb time, what that controls is, so for example, this is a good video to look at, it's 23 seconds long. That's the second video in my list of recordings.
23. Debugging Thumb Time and Account Information
To debug thumb time, check if you're passing the thumb time to the video player component. If you have any questions or need assistance, send a message to the support channel. To unlock more credit, sign up for a Mux account with your own email address and enter your credit card information. Mux sponsors developers creating content and offers credit in exchange for displaying the Microsoft logo. Streaming to YouTube and Twitch follows a similar process to what was demonstrated. HLS is the standard delivery protocol for most video content on the internet. Setting up a Mux account without a credit card has limitations on live stream and video asset duration. Plier connects to the video element and can be used to manipulate timestamps. Vercel and Netlify are recommended hosting platforms. If storyboard thumbnails are not appearing, ensure that options are passed to the plier and be patient for the first download. Debug using the Network panel by filtering for storyboard files.
So this is the timestamp from three seconds, but I changed this to six, and refresh here, you'll see that the thumbnail is different now. So to debug thumb time, you can first check that you have some time here in recordings, when you map the recordings, are you passing it in to the video player component? So that would be one thing to check is that, is it being passed in here sometime. So you want to recordings.map, deconstruct that object, passing thumb time here. And then in the video player, thumb time should be coming in here at the top. And then you'll want to, I use this useMemo function. Looks good, Geordie. Yeah. If you had this, I have this encode sampling. You could probably clone this code sandbox and just run versal and it'll deploy. Cool, so now, the end of this workshop. So one thing I want to throw out there, I know some folks are starting to leave, so let me just mention this, Muk's account, it will be deactivated. Muk's account you're in right now, it will be deactivated after this workshop. We will, if you have, if you want to like playing around with Muk's more, we love that. Sign up for Muk's account with your own email address. Like I said, because it's like abuse things that you have to put in a credit card, when you put in your credit card you'll automatically get $20 of free credit. So that'll just unlock $20 of credit. It's like pay-as-you-go billing. So $20 will get you some usage where you could do some live streaming, do some videos, stuff like that. But we have more for you. If you just send a message to the support channel and just say hey, you came in through the Reactathon workshop, then we'll throw $100 on your account. You do have to enter the credit card in order to unlock the credit and in order for us to add more credit to your account. That's just because we have to deal with a lot of abuse on our platform. So that's just like one thing, one limitation we have of our billing system. So there's that. Say you came in through Reactathon and we will be happy to do that. We'll be happy to add 100 bucks of credit on your account and if you are like creating content for developers, like let's say you're making videos on your site, we love sponsoring those kinds of things. So if you are creating developer content, doing coding, doing live coding, creating resources and videos, you can email me directly and we'll, we can talk about it, but we love sponsoring those kind of folks. We're happy to kind of throw a bunch of credit your way and come to some kind of agreement where you can throw the Microsoft logo on your page or something like that and we'll work something out if that is something you're interested in. So yeah, we love doing that. How to website. So websites like YouTube, they do live streaming basically like in a very similar way to what we did today, where if you go on YouTube and you go to start, start a live stream, then the next step is they give you a server URL and they give you a stream key and then you go into OBS or whatever encoder you're using, you enter your stream key and you start streaming. Really works kind of exactly like what we did today. So yeah, what you saw today is kind of, that's how it works when you're streaming to YouTube. That's how it works when you're streaming to Twitch. It's all using kind of like the standard protocols and delivery and on the delivery side of YouTube and Twitch, it works over HLS, like what we were just seeing today. Even when you watch YouTube videos on demand, you watch Netflix videos, when you watch most of the video on the internet, if it looks good, it's probably going over HLS. HLS is kind of the standard delivery, video delivery. Do I need a credit card for let's say, so no, if you set up an account with no credit card, then it just has some limitations. So live streams are limited to 10 minutes and all video assets are limited to 10 seconds and they are watermarked. So that's a limitation if you don't put a credit card. If you do put in a credit card, that unlocks $20 of free credit. And then if you just sent us a message, then we'll add, we'll bump that up to $100. So happy to do that. Nix, yeah, that's correct. So plier just connects to the video element. So yeah, you actually never have to use that variable. I mean, unless you want to manipulate it later, like do something like change the timestamp or something like that. Yeah, Vercel is kind of amazing, or Netlify. Oh, I didn't know you could connect those from the from Kosanbox. Whoa, nice. That is awesome. I had no idea. Awesome, Brian. Yeah, Dylan at mux.com. Give me a shout, and we can talk more. Oh. The VTT, it should work for long videos. What are you running into? Yeah, it probably, the very first time it downloads the storyboard file. MUX actually generates the storyboard kind of like on the first request. So we like lazily generate that. The first request for that storyboard file takes, I don't know, maybe five to 10 seconds. Then every other subsequent request after that, it's cached, so then it's super fast. But that's probably what you're running into. Still no storyboard thumbs. Are you, it was live. Can you make sure you're passing in the options into the plier? And also, like I said, the first time it downloads it, it might take a little longer, so you might not see it the first time, but then the next time you would. So check on that. The other way to debug it is to open this, open the Network panel. And if you go to All and then just filter by storyboard then you should see that it's downloading the storyboard.vtt file and the storyboard.jpeg file. Try that out.
24. Playback Security and Video Dimensions
If it's not downloading the files, then the configuration is incorrect. If it is downloading the files, then it should be working. Playback security is an important aspect to consider. Mux offers a solution similar to using signed URLs on S3. By creating a signed URL and attaching it to the playback ID, the playback ID can only be used with a token parameter. This ensures that the playback ID is not public and can only be accessed with a valid signature. Mux preserves the dimensions of the original video source, including thumbnails and the Storyboard preview.
If it's not downloading these files, then it's not configured right. If it is downloading these files then it should be working. Oh yeah, it's too small. Yeah, yeah, that could be. And then we talked about playback security. Just gonna open up our docs here. We have a guy for secure video playback. This does is when you have a playback. Remember we were just dealing with playback IDs, right. If anyone was able to get your playback and your playback ID is essentially public. And you're just publishing that playback ID. So if anyone was able to get the inspect, the element or inspect the network and get the playback ID, then they could essentially lift that video and use that playback ID on their site, right. And they would get kind of like grab that from you. So one solution we have to kind of deal with that is the same thing if you're used to using S3, like signed URLs on S3. It works the same way, which is you just create a signed URL and then you attach that to the playback ID stream.mux.com slash playback ID dot M3 U8 and you add a token parameter. And then that those playback ID's can only be used with a token parameter. So you actually have to create. So you actually create a playback ID that is not public. So when you create a playback ID by default, when you create one, like when you create one here, playback policy says public, if I change this to signed, then view asset. Now there's no public playback ID. This is a signed playback ID. So the only way to access this playback ID is I need to generate a signature. And in that signature, I have a timestamp and it's only valid for that certain amount of time. The same way you would generate a signature to access a URL that's on S3 or something like that. Example of, yeah, so whatever dimension video you have, you throw it into Mux and Mux will kind of preserve that video, those video dimensions. It doesn't, we don't do any like cropping or anything like that. So that, any kind of video you throw in will work and it'll work with the thumbnails and the thumbnails will be the same and the Storyboard preview and all that stuff will just like kind of respect the aspect ratio that you sent in of the original video source.
25. MUX Assets and Pricing Model
In this part, the speaker discusses the benefit of using MUX assets instead of videos directly. MUX processes video files and creates different quality levels to optimize streaming based on the user's device and internet connection. The speaker also mentions the pricing model for MUX, which is based on the minutes streamed, delivered, and stored. They highlight the advantage of using minutes as an estimation rather than gigabytes. The part concludes with the speaker expressing gratitude to the audience, sharing their email for further questions or connections, and thanking everyone for joining the workshop.
Oh, I'm just seeing, so these questions. Okay. Video source, I think we answered that. Video source here was you take the Playback ID and you create this video source variable, which is the.m3ua file. And shouldn't those be route inside a Switch Tech? Yeah, you got me there. That was right. Well, I think we need to use a route. I think I was using router, so yeah, that's correct.
Oh, yeah, the benefit of creating MUX assets versus using videos directly. So if you have a video file that's like an MP4 and you put that. So that's a good question. So, like why even use MUX assets in the first place? So if this video, like this video file, this is a MUX asset, so it's using HLS. Now, if you just use the video MP4 directly, like, let's say this video player just pointed at an MP4 file that was sitting on its server. What happens is a lot of users who try to download that video are going to have a bad time because that MP4 file, it's probably a high depth version of the file that takes up a lot of bandwidth and if users are on a slower bandwidth connection, mobile device, things like that, then they will be unable to kind of download that video fast enough for playback. So what MUX does and what HLS does is MUX processes that video file, creates a bunch of different quality levels of that video. So one single video file that's in HD, we might process that and create five different quality levels of that video. And then you take an HLS URL, and now when the user's watching that video, it's adaptive. So they'll get the quality level that suits their internet connection and their device requirements at that time. So if you have a big high dev video file and someone's on a mobile device, they're on a small screen, it doesn't make sense to send all of that giant data file that's in the format of an MP4, down onto their device because their device can't even, it's going to be wasted. Basically you're wasting a bunch of bandwidth because you're sending data that you don't need because that video file is like too big for their device. Or they're on a slow network and they're unable to actually download it fast enough. Then they can't even watch your video. So the whole thing that HLS does is it processes your video, creates a bunch of different quality levels and then your device is able to actually adapt to those quality levels and optimize the streaming based on the quality levels that are available. So that's like kind of the whole purpose behind it. And that goes back to kind of what Ed demoed in the beginning of how the, when he was throttling the network and the quality of the video was changing.
Oh yeah, that code sandbox. Let me share that again. Let me give this a better name. Old link should still work. Oh, we answered that, how YouTube does live stream? Price model in Moxie, it's per minute streamed and delivered and stored. So, if you upload a video file and it's like per minute per month that we hold on to that video file, and then it's per minute delivered. There's a pricing calculator on the website. So, you can kind of calculate average video of 4 minutes, you want to upload 500 videos and you have 75,000 people watching and they watch about 50% of your video, it kind of calculates what's the rough, what's the rough cost of that. You can kind of play around with these and figure out roughly what it's going to cost. A lot of other platforms I use like bandwidth estimation, it's like, oh, this is how many gigabytes. I find bandwidth is actually quite hard to estimate, minutes is a lot easier. If you're building a video platform, you might not get it perfect but you probably have a better idea of, hey, we have X hundreds number of people coming and they watch about X number of minutes of video and we have Y number of hours of video stored. It makes it easier to do the math than trying to figure out how many gigabytes that is. I find that's not super useful. Much more useful to use minutes as an estimation. It's all pay as you go. And then, of course, once you get up to higher plans, then we have annual commitments and down the per unit cost. Okay. Awesome. Thank you, everybody, for joining. I shared my e-mail, so feel free if you have any other questions after this or just want to connect, Dylan at Mux.com. Happy to answer any more questions you have about video or about Mux specifically. And this was great. Code sandbox. Let me share that again. And enjoy the rest of the conference. Thanks for coming. This is by far the biggest workshop we had. So this was great. We'll definitely do more of these. Thanks for joining and, yeah, keep in touch. I'll be hanging out in the discord for the rest of the conference, so feel free to catch me there.