Introduction to Remotion, a new library with which we can turn React code into MP4 videos, create our motion graphics programmatically and server-side render them. I give an overview of the priciples, the philosophy and of course we are going to code a video!
Creating Videos Programmatically in React
AI Generated Video Summary
The Talk discusses the use of ReMotion, a library that allows for declarative video creation in React. It covers the process of creating videos, animating elements, and rendering multiple compositions. The Talk also mentions the features of ReMotion, such as audio support and server-side rendering. ReMotion 2.0 introduces audio support and the possibility of live streaming. The Talk concludes by highlighting the frustration with existing video editing tools and the integration of existing videos into ReMotion projects.
1. Introduction and Remote Conference
Hey, everyone! I'm Jonny Burger, joining you from Zurich, Switzerland. Let's make the most of this remote conference. Pre-recorded videos, like the Apple keynote, are more dynamic and enjoyable to watch.
Hey, everyone, and welcome. My name is Jonny Burger, and I am coming to you from beautiful Zurich, Switzerland. Unfortunately, we can only do this conference remotely, which means that I am at home, and you are at home, and you don't see me personally, but you just see a video. But it's not so bad. I can make a more interesting video, I can choose different angles, and I can edit out all the mistakes that I make. Have you seen the Apple keynote recently? Now that they pre-record everything, their videos are much more dynamic and much more enjoyable to view.
2. Writing Videos Declaratively in React
I want to reuse elements in my video and parametrize them like I do with React components. I made a library called Remotion which allows you to write real mp4 videos declaratively in React. Making a video in React sounds like something very magical but it is actually not and it is not a black box and in order to use ReMotion correctly we actually need to know how it works. We start off in a React project and we define a composition. We have to pass in the width, the height, the duration and the frame rate of the video and which component will be used to render the video. Once we have written our video, we enter a three-stage rendering process. We open multiple tabs and make multiple screenshots at the same time.
Anyway, how we normally edit these videos are graphical programs like Adobe Premiere, iMovie, After Effects, DaVinci Resolve. But for me as a developer, something has always been missing. I want to reuse elements in my video and parametrize them like I do with React components. I want something that is more dynamic than just copy and pasting layers and undoing and redoing my actions. I want something that is more declarative.
Also version control is horrible for videos. If I save my project, close the program and reopen it up, then basically all my history is gone and I only have a snapshot of the current time unless I make copies of my file with v1, v2 file extensions and sometimes the program will just crash and all my progress is gone. I hope you see where I am going with this. I want to write videos programmatically.
I made a library called Remotion which allows you to write real mp4 videos declaratively in React. I am such a big believer in it that this video itself was edited in React using Remotion, at least all the videos that I am submitting to React Summit. This is also an open source video so you can go to the GitHub link you see on the right right now and visit the source code of the video, all the footage is there and all the edits and slides written in React.
Making a video in React sounds like something very magical but it is actually not and it is not a black box and in order to use ReMotion correctly we actually need to know how it works. So let's take a bird eye view at how ReMotion turns code into a video. We start off in a React project and we define a composition. A composition is basically a video but it has metadata on it that we have to define. We have to pass in the width, the height, the duration and the frame rate of the video and which component will be used to render the video. Inside that component, we need to call the UseCurrentFrame hook to basically get back the current time and based on that time, which is expressed as the current frame, which is an integer, we need to render anything that we want using our favorite web technologies. important that we use this frame variable to drive our animation rather than something else like a CSS transition. More on that later.
So, once we have written our video, we enter a three-stage rendering process. The first step is to bundle the video using webpack much like any other web application. In the second step, we open a headless Chromium instance and open the web page in it and then take a screenshot for each frame of the video. To make this process efficient and fast, we need to parallelize it. So, we actually open multiple tabs. The number depends on the number of cores of your CPU, or you can also customize it, but you want to have some kind of parallelism if you can afford it. We open multiple tabs and make multiple screenshots at the same time. This is the reason why it's very important that we use the UseCurrentFrame hook to drive all our animations. Because if you use something like a CSS transition or a RequestAnimationFrame, or you try to include a GIF it will not work because during rendering we open multiple instances of your animation. It pretty much only works if I give you a frame, you must return a static image that does not have any side effects.
3. Creating Videos with ReMotion
To create a video using ReMotion, we need to ensure that side effects are synchronized with the current time. After collecting all the frames, we stitch them together using ffmpeg. In the demo, we open the terminal, install the necessary dependencies, and open the development server to preview the video. We define a compositional timeline in ReMotion and can have multiple compositions. An idea for a video project involves creating graphics for each speaker at react summit by importing the list of speakers and animating them.
If you have any side effects where things will animate without the current time that ReMotion says it is changing, then the video will have some artifacts or have some lag. Assuming you don't step over the stone, we can then take all the frames, collect them in the third step and stitch them together using ffmpeg and tada we have a real video that we can then post on social media for example.
Of course there is much more to it like how to add audio, but I think this is the most important concept in ReMotion that you should know. It's time for a quick demo. Let's create our first video together. To get started we simply open up the terminal and type in yarn create video. And the installation has finished. So let's cd into our video and open it up in VS Code. Also, once we have done that, let's run npm start to open up the development server. And you will see that we will get a visual preview of the video. Let's wait a moment until it's done loading. And there we go.
As you can see, there is a Hello World project included in Reemotion. And there's also a timeline with which you can essentially control the current frame. So I have made a quick cut to adjust my project a little bit. I removed all the Hello World stuff and replaced it with a black canvas. Also I have upgraded the Reemotion version to a version that will be out by the time that you watch this video, but it was not out at the time that I was recording. Anyway, so this is how we define a compositional timeline in this entry file of Reemotion and we define the ID, the width, the height, the duration and the frame rate of a video as well as the component as I have previously mentioned. We can also have multiple compositions, they need to have different IDs and then you can just switch in the left side bar between them.
So, and I have replaced this with a black canvas. And now I want to tell you my idea of what kind of video we can make together. I saw on Twitter that react summit is making a graphic for each speaker and posting it, has like the profile picture, the talk name and everything about each speaker on it. And I also noticed that there's like a JSON file in the source code of the react summit website that contains all the speakers. So, I thought it would be fun to create all these graphics at once and also animate them so they are a video, not an image. So, this is what I've done. I've imported the list of all react speakers into this project. To get started, we don't have much time, so I'm gonna code something really quick. See you in a second. What I have quickly created in, pretty much, normal react markup and CSS. As you can see I did not use any library.
4. Animating Elements with ReMotion
You can use any library that you want. Instead of a lowercase image tag, I used the image component from ReMotion. It works the same as the native image element, but with a loader that waits for the image to load before rendering. I added circles using SVG. Now, let's animate it. We use the UseCurrentFrame hook to drive the animations. We can animate the profilePicture by scaling it up using the interpolateHelper function. Apply the scale using CSS. For smoother animation, use the spring function and pass in the frame and frames per second. Let's give it a try!
You can use any library that you want. But here I just did this with normal markup and CSS. Just one thing to note. Instead of a lowercase image tag, I used the image component from ReMotion. It works pretty much exactly the same as the native image element. Except that it is wrapped in a loader that will wait until the image is loaded before it renders the frame. So that your video will not contain an image that is not fully loaded. Also I have added some circles here in the background using SVG which is also pretty cool.
Anyway, so now that we have this basic setup which I did not go through in full detail because you probably already know a little bit of React at least. Let's animate it. So, we need to drive all our animations using the UseCurrentFrame hook. So, I'm gonna say const frame equals UseCurrentFrame and now we need to animate things over time.
Let's say we want to animate the profilePicture, so that it scales up. For that we can use the interpolateHelper function in ReMotion. I'm gonna say scale and I'm going to interpolate the scale over time. What I mean by that is I pass in what drives the animation, the frame, and I say an input range. I say from 0 to 30, so that basically means that the duration of the animation is 30 frames. I'm just gonna say it will scale from 0 to 1. Now I just apply this style using normal CSS to the image. And as you can see my image is scaling in over the duration of 1 second. So it doesn't look that smooth. Which is why I prefer to use a spring animation. So let's get rid of that and say const scale equals spring. For that we need to pass in the frame and the current frames per second. So this is the necessary information that the spring function needs to know in order to calculate the video. Yeah, let's just give this a try and there we go! It's animating in! Let's also pass this scale to our circles in the background. Alright, looking pretty good, right? Let's slide in the title as well. So we can say const title, title.translation for example. And I love to use spring animations for everything. Let's pass in fps and frame, but this time let's do it a bit more advanced.
5. Animating Elements and Creating Multiple Videos
We can adjust the physical parameters of the spring animation, such as increasing the damping to prevent overshooting. By lying to the spring function about the current frame, we can delay the transition. We also need to interpolate the range from 0 to 1 to something like 0 to 1000. After fixing a small mistake, we animate from 1000 to 0 and apply the translation to all elements. Now, let's create animations for each speaker at React Summit using a JSON file and the .map function.
We can play around with the physical parameters of our spring animation. I like to increase the damping so that it will not overshoot. As you can see here it overshoots and then becomes smaller again. I want to delay this transition so that the animation doesn't immediately start.
For that I simply lie to the spring function, which frame it currently is. And I say, actually it is 20 frames earlier than you think it is. Now the value goes from 0 to 1, as any spring animation does. We actually also need to interpolate it, so that the range is not between 0 and 1, but something like 0 and 1000. So in here, the input range is 0 and 1 and we say, 0 to 1000. I'm gonna pass a transform.
Okay, so I made a small mistake there. I accidentally animated in my name instead of the talk title and also it's going down instead of up. But of course with fast refresh we can easily fix these things. So we actually want to animate in from 1000 to 0. Let's take a look at how that looks like. All right. Let's also apply this translation really quick to all the other elements so that I think it will look good. I'm just going to copy-paste it over here. Maybe you will find a more engineered way to do so. But I think this gets the job done quickly. This is how our video is now looking like. Nice!
Now for what I think is the most fun part. Let's turn one video into dozens of videos. Let's create one of these animations for each of the speakers that appear on React Summit. For this, I have already, as shown before, imported a JSON file into the project. Where all the metadata is there. And I'm just gonna use .map to iterate over each speaker and return a composition for them. So it might look something like that. Let's give each one a unique key. And also very handy that there is a Slack property on each speaker.
6. Dynamic Composition and Video Rendering
Let's make the composition dynamic by aligning the components to take props like speaker name, company, avatar URL, and talk title. We can define default props for each composition, and later overwrite them on the command line. Now, let's create an mp4 video using a specific composition ID and run the build command to render the video.
Yeah, there we go. And now as you can see, all the speakers are in the left sidebar. So now, let's make the composition also dynamic by aligning the components to take some props. So let's say speaker name should be a string, company should be a string, avatar URL can also be a string and what else is missing? Talk title. I'm just going to accept these. And speaker name and fill them in, in my video. Here I have a small naming conflict with a style, so I'm just going to do this. Let's put this as the talk title. This is the speaker. Sorry, another naming conflict. I hope this will now solve it. Okay, it was just an indentation problem. Never mind, never mind. Here, let's put the avatar URL. Here we can define some default props for each composition, and I say default because these props can later be overwritten on the command line. But for the preview here in the browser, we're going to put some default props. The avatar URL is going to be speaker.avatar.company, speaker.company, speakerName, speaker.name, and the tagName speaker.game. Let's do speaker.activities.tags, 0.title. I think that's going to be it. And let's cast this to a string. There we go!
So here we have one for Brandon Bayer. Here we have Brent Watney's tag. Lee Robinson's. So now let's make an mp4 video out of it. Let's take the video from KentzC. I'm just going to copy the composition ID of his tag. And I'm just going to put this into my build command, which you can execute using npm run build as a shorthand, or you can write all of this out. So let's write this to kent.mp4 and run npm run build. And yeah, this will take a few seconds. You can get a taste of how long it takes to render a two-second video.
QnA
Features and Open Source
Here we can see the 8x concurrency and the video has been created. ReMotion offers features like sequences, audio support, and server-side rendering. Check out the documentation at reemotion.dev for more information. The talk is open source, and you can find the link to the edited video on screen. Thank you for listening and enjoy the Q&A session. Remotion is amazing, and despite some missing features, it's open source for anyone to contribute.
Here we can see, for my laptop it shows the 8x concurrency. So it makes eight screenshots at the same time. And the video has been created. Let's have a look.
And here we have an announcement of Kent CDOT's talk fireside chat with Kent CDOT. I've only been able to barely touch the surface of what ReMotion can do. A few things that you could explore next. The first thing is sequences, which is a construct that allows you to shift time. Which is really handy if you have multiple scenes that you want to work on individually and then you want to put them together and that they play after each other. Which really helps you to organize your code and organization and reusability and encapsulation is the key to scaling up your video production. Another thing is audio support. By the time that you are watching this, we will have launched audio support. You will be able to do things like have as many tracks as you want, you will be able to change the volume on a pair frame basis and you will be able to cut and trim audio. Another big topic server side rendering. So how to render a video based on an API call. For all these things I would recommend you to check out the documentation at reemotion.dev where we will explain all the topics and we also have some video tutorials there to help you understand the ins and outs of Reemotion. Also remember that this talk is open source and by that I don't just mean the slides but like the whole video everything that you saw was edited using Reemotion and you can check out you can check it out with the link that is on screen now.
Yeah, that's it. Thanks a lot for hearing me out. I will be live for a Q&A right after this talk and I hope you will afterwards also enjoy the other talks of this evening.
Hey, good. Thanks for having me. Yeah, so it's my pleasure to have you here. I'm also following you on Twitter for a while now and you're posting so many amazing things, but this Remotion is on a different level, I have to tell you directly. It's amazing, what you did with only React.
Cool. Yeah, thank you. The audience doesn't seem to agree, they don't want to use it apparently. For the moment, I think there are still maybe missing features, right, but you're working on it. It's also open source, so anyone can contribute on that.
Reomotion 2.0 and Live Streaming
So, I don't see the point why in the near future, maybe next year, everyone will create videos using Reemotion. The biggest feature in Reomotion 2.0 is audio support. Reomotion will create as complex as necessary FF MPEG filter to convert your React markup into a real audio track of an mp4 file. It's amazing. It's not the intended use case to use it for live streaming, but I would say it's generic in a way that I could foresee it to happen. You can just play the video in a browser, and soon I'm gonna make it that you can embed this in your own web page and then change the props of the composition live.
So I don't see the point why in the near future, maybe next year, everyone will create videos using Reemotion. So it seems that the GUI has 53%. It's almost tied, right? I mean, we can say an even poll here. Yeah, that's pretty good. I mean, if half of the people want to write in React, that's still a big chunk.
So, actually, not so bad. Let's see if we have any questions. I'll go now to the chat. Let's see. All right. So it seems that we don't have so many questions. Just, if you have any, just shoot them in the Basecamp Q&A. I have some questions on my own.
So, Johnny, I saw that you recently, two hours ago, actually announced Reomotion 2.0, which is a major bump. What were the improvements that you did? What were the new features that you've added? If you can add just a little bit there.
Sure. So there's a brand new version of Reomotion out just two hours ago. And the reason why it's just now is because I mentioned it in my talk. And then, so I was really motivated to actually ship it because I announced it here. I just did it in time. The biggest feature in Reomotion 2.0 is audio support. I think really cool that you can just declaratively put these audio tags, cut and trim, put them at any position, put multiple audio tracks, and even like change the volume per frame, create fading effects, fade it out at certain times. Reomotion will create as complex as necessary FF MPEG filter to convert your React markup into a real audio track of an mp4 file. That's really powerful, because I saw on YouTube for example lots of channels are using this kind of equalizer like Soundwave for all the videos. So just adding a video without going through a third party application and just directly in the browser maybe, or just popping up the terminal and type something and pass the mp4 mp3 and have an mp4 as an output. It's amazing.
Now, I am wondering, is it possible to make it live, like, to just feed an mp3 or live streaming session let's say audio and just have it out there. Yeah, interesting. I would say it's not the intended use case to use it for live streaming, but I would say it's generic in a way that I could foresee it to happen. So, I mean, right now, as you saw, you can just play the video in a browser, and soon I'm gonna make it that you can embed this in your own web page and then change the props of the composition live.
Lumotion Inspiration and Integration
If you script it cleverly, you can stream it. The original inspiration for creating Lumotion was frustration with existing video editing tools. Remotion does not make videos from CSS transitions. Instead, it requires creating static images for each frame. Integrating existing videos into a project is done using an HTML5 video tag and a video component in ReMotion that synchronizes with the current time.
So, yeah, I mean, if you script it in a clever way then and then stream that, why not? We have a question from Afro Dev Girl is saying, what was the original inspiration for creating Lumotion? The original inspiration. Well, I would say it was more like a frustration with the existing video editing tools. Since, yeah, I mean, I was missing these features that I am used to as a developer. I just like have this version history, just be able to pull in data from APIs. Do stuff with an API call programmatically. With video editing programs, I would just have to open it. And no good tool for abstraction except copy pastes. All these things led me to eventually create my own video editing program.
Great. I can see that frustration. I also have it. Another question from Vadik. Does Remotion makes videos from CSS transitions only? If it makes them from CSS transitions only? No, not at all. I think I mentioned this in the video, that CSS transitions actually don't work at all. The reason is that you're supposed to create a bunch of static images in React. If I give you a frame number, you create a static image. And these static images together make an animation. Whereas a CSS transition, it's not so much an animation that is derived from the frame number. You just put it in your CSS file and it moves itself without you doing any other stuff. So instead of using a CSS transition, you would just like calculate the position of, calculate the property value for each frame and then render that. And so, yeah, that is a constraint, but also a really necessary one. And once you get the hang of it, it is quite nice because then you can just like take the timeline cursor and drag it back and forth, pause your animation and that you cannot do with a CSS transition.
Great. Another question from jrock94, how do you integrate existing video into a project? I'm not sure if he's thinking about MP4s or ReMotion projects. That works pretty well. You just use an HTML5 video tag, and you just load your video into using an import statement like you would in Webpack, and pass that to a source of a video tag. Like I said, normally, this would not be driven by the frame that ReMotion thinks it is. But we have made a video component that will synchronize the video with the current time. So that works pretty well. Actually, all the screencasts that you have seen on the right side of the screen while I was coding, it was just a video imported into ReMotion and played back.
Conclusion and Q&A
It was a smooth and enjoyable project. Unfortunately, we ran out of time. Thank you, Johnny, for sharing another way to create videos using code, especially React. More questions are coming in, and Johnny will be available on Spatial chat to answer them shortly.
It was so smooth that I think it was not possible to see that I did not just submit the video like it was. Yeah, it's pretty much an inception, right? Just recording yourself inside the recording. He's like heading yourself inside the recording. Really nice. It's a fun project.
Unfortunately, we ran out of time. And thank you so much, Johnny, for taking your time and showing us another way of creating videos using code. Especially React.
Are you available for any other questions? Because I see that they are coming now. Are you on Spatial chat, and people can find you there? Yeah, absolutely. Yeah, I have it marked in my calendar. I will now move over to Spatial chat. And great to see that more questions are coming in. I'm very happy to answer them in a few minutes over there. Thank you so much, Johnny. Once again, it was a pleasure to have you here. Enjoy the rest of the day. Thank you. Bye. Thank you. Bye. Enjoy the conference.
Comments