Creating Videos Programmatically in React

Rate this content
Bookmark

Introduction to Remotion, a new library with which we can turn React code into MP4 videos, create our motion graphics programmatically and server-side render them. I give an overview of the priciples, the philosophy and of course we are going to code a video!

34 min
14 May, 2021

AI Generated Video Summary

The Talk discusses the use of ReMotion, a library that allows for declarative video creation in React. It covers the process of creating videos, animating elements, and rendering multiple compositions. The Talk also mentions the features of ReMotion, such as audio support and server-side rendering. ReMotion 2.0 introduces audio support and the possibility of live streaming. The Talk concludes by highlighting the frustration with existing video editing tools and the integration of existing videos into ReMotion projects.

1. Introduction and Remote Conference

Short description:

Hey, everyone! I'm Jonny Burger, joining you from Zurich, Switzerland. Let's make the most of this remote conference. Pre-recorded videos, like the Apple keynote, are more dynamic and enjoyable to watch.

Hey, everyone, and welcome. My name is Jonny Burger, and I am coming to you from beautiful Zurich, Switzerland. Unfortunately, we can only do this conference remotely, which means that I am at home, and you are at home, and you don't see me personally, but you just see a video. But it's not so bad. I can make a more interesting video, I can choose different angles, and I can edit out all the mistakes that I make. Have you seen the Apple keynote recently? Now that they pre-record everything, their videos are much more dynamic and much more enjoyable to view.

2. Writing Videos Declaratively in React

Short description:

I want to reuse elements in my video and parametrize them like I do with React components. I made a library called Remotion which allows you to write real mp4 videos declaratively in React. Making a video in React sounds like something very magical but it is actually not and it is not a black box and in order to use ReMotion correctly we actually need to know how it works. We start off in a React project and we define a composition. We have to pass in the width, the height, the duration and the frame rate of the video and which component will be used to render the video. Once we have written our video, we enter a three-stage rendering process. We open multiple tabs and make multiple screenshots at the same time.

Anyway, how we normally edit these videos are graphical programs like Adobe Premiere, iMovie, After Effects, DaVinci Resolve. But for me as a developer, something has always been missing. I want to reuse elements in my video and parametrize them like I do with React components. I want something that is more dynamic than just copy and pasting layers and undoing and redoing my actions. I want something that is more declarative.

Also version control is horrible for videos. If I save my project, close the program and reopen it up, then basically all my history is gone and I only have a snapshot of the current time unless I make copies of my file with v1, v2 file extensions and sometimes the program will just crash and all my progress is gone. I hope you see where I am going with this. I want to write videos programmatically.

I made a library called Remotion which allows you to write real mp4 videos declaratively in React. I am such a big believer in it that this video itself was edited in React using Remotion, at least all the videos that I am submitting to React Summit. This is also an open source video so you can go to the GitHub link you see on the right right now and visit the source code of the video, all the footage is there and all the edits and slides written in React.

Making a video in React sounds like something very magical but it is actually not and it is not a black box and in order to use ReMotion correctly we actually need to know how it works. So let's take a bird eye view at how ReMotion turns code into a video. We start off in a React project and we define a composition. A composition is basically a video but it has metadata on it that we have to define. We have to pass in the width, the height, the duration and the frame rate of the video and which component will be used to render the video. Inside that component, we need to call the UseCurrentFrame hook to basically get back the current time and based on that time, which is expressed as the current frame, which is an integer, we need to render anything that we want using our favorite web technologies. important that we use this frame variable to drive our animation rather than something else like a CSS transition. More on that later.

So, once we have written our video, we enter a three-stage rendering process. The first step is to bundle the video using webpack much like any other web application. In the second step, we open a headless Chromium instance and open the web page in it and then take a screenshot for each frame of the video. To make this process efficient and fast, we need to parallelize it. So, we actually open multiple tabs. The number depends on the number of cores of your CPU, or you can also customize it, but you want to have some kind of parallelism if you can afford it. We open multiple tabs and make multiple screenshots at the same time. This is the reason why it's very important that we use the UseCurrentFrame hook to drive all our animations. Because if you use something like a CSS transition or a RequestAnimationFrame, or you try to include a GIF it will not work because during rendering we open multiple instances of your animation. It pretty much only works if I give you a frame, you must return a static image that does not have any side effects.

3. Creating Videos with ReMotion

Short description:

To create a video using ReMotion, we need to ensure that side effects are synchronized with the current time. After collecting all the frames, we stitch them together using ffmpeg. In the demo, we open the terminal, install the necessary dependencies, and open the development server to preview the video. We define a compositional timeline in ReMotion and can have multiple compositions. An idea for a video project involves creating graphics for each speaker at react summit by importing the list of speakers and animating them.

If you have any side effects where things will animate without the current time that ReMotion says it is changing, then the video will have some artifacts or have some lag. Assuming you don't step over the stone, we can then take all the frames, collect them in the third step and stitch them together using ffmpeg and tada we have a real video that we can then post on social media for example.

Of course there is much more to it like how to add audio, but I think this is the most important concept in ReMotion that you should know. It's time for a quick demo. Let's create our first video together. To get started we simply open up the terminal and type in yarn create video. And the installation has finished. So let's cd into our video and open it up in VS Code. Also, once we have done that, let's run npm start to open up the development server. And you will see that we will get a visual preview of the video. Let's wait a moment until it's done loading. And there we go.

As you can see, there is a Hello World project included in Reemotion. And there's also a timeline with which you can essentially control the current frame. So I have made a quick cut to adjust my project a little bit. I removed all the Hello World stuff and replaced it with a black canvas. Also I have upgraded the Reemotion version to a version that will be out by the time that you watch this video, but it was not out at the time that I was recording. Anyway, so this is how we define a compositional timeline in this entry file of Reemotion and we define the ID, the width, the height, the duration and the frame rate of a video as well as the component as I have previously mentioned. We can also have multiple compositions, they need to have different IDs and then you can just switch in the left side bar between them.

So, and I have replaced this with a black canvas. And now I want to tell you my idea of what kind of video we can make together. I saw on Twitter that react summit is making a graphic for each speaker and posting it, has like the profile picture, the talk name and everything about each speaker on it. And I also noticed that there's like a JSON file in the source code of the react summit website that contains all the speakers. So, I thought it would be fun to create all these graphics at once and also animate them so they are a video, not an image. So, this is what I've done. I've imported the list of all react speakers into this project. To get started, we don't have much time, so I'm gonna code something really quick. See you in a second. What I have quickly created in, pretty much, normal react markup and CSS. As you can see I did not use any library.

4. Animating Elements with ReMotion

Short description:

You can use any library that you want. Instead of a lowercase image tag, I used the image component from ReMotion. It works the same as the native image element, but with a loader that waits for the image to load before rendering. I added circles using SVG. Now, let's animate it. We use the UseCurrentFrame hook to drive the animations. We can animate the profilePicture by scaling it up using the interpolateHelper function. Apply the scale using CSS. For smoother animation, use the spring function and pass in the frame and frames per second. Let's give it a try!

You can use any library that you want. But here I just did this with normal markup and CSS. Just one thing to note. Instead of a lowercase image tag, I used the image component from ReMotion. It works pretty much exactly the same as the native image element. Except that it is wrapped in a loader that will wait until the image is loaded before it renders the frame. So that your video will not contain an image that is not fully loaded. Also I have added some circles here in the background using SVG which is also pretty cool.

Anyway, so now that we have this basic setup which I did not go through in full detail because you probably already know a little bit of React at least. Let's animate it. So, we need to drive all our animations using the UseCurrentFrame hook. So, I'm gonna say const frame equals UseCurrentFrame and now we need to animate things over time.

Let's say we want to animate the profilePicture, so that it scales up. For that we can use the interpolateHelper function in ReMotion. I'm gonna say scale and I'm going to interpolate the scale over time. What I mean by that is I pass in what drives the animation, the frame, and I say an input range. I say from 0 to 30, so that basically means that the duration of the animation is 30 frames. I'm just gonna say it will scale from 0 to 1. Now I just apply this style using normal CSS to the image. And as you can see my image is scaling in over the duration of 1 second. So it doesn't look that smooth. Which is why I prefer to use a spring animation. So let's get rid of that and say const scale equals spring. For that we need to pass in the frame and the current frames per second. So this is the necessary information that the spring function needs to know in order to calculate the video. Yeah, let's just give this a try and there we go! It's animating in! Let's also pass this scale to our circles in the background. Alright, looking pretty good, right? Let's slide in the title as well. So we can say const title, title.translation for example. And I love to use spring animations for everything. Let's pass in fps and frame, but this time let's do it a bit more advanced.

5. Animating Elements and Creating Multiple Videos

Short description:

We can adjust the physical parameters of the spring animation, such as increasing the damping to prevent overshooting. By lying to the spring function about the current frame, we can delay the transition. We also need to interpolate the range from 0 to 1 to something like 0 to 1000. After fixing a small mistake, we animate from 1000 to 0 and apply the translation to all elements. Now, let's create animations for each speaker at React Summit using a JSON file and the .map function.

We can play around with the physical parameters of our spring animation. I like to increase the damping so that it will not overshoot. As you can see here it overshoots and then becomes smaller again. I want to delay this transition so that the animation doesn't immediately start.

For that I simply lie to the spring function, which frame it currently is. And I say, actually it is 20 frames earlier than you think it is. Now the value goes from 0 to 1, as any spring animation does. We actually also need to interpolate it, so that the range is not between 0 and 1, but something like 0 and 1000. So in here, the input range is 0 and 1 and we say, 0 to 1000. I'm gonna pass a transform.

Okay, so I made a small mistake there. I accidentally animated in my name instead of the talk title and also it's going down instead of up. But of course with fast refresh we can easily fix these things. So we actually want to animate in from 1000 to 0. Let's take a look at how that looks like. All right. Let's also apply this translation really quick to all the other elements so that I think it will look good. I'm just going to copy-paste it over here. Maybe you will find a more engineered way to do so. But I think this gets the job done quickly. This is how our video is now looking like. Nice!

Now for what I think is the most fun part. Let's turn one video into dozens of videos. Let's create one of these animations for each of the speakers that appear on React Summit. For this, I have already, as shown before, imported a JSON file into the project. Where all the metadata is there. And I'm just gonna use .map to iterate over each speaker and return a composition for them. So it might look something like that. Let's give each one a unique key. And also very handy that there is a Slack property on each speaker.

6. Dynamic Composition and Video Rendering

Short description:

Let's make the composition dynamic by aligning the components to take props like speaker name, company, avatar URL, and talk title. We can define default props for each composition, and later overwrite them on the command line. Now, let's create an mp4 video using a specific composition ID and run the build command to render the video.

Yeah, there we go. And now as you can see, all the speakers are in the left sidebar. So now, let's make the composition also dynamic by aligning the components to take some props. So let's say speaker name should be a string, company should be a string, avatar URL can also be a string and what else is missing? Talk title. I'm just going to accept these. And speaker name and fill them in, in my video. Here I have a small naming conflict with a style, so I'm just going to do this. Let's put this as the talk title. This is the speaker. Sorry, another naming conflict. I hope this will now solve it. Okay, it was just an indentation problem. Never mind, never mind. Here, let's put the avatar URL. Here we can define some default props for each composition, and I say default because these props can later be overwritten on the command line. But for the preview here in the browser, we're going to put some default props. The avatar URL is going to be speaker.avatar.company, speaker.company, speakerName, speaker.name, and the tagName speaker.game. Let's do speaker.activities.tags, 0.title. I think that's going to be it. And let's cast this to a string. There we go!

So here we have one for Brandon Bayer. Here we have Brent Watney's tag. Lee Robinson's. So now let's make an mp4 video out of it. Let's take the video from KentzC. I'm just going to copy the composition ID of his tag. And I'm just going to put this into my build command, which you can execute using npm run build as a shorthand, or you can write all of this out. So let's write this to kent.mp4 and run npm run build. And yeah, this will take a few seconds. You can get a taste of how long it takes to render a two-second video.

QnA

Features and Open Source

Short description:

Here we can see the 8x concurrency and the video has been created. ReMotion offers features like sequences, audio support, and server-side rendering. Check out the documentation at reemotion.dev for more information. The talk is open source, and you can find the link to the edited video on screen. Thank you for listening and enjoy the Q&A session. Remotion is amazing, and despite some missing features, it's open source for anyone to contribute.

Here we can see, for my laptop it shows the 8x concurrency. So it makes eight screenshots at the same time. And the video has been created. Let's have a look.

And here we have an announcement of Kent CDOT's talk fireside chat with Kent CDOT. I've only been able to barely touch the surface of what ReMotion can do. A few things that you could explore next. The first thing is sequences, which is a construct that allows you to shift time. Which is really handy if you have multiple scenes that you want to work on individually and then you want to put them together and that they play after each other. Which really helps you to organize your code and organization and reusability and encapsulation is the key to scaling up your video production. Another thing is audio support. By the time that you are watching this, we will have launched audio support. You will be able to do things like have as many tracks as you want, you will be able to change the volume on a pair frame basis and you will be able to cut and trim audio. Another big topic server side rendering. So how to render a video based on an API call. For all these things I would recommend you to check out the documentation at reemotion.dev where we will explain all the topics and we also have some video tutorials there to help you understand the ins and outs of Reemotion. Also remember that this talk is open source and by that I don't just mean the slides but like the whole video everything that you saw was edited using Reemotion and you can check out you can check it out with the link that is on screen now.

Yeah, that's it. Thanks a lot for hearing me out. I will be live for a Q&A right after this talk and I hope you will afterwards also enjoy the other talks of this evening.

Hey, good. Thanks for having me. Yeah, so it's my pleasure to have you here. I'm also following you on Twitter for a while now and you're posting so many amazing things, but this Remotion is on a different level, I have to tell you directly. It's amazing, what you did with only React.

Cool. Yeah, thank you. The audience doesn't seem to agree, they don't want to use it apparently. For the moment, I think there are still maybe missing features, right, but you're working on it. It's also open source, so anyone can contribute on that.

Reomotion 2.0 and Live Streaming

Short description:

So, I don't see the point why in the near future, maybe next year, everyone will create videos using Reemotion. The biggest feature in Reomotion 2.0 is audio support. Reomotion will create as complex as necessary FF MPEG filter to convert your React markup into a real audio track of an mp4 file. It's amazing. It's not the intended use case to use it for live streaming, but I would say it's generic in a way that I could foresee it to happen. You can just play the video in a browser, and soon I'm gonna make it that you can embed this in your own web page and then change the props of the composition live.

So I don't see the point why in the near future, maybe next year, everyone will create videos using Reemotion. So it seems that the GUI has 53%. It's almost tied, right? I mean, we can say an even poll here. Yeah, that's pretty good. I mean, if half of the people want to write in React, that's still a big chunk.

So, actually, not so bad. Let's see if we have any questions. I'll go now to the chat. Let's see. All right. So it seems that we don't have so many questions. Just, if you have any, just shoot them in the Basecamp Q&A. I have some questions on my own.

So, Johnny, I saw that you recently, two hours ago, actually announced Reomotion 2.0, which is a major bump. What were the improvements that you did? What were the new features that you've added? If you can add just a little bit there.

Sure. So there's a brand new version of Reomotion out just two hours ago. And the reason why it's just now is because I mentioned it in my talk. And then, so I was really motivated to actually ship it because I announced it here. I just did it in time. The biggest feature in Reomotion 2.0 is audio support. I think really cool that you can just declaratively put these audio tags, cut and trim, put them at any position, put multiple audio tracks, and even like change the volume per frame, create fading effects, fade it out at certain times. Reomotion will create as complex as necessary FF MPEG filter to convert your React markup into a real audio track of an mp4 file. That's really powerful, because I saw on YouTube for example lots of channels are using this kind of equalizer like Soundwave for all the videos. So just adding a video without going through a third party application and just directly in the browser maybe, or just popping up the terminal and type something and pass the mp4 mp3 and have an mp4 as an output. It's amazing.

Now, I am wondering, is it possible to make it live, like, to just feed an mp3 or live streaming session let's say audio and just have it out there. Yeah, interesting. I would say it's not the intended use case to use it for live streaming, but I would say it's generic in a way that I could foresee it to happen. So, I mean, right now, as you saw, you can just play the video in a browser, and soon I'm gonna make it that you can embed this in your own web page and then change the props of the composition live.

Lumotion Inspiration and Integration

Short description:

If you script it cleverly, you can stream it. The original inspiration for creating Lumotion was frustration with existing video editing tools. Remotion does not make videos from CSS transitions. Instead, it requires creating static images for each frame. Integrating existing videos into a project is done using an HTML5 video tag and a video component in ReMotion that synchronizes with the current time.

So, yeah, I mean, if you script it in a clever way then and then stream that, why not? We have a question from Afro Dev Girl is saying, what was the original inspiration for creating Lumotion? The original inspiration. Well, I would say it was more like a frustration with the existing video editing tools. Since, yeah, I mean, I was missing these features that I am used to as a developer. I just like have this version history, just be able to pull in data from APIs. Do stuff with an API call programmatically. With video editing programs, I would just have to open it. And no good tool for abstraction except copy pastes. All these things led me to eventually create my own video editing program.

Great. I can see that frustration. I also have it. Another question from Vadik. Does Remotion makes videos from CSS transitions only? If it makes them from CSS transitions only? No, not at all. I think I mentioned this in the video, that CSS transitions actually don't work at all. The reason is that you're supposed to create a bunch of static images in React. If I give you a frame number, you create a static image. And these static images together make an animation. Whereas a CSS transition, it's not so much an animation that is derived from the frame number. You just put it in your CSS file and it moves itself without you doing any other stuff. So instead of using a CSS transition, you would just like calculate the position of, calculate the property value for each frame and then render that. And so, yeah, that is a constraint, but also a really necessary one. And once you get the hang of it, it is quite nice because then you can just like take the timeline cursor and drag it back and forth, pause your animation and that you cannot do with a CSS transition.

Great. Another question from jrock94, how do you integrate existing video into a project? I'm not sure if he's thinking about MP4s or ReMotion projects. That works pretty well. You just use an HTML5 video tag, and you just load your video into using an import statement like you would in Webpack, and pass that to a source of a video tag. Like I said, normally, this would not be driven by the frame that ReMotion thinks it is. But we have made a video component that will synchronize the video with the current time. So that works pretty well. Actually, all the screencasts that you have seen on the right side of the screen while I was coding, it was just a video imported into ReMotion and played back.

Conclusion and Q&A

Short description:

It was a smooth and enjoyable project. Unfortunately, we ran out of time. Thank you, Johnny, for sharing another way to create videos using code, especially React. More questions are coming in, and Johnny will be available on Spatial chat to answer them shortly.

It was so smooth that I think it was not possible to see that I did not just submit the video like it was. Yeah, it's pretty much an inception, right? Just recording yourself inside the recording. He's like heading yourself inside the recording. Really nice. It's a fun project.

Unfortunately, we ran out of time. And thank you so much, Johnny, for taking your time and showing us another way of creating videos using code. Especially React.

Are you available for any other questions? Because I see that they are coming now. Are you on Spatial chat, and people can find you there? Yeah, absolutely. Yeah, I have it marked in my calendar. I will now move over to Spatial chat. And great to see that more questions are coming in. I'm very happy to answer them in a few minutes over there. Thank you so much, Johnny. Once again, it was a pleasure to have you here. Enjoy the rest of the day. Thank you. Bye. Thank you. Bye. Enjoy the conference.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

React Advanced Conference 2022React Advanced Conference 2022
25 min
A Guide to React Rendering Behavior
React is a library for "rendering" UI from components, but many users find themselves confused about how React rendering actually works. What do terms like "rendering", "reconciliation", "Fibers", and "committing" actually mean? When do renders happen? How does Context affect rendering, and how do libraries like Redux cause updates? In this talk, we'll clear up the confusion and provide a solid foundation for understanding when, why, and how React renders. We'll look at: - What "rendering" actually is - How React queues renders and the standard rendering behavior - How keys and component types are used in rendering - Techniques for optimizing render performance - How context usage affects rendering behavior| - How external libraries tie into React rendering
React Summit Remote Edition 2021React Summit Remote Edition 2021
33 min
Building Better Websites with Remix
Remix is a new web framework from the creators of React Router that helps you build better, faster websites through a solid understanding of web fundamentals. Remix takes care of the heavy lifting like server rendering, code splitting, prefetching, and navigation and leaves you with the fun part: building something awesome!
React Advanced Conference 2022React Advanced Conference 2022
30 min
Using useEffect Effectively
Can useEffect affect your codebase negatively? From fetching data to fighting with imperative APIs, side effects are one of the biggest sources of frustration in web app development. And let’s be honest, putting everything in useEffect hooks doesn’t help much. In this talk, we'll demystify the useEffect hook and get a better understanding of when (and when not) to use it, as well as discover how declarative effects can make effect management more maintainable in even the most complex React apps.
React Summit 2022React Summit 2022
20 min
Routing in React 18 and Beyond
Concurrent React and Server Components are changing the way we think about routing, rendering, and fetching in web applications. Next.js recently shared part of its vision to help developers adopt these new React features and take advantage of the benefits they unlock.In this talk, we’ll explore the past, present and future of routing in front-end applications and discuss how new features in React and Next.js can help us architect more performant and feature-rich applications.
React Advanced Conference 2021React Advanced Conference 2021
27 min
(Easier) Interactive Data Visualization in React
If you’re building a dashboard, analytics platform, or any web app where you need to give your users insight into their data, you need beautiful, custom, interactive data visualizations in your React app. But building visualizations hand with a low-level library like D3 can be a huge headache, involving lots of wheel-reinventing. In this talk, we’ll see how data viz development can get so much easier thanks to tools like Plot, a high-level dataviz library for quick & easy charting, and Observable, a reactive dataviz prototyping environment, both from the creator of D3. Through live coding examples we’ll explore how React refs let us delegate DOM manipulation for our data visualizations, and how Observable’s embedding functionality lets us easily repurpose community-built visualizations for our own data & use cases. By the end of this talk we’ll know how to get a beautiful, customized, interactive data visualization into our apps with a fraction of the time & effort!

Workshops on related topic

React Summit 2023React Summit 2023
170 min
React Performance Debugging Masterclass
Featured WorkshopFree
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
React Advanced Conference 2021React Advanced Conference 2021
132 min
Concurrent Rendering Adventures in React 18
Featured WorkshopFree
With the release of React 18 we finally get the long awaited concurrent rendering. But how is that going to affect your application? What are the benefits of concurrent rendering in React? What do you need to do to switch to concurrent rendering when you upgrade to React 18? And what if you don’t want or can’t use concurrent rendering yet?

There are some behavior changes you need to be aware of! In this workshop we will cover all of those subjects and more.

Join me with your laptop in this interactive workshop. You will see how easy it is to switch to concurrent rendering in your React application. You will learn all about concurrent rendering, SuspenseList, the startTransition API and more.
React Summit Remote Edition 2021React Summit Remote Edition 2021
177 min
React Hooks Tips Only the Pros Know
Featured Workshop
The addition of the hooks API to React was quite a major change. Before hooks most components had to be class based. Now, with hooks, these are often much simpler functional components. Hooks can be really simple to use. Almost deceptively simple. Because there are still plenty of ways you can mess up with hooks. And it often turns out there are many ways where you can improve your components a better understanding of how each React hook can be used.You will learn all about the pros and cons of the various hooks. You will learn when to use useState() versus useReducer(). We will look at using useContext() efficiently. You will see when to use useLayoutEffect() and when useEffect() is better.
React Advanced Conference 2021React Advanced Conference 2021
174 min
React, TypeScript, and TDD
Featured WorkshopFree
ReactJS is wildly popular and thus wildly supported. TypeScript is increasingly popular, and thus increasingly supported.

The two together? Not as much. Given that they both change quickly, it's hard to find accurate learning materials.

React+TypeScript, with JetBrains IDEs? That three-part combination is the topic of this series. We'll show a little about a lot. Meaning, the key steps to getting productive, in the IDE, for React projects using TypeScript. Along the way we'll show test-driven development and emphasize tips-and-tricks in the IDE.
React Advanced Conference 2021React Advanced Conference 2021
145 min
Web3 Workshop - Building Your First Dapp
Featured WorkshopFree
In this workshop, you'll learn how to build your first full stack dapp on the Ethereum blockchain, reading and writing data to the network, and connecting a front end application to the contract you've deployed. By the end of the workshop, you'll understand how to set up a full stack development environment, run a local node, and interact with any smart contract using React, HardHat, and Ethers.js.
React Summit 2023React Summit 2023
151 min
Designing Effective Tests With React Testing Library
Featured Workshop
React Testing Library is a great framework for React component tests because there are a lot of questions it answers for you, so you don’t need to worry about those questions. But that doesn’t mean testing is easy. There are still a lot of questions you have to figure out for yourself: How many component tests should you write vs end-to-end tests or lower-level unit tests? How can you test a certain line of code that is tricky to test? And what in the world are you supposed to do about that persistent act() warning?
In this three-hour workshop we’ll introduce React Testing Library along with a mental model for how to think about designing your component tests. This mental model will help you see how to test each bit of logic, whether or not to mock dependencies, and will help improve the design of your components. You’ll walk away with the tools, techniques, and principles you need to implement low-cost, high-value component tests.
Table of contents- The different kinds of React application tests, and where component tests fit in- A mental model for thinking about the inputs and outputs of the components you test- Options for selecting DOM elements to verify and interact with them- The value of mocks and why they shouldn’t be avoided- The challenges with asynchrony in RTL tests and how to handle them
Prerequisites- Familiarity with building applications with React- Basic experience writing automated tests with Jest or another unit testing framework- You do not need any experience with React Testing Library- Machine setup: Node LTS, Yarn