UI’s are composed of fast parts, and slow parts in terms of how responsive they are to user interaction. React's concurrent renderer decouples the fast parts from the slow parts by allowing us to render the slow parts in the background without blocking the fast parts, so that each part can respond to user interaction at its own pace. In this talk, we'll explore Concurrent React, understand what problems it solves, how it works and how to leverage it through the use of concurrent features.
Concurrent React Made Easy
AI Generated Video Summary
Today's Talk introduces concurrent React and highlights the importance of fast and slow updates in user interfaces. It explains how concurrent rendering improves UI performance by allowing fast updates to proceed without being blocked by slow updates. The concept of assigning priorities to renders is discussed, with high priority renders being synchronous and low priority renders being interruptible. The Talk also mentions the benefits of using concurrent features in navigation and list filtering. Overall, concurrent React enhances rendering with interruptibility and prioritization, making the application feel faster and more responsive.
1. Introduction to Concurrent React and Updates
Today, I'll be talking about concurrent React and the importance of fast and slow updates in user interfaces. We showcase examples of fast updates and the impact of heavy computations on the main thread. We also explore a demo where both fast and slow updates coexist.
Hello, everybody. My name is Henrique Núñez. I'm a software developer at Codeminer 42, and today I'll be talking about concurrent React. So come with me.
So we'll start by drawing our attention to user interfaces and interactions with them. Whenever users interact with the UI, it updates itself in response to these interactions, and these updates can be divided into two categories, fast updates and slow updates, in terms of how long it takes to process them. For example, updating input fields, buttons, toggles, and sliders. When we consider these updates in isolation, they are very fast. On the other hand, filtering a huge list, updating a dashboard, recalculating cells in a spreadsheet or performing navigations usually take a reasonable amount of time to complete and thus can be considered slow, especially when compared with fast updates.
Now, let's see this in practice. First, we have a demo where we showcase some examples of fast updates. Notice that we click on the bottom, we write in the text input, we drag the slider, and the updates in response to our interactions are processed instantly. There's no delay. Additionally, we also have two different kinds of animations in this demo. A JS animation and a CSS animation that although they are not a direct response to any interaction of ours, they are also updating.
Now, this is a slice of a profile that I was taken from this demo. Notice that in the interaction section we can see our click and below that in the main thread section we can verify that processing the corresponding update was indeed very fast, under 2 milliseconds. In the second demo, we have a button that triggers a heavy computation, that is, it's an update that takes a long time to be processed. In this case, it's an artificial example that will serve to explain some things that will come next, but it could very well be any other example that we talked about previously like filtering a huge list, so bear with me for now. This is the corresponding profile of the second demo. Notice that this update takes 2 seconds, which is a very long time for a UI update, especially considering that it blocks the main thread. This third demo is pretty much the previous one, but not. Notice that we have programmatic control over how long this heavy computation takes, as the first button initiates the computation and the second button finishes it. It's important to make it clear that even though in this case we can control how long the computation will take, it works pretty much like the previous demo. And as we can see in this profile, this computation still blocks the main thread. Now, I want to show you what happens when we have a situation where both fast and slow updates coexist. This demo is a combination of the previous ones, where we have both fast and slow updates as examples. In this demo, when we start processing this slow update, the entire UI freezes, and all the fast updates can only be processed after the slow update has finished. The clicks in the bottom, the text that was written in the text input, interactions with the slider, they only get processed after the heavy computation is done. The only thing that keeps updating despite blocking the main thread is the CSS animation, and only because it takes place on the GPU.
2. Introduction to Concurrent Rendering
But everything else that relies on the main thread to be processed gets completely frozen. It only takes a single slow update to slow down your entire UI. Both fast and slow updates are coupled to each other because of synchronous rendering. With concurrent rendering, even though you're processing the same slow update, it doesn't block the fast updates anymore. The heavy task is split into smaller chunks, allowing other work to be done in between. Two demos showcase the benefits of concurrent rendering in navigation and list filtering.
But everything else that relies on the main thread to be processed gets completely frozen. And the key point here is the following. It only takes a single slow update to slow down your entire UI. It doesn't matter how well crafted your user interface is, how optimized all your components are, because your UI is always a single slow update away from a bad user experience. And this is the main challenge we're facing here. This is the problem we're set to solve.
As you can see, in our current setting, both fast and slow updates are coupled to each in the sense that the slow updates end up blocking the fast ones. Now, the reason this happens is because the default approach React uses for rendering in most situations, which also, for many UI frameworks, is the only approach available, is to render things synchronously. With synchronous rendering, once React starts rendering an update, it will run to completion, completely blocking the main thread until the render is finished. So, in practice, this means that no matter how long the render takes to complete, any user interaction that occurs during the render will have to wait for it, regardless of how fast or urgent responding to them would be.
Going back to the fast and slow updates, now, what if we could decouple them? What if there was a way to let each update be processed at its own pace? Enter concurrent react. Now, I want you guys to pay close attention to this next demo here because this is the same demo that we saw before, but now there's a twist. Instead of using synchronous rendering, we are using concurrent rendering. Notice that now, even though you're processing the very same slow update as before, it doesn't block the fast updates anymore. While the heavy computation is still running, we can now still interact with other parts of the UI and they remain responsive. Let's take a look at this demos profile. What we see here is the heavy task being processed. But now, instead of blocking the main thread, it is split into smaller chunks and this splitting into smaller chunks let us fit other work in between these chunks. Now, going forward, I will show you two more demos running with both synchronous and concurrent rendering so we can make some more comparisons. In this first demo, we have an example of a navigation where navigating to different pages by clicking on the sidebar takes a long time. When using synchronous rendering, the navigation blocks other interactions from being processed. So, not only we have to wait before we navigate to a different page, but also the sidebar sidebar's hover effect doesn't work. When using concurrent rendering now, which is what we're doing in the same example, but now with concurrent rendering, even though the navigation still takes a while to be processed, the sidebar is kept fully functional. And even if we change our minds halfway through a navigation, which is a pretty common thing for users to do, right, and we want to navigate to a different page instead, we can easily do so without having to wait for the previous navigations to complete, because concurrent React will abort previous in-used renders. In the second demo we have the classic huge list filtering example. And, of course, as the list has several items, re-rendering it is slow. So when we type we get this jank, you know, where the search bar freezes briefly. Now, in the second version with concurrent rendering, the search bar is kept fully responsive while the list is rendering. Which, I think you all are going to agree, it makes up for a much better user experience. Now, you might be wondering, like, how this all works, right? And we'll get to that right now.
3. Enhancing Rendering with Concurrent React
Concurrent React enhances rendering with interruptibility and prioritization. It allows rendering updates to be stopped and resumed, and it prioritizes the most urgent renders. This keeps the user interface responsive.
Concurrent React enhances rendering with two new features, namely, interruptability and prioritization. Interruptibility is about being able to stop a render halfway through to do other things and then resuming it later. Prioritization is about rendering the most urgent things first. Now, by combining both, we're able to start rendering an update, like, let's say, the computation from the demo. And then, if another more urgent update is skewed, I don't know, like, interacting with the bottom with a text input, we can interrupt what we're doing, cater to these more urgent renders, and, once we're done, we go back to what we were doing before. And this is essentially what keeps the user interface responsive.
4. Assigning Priorities to Renders
React allows us to mark renders as either High Priority or Low Priority. High Priority renders are synchronous and block the main thread, while Low Priority renders are interruptible and do not block the main thread. Using an analogy with Git branches, features are similar to low priority updates, which can be interrupted by high priority updates. Assigning a high priority to fast updates and a low priority to slow updates decouples them and keeps the application responsive. Concurrent React uses concurrent features to assign priorities to updates.
To achieve that, React allows us to mark renders as either High Priority or Low Priority. A quick but important disclaimer, though. Actually, in reality, we have more than two priority levels. But for most purposes in user len, collapsing them into two levels suffices. Which is exactly what we're gonna do. We're only gonna be talking about two priority levels in this talk.
Moving on, High Priority renders are just the normal renders we're used to pre-React18. So, they are synchronous, they block the main thread, which means they cannot be interrupted. And they also interrupt Low Priority renders, which is what we're gonna see next.
Low Priority renders, on the other hand, they are interruptible, which means that they do not block the main thread anymore. And this is also important. They only start running after all High Priority renders haven't been taken care of, you know? So like, this is what makes them work with that idea of background rendering, so to speak.
I believe that an interesting way for us to understand this is using the following analogy. Let's say we're developing an application and we're using Git to track its code base. I believe everybody's used to doing that nowadays, right? So, there's a main branch that represents the code that's in production. And when we want to write a new feature, we create a feature branch of main, like, I dunno, feature slash awesome feature, and when we're done working on it, we merge it back to main. Pretty straightforward. Now, whenever there's some critical issue in production, we create a hotfix of main, like, hotfix slash fix nasty bug, and when we're done with it, we also merge it back to main. This is also pretty standard.
Now, what happens when we're working on some feature and suddenly, we have to ship a hotfix. This is not an uncommon situation, right? I think everybody has already gone through this kind of thing. So, shipping the hotfix is definitely more urgent than delivering the features. Supposing we're responsible for shipping the hotfix, we first have to interrupt our work on the feature and start working on the hotfix until we complete and merge it. Only after we do that, it's only after we ship the hotfix, is that we can go back to working on the feature.
Now, if you'll be probably paying attention, you probably noticed that features are similar to low priority updates, as working on the feature branch, which is analogous to low priority rendering, may be interrupted at any time by the need to ship a hotfix, which in its turn equates to high priority updates. With all that in mind, the trick to decouple the fast updates from the slow updates is basically assigning a high priority to the fast updates and a low priority to the slow updates, because this way we can keep the application responsive by preventing the slow updates from blocking the fast ones. At the slow updates, by being assigned a low priority, we will be taken care of in the background and may be interrupted whenever we need to tend to the fast updates.
Now that we understand the problem we're solving, the solution itself and how it works, I think it's time to take a look at how we can actually make use of concurrent rendering and practice. In Concurrent React, to assign priorities to updates, we use concurrent features. So let's go back to some of the previous examples we saw before and see how they're actually implemented using concurrent features.
5. Using Concurrent Features
In the navigation example, using concurrent features allows us to make navigations a low priority update, preventing them from blocking the sidebar. In the list filter example, by marking the renders of the filtered list as low priority, high priority updates to the search bar are prioritized. Concurrent rendering doesn't make the application faster, but it makes it feel faster and more responsive. When typing in the list filter example, high priority renders run without interruption, while low priority renders are interrupted by subsequent high priority updates.
So going back to the navigation example, without using any concurrent features, every update is a high priority. So as we can see in this first example, because we're using a plain old USETAPE to track the currently selected page, renders that are triggered by navigating to a different page are synchronous, and thus, they are blocking. Once again, notice that the navigation blocks interactions with the sidebar.
Now, when we use concurrent features in this next version, in this case we're using start transition, we can make navigations a low priority update, so that they don't block the sidebar from updating anymore, while also being able to show a loading state with the isPending flag, which is why you see the screen getting dim, you know, with this kind of grayish overlay. And, in this version, we've added a new state, delayedPage, and we're going to use that state to trigger the low priority update, which means that we're actually going to use this update to re-render the page itself. And then, by using the useTransition hook, we have access first to the isPending flag, that tells us when there's a low priority render pending, and also to the startTransition function that allows us to mark a state update as low priority, so that its corresponding render, that is, the render that's triggered by this update, will be a low priority render, and thus, non-blocking.
In this second example, once again, we're filtering a huge list, that, due to the number of components it has to render, is quite slow, and without using any concurrent features, rendering the filtered list blocks the updates to the search bar, which is just what we saw before, right? That's why it kind of freezes. Now, in the second version, by using the Use Deferred Value hook, and passing the deferred filter that we get from the hook to the filter list instead, we're marking its renders as low priority, and now the yield to the high priority updates to the search bar. Notice how in these two examples, although Concurrent Render doesn't actually make the application faster, you know, it doesn't speed up anything, it makes it feel faster, by making it more responsive, and this is very important.
Okay, so next I wanna dive in a little bit deeper in this list filter example, so that we can watch more closely how this process work. Okay, so this is gonna set the stage for what we're gonna see next. So in here, I'm now logging to the console when both high and low priority updates start and finish rendering. And I'm also highlighting when renders are interrupted. Let's see this step-by-step. First, when we type the first letter H, as we're calling a set state to set the filter, this causes a high priority render, which runs from start to finish without stopping. Like without being interrupted. Notice that in this first high priority render, only filter has updated. But the third filter still has its old value, which is just an empty string. Then, after the high priority update finishes rendering, React starts the low priority render, where both filter and the third filter are up to date. But once again, as we type another letter before the low priority render is finished, we then interrupt it again to cater to another high priority update. Also, in the second high priority render, as the previous low priority render wasn't able to finish, we're still seeing the first deferred filter value. This shows us that, in low priority renders, all values are up to date, regardless of their origin. However, in high priority renders, deferred values might be stale. This time, we're able to finish the low priority render before we type anything else. So as you can see, in the filtered list, the UI is updated accordingly. Actually, just let me go back, so you can see this happening. Like before, there's no filtering at all, because we never finished any low priority renders that update the list. And now, for the first time, we've finished the low priority render. And because we did that, you can see that the list actually updated. Going next, moving forward, we type a third letter, which, yet again, triggers a high priority render.
6. Synchronization of Deferred Values
As long as low priority renders are interrupted, deferred values will be stale in high priority renders. The process repeats until we finish typing. Concurrent rendering requires identifying fast and slow updates and assigning priorities accordingly. Developers should check out a blog post for more details on concurrent rendering and a package for debugging concurrent features.
But now notice that, as we were able to finish the previous low priority render in this high priority render now, the deferred filter is up-to-date with the filter. They are synchronized, which tells us that, as long as low priority renders keep being interrupted, deferred values will be stale in high priority renders. And it's only when the low priority render is able to complete that states will be in sync again.
Now, the process repeats once more until we finish typing everything. So we type one more letter, and then we fill the list with low priority rendering. Maybe we finish it, maybe not, it gets interrupted until eventually we finish typing everything. Then the low priority renders are able to finish, and we get to the final state, so to speak.
Now, before we move forward, there are two important remarks that need to be made. First, in this example, we're using the use deferred value hook, but if we had used, like, the use transition hook instead, it would have worked pretty much the same way, because they work pretty much in the same way. Second, the more attentive ones might have noticed that even though this wasn't explicitly shown, due to the lack of screen real states, to be honest, the components whose updates are slow, namely, both the main component on the navigation example and the future list on the list filtering example, are being memoized. So there's a React memo wrapping their definitions, which is what makes their re-render be bypassed in high priority renders. So it's because their renders are bypassed, that we can keep the interface responsive by not having to re-render the expensive components, so to speak. And they are not re-rendered like the renders are bypassed, precisely because they receive a stale value in the high-priority renders.
Ok, so with all that in mind, now you might be thinking, so concurrent rendering is so nice that this means that we should use transitions and deferred values for everything, right? Right? Well, actually no. We definitely do not want to do that. If we do that, we would be doing the equivalent of assigning a low priority for each and every update. And if everything has a low priority, then there are no priorities at all, right? See, this is what we had pre-React 18, before concurrent rendering was a thing. Like, all updates had the same priority, which was the very cause of our problems in the first place. It was just that instead of every update having a low priority, they had a high priority. But it's the same thing, you know? So in the end, our job as developers is to identify which updates are fast, which are slow, and then assign priorities accordingly, so that we can, to the best of our efforts, keep the application responsive and deliver a great user experience. If you want to know more about Concurrent React, there's this blog post I wrote which goes into much more detail on concurrent rendering. It talks also about concurrent features, and even a little bit about suspense and how it interacts with the concurrent features. So don't forget to check it out. Also, last but not least, there's this tiny package I published that has a hook that helps you to debug concurrent features. It's the hook I used to create those logs you saw on the future list slide. It's pretty interesting. It's worth checking it out. So with all that being said, thank you so much and have a wonderful day.