Let the Main Thread Breathe!

Rate this content
Bookmark

The main thread, on the web, has a lot of responsibilities. At the same time, web apps are getting more sophisticated every day. Therefore, the main thread gets too busy that will disappoint our user showing janky frames! The off-main-thread architecture ensures apps run smoothly on every device for everyone.


In this talk, we will go through the possibilities in browsers such as WebWorker, Worklet, and WebAssembly introducing practical tools that allow us to boost our user experiences.

Majid Hajian
Majid Hajian
35 min
17 Jun, 2021

Comments

Sign in or register to post your comment.

Video Summary and Transcription

Let's explore how to improve web application performance by offloading tasks from the main thread to other threads. We need to ensure compatibility with all devices and users to avoid frustrating experiences. Web Workers and Web Assembly can help improve performance by offloading tasks, but there are trade-offs to consider. Converting existing codebases to WebAssembly can be done gradually, and it's important to measure performance before making the conversion.

Available in Español

1. Introduction to the Talk

Short description:

Let's talk about how we can hold off tasks from the main thread to other threads and improve the performance of our web applications. Around 50% of the population is connected to the Internet, and this number is constantly increasing. 90% of these users access the Internet through their mobile phones.

Microsoft Mechanics www.microsoft.com www.microsoft.com www.microsoft.com www.microsoft.com www.microsoft.com Hello, let's talk about how we can actually hold off some tasks from main thread to other thread and let the main thread breeze a little bit. So, actually the fact that I wrote this talk was going back to maybe two years ago when I was searching about like how many people around the world like connecting to the Internet. And I found out that around 50% of the population are connected. And when I came back again to the statistics I noticed that after maybe one in two years five more persons of the population are now connected more than what they are connected like two years ago. And interestingly, around 90% of these people are connected to Internet via their mobile phone. I mean, they're just checking our website, our web application via mobile phones.

2. Importance of Device Compatibility

Short description:

We need to care about all devices and all users, even if we're targeting a specific market. There are various phones and devices with different specifications that our users are connecting with. We must ensure that our complex applications are responsive and reliable, avoiding unresponsive pages that frustrate our users.

And when I'm talking about mobile phones I'm talking about a whole lot of different phones. It's not just a good quality phone like iPhone or like high-end, high-quality aspect hardwares, phones. It's also some phones that they are looking good, very fine but the hardware's are not as powerful as phones like iPhone, for instance.

For instance, Nokia 2. So people are connecting with these phones for sure and we are shipping our complex application to them every day. And in fact, when it comes to even desktops, we have different laptops with different specs and different desktop with different hardware. And time to time, when we ship our complex application, we show this unresponsive page to our user, which makes them so frustrated. So they don't like to see these unresponsive or unreliable pages. But it comes to our mind that what can we do, why we need to care? So what should we do? The thing is that we need to care about all of our users. We need to care about all devices. We need to care about everyone who is connected, even though you are actually shipping an application for like a local, let's say, a market, there are still some, you know, you need to probably get some status, but there are still some, you know, devices that you need to cover and you need to support.

3. Exploring Web Workers and Web Assembly

Short description:

In this talk, we will explore Web Workers and Web Assembly to offload tasks from the main thread and improve user experience. I am Majid Hajji, a software engineer based in Oslo. Let's dive into the main thread and the challenges it faces with complex applications. We need to be mindful of the budget for delivering 60 FPS and the variations across different devices. It's our responsibility as developers to optimize performance and reduce user frustration by offloading tasks to other threads.

In this talk, I'm going to explore two options that we have in the browsers right now. An underestimated one, Web Workers, and a new one, Web Assembly, and see how we can actually hold off some of our logics to these threats and then free up our main threat and let our user be more happy.

Let me introduce myself. My name is Majid Hajji, and I'm based in Oslo. I am a passionate software engineer. I do a lot of JavaScript and frontend, as well as Dart and Flutter. So I'm also author and instructor of a couple of courses and tutorials as well as author of some books on the internet, especially about PWA. In my spare time, I'm also organizing quite a lot of meetups and conferences. This is my passion. And you can find me on internet with this handler almost everywhere.

So let's go back to our main thread. So when we are talking about our main thread, so we are talking about quite a lot of things which is happening there. When it comes to main thread, so imagine this picture, it just shows you like a main thread does. It has even loops. It just download, you know, JavaScript, parse it, it does styling, it does layouting, it paints, repaints, compositing, and a lot of things like that. So, you know, if you're thinking about main thread, it's a lot of tasks on its shoulder. And by creating complex application, we are even adding more. And in fact, we are expecting to deliver us 60 FPS, 60 frames per second. And that makes it even harder for main thread to take all of this responsibility.

Let me tell you a quick budget. So if you want to actually ship 60 frames into one second, it is around 16.6 milliseconds that we have a budget, right? So let's talk about this budget in different forms. If you're talking about like a very good phone with a high spec hardware compared to a low spec hardware phone, you see that this budget in a phone, in one phone is quite okay, and another phone, maybe it's on the edge, and another phone, it's just overpassing our budget. So that's the time our users seeing janky frame. And that's even not there. So if you're talking about 60fps in different screen rates, like 120 Hertz, it's even worse. Like in 90 Hertz, you even have less than 60, you have around 11 millisecond, and in 120, you have even less. So that brings this unpredictability to our application when we are delivering. So in fact, we don't know if this application right now, if you're actually running them on different devices, are they going to run as smooth as what we are testing on iPhone or maybe on a powerful desktop? So here is where I have to say that this is our responsibility. This is developers' responsibility to handle all of these unexpected and reduce the bad user experience by just freeing up some task from main thread to other thread. And in fact, we have a very old friend.

4. Introduction to Web Workers

Short description:

Web Worker is a headless browser that runs everything in isolation without access to the DOM. It can be used to offload tasks from the main thread. However, the post-messaging system used for communication may not align with our preferred sync and promise-based coding. The comblink library simplifies the process of spawning Web Workers and makes it promisified. We can differentiate between business logic and UI logic by utilizing worker threads for heavy computation tasks and main thread for UI manipulation.

Like back maybe 2017 or 2012 afterwards, almost all browsers have a good support of that. Web Worker is very underutilized tools or API in the browser. But let's talk about it. So quickly, talking about Web Worker, it's similar to a headless, let's say, browser, right? So it just run everything in isolation. There is no variable that we can share, and you can actually task, you can even run them in parallel. And, you know, the only thing is that this is headless because it doesn't have access to DOM. And, you know, that's one of the characteristics of this browser, right?

So if you're talking about quickly, Web Worker is that, okay, so the way that it works, you're just spouting a new instance of Web Worker, and then you're communicating with that via post-messaging system. So Web Worker has the exact same characteristic as main thread does. So it has like message queue and all of the things that you have. You can even run WebAssembly and things like that. So, but then if that is easy, then why don't we use that? Because, you know, this post-messaging system might not be like the way that we really like to work in daily work, you know? So we are very used to sync and promise-based, you know, or maybe sometimes callback. Let's say these days promise-based, you know, a way of working, a way of coding. And it would be very nice, instead of these post-messaging system, we can just get an access to worker thread quickly and then get a respond afterwards, right?

So here is where this library comblin comes in handy. So we, with this library, the way that we're actually spawning WebWorker is quite easy. So we just spawn our worker and then we wrap it with comblink and then we get access to the service or the functions that we are exposing from service worker. Sorry, from WebWorker. And the good thing about is that, as you see in the code example, so comblink makes it promisified. It promisified all of the functions, and you can just await for the result and then just do something with that. So in a worker JavaScript file, you also need to expose what you want to actually expose to the main thread here. And you just say to comblink, hey, these are my function or this is my object, which I want to expose. In fact, if I want to give an example in React, let's say, when we have spawning a worker here, which is rapid and then assign it to a variable, and then here is where we have our function in a worker where we have a heavy computation task, like I want to sort let's say an array. So here I can simply just send this array to the worker and just do something and get the result back and set it up. And if I'm done with that, I can also terminate my working. And let me just tell you this right now. So this is awesome, right? What we can do now, we can just differentiate between our business logic, let's say, we can actually hold up our business logic to our worker thread somehow, and we can have our UI logic to main thread. And this is actually how maybe mobile developers like in Suite or Android usually works. You have UI thread where you have access to UI and you can manipulate that. And in a main thread, maybe in browser, we can somehow mock that. We can say, hey, this is main thread. I just have access to Dom and I want to manipulate it, right? And then there is any logic that I can hold it off to other threads and just do something with that and then get the result.

5. Redux and Worker Thread

Short description:

In Redux, we can keep the UI-related part in the main thread and move the logic-heavy part to the worker thread. This allows the main thread to handle UI tasks without blocking. By moving the reducer and store to the worker thread, we can improve performance and avoid UI blocking. The sample code demonstrates how the main thread is blocked when the reducer is working, but the worker thread allows the main thread to handle other UI tasks without blocking. Moving the reducer to the worker thread can improve performance, but let's examine it further.

And then again, in the main thread, I can manipulate it. An example is like we are all probably familiar with Redux. It's pretty popular, you know, state management pattern, especially in React, you know, ecosystem. So the way it works quickly is that you have your reducer, which might be actually blocking. And because it's synchronous, it just wants to give you like the store, and like your state, your unified global state. And then you have your view and action where you dispatch this action from view, and the reducer just gives you this store, and you have your global store everywhere.

What we can do with that, as an example, we can keep the part that is related to DOM and related to UI in the main thread, like view and action, and we can hold off the part that is logic and might be even blocking, and then move it to the worker thread, like reducer and store can be kept over there, and then we can keep action and view in the main thread. Let me tell you an example here. If I want to write this in an action with a worker thread, this can be my worker JavaScript. So this might be my reducer, first of all. I have some action here, the type of action, and I do some stuff. I deliberately here in this example, like make a three-second delay to just do something. I just want to show you how we block sometimes UI. And then here is my store worker.js and my application.js, like where I expose my store. You see that I create a store from this reducer, and in my main thread, application.js, the way that we usually work, well, without a web worker, we just use the same method to create our store, but if you have a worker, then we actually instantiate our worker and wrap it with commlink. And here is where we have access to our store with a little bit of modification. By the way, don't worry about the sample code. You have access to all of these slides and all of these sample codes. Afterward, you can go ahead and read them once more.

So then, if I want to show you an example right now. So here is an example. So when I have my main thread, and if I dispatch an action, when Reducer is working, then the UI is blocking. But if you look at the worker one, I just dispatch an action, I am just trying to change something. But if you see the animation, my main thread, it's not blocking because there is nothing. It's breathing right now. So it's no problem for the main thread to just do some other UI stuff. But when I'm actually moving this Reducer to my worker thread, that is how it works, simply. But you may ask right away a question. Does it even faster? Well, I mean, let's have a look at that. If you have this UI thread, let's say main thread in our example.

6. Worker Threads and Performance

Short description:

Using a new worker thread may not always be faster due to the added overhead, but it offers reliability and improves UI responsiveness. It can also potentially be faster by utilizing multiple cores. However, it comes with some risk.

So if you are spouting a new worker thread, you're actually adding a little bit of overhead here for instantiation, opening, post messaging system, and things like that. So in fact, it might not be faster. Sorry. But I can say that it's more reliable. As you saw in the example, you don't block your UI. You're just sending these messages, these business logic to a worker thread and just get the response back to your main thread. And that makes it more reliable and less unresponsive UI. And, well, it might be even faster. Yeah, I can tell that. If your phone, or let's say in browser, in desktop, you have several cores. Yeah, it might be a case that you run these in parallel. I have another talk about parallelism in JavaScript. So you can watch that later. So it runs in a different core at the same time. And it might be even faster. But you don't know as a web developer for sure. And it might be also a little bit risky in that case.

7. Working with AssemblyScript and WebAssembly

Short description:

Let's move on to the next example of sorting a heavy list with Worker. WebAssembly is a powerful tool, but AssemblyScript offers a familiar and easier option for JavaScript developers. Setting up an assembly script is straightforward, and you can export functions like in TypeScript. Once you're done, running a simple command generates your Wasm file. Instantiating the assembly script is simplified with the loader from assembly script, making it easy to fetch and use the Wasm file. Let's look at an example where the assembly version performs without blocking and executes quickly.

So let's move on to the next example here. Another example here, you see that I want to actually sort a heavy list here with Worker. And without Web Worker, you see that I'm blocking my animation UI here in this case. And it's very unpredictable. And my UI and my user is not going to be happy for that.

Another friend of ours right now, these days in the browser with a very good support in most of the browsers is WebAssembly. But when it comes to WebAssembly, we quickly think of languages like C, C++, Rust, you know, these type of languages. I'm not going to talk about the theory behind it because you probably have heard about WebAssembly and you know the idea behind it. But when it comes to a JavaScript developer, I have to say like, well, it is fun to learn another language, but sometimes maybe we don't have time for a project. Maybe we are rushed to do something and it's a little bit risky to just go with another language that we don't know, right? So then this is when times comes with AssemblyScript. So AssemblyScript is actually a compiler with a strict subset of type of script. So, which is compiling to WebAssembly. And the good thing about it is that you write your type of script way and you probably as a JavaScript developer are familiar with that language already. So do you know how to write this code? Or even if you are not, writing a type of script code is way easier for us as a JavaScript developer, you know, compared to C or C++.

So, and surprisingly, it's quite easy to set up an assembly script on your, for your project or running or creating a project with assembly. You can simply just run a few commands and you see in the slides, and then you get everything already scaffolded for you. And here you go, you have your type of script file, as you see, the main entry file, you can just go ahead and write your function, create an array, randomize it or sort. These are the examples that I just wrote here, or Fibonacci in this example with no optimization. So, and you can export them like as you do in TypeScript with a special type of, you know, special types in TypeScript here, which are actually assembly type. And you can just go ahead and read about these types and get yourself familiar with the type. There are also some built-in components and modules for this and libraries for assembly scripts you can read about on it.

Once you're done, surprisingly, again, that makes it so easy for us, just run a simple command npm run as build, and then you get your Wasm file out of the box. And the most important part is where you want to instantiate this assembly script, because the way it works, it's fairly simple. When you are importing the loader from assembly script, you can simply then fetch your Wasm file, and then here you go, boom, everything is done. If you have done, or if you're going to read about WebAssembly, you know that instantiation of a Wasm file, it's a bit of work. It's not just one line of code, it's a few lines of code. So, an assembly script makes it very easy for you to instantiate your Wasm file in this case, and then simply it's a promise-based function, so you can simply wait for that and get the result and do something with your application. In this case, I'm setting a state. So, let's have a look at an example here, a function that is coming from WebAssembly and a function that is similar for that in JavaScript part. So, as you see here, the assembly version doesn't, again, blog, and it just comes very fast.

8. WebAssembly Performance and Final Example

Short description:

The WebAssembly version is almost seven times faster than JavaScript. Always measure and see the difference. The final example is a task where you transcode your live camera into an MP4 format using a WebAssembly thread. Let's embrace the power of workers, make apps more reliable and predictable, and leverage these APIs. Happy Birthday React Summit! Thank you for listening. Feel free to reach out with any questions.

If you look at the console log, you see that the first time when I run the WebAssembly version function, it's quite fast. It's almost seven times faster, but when I actually run the JavaScript part, for the first time, it's slower. But always measure, because what I can say is that what you run in WebAssembly is going to stay in the same path, and it's always going smooth and predictable in terms of performance, while JavaScript is not like that. So, always measure and see the difference.

And let's go to the final example, which is your task, you need to do it. First of all, you can get access to these examples simply. And, secondly, here is the task. So, you open up this, you see this screen here, you run and this is transcoding your live camera into a MP4 format, and you can even download it without locking, and it's just running in a WebAssembly thread. So, I'm not going to run this because transcoding here takes a little bit time, and so I leave it to you to just give it a shot and see how it works.

And with that said, let's say that- let's embrace the power of workers, let's use them, let's make an app that is more reliable and predictable and even usable for our user by using and leveraging these APIs in our browser and by using some of the library that makes it even easier for us to use it, right? And before we go to the last slide, Happy Birthday React Summit. So this is my gift. This is my slides to you. I hope we have a very fluffy cake for your birthday. And let's go to the last slide. Thank you for listening to me. So you can access this slide and link to source code via this short link, And if you have any question, you can reach me out on Twitter. I'm quite active there. Or send me an email. I will be happy to answer all of your questions. Thank you.

Adit, how are you doing today? I'm very good, actually. Very good. I'm very excited to join you. Like, very excited today. Such an amazing conference. Your excitement about, you know, being off the thread is just unfathomable, I guess, at this point. But I really like that. And I have questions. And I think there are many questions or many things that you kind of brought up, which are really, really important and relevant. Because you were looking into mobile phones and what is happening with the state of mobile in general.

9. Optimizing Legacy Applications

Short description:

Old phones that we dismiss don't disappear and can still be used, causing a horrible experience with heavy applications. To decide what should go into WebWorker, Worklet, or WebAssembly, run the application on a low-level device and identify areas for improvement. Move synchronous operations to WebWorker, use Worklet for rendering, and consider WebAssembly for tasks with better libraries. There is a trade-off in spawning a Web Worker, but it's not necessary for everything.

It's very interesting because there was an article just a couple of days ago saying that, you know, the old phones that we dismiss because we don't even like them and there are new ones coming up, they just don't disappear. They kind of get in past three generations. So you end up maybe, you know, second generation, third generation, they're still being used. And if you're using a heavy application today, this is going to be just a horrible experience.

So maybe to just really break it down into kind of strategic parts. So if you have a legacy application and you want to make it fast and you realize that you have some issues with the thread, how do you decide what should go into WebWorker, what should go into Worklet, what should go into WebAssembly? What's your strategy? Yes, this is a very, very good question. So I often ask the question that right now you're asking. It's a bit broad to answer, though, but I try to simplify it a little bit and answer it. So what I do usually, I just give my personal experience, is that, like you said, so I try to actually run my application first on a very low-level device and then try to figure out which part of the application needs a little bit of improvement, right? So, and depends on that improvement, I decide like, should this one goes to WebWorker or should this one goes to Worklet, for instance, using Worklet. Let's say, as an example, so I gave an example about Redux, right? So sometimes it happens, you know, you have a synchronous operation in your application and see that you're just running some JavaScript code and it's just some execution. And then you feel like, okay, this is blocking my event loop for a while. So, and then it's blocking my UI, right? So probably this is a good candidate to move it there. But then you go to another part of your application and see that, oh, this is like a background that I render, you know, and well, the rendering causes a little bit of janky frame when the animation is running, right? Then I think, oh, then we have another API worklet to render these low-level format of rendering like graphic and audios, right? Then I decided to move that to worklet instead of doing that in WebWorker. Or then sometimes you feel that like, okay, well, you can definitely use, I actually had an example in the talk. So you can definitely use some JavaScript libraries, for instance, and decode or encode or compress like some data or images or videos and stuff, you know, in your application, for sure. But then that's probably the part you think that, oh, this part, probably there are some better libraries in C or maybe some libraries have been written in ROS, and then I can actually compile it to WebAssembly. So this is very important, actually. When we are talking about these APIs, it's very, very important you just talk about which part of the application you have a problem. So, and then you need to figure out based on that, which one is closer to the API that you need to use, you know? But I usually do it this way.

Yeah, that definitely makes sense, and it's really interesting that we have all these options available, right, and so we can actually make use of them. And it's probably, you know, every single sophisticated application will probably use a very interesting combination, hybrid combination of all those things.

There were also many questions coming up in the chat. One of them was Jay Holfeld, I guess. And Jay would like to know, what's the trade-off between spawning a Web Worker and actually doing the work? Is using Web Workers worthwhile, even for minimal tasks? Yes, first of all, hi, Jay. Very good question. If you pay attention to one of my slides, I said that there is a little bit of overhead when you are actually spawning a Web Worker. So technically, you're adding a little bit of work to your application or workload, let's say. So there is a trade-off, for sure. But then, do you have to use Web Worker for everything? Okay, today I learned that there is a library that I can simply just use it and then it's synchronized. Like, it's the way that I write my JavaScript.

10. Considering the Trade-off of Using Web Workers

Short description:

Pushing everything to Web Worker may not always be worth it in terms of user experience and resource consumption. Measure the impact and only use it when necessary.

But it's very easy, well, from tomorrow I can just push everything to Web Worker. Well, that's not the case. So I had another slide, and I emphasized that you need to measure when you are actually pushing something to Worker. You need to see if this execution goes to Web Worker, worth it in terms of user experience. For instance, if I'm just moving this to a Web Worker and it doesn't solve any problem in terms of user experience. There was no blocking things, there was no such things that needed to improve, then perhaps you should not do that, because there is a trade-off. You are adding a little bit of workload. And also remember, you're consuming more resources from your user, you know, users' hardware, let's say. So this is something that you need to pay attention.

11. Creating and Managing Web Workers

Short description:

You can create an unlimited number of web workers, but it's important to use them judiciously and clean up after you're done with a task to avoid unnecessary resource consumption.

So small task or big task, that's one thing. Another thing is that you need to take care of the user experience and see that where in your application, something is blocking. Okay. A very similar kind of slightly related question is coming from Tushar, who asked, how many worker.js-like threads can we create to not impact the overall performance too much? I can see myself writing many things using comlink in many, many, many files. Yeah, yeah. Well, very good question again. Technically, you can create unlimited number of web worker. So there is no limit. As long as the user's system resources are low or it's not fully consumed or browser doesn't kill your web workers. But then should you do that? Well, you know, perhaps not. You need to decide when to use it. And more importantly, when you're using something, let's say in one page because of some problem, you need to use web worker and comlinks helps, right? So after leaving that page or doing that task and it's done and you don't need that anymore, you just terminate your worker, you just clean up after yourself. And that is a general rule actually in programming, right? So let's clean up when we don't need anything, right? So when we are done with a task, so short answer is, of course, you can create unlimited number of workers, but should you do that? Perhaps not, because you don't want to actually spam your user's resources and just crash the tab or the browser that user is using because you just open up too many things and it's not necessary. You just need to do it when it's necessary and you need to clean up afterwards to make sure that there is no leaking of stuff or there is no, let's say, something that is draining your users' system resources.

12. Converting Existing Codebase to WebAssembly

Short description:

To convert an existing codebase to WebAssembly, consider rewriting JavaScript code in TypeScript and gradually converting parts to AssemblyScript. Measure performance to determine if it's worth the conversion. When loading a large WebAssembly application, use a lazy loading approach to load only what is necessary at the beginning and load additional modules as needed.

Very detailed answer. Thank you, Majid. Another one just came up by Alexander Botterham. We already use web workers doing real-time text analysis. If we would like to increase performance by using WebAssembly, what would be the best way to convert this existing codebase, which is really huge, to WebAssembly bit by bit?

Yes. Very good. You know that I introduced AssemblyScript today. And first step, perhaps, is that you can, if it's in JavaScript you wrote, you can simply actually write it right now in TypeScript, if you haven't written that. So that's probably the first step. And the second step is to, piece by piece of these TypeScript code, just figure out which part you can use, or you can write with AssemblyScript, the specific set of the assembly types, and then convert it to probably AssemblyScript. But also you need to think of like if it's worth it. Because as I said in the slides, so, you are using WebAssembly because WebAssembly, when it runs, it's reliable, right? From the beginning to the end, but when you use JavaScript, it might not be that much reliable. It might just depends on the engine when it's optimizing JavaScript code in the browser. It might just be very performant or low performant. So, that is important to measure when you want to do that, first of all. So, technically probably you can just take a small piece of that, just do an experiment, run it with Assembly Script, compile it, and see if it's worth in terms of performance measurements. That makes sense.

Speaking about just that and speaking about WebAssembly, one thing that I was wondering about as well, if you use WebAssembly for a big application because it's good at heavy computations, how do you deal with actually loading that big application that's actually served over the internet? Is it an incremental way of doing that? What would you suggest to do there? Actually, very good question you asked. Similar to other patterns in the web, there is a lazy way of loading WebAssembly. So technically you need to figure out which part is the most important for the beginning, you can load that, and once you need more stuff, like a plugin or package or something, you just modularize your assembly and lazy load them afterwards and load them afterwards. This is a pattern that existed already for loading WebAssembly scripts. And there are good examples on the Internet that have been created in this way. So you don't need to load, like, 10 megabits or 10 megabytes of assembly just in the beginning of the application. Maybe you need one megabyte or less than that before gzips. So this is important to think about to make sure that you're loading what is needed in the beginning and then continue loading what is necessary afterwards.

Okay, excellent. Thank you for wonderful question. That was very, very insightful. Dear friends, we're out of time here, so please give a warm round of sunshine emoji to our wonderful guest, Majid. And please also join him in the room later to ask questions directly. Thank you so much for joining us, Majid.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

A Guide to React Rendering Behavior
React Advanced Conference 2022React Advanced Conference 2022
25 min
A Guide to React Rendering Behavior
Top Content
React is a library for "rendering" UI from components, but many users find themselves confused about how React rendering actually works. What do terms like "rendering", "reconciliation", "Fibers", and "committing" actually mean? When do renders happen? How does Context affect rendering, and how do libraries like Redux cause updates? In this talk, we'll clear up the confusion and provide a solid foundation for understanding when, why, and how React renders. We'll look at: - What "rendering" actually is - How React queues renders and the standard rendering behavior - How keys and component types are used in rendering - Techniques for optimizing render performance - How context usage affects rendering behavior| - How external libraries tie into React rendering
Don't Solve Problems, Eliminate Them
React Advanced Conference 2021React Advanced Conference 2021
39 min
Don't Solve Problems, Eliminate Them
Top Content
Humans are natural problem solvers and we're good enough at it that we've survived over the centuries and become the dominant species of the planet. Because we're so good at it, we sometimes become problem seekers too–looking for problems we can solve. Those who most successfully accomplish their goals are the problem eliminators. Let's talk about the distinction between solving and eliminating problems with examples from inside and outside the coding world.
Speeding Up Your React App With Less JavaScript
React Summit 2023React Summit 2023
32 min
Speeding Up Your React App With Less JavaScript
Top Content
Too much JavaScript is getting you down? New frameworks promising no JavaScript look interesting, but you have an existing React application to maintain. What if Qwik React is your answer for faster applications startup and better user experience? Qwik React allows you to easily turn your React application into a collection of islands, which can be SSRed and delayed hydrated, and in some instances, hydration skipped altogether. And all of this in an incremental way without a rewrite.
Utilising Rust from Vue with WebAssembly
Vue.js London Live 2021Vue.js London Live 2021
8 min
Utilising Rust from Vue with WebAssembly
Top Content
Rust is a new language for writing high-performance code, that can be compiled to WebAssembly, and run within the browser. In this talk you will be taken through how you can integrate Rust, within a Vue application, in a way that's painless and easy. With examples on how to interact with Rust from JavaScript, and some of the gotchas to be aware of.
React Concurrency, Explained
React Summit 2023React Summit 2023
23 min
React Concurrency, Explained
Top Content
React 18! Concurrent features! You might’ve already tried the new APIs like useTransition, or you might’ve just heard of them. But do you know how React 18 achieves the performance wins it brings with itself? In this talk, let’s peek under the hood of React 18’s performance features: - How React 18 lowers the time your page stays frozen (aka TBT) - What exactly happens in the main thread when you run useTransition() - What’s the catch with the improvements (there’s no free cake!), and why Vue.js and Preact straight refused to ship anything similar
The Future of Performance Tooling
JSNation 2022JSNation 2022
21 min
The Future of Performance Tooling
Top Content
Our understanding of performance & user-experience has heavily evolved over the years. Web Developer Tooling needs to similarly evolve to make sure it is user-centric, actionable and contextual where modern experiences are concerned. In this talk, Addy will walk you through Chrome and others have been thinking about this problem and what updates they've been making to performance tools to lower the friction for building great experiences on the web.

Workshops on related topic

React Performance Debugging Masterclass
React Summit 2023React Summit 2023
170 min
React Performance Debugging Masterclass
Top Content
Featured WorkshopFree
Ivan Akulov
Ivan Akulov
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
React, TypeScript, and TDD
React Advanced Conference 2021React Advanced Conference 2021
174 min
React, TypeScript, and TDD
Top Content
Featured WorkshopFree
Paul Everitt
Paul Everitt
ReactJS is wildly popular and thus wildly supported. TypeScript is increasingly popular, and thus increasingly supported.

The two together? Not as much. Given that they both change quickly, it's hard to find accurate learning materials.

React+TypeScript, with JetBrains IDEs? That three-part combination is the topic of this series. We'll show a little about a lot. Meaning, the key steps to getting productive, in the IDE, for React projects using TypeScript. Along the way we'll show test-driven development and emphasize tips-and-tricks in the IDE.
Web3 Workshop - Building Your First Dapp
React Advanced Conference 2021React Advanced Conference 2021
145 min
Web3 Workshop - Building Your First Dapp
Top Content
Featured WorkshopFree
Nader Dabit
Nader Dabit
In this workshop, you'll learn how to build your first full stack dapp on the Ethereum blockchain, reading and writing data to the network, and connecting a front end application to the contract you've deployed. By the end of the workshop, you'll understand how to set up a full stack development environment, run a local node, and interact with any smart contract using React, HardHat, and Ethers.js.
Remix Fundamentals
React Summit 2022React Summit 2022
136 min
Remix Fundamentals
Top Content
Featured WorkshopFree
Kent C. Dodds
Kent C. Dodds
Building modern web applications is riddled with complexity And that's only if you bother to deal with the problems
Tired of wiring up onSubmit to backend APIs and making sure your client-side cache stays up-to-date? Wouldn't it be cool to be able to use the global nature of CSS to your benefit, rather than find tools or conventions to avoid or work around it? And how would you like nested layouts with intelligent and performance optimized data management that just works™?
Remix solves some of these problems, and completely eliminates the rest. You don't even have to think about server cache management or global CSS namespace clashes. It's not that Remix has APIs to avoid these problems, they simply don't exist when you're using Remix. Oh, and you don't need that huge complex graphql client when you're using Remix. They've got you covered. Ready to build faster apps faster?
At the end of this workshop, you'll know how to:- Create Remix Routes- Style Remix applications- Load data in Remix loaders- Mutate data with forms and actions
Vue3: Modern Frontend App Development
Vue.js London Live 2021Vue.js London Live 2021
169 min
Vue3: Modern Frontend App Development
Top Content
Featured WorkshopFree
Mikhail Kuznetcov
Mikhail Kuznetcov
The Vue3 has been released in mid-2020. Besides many improvements and optimizations, the main feature of Vue3 brings is the Composition API – a new way to write and reuse reactive code. Let's learn more about how to use Composition API efficiently.

Besides core Vue3 features we'll explain examples of how to use popular libraries with Vue3.

Table of contents:
- Introduction to Vue3
- Composition API
- Core libraries
- Vue3 ecosystem

Prerequisites:
IDE of choice (Inellij or VSC) installed
Nodejs + NPM
Developing Dynamic Blogs with SvelteKit & Storyblok: A Hands-on Workshop
JSNation 2023JSNation 2023
174 min
Developing Dynamic Blogs with SvelteKit & Storyblok: A Hands-on Workshop
Top Content
Featured WorkshopFree
Alba Silvente Fuentes
Roberto Butti
2 authors
This SvelteKit workshop explores the integration of 3rd party services, such as Storyblok, in a SvelteKit project. Participants will learn how to create a SvelteKit project, leverage Svelte components, and connect to external APIs. The workshop covers important concepts including SSR, CSR, static site generation, and deploying the application using adapters. By the end of the workshop, attendees will have a solid understanding of building SvelteKit applications with API integrations and be prepared for deployment.