Leveraging the Event Loop for Blazing-Fast Applications!

Rate this content
Bookmark

Some weeks ago I made my own implementation of signals, capable of updating 300k times the DOM in circa 600ms. Leveraging the Microtask Queue, I was able to bring that number down to 6ms, effectively improving performances by 100x. I wanna share how the Microtask Queue works and how you can leverage the Event Loop to improve performances.

Michael Di Prisco
Michael Di Prisco
35 min
20 Oct, 2023

Video Summary and Transcription

This talk covers the event loop, microtask queue, and provides a live demo. JavaScript is single-threaded but can perform tasks that only a multithreaded environment can. The event loop consists of a call stack and microtask queue, which allow JavaScript to run non-blocking operations. Leveraging the microtask queue can lead to significant performance improvements in applications, such as React. However, it is important to use it correctly to avoid issues like infinite loops.

Available in Español

1. Introduction to Event Loop and Microtask Queue

Short description:

I'm Michael Di Brisco, known as Kadevan, a senior developer in Jointly and an open source contributor. Today, we will cover three main topics: the event loop, the microtask queue, and a live demo. JavaScript is both single-threaded and not. We'll explore this further. The ECMAScript specification does not allow for multithreading.

I hope you are enjoying London since now. It's raining, but it's a lovely city. So, as Jessica said, I'm Michael Di Brisco, known on the web as Kadevan, I'm a senior developer in Jointly and I'm a junior father at home and I'm also an open source contributor. But I'm not the main part of this talk so first of all, sorry for my English and I'm Italian so yeah, whatever.

And let's go to our table of contents. So what are we going to talk about today? We have three main topics we want to cover. Of course, we will not dive deep into all of them because we have 90 minutes, as I can see. So we will talk about what is the event loop. So we briefly talk about it. We will then talk about the microtask queue. So our main topic will be that one. So we go deep dive into the microtask queue and then we will go live demo. As we all know, live demo always go well during events. So I promised you learn something. Yeah. So someone had to make some bad jokes and I decided that was me. And please raise your hand if you know what the microtask queue is. Okay. Yeah. Let's hope they are a little more later.

Okay. So let's start with a couple of premises. So one step back before starting our real journey. Let's try to answer two important questions today. So is JavaScript single-threaded? Yes and no. But yes. So you know that many of us work with React, okay? So we work in a browser. Some of us also work in Node. And you know you can leverage multiple threads. But the ECMAScript specification, let's say the list of rules around the language, do not allow for multithreading.

2. JavaScript Runtime and Engine

Short description:

So in fact, JavaScript is single-threaded. Yet it can do things that only a multithreaded environment could allow it to do. JavaScript runs in a box composed of two parts: the engine (e.g., Chrome V8) and the runtime (set of APIs on top of the engine). The browser and Node.js share the same parser and executor, but they have different APIs. For example, Node.js has the file system API, while the browser provides web APIs like the DOM.

So in fact, JavaScript is single-threaded. Yet it can do things that only a multithreaded environment could allow it to do. So first of all, let's answer this question. So what is a JavaScript runtime? Let's say JavaScript runs in some sort of box, container, let's call it how we want, and it's mainly composed of two parts. We have the engine and the runtime. The engine is, for example, the Chrome V8 engine. So it's the part that parses and executes our code and makes sure everything is in order and launches all of our code. And then we have the runtime. The runtime is the set of APIs on top of what the engine can do. So let's take the most famous duo in the JavaScript ecosystem, which are, of course, the browser and the node, and we will specifically talk about Chrome browser. Why that? Because they share both Chrome and V8, and the node shares the V8 engine. So we have something in common. The parser and executor of the code is the same, but, of course, the node and the browser are, well, different beasts, okay? So in node, you have, for example, the file system API. You can call the file system, ask for files, directories, etc., which is something you can't do. You can't do with the exact same API on the browser because the browser provides its own set of web APIs. Let's think about the DOM. You can't interact with the DOM in node because it's not part of the engine. The console.log, for example. Did you know there are two different implementations of the console.log? So it's not a specific part of the engine but it's provided by the runtimes. Okay. So now that we settled all those two questions, so we said that JavaScript is single threaded yet node isn't. Chrome isn't.

3. Understanding the Event Loop

Short description:

So, let's take a look at what the event loop is effectively so we can better answer both those questions. We have three main parts: a call stack, a microtask queue, and a microtask queue. The call stack is our current execution, while the microtask queues are used to free up the main thread. The event loop provides order in these three components. APIs like process.nexttick and set immediate interact with the microtask queue. JavaScript allows for run to completion operations and leverages queues to free up the thread.

So, let's take a look at what the event loop is effectively so we can better answer both those questions. So, as you can see, this is a really awesome representation from Lydia Ali of the event loop. And as you can see, we have three main parts. We have a call stack, a microtask queue, and a microtask queue. Of course there are many other moving parts but for the sake of simplicity this is enough. And we have this thing called the event loop right there that just provides some order in all of these three things.

So the call stack is our current execution. Okay. So, what our main thread is doing. And then we have the microtask queue and the microtask queue which are effectively used to free up our main thread from, you know, some hard computation or some known immediately executable functions. So this is how we can have a sort of multi threaded execution even in a single thread runtime. Okay. In a single thread execution. So let's look at this GIF going. So as you can see, as soon as the call stack empties things from the microtask queue are being brought into the call stack, and then we go to the microtask queue. So as you can see, there are some simple APIs that we can use to interact with the microtask queue which are process.nexttick, promise callback, async functions, and the queue microtask. And then we have the microtask queue which set timeout, set interval, and set immediate. And yeah, you read it correctly. The process.nexttick is something that is done in the current tick of the microtask And set immediate is effectively done in later in time because yeah, we are pretty good at naming things in the JavaScript ecosystem.

So let's talk about three main concepts of the event loop. And you see they are like a semaphore there. The first thing we have to mention is run to completion operations. So what do we mean by run to completion operations? Having a single call stack and having a single thread of operations, JavaScript allows us to be sure that our execution is going from the start to the end before another one starts. So let's make it clear. There is no concurrency in JavaScript as itself. We can reach some sort of concurrency in some ways. But, of course, this is one of the main concepts to make the event loop work. So just one thing at a time can be done in our call stack. The second thing worth mentioning is leveraging queues to free up the thread. So as we said before, we have the micro task queue and the micro task queue, which are used to put all the non-immediately executable operations.

4. Blocking the Main Thread and I.O. Operations

Short description:

And so this frees up our thread and makes our call stack keep going and our thread not blocking. The timeout isn't granted. What do we mean by the timeout isn't granted? Let's take, for example, a set timeout of 1,000 milliseconds. We have all these other things going around before the call stack gets our operations in place and executes them. Those three things are what makes the JavaScript ecosystem non-blocking even by leveraging a single thread. JavaScript is effectively single thread. Node, Chrome, etc., allow us to free up our main thread. Let's start with the worst case scenario. We have two buttons: one is an I.O. blocking one, and the other one is a non-I.O. blocking one. As you can see, I can continuously click the non-I.O. blocking button and nothing happens, but as soon as I click the second one, everything breaks.

And so this frees up our thread and makes our call stack keep going and our thread not blocking. The third thing is that the timeout isn't granted. This is red because it's a red flag I usually see when developing applications with myself. What do we mean by the timeout isn't granted? Let's take, for example, a set timeout of 1,000 milliseconds. OK? So we wait for 1,000 milliseconds and then we execute our operation. If we try to bring in some timestamp, OK, we will see that 1,000 milliseconds is the least amount of time it will take for our operation to be executed. Because as we saw before, we have all these other things going around before the call stack gets our operations in place and executes them. So that's what we mean by the timeout isn't granted. And those three things are what makes the JavaScript ecosystem at all to be a non-blocking even by leveraging a single thread. Of course, when we are working in Node and in the browser, we have some multi-threading operations that can happen. But still, our main thread is only one and only one can execute our call stack. OK? We have, you know, worker threads. We have web workers for the web, et cetera. Still, JavaScript is effectively single thread. So the main operations are going for the main call stack. So as we said, Node, Chrome, et cetera, allow us to free up our main thread. So I found out it was interesting to have a look at how to block the main thread effectively. OK? So let's start with the worst case scenario. So here we have... Is this a llama or an alpaca? You know what the difference is? No? Well, one is not in the picture. So we have two buttons. Yeah, it's a llama, by the way. We have two buttons. One is an I.O. blocking one, and the other one is a non-I.O. blocking one. I have this image moving. Now I will make it start. And as you can see, I can continuously click the non-I.O. blocking button and nothing happens, but as soon as I click the second one, everything breaks.

5. I.O. Blocking Button and Unsafe Loop

Short description:

Let's look at our simple implementation of those buttons. We have two event listeners with a click event, and we call the unsafe and the safe loop. The unsafe loop function implements an infinite loop by continuously asking our call stack to bring something on the table. This can lead to filling up the call stack and causing the browser to become unresponsive.

Okay. So how can it happen? And how can we avoid it? Let's look at our simple implementation of those buttons. So as you can see, we have two event listeners with a click event, and we call the unsafe and the safe loop. So let's start with our I.O. blocking button. And as you can see, we are calling the unsafe loop function. So what does the unsafe loop function do? Well, effectively, nothing. But we are implementing an infinite loop. So why is that? Because even if there are no operations between the parentheses, we are still continuously asking our call stack to bring something on the table. So we are effectively going to fill up our call stack until Chrome, for example, is good enough to tell us, hey, please stop. But in the past, your computer will burn. So this is the first case.

6. MacroTaskEq and Live Demo

Short description:

The MacroTaskEq acts as a last step before the next tick begins. We can still ask our call stack to be filled up with things to do before re-rendering the page. Today, we will build a live demo with four steps: project scaffolding, JS class implementation, benchmarking, and implementing the micro task queue. Let's see some code.

Then we go to the non-IoBlockingButton, which does almost exactly the same thing, but calls the setTimeout API. And remember, we are talking about the MacroTaskEq and this is an important difference we will cover later. So the MacroTaskEq effectively acts in a different way as the MicroTask one. Because let's think of the event loop as a circle, okay, with a dot in it doing a 360. Okay? And whenever it reaches the end, it starts another tick. So in the browser, that tick can be the repaint of the page. And when we are leveraging some MacroTask API like the setTimeout, we are effectively asking our dot to complete its tick. And then when the new one starts, we can look if there is something in the MacroTaskEq to bring into the call stack. So that's why we can effectively create a non-infinite loop even if this execution will continuously going so our main thread will still be free because we are asking our tick to end every time before starting to call again our safe loop function.

So then we can go back to our initial question. So what is the MacroTaskEq? Well, this is probably one of the most misunderstood parts of the event loop because it acts in a specific part of it. And let's take the example we did before. So we have this circle with this dot going around 360 and whenever it ends, it starts a new tick. The MacroTaskEq acts effectively as a last step before the next tick begins. So we are in the current tick but we can still ask our call stack to be filled up with things that we have to do for the last part before, as an example in Chrome, re-rendering the page, okay?

So far so good, but what are we building today? I promised you a live demo and we have four steps to victory. The first one is a project scaffolding, so a simple HTML page. A basic signal implementation leveraging JS classes. I know this is not the best way to do this, but for the sake of simplicity, I decided to go for classes because we can still see a lot of improvement there. The third part would be a bare bone benchmark so we will track the time needed for, I don't know, 100,000 updates, 1 million updates, whatever. I have an M1 max so we can go up to 10 billion. And the fourth part will be, of course, implementing the micro task queue. So let's see some code. I prepared some simple scaffolding so the first step is done. So we have this simple Chrome page. Can you all see it? Okay. Is it good? Fine. Good. So we have our paragraph, our simple paragraph and we will do the best anti-pattern ever. We will put the script inside of the HTML. You know, like the good old days when we did that.

7. Creating a Signal Class

Short description:

Let's start by creating a new class called a signal. It will have an HTML element and a value attached to it. The constructor will take an element and an initial value. The render function will update the HTML element with the new value. Finally, we attach the signal to the HTML element and create a new instance of the signal class.

So let's start by creating a new class which is a signal. Yeah. Let's be clear. The class is, you all know, is syntactic sugar but there are many backgrounds here, not all of us started with React, with JavaScript, et cetera. So this is the best way to go for it. So we will go simple.

So our signal implementation will have just an HTML element and a value attached to it and the possibility to change this value by updating the object. So I know I can use some private properties like with the hash, et cetera, but I don't want to do that. I want to do this for simplicity so we have some, let's call them internal values.

We will have our constructor which will take an element and an initial value and we will save them. Yeah, thank you, Copilot. Okay. So we have our simple constructor. Of course it's doing nothing. And we have our render function. Our render function will simply take our element and will bring some dangerous set in our HTML to it. Okay. So you remember when it was easy to just bring some inner HTML to the page and now you have to keep a flag, you know, set it to true. Are you sure you're doing that? Okay. So we are calling the dis.render function and we are done.

Now we have, of course, to attach our signal to our HTML element that we called paragraph. So let's call it London paragraph. Okay. And, of course, we create a new variable called myLondonSignal. Yeah if I guess correctly. New signal. Oh, yeah, thank you again, copilot. So yeah, we are calling the new signal. So we are calling our constructor with document get element by the London paragraph and our initial value. Okay.

8. Signal Implementation and Benchmarking

Short description:

So let's see if it works. Of course it's not working because we are not calling the render function. So now it works. London is awesome. We have our signal implementation with get value and set value functions. If I try to change the value of this signal, it works. Now let's do some benchmarking. We save our start time and update the value 100,000 times, taking 70 milliseconds.

So let's see if it works. Of course it's not working because we are not calling the render function. How stupid am I? So, yeah, so let's try to... Okay. So now it works. London is awesome. Woo hoo.

Okay, so we have our signal implementation, and of course we have to bring some salty magic here. So we have a get value that you can leverage later. That will pass our internal value. And of course a set value with a new value. will effectively update our internal value and call our rendering function. Okay? So far so good. If I try to... How did I call it? MyLondonSignal. If I try to change the value of this signal, it works. So it's a pretty simple signal implementation, and now we can do some benchmarking around it.

Please, copilot, help me, because I never remember the date.now API. Okay. So we save our start time. For the sake of simplicity, I'm going for date. I'm not going for some micro-time management, et cetera. So let's go for... I'm gonna go for, yeah, 100,000 updates. Did you know you can write integers like that in the new version of the ECMAScript specification? Yeah. It's really awesome. Okay. Let's try to update this value 100,000 times, and let's try to see how much it needed to update those values. Okay. So as you can see, it's taking 70 milliseconds.

9. Leveraging the Queue Micro-Task API

Short description:

Of course, we are on a pretty powerful machine. So let's bring up the game a bit. Let's go for 1 million. Yeah, we're getting close to the second. Okay, 600 milliseconds. We can update it a couple of times. Yeah? Okay, so now we are ready to leverage our queue micro-task API. So first of all, we are going to talk about a queue. We are effectively putting this rendering operation in a queue. In our constructor, we will provide some initial value to false. If this is already in queue, return. But if not, window.queue-micro-task is just as easy as that. So there we have it. This is something to be aware of, because the render function is not the best place for these kind of things, but we have a better place, which is the set value.

Of course, we are on a pretty powerful machine. So let's bring up the game a bit. Let's go for 1 million. Yeah, we're getting close to the second. Okay, 600 milliseconds. We can update it a couple of times. Yeah? Okay, so now we are ready to leverage our queue micro-task API, and how hard can that be? Well, it's pretty simple, in fact. So first of all, we are going to talk about a queue. Okay, so we are effectively putting this rendering operation in a queue. So let's say is in queue. Okay, just a simple internal value we can leverage later. In our constructor, of course, we will provide some initial value to false. Our render function is not being touched till now and we are going to do the magic right here. So if this is already in queue, return. But if not, window.queue-micro-task is just as easy as that, and we can pass a function here to say, hey, okay, whenever you are in the queue micro-task, just say this is in queue equal to true. So whenever we call again the render function, it is not being called. And okay. So there we have it. So if I didn't do anything wrong here, yeah, okay, I did something wrong. Yeah, it's not a problem. Yeah, because I moved it in the wrong place. This is something to be aware of, because the render function is not the best place these kind of things, but we have a better place, which is the set value. And of course, we are going to go back to our initial implementation. Okay. This dot failure. And here we are going to say, if this let's say if not, this is in queue. Window dot queue microtask. Let's write it again, just to be sure we don't make anything wrong here. This is in queue equals to true. This dot value.

10. Using the Microtask Queue and Performance Benefits

Short description:

And this dot render. So let's remove those one from here. And okay. Yeah. Let's make it refresh. Okay. As you can see, we had some performance benefits, not as much as we imagined, but of course it's just for the sake of simplicity of this demo. And of course you can go� you can do this in many parts of your applications. And if you do it properly, not as I'm doing this right now, as I have one minute left to complete this demo, you can effectively reach 100x performance improvement on your signal implementation.

And this dot render. So let's remove those one from here. And okay. Yeah. Let's make it refresh. Okay. As you can see, we had some performance benefits, not as much as we imagined, but of course it's just for the sake of simplicity of this demo. And of course you can go� you can do this in many parts of your applications. And if you do it properly, not as I'm doing this right now, as I have one minute left to complete this demo, you can effectively reach 100x performance improvement on your signal implementation.

So if what I explained until now is good, it was good enough, that was really unexpected result. If not, I'm sorry. So you can have a look at Jake Archibald in the loop talk, which is really awesome and really goes even deeper in the microtask queue and how its inner workings, you know, are interacting with our event loop. And now that, of course, we saw like a 3x, 4x improvement in our performances, hey, it's really awesome. It's like 25% off of the performances we had before. I'll ask you again, so please raise your hand if you know what the microtask queue is now. Yeah, at least one more than before.

Okay. So, why don't we always use the microtask queue, you may ask. So, we saw that these implementation is easy. It's a simple API. In some other demos of some other implementations, I literally passed from 600 milliseconds to 6, 7 milliseconds, and I will have a quadcode later. So, why don't we always use the microtask queue? Well, as we like to say in Italy, all that glitters is not gold. As we saw before, the microtask queue acts before the next tick begins. So, what does this mean? That if we go to our previous implementation of the safe loop and we just change it to a promise.resolve.then, we are effectively creating a new infinite loop. So, why is that? Because the promise.resolve puts the safe loop function into the microtask queue. Our dot goes till the end of our event loop and says, okay, microtask queue, do you have something for me? Yeah, sure. Take this safe loop. That's why it's a promise.resolve.then. Let's bring it into the microtask queue. Let's do this again.

QnA

Microtask Queue and Use Cases

Short description:

Do we have something? Yeah. You can leverage the microtask queue for last steps before refreshing or repainting your page. Use cases include batching operations and sorting arrays. Check out the prototype-based signal implementation on my GitHub for a hundred times performance improvement. Now, let's move on to the questions.

Do we have something? Yeah. And we are effectively creating a new infinite loop. So, please be aware of the fact that this can, you can leverage this to do some last steps before refreshing your page, repainting your page, but still, you can encounter some infinite loops there. So, as I like to say, the event loop is the only infinite loop you want to have in your app. So, please remember what we said about the microtask queue today.

And, yeah, I know 20 minutes passed, but I'm going for the question, yeah, so, I guess it's fine. You may ask some use cases for it because, of course, you don't need to update one million times your paragraph in your page. And well, everything that can be batched together as an operation can literally be built into a microtask queue to be executed just once. So, let's take, for example, an array of names that you have to sort whenever a new one comes in and you have a CSV import in your page. I don't know. So, you bring in the CSV, you take all the names, and every time a new name comes in, you rearrange the array. So, with this implementation, you can simply say, did I already ask for my array to be resorted? Okay. So, just do this in the microtask queue. And maybe in the microtask, I can put a use state somewhere. Okay. So, it's like a mic conference. So, let's put things into a state. So, you put things into the state. And then just one time, the operation is being done. So, if you want to see a better implementation and a more wow one because it's literally a hundred X change, you can look at a simple repository I created which is a super simple signal, of course. And it's similar to this one, but it's using the prototype based approach. And it's on my GitHub. You can have a look at it. It's simple enough. It's just one page of JavaScript. It's like 40 lines of code if I'm not wrong.

And here we are for the questions. Thank you very much, Michael. Thank you. Yeah.

Using the Microtask Queue and Its Potential Issues

Short description:

The Microtask queue should be used outside the render function, but not always. It's important to find a compromise between pre-rendering and post-rendering tasks. Many people underutilize the Microtask queue because they don't understand its role in the event loop. Promises heavily rely on the Microtask queue, but using it incorrectly can lead to infinite loops. It may seem counterintuitive, but promises can execute immediately and not necessarily later.

That was excellent. Thanks so much for taking us through that. I think the thing that kind of got me thinking the most was of course the use cases. I think we should go straight to some of the questions that came in around using the Microtask queue. Let's quickly highlight. So it should be always use the Microtask queue outside the render function.

Should we always use the Microtask queue outside the render function? No, not always. That's what I was trying to say in my last slide. So please be aware of the benefits that it can provide. But of course remember we also have the Microtask queue that we can leverage when we don't need to do things before the rendering. So just to find a good compromise between what I need before rendering and what can wait after the rendering phase.

Nice. And on that point, are there... Why do you think the Microtask queue is so often potentially underutilized or...? I think it's... I will not say not utilized because promises heavily rely on the Microtask queue. But many people use the Microtask queue without effectively knowing in which part of the event loop it acts. So that's the issue I was talking about before. So why don't we always use them? Because they effectively are another way to create an infinite loop. Something you don't do with the Microtask queue. So you should be aware. The promises implementation is totally based on the Microtask queue. So if you don't know how to rely on the Microtask queue, it's easy enough to get into an infinite loop. You can't understand because you're saying, yeah, okay, it's a promise. It's later in time. Yes, but not in the event loop if you resolve it immediately.

Makes sense and it feels quite counterintuitive. Yeah. Yeah, because it's not effectively later. It can be, of course, if the promise is, I don't know, an AJAX call, et cetera, but it can still be immediately. There are some, there are many libraries which rely on promises but execute immediately.

Handling Promises and Micro-tasks in React

Short description:

You can find yourself in an infinite loop if you don't know where this acts. There are some small differences in handling promises and micro-tasks between browsers. Batching operations have been the most useful use case for improving performance in React.

You know, just for the sake of simplicity, you can have, I don't know, an HTTP call and a Mock for your local implementation which doesn't provide the HTTP call but resolves in a promise. Okay. So, let's think about the Mock servers. They usually do that with the APIs, the Mock APIs. And so, yeah, you can find yourself in an infinite loop if you don't know where this acts. And you say this kind of quite quick, like, easily leads on to kind of our next question which is highlighted on the screen behind us.

Are there any differences in handling promises and micro-tasks between browsers? Yeah, I'm not an expert in browser engines, okay. But I know there is a bug in Safari. Oh. Yeah, where in some cases promises are delayed in the macro task queue and also there is a safe mechanism in Chrome in which if it can detect with its own parser, if it can detect the possibility of an infinite loop, it moves the promises into the macro task queue. Okay. So, yeah, there are some small differences, but of course it's just some specific use cases that you shouldn't see in your lifetime. And realistically, you'll pick those up as you kind of develop in a browser. Yeah. Yeah.

I really like this next one as well, actually. I'm just gonna bring this one. It would be nice to understand the variety of contexts within the React world that we can utilize this to improve performance. So it's not exactly a question, but I think it's a bit of props. Yeah. Excuse the horrible pun. So what sort of use cases have you found the most useful? Well, as I said before, it's the batching operations. You know, I've been working with React for the last couple of years at least and I found myself struggling to find the best way to improve performances. And when I discovered the QMicroTask, I found so many ways I could not rely on the use state callback mechanism. You know, when you have a state and then you have an effect on that state that calls another set state, etc. And using the QMicroTask, you can leverage something that is built into the browser, okay? It's the event loop. It's how JavaScript works. So you don't have to rely on some magic from the React side, you know, with the suspense, fiber, etc. You don't have to go to that magic world where only Dan Abramov can understand the things. And you can stay on the ground and say, okay, I know how it works.

Event Loop and Error Handling in JavaScript

Short description:

I don't care how React handles it. I know I can do this batching operation, this sorting operation, you know, these set of things that I need before rendering without asking React to re-render my component four, five, six times. The event loop handles error handling and propagation in JavaScript applications similarly to how try-catch mechanisms work. Using microtasks can be abused and cause performance issues, but if used correctly, they can be beneficial. The main use case for microtasks in React is to prevent unnecessary re-rendering of the page.

I don't care how React handles it. I know I can do this batching operation, this sorting operation, you know, these set of things that I need before rendering without asking React to re-render my component four, five, six times. I was going to say, it's quite a common ‑‑ I wonder whether or not it's depending on which ‑‑ where your experience lies. It depends if you end up working with React or against React or working with JavaScript or against JavaScript.

Can you rephrase it again? Oh, you're talking about using the inbuilt method within the browser. Yep. Just relying on the browser, because it doesn't change. By the way, if anyone wants to add any questions, if you've got the Slido app, because I know a lot of people have downloaded Slido for various town halls or events that you've been using, if you use the code, I think it's 2010, you'll be able to jump into the session, choose the track, and add your question to the list here. So if any are kind of coming up that you think, I actually think I want to ask that for my work right now, that's a good way to kind of enter the queue.

So we've got up here next, we've got, how does the event loop handle error handling and the propagation in JavaScript applications? Well, effectively, there is no difference between how you actually handle a try-catch mechanism, an error handling mechanism. Because as we said before, we are going on the ground, on the event loop, normal way of working. So if you have a promise, you still can use the catch for error handling, there is nothing particular around how to use the queue microtask. You can still use a try-catch, you can use a sink away. You can also, even if it's not recommended, you can also have an async function being called in the queue microtask callback function. And as I said before, I would not recommend it because some browsers like Chrome can detect it and move it into the microtask queue to prevent the main thread from being blocked from a too hard call on the async.

I think you've almost inadvertently answered our next question that we've highlighted here. The answer is that we shouldn't, in general that we shouldn't use microtasks as it can be abused and cause performance issues on client's computers. Yeah, that's true, but it's also true that you shouldn't have five use effects for six set states that re-renders the page five times. So, you have to be aware at least of how you are leveraging the event loop and know that there is the microtask queue that of course can be abused. But if you know what you are doing, I think you are safe. So I think that's the point of my talk, if you know what you are doing, I think you are safe. Nice, I think we might have time for just one or two more questions. I think the next ones are kind of, the next two here, we've got can you apply this to React context and will it work in React function, so I think they're kind of asking a similar bit of context there. As I said, the best use case I find is to prevent relying on all these affect state callback. Okay. So this is the most common use case I see for the Microtask. Where you can effectively prevent the page from re-rendering in a useless way because it's going to re-render three, four, five times. Yeah. So that's the main use case I see at a glance for React. I wonder whether or not we should...

Browser Queues and VS Code

Short description:

The browser has many queues, each with its own implementation of how it interacts with the event loop. In a simple event loop environment, there are usually just two queues. The Chrome UI interaction queue is an example where it prioritizes rerendering the page before processing the microtask queue. We will reconvene after a break at 11.30. If anyone has burning questions or if there are any remaining questions of interest, feel free to raise your hand. The IP entity being used is VS Code, Visual Studio Code, developed by Microsoft.

Have you got a preference on the next question you'd want to take here? Because I feel like I've been driving you through a couple of fundamentals.

Yeah. The first one is nice and worth mentioning. So it is... It's also worth mentioning that the browser has many queues. Yeah. That's right. Yeah. Of course, we have some concept of prioritization. That's what I was saying before when I showed the Lydia Halle gif. Because there are so many moving parts and the browser have its own implementation of how it interacts with the event loop. That is, of course, different from Node, from BUN, from Dyno, from the Safari, Runtime, et cetera.

Yeah, it's worth mentioning that there are usually just those two queues in a simple event loop environment. But, of course, there can be many. As an example, it's the Chrome UI interaction one where, of course, it detects those infinite talking about before and brings the microtask queue in a low priority so it rerenders the page before going into the microtask queue. Yeah. I think we are kind of getting into the time where it will be a break for everyone and we'll be reconvening at 11.30, but I just wonder whether or not, if anyone had any burning questions in the room or if any of the questions from the last few remaining are really calling out to you. If we got... Feel free to raise your hand if you've got anything that... Oh, front row. Front row, what's the name of the IP entity you're using? I like your tokens. It's VS Code. It's Visual Studio Code. VS Code. VS Code. Yes. Yeah. Microsoft's... And you were making great use of Copilot there. Yeah.

Microtask Queue and Rendering

Short description:

I love that. The main concept of the microtask is that it acts slightly before the rendering. When you have 1000 updates to do, you are effectively re-rendering the page many times. The microtask queue acts just once, which significantly reduces the time difference. Thanks, Michael, for the talk.

I love that. Yeah. Because you know, I have something like 20 tries of this talk and yeah. So that's why the Copilot knows how to help me in this situation.

So there is one question I'd like to answer and it's the first one. So about the fact that the render happened after I finished and timed my loop. Yeah. That's the main concept of the microtask. So it acts slightly before the rendering. So that's why.

So when when you have 1000 updates to do, for our example, for our scaffolding and C grant implementation, you are effectively re-rendering 100,000 times, one million times the page. And of course, on M1 Max, you don't see that much of a difference. But I can guarantee you, when I first did this trial, this example on my own laptop, which is an M1, so not a bad machine, the time difference for that exact same implementation, so with the classes, which is of course not the best one you can see, I literally went from two seconds to 30 milliseconds. So yeah, it's correct that the microtask queue acts just once. That's why we put the is in queue boolean there, just to say, okay, just do it once. That's the main difference. Brilliant. Well, thanks very much, Michael, really appreciate it, and thanks for the talk.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

A Guide to React Rendering Behavior
React Advanced Conference 2022React Advanced Conference 2022
25 min
A Guide to React Rendering Behavior
Top Content
React is a library for "rendering" UI from components, but many users find themselves confused about how React rendering actually works. What do terms like "rendering", "reconciliation", "Fibers", and "committing" actually mean? When do renders happen? How does Context affect rendering, and how do libraries like Redux cause updates? In this talk, we'll clear up the confusion and provide a solid foundation for understanding when, why, and how React renders. We'll look at: - What "rendering" actually is - How React queues renders and the standard rendering behavior - How keys and component types are used in rendering - Techniques for optimizing render performance - How context usage affects rendering behavior| - How external libraries tie into React rendering
Don't Solve Problems, Eliminate Them
React Advanced Conference 2021React Advanced Conference 2021
39 min
Don't Solve Problems, Eliminate Them
Top Content
Humans are natural problem solvers and we're good enough at it that we've survived over the centuries and become the dominant species of the planet. Because we're so good at it, we sometimes become problem seekers too–looking for problems we can solve. Those who most successfully accomplish their goals are the problem eliminators. Let's talk about the distinction between solving and eliminating problems with examples from inside and outside the coding world.
Scaling Up with Remix and Micro Frontends
Remix Conf Europe 2022Remix Conf Europe 2022
23 min
Scaling Up with Remix and Micro Frontends
Top Content
Do you have a large product built by many teams? Are you struggling to release often? Did your frontend turn into a massive unmaintainable monolith? If, like me, you’ve answered yes to any of those questions, this talk is for you! I’ll show you exactly how you can build a micro frontend architecture with Remix to solve those challenges.
Speeding Up Your React App With Less JavaScript
React Summit 2023React Summit 2023
32 min
Speeding Up Your React App With Less JavaScript
Top Content
Too much JavaScript is getting you down? New frameworks promising no JavaScript look interesting, but you have an existing React application to maintain. What if Qwik React is your answer for faster applications startup and better user experience? Qwik React allows you to easily turn your React application into a collection of islands, which can be SSRed and delayed hydrated, and in some instances, hydration skipped altogether. And all of this in an incremental way without a rewrite.
React Concurrency, Explained
React Summit 2023React Summit 2023
23 min
React Concurrency, Explained
Top Content
React 18! Concurrent features! You might’ve already tried the new APIs like useTransition, or you might’ve just heard of them. But do you know how React 18 achieves the performance wins it brings with itself? In this talk, let’s peek under the hood of React 18’s performance features: - How React 18 lowers the time your page stays frozen (aka TBT) - What exactly happens in the main thread when you run useTransition() - What’s the catch with the improvements (there’s no free cake!), and why Vue.js and Preact straight refused to ship anything similar
The Future of Performance Tooling
JSNation 2022JSNation 2022
21 min
The Future of Performance Tooling
Top Content
Our understanding of performance & user-experience has heavily evolved over the years. Web Developer Tooling needs to similarly evolve to make sure it is user-centric, actionable and contextual where modern experiences are concerned. In this talk, Addy will walk you through Chrome and others have been thinking about this problem and what updates they've been making to performance tools to lower the friction for building great experiences on the web.

Workshops on related topic

React Performance Debugging Masterclass
React Summit 2023React Summit 2023
170 min
React Performance Debugging Masterclass
Top Content
Featured WorkshopFree
Ivan Akulov
Ivan Akulov
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
React, TypeScript, and TDD
React Advanced Conference 2021React Advanced Conference 2021
174 min
React, TypeScript, and TDD
Top Content
Featured WorkshopFree
Paul Everitt
Paul Everitt
ReactJS is wildly popular and thus wildly supported. TypeScript is increasingly popular, and thus increasingly supported.

The two together? Not as much. Given that they both change quickly, it's hard to find accurate learning materials.

React+TypeScript, with JetBrains IDEs? That three-part combination is the topic of this series. We'll show a little about a lot. Meaning, the key steps to getting productive, in the IDE, for React projects using TypeScript. Along the way we'll show test-driven development and emphasize tips-and-tricks in the IDE.
Web3 Workshop - Building Your First Dapp
React Advanced Conference 2021React Advanced Conference 2021
145 min
Web3 Workshop - Building Your First Dapp
Top Content
Featured WorkshopFree
Nader Dabit
Nader Dabit
In this workshop, you'll learn how to build your first full stack dapp on the Ethereum blockchain, reading and writing data to the network, and connecting a front end application to the contract you've deployed. By the end of the workshop, you'll understand how to set up a full stack development environment, run a local node, and interact with any smart contract using React, HardHat, and Ethers.js.
Remix Fundamentals
React Summit 2022React Summit 2022
136 min
Remix Fundamentals
Top Content
Featured WorkshopFree
Kent C. Dodds
Kent C. Dodds
Building modern web applications is riddled with complexity And that's only if you bother to deal with the problems
Tired of wiring up onSubmit to backend APIs and making sure your client-side cache stays up-to-date? Wouldn't it be cool to be able to use the global nature of CSS to your benefit, rather than find tools or conventions to avoid or work around it? And how would you like nested layouts with intelligent and performance optimized data management that just works™?
Remix solves some of these problems, and completely eliminates the rest. You don't even have to think about server cache management or global CSS namespace clashes. It's not that Remix has APIs to avoid these problems, they simply don't exist when you're using Remix. Oh, and you don't need that huge complex graphql client when you're using Remix. They've got you covered. Ready to build faster apps faster?
At the end of this workshop, you'll know how to:- Create Remix Routes- Style Remix applications- Load data in Remix loaders- Mutate data with forms and actions
Vue3: Modern Frontend App Development
Vue.js London Live 2021Vue.js London Live 2021
169 min
Vue3: Modern Frontend App Development
Top Content
Featured WorkshopFree
Mikhail Kuznetcov
Mikhail Kuznetcov
The Vue3 has been released in mid-2020. Besides many improvements and optimizations, the main feature of Vue3 brings is the Composition API – a new way to write and reuse reactive code. Let's learn more about how to use Composition API efficiently.

Besides core Vue3 features we'll explain examples of how to use popular libraries with Vue3.

Table of contents:
- Introduction to Vue3
- Composition API
- Core libraries
- Vue3 ecosystem

Prerequisites:
IDE of choice (Inellij or VSC) installed
Nodejs + NPM
Developing Dynamic Blogs with SvelteKit & Storyblok: A Hands-on Workshop
JSNation 2023JSNation 2023
174 min
Developing Dynamic Blogs with SvelteKit & Storyblok: A Hands-on Workshop
Featured WorkshopFree
Alba Silvente Fuentes
Roberto Butti
2 authors
This SvelteKit workshop explores the integration of 3rd party services, such as Storyblok, in a SvelteKit project. Participants will learn how to create a SvelteKit project, leverage Svelte components, and connect to external APIs. The workshop covers important concepts including SSR, CSR, static site generation, and deploying the application using adapters. By the end of the workshop, attendees will have a solid understanding of building SvelteKit applications with API integrations and be prepared for deployment.