Building a Web-App: The Easy Path and the Performant Path. Why Are They Not the Same?

Rate this content

We use frameworks to make building our applications easier. Yet as the application scales, its performance suffers. There is no one thing, but rather a death by thousand cuts. Developers are under pressure, and they often choose the easy and quick path to deliver a feature rather than the performant path. The performant path is usually more work. So let's look at these two paths and imagine a world where the performant path is the quick and easy path.

31 min
01 Jun, 2023


Sign in or register to post your comment.

Video Summary and Transcription

Misko Havry introduces himself and discusses the impact of JavaScript on performance. The concepts of reconciliation, hydration, and resumability are explored, along with the importance of clean code and compiler optimization. The Talk includes demos of application components and showcases the power of code extraction. The QUIC framework is highlighted for its ability to optimize code loading and prioritize interactions. The service worker is used to selectively download components for improved performance. SEO and debugging in QUIC are also discussed, along with comparisons to other frameworks.

Available in Español

1. Introduction to Misko Havry and Performance

Short description:

Let's get started with a joke about how functions break up. I'm Misko Havry, the creator of AngularJS and Quik. We also have Partytown and mitosis. Let's talk about performance and how JavaScript can impact it.

So, with that, let's get started. Happy Android. Thanks, guys. Thank you. I think we also need to give applause to our emcees. I could not do their job. So I'm going to start with a joke, because I love dead jokes when I'm a dad. So how do functions break up? They stop calling each other. And it's actually a relevant joke. Because we're going to show you how the functions break up in this presentation.

So hi, I'm Misko Havry. You might know me, because I've done this thing called AngularJS, and now I'm working on this thing called Quik, and hopefully you've heard of It's a headless visual CMS system. What it is, is Imagine Wix, but not hosted. Instead, npm install inside of your application. You drag it in and then you get visual editing. And you can also, because it's your application, you can register your own components with it and have your marketing people go wild. And they don't have to bug you, the engineer, about changing anything on their landing pages.

Now, we do other things too. We do Quik. But we also have this thing called Partytown, which moves third party code into web workers. And we do mitosis, which allows you to write your code once. And we generate canonical code for React, Angular, Vue, Svelte, and anything else that you can possibly imagine. But let's talk about performance. So this is kind of a typical randomly selected set of websites from the web. And notice, they're all kind of green, maybe some yellow. It really isn't looking that good. Why is that? You know, if you build a simple hello world app and you push it somewhere, the performance is great, but once you put real applications, you know, real traffic behind it, the performance doesn't go so well. And there's a lot of different reasons for it, but one thing I'm really going to try to convince you of is that it is JavaScript. And basically, too much of it.

2. JavaScript and Hydration

Short description:

This is a chart from HTTP archive showing the increasing amount of JavaScript being sent to browsers over time. Users expect complex applications, which require JavaScript. The more JavaScript shipped, the lower the light house score. Hydration is a workaround that creates a problem. Previously, applications would boot by sending empty HTML and loading JavaScript. To eliminate the white screen, server-side prerendering was introduced, but it lacks interactivity until the JavaScript is downloaded and executed.

And if you look at it, this is a chart from HTTP archive. This is the amount of JavaScript that we have been sending to our browsers over time. And as you can see, that's just going up, up, up. And I'm going to make a bet that in the future, there's even going to be even more JavaScript. And it totally make sense because our users experience expect complicated rich applications. And you cannot deliver complex applications without JavaScript. And so we need JavaScript.

You know, there isn't the world where we stop shipping JavaScript or is there? So this is another interesting graph from HTTP archive. I've selected a few frameworks here. This is not important. What I want to show you here is that the median score that the website gets and the amount of JavaScript that's being shipped are essentially inverse of each other, right? The more JavaScript you ship, the lower your light house score. And the less JavaScript you ship, the better your light house score. That shouldn't be surprising. This should be like self-evident, right? That the less JavaScript you ship, the better the thing will be. The problem is that the way our applications work is we have this thing called hydration.

And hydration is this interesting workaround we have created and is creating this problem. So let me explain. Back in the day before we had meta frameworks like Next.js, the way the applications would boot is we would send HTML. The HTML would be empty. And into the HTML there would be a script tag that would load JavaScript. JavaScript would execute your application, the application would cause a render and then the render would make a website and now you can interact with the website. But we said, you know what? We really don't like the fact that there is this whitescreen there for several seconds. We really want to get rid of that. So we said, you know what? We know the solution to that. We're just gonna go server-side prerendering. So, we sent now a bigger HTML, notice the HTML got bigger, and now the page isn't white. It's the actual application that you have. But guess what? You cannot click on it. It appears faster which is great but you can't have any interactivity on that page yet. So at this point we download the JavaScript, execute the application.

3. Reconciliation, Hydration, and Resumability

Short description:

We're no longer calling it render, we're now calling it reconciliation or hydration. The interactivity is a little less because the whole thing got shifted. Hydration works by visiting all the children to find the click listeners. People are exploring partial hydration, small islands, or React server components. Resumability allows immediate interaction with HTML containing listener information. JavaScript is smaller by removing duplicates. The application shows up faster and is quicker to interact with. In the resumable world, listeners are the key.

We're no longer calling it render, we're now calling it reconciliation or hydration but really it's the same exact thing, it's just we're reusing the DOM nodes. And then we essentially end up with the same exact page and only now can you click on it. And so this is actually slower. Because notice the whole thing got shifted. So it appears faster but the interactivity is actually a little less.

And the reason why it's actually slower is because we're sending the same information twice, once as an HTML and then again as JavaScript. If you look into a string that says, let's say, hello world, the hello world's going to appear once in the HTML and once in the JavaScript. And so the way hydration works is you start with the root component over here and then you go and visit all the other children and what you're really looking for are these red boxes. These red boxes represent the events. You need to know where are the click listeners, right? And this is a huge amount of code to download and execute. And so people are exploring different things. Maybe we should do partial hydration, instead of having one big tree, maybe we can have a bunch of small islands, or delay it, etc. And that certainly improves the situation, it gets better. Or maybe we should do something like React server components where we say some of the roots end up on a server, but the children will continue to be in the client world. But at the end of the day, what you want is to get hold of these listeners. The listeners are the key to making your page interactive.

So I would like to show you an alternative world, a world which I call Resumability. So you start with HTML, and the HTML contains the page, just as before. But there's a huge difference in here, it's that this page contains information about where the listeners are. And as a result, you can immediately click on it and interact with it. In other words, the moment a button shows up, it is ready to be clicked on and interacted with. But you don't have JavaScript yet, so you have to download that. And in this particular case, notice the JavaScript is much, much smaller. Why is that? Well, what's missing over here? I'll explain in a second because, well, we removed the duplicates. We know, we looked at the page, and we said, actually, that hello world that you printed, that's static. It will never have to be re-rendered again on a client, so why are we sending this across? Right? And then of course, we don't, as a result, because the HTML is not there, we don't have to execute the application, we don't have to reconcile the application. And so your application not only shows up faster, it is fundamentally quicker to interact with. And so this is the bit we call resumability. And so the thing is, in resumable world, instead of starting at the root component and then, you know, finding all the listeners, you actually flip it all around and you say, no, no, no, you start at the listeners. And it's the listeners that matter because if there is no listener, then that component is inert.

4. Optimization, Clean Code, and Compiler Magic

Short description:

By starting with listeners rather than from the root, your application breaks up into smaller chunks. Clean code is simple, while fast code is complicated. We want a system where we can write clean code and let the computer optimize it. Two new concepts, signal and code extraction, can give constant startup performance and lazy loading out of the box. The compiler can optimize the code if we provide the right tools. Signals and code extractions are key parts. Let's see an example.

You can't really do anything with it. And so, in this particular example, for example, if I click on this component, I see that only this particular component needs to be re-rendered, so the rest of the page doesn't even have to be downloaded. And so the key over here is that by starting with the listeners rather than from the root, your application automatically breaks up into smaller chunks. So even if you have the most complicated application in the world, if you click the add buy button, the only thing you have to do is talk to the shopping cart and re-render the shopping cart. The fact that you have a complicated menu or the complicated component or the complicated way of commenting on the product doesn't matter. It is irrelevant to what the user is trying to achieve.

And so I'm sure you've heard this, which is pretty much optimization is the root of all evil, but why? Why is this common there? And the reason is because clean code is simple code, whereas fast code is complicated. What you're trading when you make something fast, is you're trading simplicity for something that's complicated. And of course, we as humans, we write code primarily for other humans, right? And so what we want to have is a system where I can write clean code, but the computer can go and optimize everything for me, right? I want the computer to do the hard bit and we already have that. We have compilers and linters and all kinds of magical things and they do all kinds of optimizations so that we can continue looking at a pretty clean code and we don't have to bother our brain with the optimizations in there.

So how can we have an application like that? And so the way most people build apps is you start with building an app, and then some of us actually get to the next phase and realize that our application is slow. And even fewer of us are actually doing something about this, right? Because optimization is hard. Optimization is not a single bug. It's something that creeps on you over time. And so the question is, what can we change? Well, what if I told you that you can learn two new concepts? One concept is signal. I think Ryan talked about it earlier today. It's known as the signal CEO, and the second one is code extraction and code extraction is denoted here as a dollar sign with parentheses. What if these two things would allow the compiler to give you a startup performance that's constant? Meaning it doesn't matter how big your application gets. It's always a startup of all of one. There is always the same amount of JavaScript before you need to, before you can interact. What if you can get lazy loading out of the box without any sort of effort? And what if you could get lazy execution without any sort of effort at all? That is an optimization that's not premature in a sense that the human doesn't have to do it. The code remains clean, yet the compiler can do it. But the compiler can only do it if we give the compiler the right sort of tools. And so these tools are important, and that's why signals and code extractions are kind of the key parts to it. So when you give this to the compiler, then the compiler can give you a best possible developer experience, and the computer can go and optimize it and do all of this magic of lazy loading and making sure that only the minimum amount of code gets written. So that's the theory. Let me show you something in practice. Okay. Here I have an example for you. Actually, let's look at the code first.

5. Application Demo and Component Rendering

Short description:

Here is a simple application demo that looks like React. Components announce when they're rendered. Demos include hello, counter, clock, and RPC.

So here is a simple application, a simple demo that I have built for these purposes. Notice it looks very much like React, and that's not an accident, that's intentional. Also notice that every once in a while there is a dollar sign present, and that's basically the thing that allows us to do lazy loading, code extractions, etc. And so what I've done is I've made sure that every component announces when it's rendered. It says rendering app, or whatever you have here. And then here I have hello, counter, clock and RPC, the different demos that I'm going to go through today.

6. Hello Component and Code Extraction

Short description:

Let's go through the hello component and see how the system works. When the component is rendered, it logs a message and displays a button. Clicking the button triggers the download of the necessary JavaScript code. The system optimizes the download by only sending the required code, in this case, a console log. The service worker pre-populates the cache, ensuring fast and reliable performance even on slow networks. Code extraction allows the system to execute the necessary code by importing it at the top level. Buttons can close over state and other variables.

So let's go through hello first. So hello component says, hey, first of all, I'm going to tell you that I'm being rendered, so I'm going to say console log, and I'm just going to have a button on it, and the button on it is going to have a console log when I'm going to click on it, and I'm just going to say hello JS Nation. So let's run the first thing. So let's go to our application. Let's refresh it, and notice that there is no JavaScript being downloaded. I'm looking at the JavaScript, no JavaScript being downloaded. So if I go and I click on hello, notice it is at that point that JavaScript shows up and hello JS Nation shows up. So as a developer, I have given my intention. My intention was I want a component with this listener, and the framework was able to do the magic. Now, notice what got downloaded. What got downloaded was literally just console log JS Nation. Nothing else. The system, with the help of these dollar signs, was able to look at the whole problem and say, ah, you just need the console log and you don't need anything else to achieve what you want. So why are we sending the whole application across? That's unnecessary. I'm just going to send the thing that you actually need.

Now, at this point, a lot of people will say, hey, I know what's going to happen. You're going to be in a slow network. You're going to click on something, and you're going to have to wait forever. So let me show you another important bit. This thing, notice in the size, I know it's hard to see, it says service worker. This was a cache hit. So the way it actually works is that when you navigate to a page, the service worker wakes up and immediately starts downloading all the necessary code, and pre-populates the cache, so that when you go and click, the virtual machine, the B8 will go and fetch the data out of the cache, it's going to be a cache hit, and therefore you will not have any kind of delay, even on slow and unreliable networks. If you load the page before you enter the tunnel, the page will continue working when you are in the tunnel.

Okay, so how does this magic work? Well, the thing is, the system can realize that, hey, in order to click this button, I have to execute this code. But the problem is, how do I get a hold of this? In order for the JavaScript to execute some code, there has to be a top-level export. Somebody somewhere had to do basically this piece of code right here, right? And so this is what code extraction does. This dollar sign right here basically tells the system, I want you to extract it and put it into a top-level thing that I can import it. And because I can import it now, it can be executed by itself without any sort of rest of the system being available. But you know what? You're going to say, well, yeah, but look, this console.log doesn't close over anything. In reality, our buttons close over state and other things.

7. Counter Example and Component Hierarchy

Short description:

Let me show you a slightly more complicated example, which is the counter. When you normally see a counter, you see a single component that does both the buttons and the rendering and the state. Here, I'm breaking things up to show you the power of optimization that QUIC can do. The counter is being rendered. I'm going to create a state, called the signal, and give it to an incrementer, which is a button that knows how to increment the thing. I'll also give the value to the display wrapper, which knows how to render it. Finally, we have the display that shows the count and the incrementer, which gets a hold of the count and increments it.

So let me show you a slightly more complicated example, which is the counter. Let me, I'm going to switch over to a different tab where I'm running in the dev mode because I want to show you things kind of more down-to-the-grain because I don't want to necessarily show you the service worker. So it's the same exact demo, it's just running in a dev mode.

So notice what downloaded. When I hit plus one, I downloaded some code that did the incrementation, I downloaded the framework, and finally, I downloaded some build thing that's not really important. But this is the important bit, right? I've downloaded a piece of code that incremented the counter. So let's look at the implementation of this. Now, the counter is intentionally pretty complicated because I want to show off important things. When you normally, when you see a counter, you see a single component that does both the buttons and the rendering and the state. Intentionally here, I'm going to break things up because I want to have as many components involved in here as possible because I want to show you the power of optimization that QUIC can do.

So here is your counter. First of all, I'm going to tell you that the counter is being rendered. I'm going to create a state, and in this case in QUIC it's called the signal, and what I'm going to do is I'm going to take the signal and give it to an incrementer, which is essentially just a button that knows how to increment the thing, and I'm going to give the value to the display wrapper, which again knows how to render it. The reason I'm breaking it up is I want to show you a hierarchy of components. So I'm just making it kind of complicated instead. Display wrapper doesn't really do anything. It simply takes the value and passes it on to display. What I want to show here is a typical example of prop drilling. A component that by itself doesn't do anything useful. It just passes the value down. And finally we have the display that shows the count, and we have our incrementer which is just a button and gets a hold of the count and does count.value++. So I know it's overly simplified, but fundamentally that's what a web application is. You have a place where you keep your state, you have a place where you show the state, you have a place where you mutate the state, and you probably have a bunch of unrelated components that just do prop drilling in order to get the UI that you want. Now when this happens in the normal world and you go and modify the count, if you're expect that all these console logs would rerun and rerender and reprint. Let me show you what actually happens here. So let's hit plus one. Notice it worked. It did plus one. What we downloaded was just this listener, right? Just this piece of code right here. Just this.

8. Framework, Signals, and DOM Connection

Short description:

The framework only sends the necessary code for mutation to the client, while the rest is unnecessary noise. Signals allow direct connection to the DOM, so the server learns the component relationships and updates the DOM accordingly. Irrelevant code is not sent to the client.

Nothing else. Then the framework showed up. And of course this build file is not important. Notice what didn't show up. None of the components showed up. Right? The fact that a state was in one component and we propped drills, etc., in the console log there is nothing in here, and also inside of the source code, we didn't even bother sending it. Really, the only thing we sent was this piece of code that was the thing that did the mutation. Everything else is unnecessary. Why? Because signals allow you to connect directly to the DOM. When this piece of code executed on a server, the server learned about the relationship of the components, and it learned that, like, oh, this signal is connected to this piece of DOM over here. So when I increment this value, I really just have to increment, you know, update the DOM over here, and everything else is irrelevant. It's just noise. And so sending it to the client would be problematic. And so it doesn't get sent.

9. Clock, UseVisible, and Quick Loader

Short description:

What if I have a clock that only shows up when you scroll into it? The clock runs with JavaScript. There's a useVisible task that runs when the component becomes visible. If the component is invisible, there's no need to download or execute anything. Another interesting thing is the ability to do expensive work on the server while the client handles the return value. This showcases a unified execution model. Is a button immediately clickable? There is a small quick loader that sets up a global listener, making the button clickable without delay.

Okay. Let me show you a next thing. What if I have a clock? A clock only shows up when you scroll into it. Notice the clock is running. Let me show this again for you. I have no JavaScript, scroll the clock into the position, when the clock shows up, the JavaScript shows up, and it starts running.

What exactly does the clock do? Well, there is a useVisible task which you can think of as useEffect. UseEffect runs on hydration, but there is no hydration, so when exactly are we supposed to run something like that? Well, the answer is you run it when the component becomes visible. This piece of code naturally does what you would expect, but it has an added interesting behaviour which is like, hey, if the component is invisible, why even bother downloading, executing or doing anything like that?

Let me show you one more interesting thing. We have a button here that says Do something, and I'm going to flip over to the console log. This Do something says I'm going to do work, expensive work, and return some value. Let me show this to you in here. There's an RPC component that says render, a click button that says hey, I'm going to do the work, I'm going to create some date, and I'm going to create a function that says do work, and I'm going to pretend this is the expensive work, and, just for fun, I'm going to return a function that actually has a value. Typical thing you do, you return closures and pass things around, right? Obviously, all this happens on the browser because, well, it's on the browser. But what if I can do crazy magic like this, say server, and all of a sudden, if I refresh, and I click the button, notice the clink was on a client, the return value is on a client, but the expensive work, well, that showed up on a server. That's pretty magical when you can do that. One of the things we talk about in Click is the idea of a unified execution model, and I can show you more of that later. And where is my slides? My slides disappeared. Here they are. Anyways, that's kind of my talk. Hope you enjoyed it, and hope you kind of seen the point of it, and if you have any questions, I'm happy to answer them.

Here's a philosophical one. Is a button immediately clickable? It needs to download a small piece of JavaScript first, doesn't it? Yeah, that's a good question. So there is something called a quick loader. A quick loader is very small, about one kilobyte, and it executes in about one millisecond on a desktop and about 10 milliseconds on a mobile. And all it does is sets up a global listener. And so that script tag is actually included in the HTML so that it's immediate. And so at that point, nothing else happens. So when you click on a button, the event propagates up and the top level global listener catches it, looks at the button and sees that the button has an attribute, and the attribute says something like on colon click equals and it has a URL, and that's where it goes and fetches the code. And we know that that URL will actually cause a cache hit.

10. JavaScript Loading and Interactions

Short description:

So data is immediately available, making it clickable. Heavy animations are loaded as you scroll, improving performance. Interactions are prioritized, reducing delays. Qwick outperforms hydration applications. Service worker doesn't pre-cache everything.

So data is going to be immediately available. And therefore, we can immediately execute it. Thank you. I mean, obviously, a button is immediately clickable. An HTML button can be clicked. It won't do anything until the Java is lost. Well, the point is there is no state where you have a button that you can click and the At all points, the moment the button is visible, it's clickable and you are guaranteed that it will be processed.

The next question is, is there a way to fix the JavaScript loading issue with heavy animations that start immediately on the loading page? So the clock example was an example of that, right? Because it said, when it's visible, I want you to eagerly start executing this code. And so when it's visible, you can eagerly start executing animations or anything of that sort. So that's certainly possible. And it is actually, we have a couple of websites that are built with Qwick, and the nice thing about them is they have lots of heavy animations, and as you scroll, because it's a long website, more and more animations start loading or streaming into the application, as you scroll towards the bottom. So the animation that is currently not visible is not loaded. But then when it becomes visible, then it automatically is loaded and starts executing.

That neatly answers the next question. How does that work if the page has lots of interaction? Will every little piece be downloaded on click, resulting in many tiny files? That's also a very good question. So the more complicated your application becomes, the more interactions you have, the better the Qwick will perform, actually. So the way this is solved is that when you run an application, you can observe real-world user behavior, and you can see that these buttons are clicked more often than these other ones. And then what you can do is you can go to the bundler and just give it a list of symbols and say, hey, I want to make sure that all of these symbols, which is basically these lazy-loaded functions, are together in a single chunk. And then the service worker knows that it can download these in specific orders. So it downloads the high-priority chunk first. So that the buttons that are most likely to be clicked are become immediately available first. And then it starts downloading all the other parts with the least-priority item at the bottom even though everything is eagerly downloaded. The result will be that if you click on a button that's more likely for you to click on, you will have almost no delay whatsoever. And in all cases, sorry, just to be clear, in all cases, the delay to interactivity will always be better than an equivalent hydration application. Like you cannot be in this hydration will always have more stuff to download and more stuff to execute than an equivalent, reasonable UI.

Maybe the Dutch audience get a bit grumpier after dinner, but there's some quite gnarly questions here just so you don't blame me. I'm merely a medium. If the service worker pre-caches everything to avoid stalling, then how is that better than loading all JavaScript together? Another good question. Right, so service worker doesn't actually pre-cache everything. The thing that when you run an application, what you do is you learn about the fact that a whole bunch of things never actually execute on the client.

11. Service Worker and Component Downloads

Short description:

The service worker only downloads the components that the user can interact with, significantly reducing the amount of code. Non-interactive components are not downloaded by the service worker.

So, as I've shown you the application, there was nothing that the user could have done to force a download of many of these components. You can observe that, and that information then gets fed into the service worker, and so the service worker will only download things that it has seen that it was possible to download in the development. In the worst possible situation, the service worker would have to download everything, which is what you're already doing with hydration. In the vast majority of cases, the service worker will end up downloading significantly less, like an order of magnitude less code. Because most of the components that you have actually are not interactive, they're just there for the purposes of layout. And because they're not interactive, there is no way that a click listener or any kind of interactivity can cause them to re-render and as a result, the service worker doesn't download them.

12. Error Code in Quick and SEO

Short description:

What is error code six in Quick? Is Quick web components friendly? Quick can resume because the server delivers HTML, making SEO possible.

Here's a question that means nothing to me, but it's there. Okay, go. What is error code six in Quick? I've had it a couple of times already. I have no idea. But it's good because somebody is using it. Perfect. I love it. I will look it up. So if you've got, I mean, obviously I'd never get errors in any of my code, but if you do get error six, even this guy doesn't know what it means. So it's your fault. Bad luck.

Is Quick web components friendly? There is, so Quick has what we call Quickify and in Quickify, you can for example, take existing React components and run them in Quick. I forgot to mention that on stage. So you can actually run existing React components inside of Quick. In the same way, you could make a Quickify for web components. We already have adapters for, you know, Views, Solid and other popular frameworks. So we could also have, in theory, web components. The hard thing about web components is that web components don't really have a good server side rendering story. And as a result, Quick kind of lives off the fact that the HTML delivered from the server already has everything present. And because web components don't have a good way of like pre-rendering it in the context of Quick, it's not really the best fit. You can still do it in a sense that you can eagerly hydrate the web components on the client on initial render. But then again, we're just going back into the hydration world.

On a related or I suspect related subject, given that everything's popping in only as and when needed, how does that play with the wonderful world of SEO? I refer, of course, to search engine optimists. Yeah, no. The SEO is actually a perfect thing for Quick because if you disable JavaScript in a Quick site, you're essentially seeing what SEO sees. You can see that. I invite you to go to and just browse around that site with JavaScript disabled, and you will see that. The SEO actually sees pretty much everything. The reason why Quick can resume is because the server delivered HTML. The fact that the server's delivering the HTML is also the same exact thing that makes SEO possible.

13. SEO, Debugging, and Framework Comparison

Short description:

SEO is crucial for Quick's resumability. Quick supports both SSR and SSG by storing HTML in a CDN. Debugging in Quick works as expected with source maps and breakpoints. QUIC is a fundamental rethink of web frameworks, focusing on resumability. Other frameworks like Marco and Wiz also embrace this idea, with Wiz powering Google Search and Google Photos. These frameworks prioritize interactivity and lazy execution of code.

SEO is really something that Quick really lives by because without that, resumability really isn't possible. Just to point out, you can store the HTML in CDN so Quick not only can do SSR, it can also do SSG. So both of these scenarios are perfectly possible.

Okay, this is an interesting question that hadn't occurred to me. So thank you, Ben, whoever you are. And this is probably the last question, but I'm sure that you can grab Mishko around and have a... I love questions. You can come talk to me afterwards. I have stickers. Yes, questions... So we really have loads of great questions. Great. But this is an interesting one. What about debugging? If JavaScript is downloaded only on-demand, what about breakpoints and debugger features, how do those work? It works just like you would expect. We have source maps. You obviously can't place a breakpoint until JavaScript is downloaded. Usually the way to solve this, you can click on a button first and then you place the breakpoint and then you refresh the browser and now you have the proper breakpoint in that sense. You might have to place breakpoints on a server side, because, well, it's a server and client execution model. So you need to be familiar with how debugging works on both the Node.js in the browser. But end of the day, you have source maps and it looks like regular code and you place breakpoints and for the most part, it's just like a regular thing.

I'm notoriously skeptical about the latest, greatest frameworks, but actually this sounds so exciting, I'm going to go and look at this. I think, I mean, I know I'm partial because it's, you know, my baby and you always love your babies, but I think QUIC is a fundamental rethink of what a web framework is. You know, every framework out there really does hydration. Well, I should say every but few. There is couple of them. Ebay has this thing called Marco. Marco also has Resumability and internally Google has a project called Wiz and Wiz is a framework that internally powers Google Search and Google Photos and one thing you can say about searches, it's fast and you never have to wait for Google Search to be kind of slow in terms of the interactivity. And before you say like, well, Google Search is not complicated, have you tried putting in a movie and you get a carousel or put an equation and you get a chart or a calculator that you get out of it. So there's actually quite a lot of interactivity that Google can do and it's powered with Wiz and while Wiz is not exactly the same thing as QUIC, it has the same exact idea of let's not start with hydration, instead, let's just leave some markers in the HTML so that when you interact, it's only then that the code gets lazy executed. Brilliant.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

React Advanced Conference 2022React Advanced Conference 2022
25 min
A Guide to React Rendering Behavior
Top Content
React is a library for "rendering" UI from components, but many users find themselves confused about how React rendering actually works. What do terms like "rendering", "reconciliation", "Fibers", and "committing" actually mean? When do renders happen? How does Context affect rendering, and how do libraries like Redux cause updates? In this talk, we'll clear up the confusion and provide a solid foundation for understanding when, why, and how React renders. We'll look at: - What "rendering" actually is - How React queues renders and the standard rendering behavior - How keys and component types are used in rendering - Techniques for optimizing render performance - How context usage affects rendering behavior| - How external libraries tie into React rendering
React Advanced Conference 2023React Advanced Conference 2023
33 min
React Compiler - Understanding Idiomatic React (React Forget)
React provides a contract to developers- uphold certain rules, and React can efficiently and correctly update the UI. In this talk we'll explore these rules in depth, understanding the reasoning behind them and how they unlock new directions such as automatic memoization. 
React Summit 2023React Summit 2023
32 min
Speeding Up Your React App With Less JavaScript
Too much JavaScript is getting you down? New frameworks promising no JavaScript look interesting, but you have an existing React application to maintain. What if Qwik React is your answer for faster applications startup and better user experience? Qwik React allows you to easily turn your React application into a collection of islands, which can be SSRed and delayed hydrated, and in some instances, hydration skipped altogether. And all of this in an incremental way without a rewrite.
React Summit 2023React Summit 2023
23 min
React Concurrency, Explained
React 18! Concurrent features! You might’ve already tried the new APIs like useTransition, or you might’ve just heard of them. But do you know how React 18 achieves the performance wins it brings with itself? In this talk, let’s peek under the hood of React 18’s performance features: - How React 18 lowers the time your page stays frozen (aka TBT) - What exactly happens in the main thread when you run useTransition() - What’s the catch with the improvements (there’s no free cake!), and why Vue.js and Preact straight refused to ship anything similar
JSNation 2022JSNation 2022
21 min
The Future of Performance Tooling
Top Content
Our understanding of performance & user-experience has heavily evolved over the years. Web Developer Tooling needs to similarly evolve to make sure it is user-centric, actionable and contextual where modern experiences are concerned. In this talk, Addy will walk you through Chrome and others have been thinking about this problem and what updates they've been making to performance tools to lower the friction for building great experiences on the web.
GraphQL Galaxy 2021GraphQL Galaxy 2021
32 min
From GraphQL Zero to GraphQL Hero with RedwoodJS
Top Content
We all love GraphQL, but it can be daunting to get a server up and running and keep your code organized, maintainable, and testable over the long term. No more! Come watch as I go from an empty directory to a fully fledged GraphQL API in minutes flat. Plus, see how easy it is to use and create directives to clean up your code even more. You're gonna love GraphQL even more once you make things Redwood Easy!

Workshops on related topic

React Summit 2023React Summit 2023
170 min
React Performance Debugging Masterclass
Featured WorkshopFree
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
React Summit 2023React Summit 2023
145 min
React at Scale with Nx
Featured WorkshopFree
We're going to be using Nx and some its plugins to accelerate the development of this app.
Some of the things you'll learn:- Generating a pristine Nx workspace- Generating frontend React apps and backend APIs inside your workspace, with pre-configured proxies- Creating shared libs for re-using code- Generating new routed components with all the routes pre-configured by Nx and ready to go- How to organize code in a monorepo- Easily move libs around your folder structure- Creating Storybook stories and e2e Cypress tests for your components
Table of contents: - Lab 1 - Generate an empty workspace- Lab 2 - Generate a React app- Lab 3 - Executors- Lab 3.1 - Migrations- Lab 4 - Generate a component lib- Lab 5 - Generate a utility lib- Lab 6 - Generate a route lib- Lab 7 - Add an Express API- Lab 8 - Displaying a full game in the routed game-detail component- Lab 9 - Generate a type lib that the API and frontend can share- Lab 10 - Generate Storybook stories for the shared ui component- Lab 11 - E2E test the shared component
JSNation 2023JSNation 2023
170 min
Building WebApps That Light Up the Internet with QwikCity
Featured WorkshopFree
Building instant-on web applications at scale have been elusive. Real-world sites need tracking, analytics, and complex user interfaces and interactions. We always start with the best intentions but end up with a less-than-ideal site.
QwikCity is a new meta-framework that allows you to build large-scale applications with constant startup-up performance. We will look at how to build a QwikCity application and what makes it unique. The workshop will show you how to set up a QwikCitp project. How routing works with layout. The demo application will fetch data and present it to the user in an editable form. And finally, how one can use authentication. All of the basic parts for any large-scale applications.
Along the way, we will also look at what makes Qwik unique, and how resumability enables constant startup performance no matter the application complexity.
React Day Berlin 2022React Day Berlin 2022
53 min
Next.js 13: Data Fetching Strategies
Top Content
- Introduction- Prerequisites for the workshop- Fetching strategies: fundamentals- Fetching strategies – hands-on: fetch API, cache (static VS dynamic), revalidate, suspense (parallel data fetching)- Test your build and serve it on Vercel- Future: Server components VS Client components- Workshop easter egg (unrelated to the topic, calling out accessibility)- Wrapping up
React Advanced Conference 2023React Advanced Conference 2023
148 min
React Performance Debugging
Ivan’s first attempts at performance debugging were chaotic. He would see a slow interaction, try a random optimization, see that it didn't help, and keep trying other optimizations until he found the right one (or gave up).
Back then, Ivan didn’t know how to use performance devtools well. He would do a recording in Chrome DevTools or React Profiler, poke around it, try clicking random things, and then close it in frustration a few minutes later. Now, Ivan knows exactly where and what to look for. And in this workshop, Ivan will teach you that too.
Here’s how this is going to work. We’ll take a slow app → debug it (using tools like Chrome DevTools, React Profiler, and why-did-you-render) → pinpoint the bottleneck → and then repeat, several times more. We won’t talk about the solutions (in 90% of the cases, it’s just the ol’ regular useMemo() or memo()). But we’ll talk about everything that comes before – and learn how to analyze any React performance problem, step by step.
(Note: This workshop is best suited for engineers who are already familiar with how useMemo() and memo() work – but want to get better at using the performance tools around React. Also, we’ll be covering interaction performance, not load speed, so you won’t hear a word about Lighthouse 🤐)
GraphQL Galaxy 2021GraphQL Galaxy 2021
164 min
Hard GraphQL Problems at Shopify
At Shopify scale, we solve some pretty hard problems. In this workshop, five different speakers will outline some of the challenges we’ve faced, and how we’ve overcome them.

Table of contents:
1 - The infamous "N+1" problem: Jonathan Baker - Let's talk about what it is, why it is a problem, and how Shopify handles it at scale across several GraphQL APIs.
2 - Contextualizing GraphQL APIs: Alex Ackerman - How and why we decided to use directives. I’ll share what directives are, which directives are available out of the box, and how to create custom directives.
3 - Faster GraphQL queries for mobile clients: Theo Ben Hassen - As your mobile app grows, so will your GraphQL queries. In this talk, I will go over diverse strategies to make your queries faster and more effective.
4 - Building tomorrow’s product today: Greg MacWilliam - How Shopify adopts future features in today’s code.
5 - Managing large APIs effectively: Rebecca Friedman - We have thousands of developers at Shopify. Let’s take a look at how we’re ensuring the quality and consistency of our GraphQL APIs with so many contributors.