Don’t Try This at Home: Synchronous I/O in Node.js

Rate this content
Bookmark

Node.js is famously a JavaScript runtime that encourages using asynchronous operations wherever possible – but what happens when you really, really need to do synchronous I/O? Anna gives an overview over the – surprisingly many – different ways to achieve this, and what we can learn about how the language and Node.js internals work from them.

32 min
24 Jun, 2021

Video Summary and Transcription

This Talk explores synchronous IO in Node.js and the advantages of asynchronous IO. It discusses exceptions to synchronous IO and approaches to achieving synchronous IO in Node.js, including using WASI and native add-ons. The Talk also covers mixing workers with atomics for synchronous IO and embedding Node.js to create a synchronous worker. Additionally, it touches on TypeScript migration, optimizations in Node.js, and experiences and advice on contributing to Node.js.

Available in Español

1. Introduction to Synchronous IO in Node.js

Short description:

Hi, everyone. I'm Anna and I'll be talking about synchronous IO in Node.js. I have a background in the Node.js Technical Steering Committee and now I'm part of the MongoDB DevTools team.

Hi, everyone. So I'm Anna and I'm going to be talking a bit about synchronous IO in Node.js. Before we get started on this, who am I? So I'm Anna, pronouns are she, her. I was previously on the Node.js Technical Steering Committee, so I was getting paid full-time to work on Node.js Core. So in September, I joined the MongoDB team, the DevTools team. And my handle, if you have any questions I want to reach out in some other way, is addalex on Twitter and GitHub, at least. And I'm also the mom of these two little cuties.

2. Synchronous vs Asynchronous IO in Node.js

Short description:

The synchronous and asynchronous ways of loading files in Node.js give the same result, but the synchronous way has performance issues. Initially, I expected the synchronous version to be slightly faster for a single call, but I discovered a bug in the async version that affected performance. However, the async version is now faster. The advantage of asynchronous IO is that multiple things can happen at the same time, allowing for concurrent operations.

But, yeah, so let's try to remember, like, the subtitle of this talk is don't try this at home. And, like, why don't we want to do synchronous I-O in Node.js? I'm pretty sure you've heard that you shouldn't, but why? So here's side-by-side. On the left-hand side you have the classical synchronous way of loading, of doing I-O, of loading a file from disc. On the right-hand side, you have an asynchronous way, and in this case, you could also say it's more modern because it's asynchronous. In the end these two give the same result. And, well, why don't we want the left-hand side? Why don't we want people to do what's on the left? The reason is performance. And before I get into details on that, so, like, if you benchmark a single fs.refi call, I tried to do that for this talk. What I would have expected to happen is that the synchronous version is slightly faster than the async version if you just do a single call because the synchronous version only has to go and read the file and return the results. And the async version, it actually has to schedule doing that and then waiting for the result to come back. So, I would have expected the async version to be a bit faster than the synchronous version. First of all, in the beginning that was true, but that was, like, it was so much faster that I actually discovered a bug in the fs.promises.refi that affected performance. And after fixing that, I don't know, the async version is faster for some reason. If you want, feel free to dig into this and tell me what's going on. But, yeah, generally, that is what I would have expected. But anyway, so the big advantage is multiple things can happen at the same time. That's the general idea of Node.js. You can do something, do something else, and wait for these two things to happen before you go on. And other things can happen also while those operations are ongoing. With synchronous, everything happens after one another. And while your process does this file loading, nothing else can happen. No JavaScript, no other IO, nothing.

3. Exceptions to Synchronous IO in Node.js

Short description:

There are exceptions to synchronous IO in Node.js. Loading code required and ESM import do synchronous file system IO, but it's not necessary. Writing asynchronous code is encouraged even if not strictly necessary. The third case is when synchronous code is needed, such as when an API or user interface requires it. I'm currently working on rewriting the Mongo CLI utility as a Node.js application called Mongosh. JavaScript applications are easier to maintain and we can embed Mongosh in Electron apps and web pages.

But yeah, so there are exceptions to when it is okay to do this. You should not. But some cases. So, loading code required and as far as I know, also ESM import, they do synchronous file system IO. For ESM, that's not something that's technically necessary. And I hope we can get away from that at some point. But yeah, because ESM loading is asynchronous anyway and you wouldn't necessarily notice if the file system was happening asynchronously.

Second case, you absolutely know that what you're doing is the right thing to do. For example, you're writing a CLI application where you, where there's a very limited set of things going on at a time and you know that nothing else is happening at the same time. I would still encourage you to write asynchronous code. Simply because that you should follow that best practice, even if it's not strictly necessary.

And the third case is you need synchronous code for some reason. And that might be because some API or some user facing interface exposes your code as synchronous and you have no other choice. And I'm going to be talking about that last case here.

What do I actually do at my job these days? So if you've ever worked with MongoDB, this might seem a little familiar. There's this Mongo CLI utility where you can pass something that looks like a URL and say, hey, connect to this server. Connect to this database. And then you get a shell where you can run commands such as in this case, db.test.find. It doesn't really matter for the stock that this is MongoDB. This might as well be the MySQL CLI and the command could be select asterisk from test, basically the same thing.

So the project that I'm currently working most on is actually rewriting this, it's something called Mongosh, depending on how you like to pronounce it. It pretty much does the same thing. You pass it as some URL-like thing and you can get a shell and run the same kind of command there. Why are we doing this? So Mongo, the old shell is basically what I've gotten used to calling it. That is a spider monkey based C++ application, so it is a C++ application that uses the JavaScript engine from Firefox to run JavaScript from the shell. So what we are writing, this Mongosh thing, this – sorry about that – it's a Node.js application. So first of all, JavaScript applications are a bit nicer to maintain. It's just it's a more high level language than C++, as much as I like C++. Node.js already has a great representation that we can build on top of, but we don't have to write all of this again. And we can even embed this in Electron apps and maybe web pages at some point.

4. Approaches to Synchronous IO in Node.js

Short description:

We're building on top of the Node.js driver and need to make the method do something synchronously. The easy way is using synchronous methods, but they don't cover network I/O. I believe the existence of synchronous file operations is a design flaw. To achieve synchronous I/O in Node, you can write C, Rust, or C++ code and compile it to Wasm.

We're actually doing that first thing. So there's this GUI for MongoDB, which is called Compass, which is also maintained by our team, and we embed this in an Electron app as a React component basically, which is pretty cool. But yeah, so this old shell, the way it works is you type some command like db.test.find, and it synchronously does I O, because there is no event loop, no nothing, no Node.js involved. There's no reason to do anything asynchronous.

But we are building on top of the Node.js driver. There's no just driver does network I O. You don't have synchronous network I O Node. It's just not there. And that was tricky because people shouldn't have to know about async await in order to be able to use our shell. People have written scripts for the old shell that ideally we want to keep working as much as possible.

So the question becomes, how do we make this method do something synchronously? That's what inspired me to give this talk. What are the different approaches that we could take here? So, first of all, there's the easy way of doing synchronous I O Node. Which is synchronous methods. They are just there. They are in the API. You have FS referencing which just does a synchronous file operation. Doesn't really solve our use case here, obviously. Because it doesn't cover network I O. And that's what we're mostly concerned about here. So, that is kind of a non-starter for us. And also, if you ask me, and this is just my personal opinion, the fact that this is even possible that these read file sync and similar operations are there, there's no good reason why FileSystem.io in libuv in the underlying library that supports Node is implemented the way it is. There's no good reason why it shouldn't work just like accessing network or SDIO streams or anything else. I think that's a design flaw, and we shouldn't be able to have these things in the first place. Obviously, they're not going away because millions of people are using them. But yeah.

So, then... and this is something that if you think about doing synchronous IO in Node, probably not going to be thinking about it in the first... Like, it's not going to pop into your head at first. What you can do is you can write C code, or Rust or C++ and compile it to Wasm. So, basically anything that Clang supports would work here.

5. Using WASI for Synchronous IO in Node.js

Short description:

You can use WASI, the Web Assembly System Interface, supported by Node.js, to run code synchronously. However, it is still experimental and not very useful for writing JavaScript due to the need to serialize data and use array buffers. It does support web network IO.

And then you can use WASI, which is the Web Assembly System Interface, which Node.js supports, and use that to run that code. And what that looks like in practice, so, on the left-hand side, there's a C file. It doesn't really matter what it does, but there's a lot of calls that start with F, which means file.io. On the right-hand side, you have the corresponding JavaScript. You can create these WASI objects. In Node.js, they are experimental, and you need to pass a flag in order to get them to work, but they are there and they are supported. And using these steps, you can actually run this code on the left, and it will work completely synchronously. I think this is a very cool thing to have, but it's also still very experimental, and it's currently not very useful for writing JavaScript because if you want to use this from JavaScript, you would have to act like a WebAssembly application, which means serializing everything that you want to pass to a call into an array buffer and routing it back afterwards. And it's just not ergonomic at all. This is not something we want to do. But as far as I know, this would support web network IO.

6. Foreign brute force for synchronous IO in Node.js

Short description:

You could write a native add-on that you load from Node.js to perform IO. However, reimplementing the whole Node.js networking stack just for synchronous IO would be too much work and not supported by libuview.

Then there's the foreign brute force, very straightforward Node UA, which is you write a C++ or Rust add-on, and it doesn't really require more than this. This would also be a working example except for there's boilerplate missing obviously. But you could do this. You could write a native add-on that you load from Node.js, and that performs the IO for you. This is also something that we don't want to reimplement the whole Node.js networking stack um just so that we can have synchronous IO in this. That would be far too much work. It's not something that libuview supports so we would have to come up with some clever ways of doing it in other ways. It would require rewriting so much code. We're not doing that.

7. Mixing Workers with Atomics

Short description:

And now, let's get to the exciting part: mixing workers with atomics. We create a message channel and a shared array buffer. The main thread starts a worker and waits on the shared array buffer. The worker receives data from the main thread and runs an async function using NodeFetch to load HTTP requests. After getting the response, it posts it back to the main thread using atomic cert notify.

And now, let's get to like the ways that I am more excited about personally. So this is my favorite, probably, because it's kind of production ready at this point and we might actually be able to use it in the future.

So what do we do? We mix workers with atomics and the reason this is the nuclear sign and not the atom emoji is that everybody would have just thought React.

So how does this work? Let's look at an example. Again, this is pretty much runnable code. The left-hand side, the main thread side is missing some imports. But it's basically working. What the main thread does, which, when it starts up, is it creates a message channel and a shared array buffer. And it needs a shared array buffer. We'll get to that in a second. And it starts a worker to which it passes one side of the communication channel and that shared array buffer. And then, it waits on that shared array buffer using atomics.wait. And in the worker that happens when it starts up. Well, it gets the data that it was sent from the main thread. And it runs an async function. And that is actually just doing things that are super familiar for you already. You've probably most of you have seen NodeFetch at some point. It's a very nice API for loading HTTP requests. And it's also very good for these examples because it's very straightforward to get to do IO with it. It's like three lines of code here. So what happens is we load it. We use a way to fetch it. A way to wait for the whole response body to get back to us. This is obviously missing error handling. But you know, you don't really do that on slides like these where there's limited space. And then what we do after we got that response, we post it back to the main thread and use atomic cert notify. It's shared memory between the main thread and the worker thread. This actually wakes up that wait call that happened in the main thread, which was blocking. So nothing else progressed in the main thread. It was still waiting at that line for somebody to call atomic cert notify on another thread.

8. Using Workers for Synchronous IO in Node.js

Short description:

And then on the main thread, we look at what we got, use the receive message from party API, and print out the response. This idea allows synchronous operations with some advantages. The main thread is blocked, spawns a worker thread, and waits for the response before progressing. Node.js offers the full API and NPM packages in the worker, but there are downsides. Atomics.wait is not allowed on main threads in browsers, and manipulating objects inside the worker is not easily possible. However, it's production-ready and can be used in worker threads.

And then on the main thread, we look at the we look at what we got, we use this receive message from party API, which is Node.Jet specific. You can emulate it on the web. But it's, like, it's a bit more convenient this way. And then we print out the response.

This general idea, I think this is pretty cool. It allows you to do things synchronously if you really need to. And it has some advantages. So, again, this is, like, how it looks schematically. The main thread is blocked and does not progress. It does not return to its own event loop. It just spawns a worker thread, lets that loop run, and then waits for that response to come back before it progresses in any way.

And so big advantages Node.js, you can use the full Node.js API and the NPM package in the worker. The small downsides are, so Athomics.wait is not allowed on main threads in browsers. Because if it were, that would look. Atomics.wait is a blocking call that does not allow anything to progress on the main thread. So, it will block rendering, for example, indefinitely, which is not something that should be allowed by a web page. And it still doesn't fully give us what we need, because it doesn't allow manipulating objects inside the worker. So, if you think about the fetch example, if we had wanted to, for example, add an event listener to the response object that we saw there, we could not have done that easily, because we there's no way to access these objects inside the worker. So, there would have to be some kind of RPC protocol that takes care of that. And yeah. Generally not very ergonomic in that way. But it's very cool. Very production ready. There was nothing experimental in what I showed. And you could use this, for example, inside a worker thread. So, Atomic does work inside of worker threads inside the browser. You could kind of do things like this. But, yeah. Anyway. None of these things really worked.

9. Embedding Node.js and Synchronous Worker

Short description:

I went to my evil scientist lab and thought about a solution for synchronous IO in Node.js. The idea is to embed Node.js into itself, starting a new instance on the same thread. This eliminates the need for separate threads and reduces complexity. I came up with a project called synchronous worker, which achieves the desired result.

So, what I did was I went to my evil scientist lab, I thought, so, I know Node.js very well. I'm very familiar with its internals. I should be able to come up with a solution for this, right? And so, yeah. Remember when I made workers? Which I didn't make them all by myself. Obviously other people were involved. But like this statement, it doesn't feel entirely inaccurate. And, well, yeah. Anyway.

So, back then I obviously gave talks about that, too. And so, one of the slides from back then is like, the idea behind that is to embed Node.js into itself, to start a new Node.js instance, just like the main thread, except on a different operating system thread. And it turns out, like, if you think about it a bit more, you don't even need a separate thread for this. You can do it on the same thread. This is something that, like, I have thought about this in the past for various reasons. For example, testing systems like tap or might want to run pieces of code inside of somewhat isolated environments. I had conversations about that. I thought about what could we do about the with about XSync and similar functions in Node.js in the child processes. So, the way that these are implemented in Node is they have entirely separate implementations from the async methods. There's no good reason for that. And I think with this, we could even get down complexity inside of Node.js quite a bit if it ever ended up in Node.js.

So, the idea is instead of having separate threads, the main thread event loop still runs. It gives us its callback. And inside the callback or during startup code, we start a new event loop, a new Node.js instance with its own event loop on the same thread. And until we're done with that, nothing else on the main thread progresses. And, so, I came up, it's a pandemic. I was a bit bored during the holidays. So, this is a project that I came up with. And the idea is, like, this is all you need to actually achieve what we want. So, you create a synchronous worker. That's what I call it because it's kind of like a worker and that it starts a new Node.js instance, but it's also like no multi-threading involved. So, it's synchronous.

10. Using Workers for Synchronous IO

Short description:

You can create a require function inside the worker that loads node fetch. This is a runnable example. There are a couple of downsides, such as it only works in Node.js and is currently implemented as a native add-on. However, it provides full event loop control and access to JavaScript objects. For Mongo's edge, we use Babel to transpile asynchronous code to sync code. Thank you for listening.

So, you can create a require function inside that node fetch, inside that that worker that loads node fetch. And then you have something, you have a fetch function that only runs inside the worker. And then you can do cool things like worker.runLoopUntilPromiseResolved and pass it a promise that was created inside this worker. You can do that twice to get the full text and you can print that out. And this is also, again, a runnable example.

There is a couple downsides. So, it's like this is Node.js only. It's currently implemented as a native add-on. So, browsers don't support this. And I think if they ever wanted to support that, it would be years until they actually got to it. It's also Node.js 15.5 and above only because there were some bug fixes that we needed to get into Node.js in the first place. You should consider various, feel free and feel encouraged to try some things out with it, but it's probably really easy to make the process crash using it. But you do get the full Node.js API, like in workers, you get full event loop control and you can access the JavaScript objects just like any other JavaScript object. I think that's pretty cool.

If you were wondering, so, like none of these things actually work for Mongo's edge because they all have drawbacks, as you saw. What we actually currently do is if we get input like this, where somebody tries to use the result of an asynchronous call synchronously, what we do is we use Babel to transpile it to async code and that works well enough for us. It also has some drawbacks, like it works on a best effort basis, we're using a weight in places where we think it should be applied and some language features are not supported, but overall, this works well enough for us currently. Thank you for listening. I'll upload the slides soon and if you have any questions or want to reach out at some point in the future, you can ping me on Twitter. Thank you. And that's it. Hello, hello, how are you doing? I'm good. I'm good. We're so happy to have you join us. Thank you so much. What did you think of the poll results? Honestly, I have to think about it for a bit because I have to go through my head to figure out what the yes and no exactly stand for. It sounds like at least the majority of people are using TypeScript and they are very happy with that. Yeah, I noticed that most people are just basically happy with what they're working with. But also I was just curious because I know people have very strong opinions about this sometimes and I don't know, I wanted to get a feeling of what people really think. Of course, of course.

11. TypeScript Migration and Optimizations in Node.js

Short description:

We use TypeScript at work and plan to convert legacy JavaScript code to TypeScript. The migration is not a top priority now, but we aim to do it on a file-per-file or project-per-project basis. We follow the concept of single-gear development, migrating as we touch each file. Regarding optimizations for SS read file in Node.js, I improved performance by changing the implementation to read the entire file at once. Being one of the top contributors to NodeJS has been a special experience.

What's your answer to this? What do you think? Yeah, well, we use TypeScript at work and we're very happy with that. Converting some of our legacy vanilla JavaScript code to TypeScript is kind of under long-term backlog. We want to do it at some point. Yeah.

How are you finding that migration, or what's the road plan for that? We don't have one yet, I think. It's just like, I mean, it's not like on the top of the priority list for us right now, but I hope it's going to be some way where we can just migrate things on a file-per-file or project-per-project basis somehow. Yeah, yeah, yeah. Yeah, I know what you mean. It's going to take somebody who's really spending a lot of time doing that. We have the company I work for, Buffer. We're right now working on this theory, I think coined by one of our engineers, but I'm not sure. His name is Mike Sanderman, and he's talking about single-gear development, and it's basically just like, as you go along, start—you know, the time you touch the file, that's when you do like the migration to the thing you want to migrate to. Yeah, I'm not sure how easy that would be with TypeScript, but yeah. So...

Interesting, interesting. Let's take a look at some of the questions for you. Somebody asked, which optimizations I made to... oh, which optimizations did you make to SS read file in Node.js while preparing to talk? Right. So, I can just... I can share... I'm going to share the link in the Q&A channel on Discord, but basically, for some reason, the promises.readfile implementation in Node read files small chunks of 16 kilobytes. There was no real reason why it would do that. Especially, because it did get the file size first. And when you do things like... when you know you're going to read the entire file, then you might as well allocate one big buffer that is large enough to hold that instead of reading small chunks over and over again. And so, that improves performance a lot, just changing that. That's so cool. Can you talk... in your bio you mentioned that you've been one of the top contributors to NodeJS in the last four years. Can you talk a little bit about how that experience has been for you? It's a very special experience.

12. Experiences and Advice on Contributing to Node.js

Short description:

I don't think it's one that many of us get to have, honestly. I'm really glad. I appreciate that a lot. It's just very different working on code that affects so many people and that is so visible in the community. I'm also not that sad that I'm not actively working on Node that much anymore. Do you have any advice for folks that are interested in starting to contribute? Start with what would I want to change about Node if I could. Or what is something that I know I could help with? And then focus on that instead of just looking for an easy contribution. What was your first change in Node? My first change was providing a test case for a bug that I found. It's not very exciting on its own, but it helped me work on my own project. I like the philosophy of making changes that are useful to you and probably useful to a lot of other people.

I don't think it's one that many of us get to have, honestly. I'm really glad. I appreciate that a lot. Yeah, it's just very different working on code that affects so many people and that is so visible in the community.

I'm also not that sad that I'm not actively working on Node that much anymore. It also contains a lot of long discussions with lots of people and lots of different opinions. It's okay. It is good to do something else for a change, honestly. Four years is a really long time.

Do you have any advice for folks that are interested in starting to contribute? I mean, maybe ideally don't start with I want to contribute, but start with what would I want to change about Node if I could. Or what is something that I know I could help with? And then focus on that instead of just looking for like, hey, so, what would an easy contribution be? Because there's always like some tiny style change that you can do. But also like if you look for something that you want to do, you're going to get a bigger sense of accomplishment. And also, like, just something that is more specific to what you personally want to see change. I mean, that's how I guess. Yeah.

What was your first change in Node? Do you remember it? Yeah. I think I do. So, like, I think technically my first change was providing just a test case for some bug that I found. So, not that exciting, but like a lot of the first changes that I made, oh, yeah, my first change to Node, it's adding type checking to set timeout and the other time as functions. And it was like, it's not very exciting on its own, but it's like, hey, so, like, this helped me work on my own project that I was doing at the time. So. So, that's really cool. I like that. I like that philosophy. You know? Make the changes that you need or that are useful to you. Probably useful to a lot of other people. Right. Yeah. Yeah. Exactly.

13. Improving Node.js with Supported Official API

Short description:

That's how open source works. You do things that are good for you and hope that somebody else also finds them helpful. A lot around tracing and like acing operations in Node. If we get everything inside Node on one format, we could provide a supported official API that gives you all things that currently keep the event group alive. It would be really nice to have something inside of Node that is a fully supported public API. Somebody sufficiently motivated could definitely do that.

That's how open source works. You do things that are good for you and hope that somebody else also finds them helpful. Did you have anything in your, I don't know, in your personal backlog that you wish you had worked on that you wish someone would work on right now?

Yeah, there's a couple things that like, that I wish I would have done or like that I wish somebody would do. A lot around like, tracing and like acing operations in Node. I know that there are some very good, very talented and engaged people working on that. But like, there's some areas of the code that were just like, you know, if we get everything inside Node on like one format, we could do some really cool things, like, for example, provide a supported official API that gives you all things that currently keep the event group alive. There's some internal ones, and they return some internal objects and it's very tricky. And there's people who have built npm modules on top of that. But it would be really nice to have just like, something inside of Node that actually, you know, is a fully supported public API. And I think somebody sufficiently motivated could definitely do that. So if you're watching this live stream, get the motivation, there's something that you can do to improve this environment for lots of folks.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Node Congress 2022Node Congress 2022
26 min
It's a Jungle Out There: What's Really Going on Inside Your Node_Modules Folder
Top Content
Do you know what’s really going on in your node_modules folder? Software supply chain attacks have exploded over the past 12 months and they’re only accelerating in 2022 and beyond. We’ll dive into examples of recent supply chain attacks and what concrete steps you can take to protect your team from this emerging threat.
You can check the slides for Feross' talk here.
Node Congress 2022Node Congress 2022
34 min
Out of the Box Node.js Diagnostics
In the early years of Node.js, diagnostics and debugging were considerable pain points. Modern versions of Node have improved considerably in these areas. Features like async stack traces, heap snapshots, and CPU profiling no longer require third party modules or modifications to application source code. This talk explores the various diagnostic features that have recently been built into Node.
You can check the slides for Colin's talk here. 
JSNation 2023JSNation 2023
22 min
ESM Loaders: Enhancing Module Loading in Node.js
Native ESM support for Node.js was a chance for the Node.js project to release official support for enhancing the module loading experience, to enable use cases such as on the fly transpilation, module stubbing, support for loading modules from HTTP, and monitoring.
While CommonJS has support for all this, it was never officially supported and was done by hacking into the Node.js runtime code. ESM has fixed all this. We will look at the architecture of ESM loading in Node.js, and discuss the loader API that supports enhancing it. We will also look into advanced features such as loader chaining and off thread execution.
JSNation Live 2021JSNation Live 2021
19 min
Multithreaded Logging with Pino
Top Content
Almost every developer thinks that adding one more log line would not decrease the performance of their server... until logging becomes the biggest bottleneck for their systems! We created one of the fastest JSON loggers for Node.js: pino. One of our key decisions was to remove all "transport" to another process (or infrastructure): it reduced both CPU and memory consumption, removing any bottleneck from logging. However, this created friction and lowered the developer experience of using Pino and in-process transports is the most asked feature our user.In the upcoming version 7, we will solve this problem and increase throughput at the same time: we are introducing pino.transport() to start a worker thread that you can use to transfer your logs safely to other destinations, without sacrificing neither performance nor the developer experience.

Workshops on related topic

Node Congress 2023Node Congress 2023
109 min
Node.js Masterclass
Workshop
Have you ever struggled with designing and structuring your Node.js applications? Building applications that are well organised, testable and extendable is not always easy. It can often turn out to be a lot more complicated than you expect it to be. In this live event Matteo will show you how he builds Node.js applications from scratch. You’ll learn how he approaches application design, and the philosophies that he applies to create modular, maintainable and effective applications.

Level: intermediate
Node Congress 2023Node Congress 2023
63 min
0 to Auth in an Hour Using NodeJS SDK
WorkshopFree
Passwordless authentication may seem complex, but it is simple to add it to any app using the right tool.
We will enhance a full-stack JS application (Node.JS backend + React frontend) to authenticate users with OAuth (social login) and One Time Passwords (email), including:- User authentication - Managing user interactions, returning session / refresh JWTs- Session management and validation - Storing the session for subsequent client requests, validating / refreshing sessions
At the end of the workshop, we will also touch on another approach to code authentication using frontend Descope Flows (drag-and-drop workflows), while keeping only session validation in the backend. With this, we will also show how easy it is to enable biometrics and other passwordless authentication methods.
Table of contents- A quick intro to core authentication concepts- Coding- Why passwordless matters
Prerequisites- IDE for your choice- Node 18 or higher
JSNation 2023JSNation 2023
104 min
Build and Deploy a Backend With Fastify & Platformatic
WorkshopFree
Platformatic allows you to rapidly develop GraphQL and REST APIs with minimal effort. The best part is that it also allows you to unleash the full potential of Node.js and Fastify whenever you need to. You can fully customise a Platformatic application by writing your own additional features and plugins. In the workshop, we’ll cover both our Open Source modules and our Cloud offering:- Platformatic OSS (open-source software) — Tools and libraries for rapidly building robust applications with Node.js (https://oss.platformatic.dev/).- Platformatic Cloud (currently in beta) — Our hosting platform that includes features such as preview apps, built-in metrics and integration with your Git flow (https://platformatic.dev/). 
In this workshop you'll learn how to develop APIs with Fastify and deploy them to the Platformatic Cloud.
JSNation Live 2021JSNation Live 2021
156 min
Building a Hyper Fast Web Server with Deno
WorkshopFree
Deno 1.9 introduced a new web server API that takes advantage of Hyper, a fast and correct HTTP implementation for Rust. Using this API instead of the std/http implementation increases performance and provides support for HTTP2. In this workshop, learn how to create a web server utilizing Hyper under the hood and boost the performance for your web apps.
React Summit 2022React Summit 2022
164 min
GraphQL - From Zero to Hero in 3 hours
Workshop
How to build a fullstack GraphQL application (Postgres + NestJs + React) in the shortest time possible.
All beginnings are hard. Even harder than choosing the technology is often developing a suitable architecture. Especially when it comes to GraphQL.
In this workshop, you will get a variety of best practices that you would normally have to work through over a number of projects - all in just three hours.
If you've always wanted to participate in a hackathon to get something up and running in the shortest amount of time - then take an active part in this workshop, and participate in the thought processes of the trainer.
TestJS Summit 2023TestJS Summit 2023
78 min
Mastering Node.js Test Runner
Workshop
Node.js test runner is modern, fast, and doesn't require additional libraries, but understanding and using it well can be tricky. You will learn how to use Node.js test runner to its full potential. We'll show you how it compares to other tools, how to set it up, and how to run your tests effectively. During the workshop, we'll do exercises to help you get comfortable with filtering, using native assertions, running tests in parallel, using CLI, and more. We'll also talk about working with TypeScript, making custom reports, and code coverage.