Towards a Standard Library for JavaScript Runtimes

Rate this content
Bookmark

You can check the slides for James' talk here.

34 min
17 Feb, 2022

Video Summary and Transcription

There is a need for a standard library of APIs for JavaScript runtimes, as there are currently multiple ways to perform fundamental tasks like base64 encoding. JavaScript runtimes have historically lacked a standard library, causing friction and difficulty for developers. The idea of a small core has both benefits and drawbacks, with some runtimes abusing it to limit innovation. There is a misalignment between Node and web browsers in terms of functionality and API standards. The proposal is to involve browser developers in conversations about API standardization and to create a common standard library for JavaScript runtimes.

Available in Español

1. Introduction to Node.js API Standardization

Short description:

Hello, Node Congress. I am James Snell, or JA Snell on GitHub or Twitter. I've been contributing to Node for almost seven years. We've had ongoing debates about Node's API standardization and its alignment with web platform standards. Other runtimes like Deno and Cloudflare Workers have their own APIs. We need a standard library of APIs for JavaScript runtimes. Let's kick it off with a discussion on base64 encoding in Node.

Hello, Node Congress. I am James Snell, or JA Snell on GitHub or Twitter. And I am with CloudFlare. I'm also on the node technical steering committee. I've been contributing to Node for almost seven years now. Started back in 2015, right when NodeJS and IOJS were having their issues and got them back together. So they've been around for a while.

And in that time, there's been some themes that have developed over time. You know, we have all the node APIs that are out there. You know, we've added a bunch of new features to node. We added HTTP2 support and new URL parsing. Recently, we just added support for web streams, so the readable stream and writable stream API. I think two years ago, we implemented the Web Crypto API. So, you know, there's a lot that's happened. And over that time, you know, we've had this kind of ongoing debate and ongoing conversation about just how standardized Node's API's should be or how much of the, you know, web platform standards Node should be paying attention to.

Well, you know, since that time, you know, we've had other runtimes that have come up. We've had Deno, you know, which is a fabulous, fantastic platform. But it has, you know, its own set of APIs. We have Cloudflare Workers, one of the runtimes I'm working on right now. You know, and it has its set of APIs and things that it does. And, you know, there are other environments you can, you know, look at Fastly or look at, you know, some of the IoT devices, you know, there's a bunch of places where JavaScript is being used now. So, yeah, you have to stop and think, you know, at some point, it's good to always kind of step back and think, what kind of APIs should we be implementing? Well, I'm here, you know, after, you know, seven years of doing this, you know, here with a bit of a modest proposal. We need a standard library of APIs for JavaScript runtimes. I'm going to talk a little bit about, you know, why I'm thinking that and, you know, kind of what I'd like to see. So, let's kick it off.

Here's a puzzle. You know, base64 encoding is something that is very common to many applications. We see it everywhere. In Node, you know, we've always had this, you know, buffer from hello to string base64. We can, you know, take a base64 encoded string and get our buffer back out from it.

2. Base64 Encoding in JavaScript

Short description:

What is the right way to base64 encode data from JavaScript? The answer is all of them. Node's API buffer from, Deno's API, or something from NPM. However, having so many different ways to do something fundamental adds friction and difficulty to developers. Base64.js module is widely used, even though Node has a built-in option. None of these are the right way because there's always a different way of doing it.

But, you know, what is the right way to base64 encode data from JavaScript, right? There's this, you know, people that have been developing the browser for the longest time, you know, are probably familiar with this B2A function. Is it the right way? I mean, it's the only standard we have for base64 encoding, but honestly, it kind of sucks. It doesn't handle everything you need and it just kind of works okay. But we really can't change it because of backwards compatibility and all that kind of stuff.

So, what is the right way to do base64 encoding in JavaScript? Is it Node's API buffer from? Is it Deno's API, you know, where you import this encode function from their standard library? Or is it, you know, is the answer something out on NPM, right? Where, you know, you have to go out and, you know, NPM install some module once you find it, once you find the right one. And hopefully, you find one that is, you know, well maintained, has an active, you know, at least, you know, an active contributor that will keep it up to date and keep it moving and ensure that it keeps working with Node and different Node versions or different Deno versions or, you know, other runtimes.

You know, and what is, you know, once you have the basic mechanism, you know, which API here is the correct one? Unfortunately, the answer is all of them. All of them are the correct way, you know, but having so many different ways to do something that is so fundamental it just adds friction, adds difficulty to developers who are writing code that runs on the web. Amazingly, this fourth option, this Base64.js module, you know, has over 30 million downloads per week from NPM and has over 1,300 dependents. And a lot of those are Node applications, despite the fact that Node has this built in, right? But also a large number of those dependencies are, you know, browsers, browser applications that, you know, that all they have to rely on built in is B2A. And, you know, B2A itself is lacking in quite a bit of functionality. So, you know, the answer is, you know, all of these are the right way, but none of them are the right way because there's always a different way of doing it.

3. Challenges with Encoding and Binary Operations

Short description:

What if we wanted to have Base64 URL encoding? Why is there no single API for hex encoding in JavaScript? Is there a standard API for comparing subsets of binary ranges? How do different platforms handle accumulating stream data? JavaScript runtimes have historically been anti-standard library.

You know, what if we wanted to have Base64 URL encoding, right? Which is a variance of Base64. Well, it turns out, you know, even that, you know, from Base64 to Base64 URL, there are differences in APIs. Not all the options work. You know, sometimes, you know, we have to change a few things. Fortunately, both Deno and Node have this built in as part of their, you know, core libraries. And then at least those will work consistently with each other. But you still, you know, start having a lot of these options.

So, it comes up to a very important question. Why is something so common and so widely used in JavaScript have so many different ways of doing it? And why should it matter which JavaScript runtime I'm using? Why should I, as a developer, writing JavaScript need to know, okay, is this code going to run in Node or is it going to run in DNO? Which one of these basics for encoding libraries, APIs do I need to use? Oh, wait, now my company's priority has shifted, they're investing now in edge computing. Now it's going to go over to Cloudflare workers. Which one of these APIs will work there? Or how do I get it to work there? Or do I need to change all of my code that does Base64 coding to do something brand new? That's ridiculous, right? That something so common has to be such a difficult time-consuming conversation to just Base64 and a little bit of text.

What about hex encoding? What are the APIs for that? Node does it one way, and, again, it's consistent with the way it does Base64. Deno does it its own way, which is consistent with its APIs. There's nothing you can use on the browser unless you find some, you know, random module on npm, and, again, hope that it's being maintained. But how often do we come across hex encoding? It's everywhere. We see this all over the place. You know, so, again, why is it so difficult find a single API for doing hex encoding in JavaScript? It doesn't make any sense.

Alright, what if we want to look at other functionality, comparing subsets of two different binary ranges, right? Say if you want to, you know, see if one binary sequence is embedded in another binary sequence. What's the API for doing this? Is there a standard API for doing this? Well, you know, unfortunately, typed arrays and JavaScript don't do this. They don't have, you know, comparing subsets of different binary ranges. All you can do is look at an individual member, an individual item, in a typed array and compare it to another individual item in another type array. That's all you can do. You know, and unfortunately, Node and Dino, two completely different ways of doing this, and you know, it just doesn't make any sense.

What if we're accumulating stream data into a simple stream, right? We want to, you know, stream some data in, build a string from it. How do we do that in the different platforms? Again, different APIs. It works differently. It doesn't need to be. JavaScript runtimes have historically been very anti-standard library. This thought really, you know, you know, stems from, you know, this idea, we'll explore it a little bit, you know, the small core idea. You know, the underlying principal here, I mean, the idea behind it, the theory behind it is that, you know, just let the ecosystem, you know, all of you, all the developers, do whatever you want to do, right? Pick whatever dependency works for you.

4. Challenges with Small Core Approach

Short description:

Make whatever choice fits best for your situation. Keep the runtimes small and unopinionated. Unfortunately, this approach makes development harder. You have to find and select implementations, ensure maintenance and security, and deal with compatibility issues. The lack of a standard library for JavaScript has caused friction and headaches for developers.

Make whatever choice, you know, fits the best for your particular situation. And keep the runtimes small, right? You know, keep them unopinionated. Just implement the bare minimum that the language provides. Unfortunately, that doesn't make life any easier for the developer. It makes things quite a bit harder. You have to find the implementation of these things. You have to select which one of several you're gonna use. Make sure that it's being maintained. Make sure it doesn't introduce any security risks. It is open source software. There are examples of, you know, modules out on NPM being hijacked. You know, having security vulnerabilities injected into them. There's a risk of it not being kept up to date with a particular version of Node or Deno or workers or whatever and something breaks. You know, there were some modules that fairly recently in server-side rendering libraries that are built on an old model of native add-on APIs for Node. Well, that old model is no longer supported and, you know, once you get past Node 16 on, they don't work as well or at all. You know, when you start picking up these APIs, how do you know they're going to continue to work as the run times evolve? Right? So this idea of small core, we're going to let the developers decide is basically just a different way of saying we're going to make this the developers' problem. You know, we're not going to think about it. We're going to force you guys to think about it. So this idea of, you know, no standard library for JavaScript has actually caused quite a bit of friction. Now granted, it has allowed a lot of evolution, faster cycles, a lot of experimentation, and really fast growth. But it just fundamentally ends up adding headache to most developers.

5. The Idea of Small-Core

Short description:

The idea of small-core is to provide minimal APIs and rely on external sources for additional functionality. This approach makes runtimes like Node smaller and easier to maintain, but it shifts the responsibility of finding missing functionality to developers.

Alright, so this idea of small-core. Again, what this basically means is that the APIs that your runtime provide you are as minimal as possible. Essentially the idea is, only do what JavaScript does, plus a little bit more. And then whatever is missing, you would just go out to npm or some other registry somewhere, some website somewhere, or some other GitHub repo or wherever you can find it, and just find that additional missing functionality somewhere else. This allows runtimes like Node to be smaller, it gives us fewer things to maintain over time. But again, just kicks the can down the road a little bit, and says it's not our problem, it's your problem, if you need some bit of functionality that's not there.

6. Abuse of Small Core in Runtimes

Short description:

The idea of small core has been abused and misinterpreted to limit innovation in JavaScript runtimes. Whenever a standard API would come out, there was always resistance to adding these new things into the runtime.

The idea of small core has been abused and misinterpreted to limit innovation in JavaScript runtimes. Whenever a standard API would come out, for instance, with a Node, there was always this argument, nah, nah, that can be implemented in the ecosystem, we don't need it here. An example of that would be the web crypto API, Node has a crypto API, it's had it for forever, and then the W3C developed this web crypto API, which does a lot of the exact same things, but in a completely different way, than the Node's API. Developers started using it and started asking, hey Node, can you implement web crypto? And the response was no, small core, small core, if you want that, you go off and implement it yourself. Here's native add-ons, here's ways of doing it, but we're not going to do that. And it ended up limiting the compatibility between Node and browser run times and other run times. It's where we are still fighting interoperability issues between these things today, now that we have web crypto in Node, but it was that that resistance that, you know, that, that constant fighting uphill battle for trying to get these things in into Node to evolve that platform. And then you see something like Dino come along and they're like, you know what, hey, we're going all in on web standards, we're doing web streams, we're doing web crypto right from the start. Developers look at that and say, well, hey, they did it at Node, why can't you do this? And again, it goes back to small core, right? You know, there was this constant, years and years and years of resistance to adding these new things into the runtime.

7. Misalignment Between Node and Web Browsers

Short description:

Node and web browsers have a lot of overlap in functionality, including working with streaming data, binary data, and crypto. However, these platforms do things in completely different ways, causing friction for developers. The conflict between Node and web browsers has been ongoing, with each side insisting on doing things their own way. This lack of alignment has led to the creation of Node-specific APIs that are widely used across different platforms.

There's another argument and this is really from day one of my involvement with Node, and I've been hearing this, this Node is not a web browser, it shouldn't act like one, it shouldn't have these APIs that exist in the browser. They do different things and honestly, they're a lot closer than it may appear.

Yeah, Node doesn't have a rendering process, it's not going to go out and parse HTML and render it in a window. There is a lot of stuff that Node does that web browsers don't, and there's a lot of stuff that web browsers do that Node doesn't. So, you know, yes, the statement is factually true, but it ignores a very fundamental concept that Node and web browsers do a lot of the exact same things. They both are used to implement applications on the web. They both need to work with streaming data. They both need to work with binary data. They both need to do crypto. There is a huge amount of overlap in functionality between these environments, and I'm not just talking just Node, you know, this includes Deno, this includes environments like Cloudflare Workers. There's a huge amount of overlap between these platforms. So why are these platforms doing things in completely different ways, you know, even when they're on the areas of overlap? It doesn't make any sense.

The argument has been used throughout the history to basically say it's okay for Node to do everything its own way. Even if developers are doing those exact same things in browsers. To be fair, web browsers have not really shown that they care much about Server and Edge developers either. Web browsers will basically get together and decide on an API and they'll go out and say this is the standard for this. And all of you developers go out and use that. And when, you know, those of us that are working on Node and Edge and say, well, wait, that doesn't really work great or we already have an API for this. Why didn't you just use that? The response typically has been we don't care what you're doing. This is what browsers are going to do. It's gotten a little better here recently, but that's, you know, historically that's, you know, there's been this conflict back and forth between these environments, you know, for quite some time. And it's really coming up to this idea. Well, you know, yeah, it's all JavaScript, but we're going to do it a different way. Yes, you know, we're both runtimes running on a server somewhere, but we're going to do it the Deno way or we're going to do it the Node way. And it really doesn't make any sense. And it just causes friction for developers. One example of this, the readable stream module on NPM has 105 million downloads per week with over 3,100 dependents. It works in Node, it works in Deno, it works on Cloud for Workers and every major browser. But it's a Node specific API. You know, but it's used everywhere in many different applications.

8. Challenges with API Standardization

Short description:

Even if you've never touched Node, you probably use an application that is using the readable stream module. Every runtime's choice to do its own thing has very real costs and causes very real pain for developers. We shouldn't pick a winner is basically a concept within Node that says we shouldn't decide, we being the Node core developers, we shouldn't decide which of these modules out on npm people should use. It's a fundamental problem and it's not one that we should take lightly. We need to be opinionated. Being opinionated is good. In the runtime, being opinionated about what APIs are best is what you need to, it's something that the runtimes themselves need to do. We should absolutely pick a winner when it comes to APIs if we don't developers suffer.

Even if you've never touched Node, you probably use an application that is using the readable stream module. Every runtime's choice to do its own thing has very real costs and causes very real pain for developers. You know, so the readable stream module, right? When we decide to change an API in Node, that module has to change that cause, you know, and that trickles down to everyone else who's using it, even if they are not using Node, right?

The readable stream module, the decisions we make about what other dependencies in Node are required trickle down. So, you know, readable streams uses the buffer API, which means you can't just pull in the readable stream API into Node, or into Deno or any other environment. You also have to pull buffer in. Readable stream is also based on Node's idea of event emitter. Which means you have to pull in event emitter into Deno and, you know, workers and browsers. So, you know, this choice that Node made to have this API, and the fact that so many people are using it, even if they are never even if they've never touched Node, it drags on a huge amount of additional work and additional pain and additional effort for developers pretty much everywhere. Whether you're using Node or not.

So, there's another idea here that we shouldn't pick a winner. So, backing up what this means is, you know, so Node is a runtime and there's this ecosystem of developers who are creating stuff on that runtime. And they publish those things to npm. This idea of we shouldn't pick a winner is basically a concept within Node that says we shouldn't decide, we being the Node core developers, we shouldn't decide which of these modules out on npm people should use. If there are two different implementations of a web framework, for instance, you know, we shouldn't have an opinion about which one is better or we shouldn't, you know, advantage one over the other. Basically, just let the market decide, let the developers decide. All we're doing is providing a platform upon which these things will run.

The idea of this, I mean it's fine in concept, except when it just ends up adding friction to developers. And the friction I'm talking about is deciding which one of these you actually should use, which one of these is going to be maintained long term, which one of these is actually going to be the safest investment for your application. And how are you protected against making the wrong choice? How do you know you're not going to end up having to rewrite your application or things are going to stop working or you're going to be stuck on an older version of Node that might have security vulnerabilities because you can't update because it requires half your application to write, not to be rewritten. It's a fundamental problem and it's not one that we should take lightly. All right, so I'm going to say just right off, I'm sorry, this idea is wrong. It's something that we messed up on with the Node. It's something that I had, it's a philosophy I had bought into. But as time has gone, I've come to realize that this entire idea is wrong. We need to be opinionated. Being opinionated is good. In the runtime, being opinionated about what APIs are best is what you need to, it's something that the runtimes themselves need to do. When there is a common need for something, runtimes should favor a single consistent common API while encouraging competition in the implementation of it. We should absolutely pick a winner when it comes to APIs if we don't developers suffer. A case study of this is the nodeBuffer API.

9. Buffer API Standardization

Short description:

Buffer is a node-specific API that predates type arrays. It was the only API for working with raw binary data in JavaScript. Today, buffer extends Uint8Array, but there are differences in functionality. The buffer API should be standardized to avoid decisions being made solely by node contributors.

Buffer is a node-specific API, it predates type arrays. When buffer was added it was the only API for working with raw binary data in JavaScript. Today buffer extends Uint8Array, kind of. Buffer slice works differently. You have buffer includes, or it's searching sub-ranges while Uint8Array does not. ToString works differently between buffer and Uint8Array. There are probably more applications using buffer in the world than there are Uint8Array. The buffer polyfill alone on npm has 44 million downloads a week so this is this is a significant API. But it's entirely driven by node's decisions. The need to work with binary is not limited to node, but all of the decisions on what happens with that API are made only by node impacting millions of developers including those of you who aren't using node at all. And that seems wrong. That definitely feels wrong to me. The buffer API should be standardized. Decisions that impact the API should not be made by node contributors alone even though buffer originated as a node API.

10. Proposal for API Standardization

Short description:

JavaScript runtimes should not introduce new runtime specific APIs unless absolutely necessary. There should be a conversation between different runtimes to determine if new functionality is needed and how to implement it consistently. Browsers are already doing this through organizations like the W3C, and it's time for Node and other runtimes to catch up and involve browser developers in these conversations.

All right so I have a modest proposal. JavaScript runtimes should not introduce new runtime specific APIs unless absolutely necessary. What this means is you know anytime there is some bit of new functionality that needs to be added, there should be a conversation between node and dino and all these different runtimes and say okay you know is this something we should do? Is this something developers need? How do we do this in a way consistent across all these runtimes? Browsers are basically already doing this through venues like the W3C and oneWG. It's well past time for node, dino, and others to catch up and time for browser developers to care more about servers and edge developers also and participate in these conversations as well.

11. Runtime-Specific APIs and Node vs Web Streams

Short description:

Runtime specific APIs that have to be polyfilled and other environments are a mistake. Node streams and web streams are both complex APIs. Node streams are widely used and complicated, but still work well. Web streams have a similar level of complexity and perform the same tasks. The decisions made by Node and the what WG impact developers worldwide.

It doesn't matter which runtime we're talking about. Runtime specific APIs that have to be polyfilled and other environments are a mistake right? If they are if they need to be in these other environments, if developers need this functionality in multiple places, then there should be a standard API. And it should be common.

Node streams versus web streams. Here's a great example of this. Node streams predate the what W3C stream standard. It's been around for years. There are three different versions of node streams in one implementation. It's one of the oldest most complicated APIs in node. And it's also one of the most broadly used across the entire ecosystem. Node streams are an overly complicated mess of code and APIs that are horrible to use, but still work really, really well.

Web streams. Look at the complexity of node streams instead of hold my beer. It looks like it might be an easier API. But under the covers, it is actually a very, very complicated API. Web streams aren't any less complicated than node streams. They have about the same level of complexity, they do the same things. An argument could be made that web streams never needed to exist. You just did it the way node did it. But, you know, we are where we are today. We have two different streams API models in use in JavaScript. Node streams are faster than web streams, but they both do exactly the same thing. Node controls all the decisions on what changes are made to node streams. All right. And again, like I said, those decisions impact millions of developers around the world, whether they use node or not. The what WG on the other hand controls all decisions on changes to web streams and what WG is not just a single organization. It's an open process for participating here. But, what WG explicitly prioritizes the needs of web browsers over all other platforms. It's caused issues. All right.

12. Cooperation and Standard Streams API

Short description:

The standard streams API for JavaScript should be web streams, regardless of the popularity of node streams. This is the direction we should be heading.

There needs to be more cooperation. So, you know, let's take a step back. What's the Deno streams API? It's web streams. Right. What is the CloudFlowerWorker streams API? Right. It's web streams. What is the browser streams API? It's web streams. Right. What should the standard streams API for JavaScript B, no matter how much kicking and streaming node developers may do about it, it should be web streams. Okay. And async iterators. That's a different story. That's a different conversation to have. But, it doesn't matter that node streams are so popular. The standard, the one that everyone should be using, is web streams. And that's the direction we should be heading.

13. The Deano Standard Library

Short description:

They call it a standard library. It'd be better if it wasn't specific to Deano. Adding standard APIs does not mean changing or removing existing ones. Node should follow Dino's example of having a standard library of common APIs that are not necessarily built into the runtime. All JavaScript runtimes should share a common standard library of APIs that work consistently across multiple platforms.

The Deano standard library. I just gotta make a point. They call it a standard library. It's really just set of Deano-specific APIs. They call it a standard library for Deano itself. Brilliant idea. It'd be better if it wasn't specific to Deano. And if you look through the implementation, there are a number of places where it's very specific to Deano. You know, it's something that we can do better, right?

It's something that we can continue to iterate on. One key question here, what about backwards compatibility? Adding standard APIs does not mean changing or removing existing ones. A good example of this is URL parse and NewURL in Node. Both of these still exist in Node. NewURL is the one you should be using. But URL parse is not going anywhere. It's still there. It just has a bunch of bugs and some security concerns. And there's lots of reasons you shouldn't use it. But if you do use it, it'll continue to work. Adding these new things does not mean getting rid of the old stuff, right?

But Node should follow Dino's example of having a standard library of common APIs that are not necessarily built into the runtime, right? You know, these still might be things you install through NPM. But they are blessed by Node. And they implement common APIs that also work in Dino, that also work in Edge, Edge environments like workers, that work in any of these runtimes, right? They're based on standard APIs that work consistently across multiple platforms. These should not be specific to Node in any way, right? They should not be specific to Dino in any way. All JavaScript runtimes should share a common standard library of APIs that work consistently, that aren't necessarily part of the language, but still provide common functionality on top. What functionality are we talking about? There's quite a bit. There's quite a bit here. And definitely most of these do not belong actually in the language itself. But they are things that we can work on together to drive in the platform. So what do we need? And this is the crux of the modest proposal that I'm talking about. A new collaborative open-source effort, preferably driven by contributors to both Node, Deno, and other JavaScript runtimes like CloudFlare workers. Develop a common library of APIs.

QnA

Standard Library and Node Core Contributors

Short description:

Take what Deno has started with their standard library and expand it to a cross-platform, multi-runtime environment. The underlying implementations can be different, but the APIs should be clear, consistent, and common across all environments. The poll results show that a third of respondents are unsure about the APIs being used. Now, let's move to the questions. Are any Node core members part of W3C W3C, WG or TC39? Node itself cannot join these organizations, but individual Node core contributors are active in TC39 W3C.

So take basically what Deno has started with their standard library, which exists as a separate repo and separately installable pieces. Let's start there, perhaps. Expand that out to something that is truly cross-platform, truly a multi-runtime environment that works for Node. It works for Deno. It works for workers. And it will work for any environment that chooses to support the standard library APIs. The underlying implementations can be different. Nobody says that these have to be implemented the exact same way. But the APIs themselves should be clear, consistent, and common across all of these different environments.

Anyway, I'm out of time. So, that's my modest proposal. I'm looking forward to conversations around this and seeing where we can go. And I hope you all enjoy the rest of the conference. Bye.

So, let's see the results of the poll. So, James asked, if your JavaScript locations do you use polyfills for Node.js or Deno APIs? 41% responded no, 32% responded I don't know, and 25% responded yes. What do you think about this poll, James? This is what you expected? It's pretty much what I expected. That 33% and the I don't know. It's really interesting that a third of the folks just aren't sure which APIs are actually being used and whether there's stuff from Node there or not and where those things are coming from. It's very interesting. Nice.

So now, let's move to the questions. People can still ask questions in the Node Talk Q&A in the Discord channel. We have one from Com Frarial. Are any Node core members part of W3C W3C, WG or TC39? Yeah, absolutely. So the one thing I can describe is Node itself is part of the JS Foundation, which is a certain type of nonprofit organization in the United States. It cannot, because of the type of organization it is, Node itself cannot join TC39, and it can't join W3C as an organization. But individuals can, individuals that work for employers like IBM or Cloudflare or whatever, where these companies might be involved with these groups, then they can participate. So we have individual Node core contributors who are very active in TC39 W3C. And Whatwg is, you know, you don't have to be a member of a company or anything, anybody can participate in there.

Relevance of Browser APIs and Node Development

Short description:

Browsers expose a large number of standardized APIs, many of which are relevant to server and edge developers. APIs like URL, compression stream, decompression stream, streams, crypto, WebSocket, File, and Blob are common to multiple platforms. While some web platform APIs may not be relevant, there is significant overlap. Front-end and Node.js development require different mindsets. The speaker is working on implementing the quick protocol for Node, which is expected to land halfway through the year.

We've been active in a lot of those conversations. Thanks. Thank you for your answer.

We have another question. Browsers expose a very large number of standardized APIs for users. How many of those APIs are actually relevant to server and edge developers running Node.js, Deno or Edge compute environments like Cloud for Workers? Quite a bit of them. And anybody who's been paying attention to Node for a while and Deno and Workers. You know, we've seen a lot of APIs like URL, right? Compression stream, decompression stream like Deno just rolled out decompression stream I think in 1.9 today I think. There's all of the streams APIs, all of the crypto APIs. There's the WebSocket APIs. There's you know File and Blob and you know there's a huge number of these things that are relevant. That represent functionality that is common to all these platforms. So you know, yes there's a lot of the web platform ones that just are never going to be relevant right? Like I don't think we're ever going to see you know like media rendering APIs there but yeah where there is overlap there's definitely a lot of APIs that we can look at.

Thank you. And we have a comment here from JsCars saying that he's certainly right, most of the time I need two mindsets. One I work front-end development and other when I'm working with Node.js. So yeah so thank you very much for your talk. Now that you work in Node core, can you tell us what is the most interesting topic that you're working on right now or any feature? For Node, I'm working on trying to get the quick protocol landed, so we're going to be working on that quite a bit more over the next couple of months and hopefully we'll get that landed probably about halfway through the year. Wow very exciting, very exciting. I hope it happens soon. Thank you very much James. I hope you enjoyed this conference and I hope the attendees learn a lot too.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Remix Conf Europe 2022Remix Conf Europe 2022
23 min
Scaling Up with Remix and Micro Frontends
Top Content
Do you have a large product built by many teams? Are you struggling to release often? Did your frontend turn into a massive unmaintainable monolith? If, like me, you’ve answered yes to any of those questions, this talk is for you! I’ll show you exactly how you can build a micro frontend architecture with Remix to solve those challenges.
React Advanced Conference 2021React Advanced Conference 2021
47 min
Design Systems: Walking the Line Between Flexibility and Consistency
Top Content
Design systems aim to bring consistency to a brand's design and make the UI development productive. Component libraries with well-thought API can make this a breeze. But, sometimes an API choice can accidentally overstep and slow the team down! There's a balance there... somewhere. Let's explore some of the problems and possible creative solutions.
Remix Conf Europe 2022Remix Conf Europe 2022
37 min
Full Stack Components
Top Content
Remix is a web framework that gives you the simple mental model of a Multi-Page App (MPA) but the power and capabilities of a Single-Page App (SPA). One of the big challenges of SPAs is network management resulting in a great deal of indirection and buggy code. This is especially noticeable in application state which Remix completely eliminates, but it's also an issue in individual components that communicate with a single-purpose backend endpoint (like a combobox search for example).
In this talk, Kent will demonstrate how Remix enables you to build complex UI components that are connected to a backend in the simplest and most powerful way you've ever seen. Leaving you time to chill with your family or whatever else you do for fun.
JSNation Live 2021JSNation Live 2021
29 min
Making JavaScript on WebAssembly Fast
Top Content
JavaScript in the browser runs many times faster than it did two decades ago. And that happened because the browser vendors spent that time working on intensive performance optimizations in their JavaScript engines.Because of this optimization work, JavaScript is now running in many places besides the browser. But there are still some environments where the JS engines can’t apply those optimizations in the right way to make things fast.We’re working to solve this, beginning a whole new wave of JavaScript optimization work. We’re improving JavaScript performance for entirely different environments, where different rules apply. And this is possible because of WebAssembly. In this talk, I'll explain how this all works and what's coming next.
React Summit 2023React Summit 2023
24 min
Debugging JS
Top Content
As developers, we spend much of our time debugging apps - often code we didn't even write. Sadly, few developers have ever been taught how to approach debugging - it's something most of us learn through painful experience.  The good news is you _can_ learn how to debug effectively, and there's several key techniques and tools you can use for debugging JS and React apps.
Node Congress 2022Node Congress 2022
26 min
It's a Jungle Out There: What's Really Going on Inside Your Node_Modules Folder
Top Content
Do you know what’s really going on in your node_modules folder? Software supply chain attacks have exploded over the past 12 months and they’re only accelerating in 2022 and beyond. We’ll dive into examples of recent supply chain attacks and what concrete steps you can take to protect your team from this emerging threat.
You can check the slides for Feross' talk here.

Workshops on related topic

React Day Berlin 2022React Day Berlin 2022
86 min
Using CodeMirror to Build a JavaScript Editor with Linting and AutoComplete
Top Content
WorkshopFree
Using a library might seem easy at first glance, but how do you choose the right library? How do you upgrade an existing one? And how do you wade through the documentation to find what you want?
In this workshop, we’ll discuss all these finer points while going through a general example of building a code editor using CodeMirror in React. All while sharing some of the nuances our team learned about using this library and some problems we encountered.
React Summit 2023React Summit 2023
137 min
Build a Data-Rich Beautiful Dashboard With MUI X's Data Grid and Joy UI
Top Content
WorkshopFree
Learn how to put MUI’s complete ecosystem to use to build a beautiful and sophisticated project management dashboard in a fraction of the time that it would take to construct it from scratch. In particular, we’ll see how to integrate the MUI X Data Grid with Joy UI, our newest component library and sibling to the industry-standard Material UI.
Table of contents:- Introducing our project and tools- App setup and package installation- Constructing the dashboard- Prototyping, styling, and themes - Joy UI features- Filtering, sorting, editing - Data Grid features- Conclusion, final thoughts, Q&A
Node Congress 2023Node Congress 2023
109 min
Node.js Masterclass
Top Content
Workshop
Have you ever struggled with designing and structuring your Node.js applications? Building applications that are well organised, testable and extendable is not always easy. It can often turn out to be a lot more complicated than you expect it to be. In this live event Matteo will show you how he builds Node.js applications from scratch. You’ll learn how he approaches application design, and the philosophies that he applies to create modular, maintainable and effective applications.

Level: intermediate
TestJS Summit - January, 2021TestJS Summit - January, 2021
173 min
Testing Web Applications Using Cypress
WorkshopFree
This workshop will teach you the basics of writing useful end-to-end tests using Cypress Test Runner.
We will cover writing tests, covering every application feature, structuring tests, intercepting network requests, and setting up the backend data.
Anyone who knows JavaScript programming language and has NPM installed would be able to follow along.
JSNation 2023JSNation 2023
104 min
Build and Deploy a Backend With Fastify & Platformatic
WorkshopFree
Platformatic allows you to rapidly develop GraphQL and REST APIs with minimal effort. The best part is that it also allows you to unleash the full potential of Node.js and Fastify whenever you need to. You can fully customise a Platformatic application by writing your own additional features and plugins. In the workshop, we’ll cover both our Open Source modules and our Cloud offering:- Platformatic OSS (open-source software) — Tools and libraries for rapidly building robust applications with Node.js (https://oss.platformatic.dev/).- Platformatic Cloud (currently in beta) — Our hosting platform that includes features such as preview apps, built-in metrics and integration with your Git flow (https://platformatic.dev/). 
In this workshop you'll learn how to develop APIs with Fastify and deploy them to the Platformatic Cloud.
Node Congress 2023Node Congress 2023
63 min
0 to Auth in an Hour Using NodeJS SDK
WorkshopFree
Passwordless authentication may seem complex, but it is simple to add it to any app using the right tool.
We will enhance a full-stack JS application (Node.JS backend + React frontend) to authenticate users with OAuth (social login) and One Time Passwords (email), including:- User authentication - Managing user interactions, returning session / refresh JWTs- Session management and validation - Storing the session for subsequent client requests, validating / refreshing sessions
At the end of the workshop, we will also touch on another approach to code authentication using frontend Descope Flows (drag-and-drop workflows), while keeping only session validation in the backend. With this, we will also show how easy it is to enable biometrics and other passwordless authentication methods.
Table of contents- A quick intro to core authentication concepts- Coding- Why passwordless matters
Prerequisites- IDE for your choice- Node 18 or higher