Towards a Standard Library for JavaScript Runtimes

Bookmark

You can check the slides for James' talk here.



Transcription


Hello Node Congress. I am James Snell or JASnell on GitHub or Twitter. And I am with cloudflare. I'm also on the Node Technical Steering Committee. I've been contributing to Node for, oh wow, almost seven years now. Started back in 2015, right when node.js and IoJS were having their issues and got them back together. So I've been around for a while. And in that time, there's been some themes that have developed over time. We have all the Node APIs that are out there. We've added a bunch of new features to Node. We added HTTP2 support and new URL parsing. Recently, we just added support for WebStream, so the readable stream and writable stream api. I think two years ago, we implemented the WebCrypto api. So there's a lot there that's happened. And over that time, we've had this kind of ongoing debate and ongoing conversation about just how standardized Node's APIs should be or how much of the web platform standards Node should be paying attention to. Well, since that time, we've had other runtimes that have come up. We've had deno, which is an absolutely fantastic platform, but it has its own set of APIs. We have cloudflare Workers, one of the runtimes I'm working on right now. And it has its set of APIs and things that it does. And there are other environments. If you look at Fastly or look at some of the iot devices, there's a bunch of places where javascript is being used now. So you have to stop and think. At some point, it's good to always step back and think, what kind of APIs should we be implementing? Well, I'm here after seven years of doing this, here with a bit of a modest proposal. We need a standard library of APIs for javascript runtimes. I'm going to talk about a little bit about why I'm thinking that and what I'd like to see. So let's kick it off. Here's a puzzle. Base64 encoding is something that is very common to many applications. We see it everywhere. In Node, we've always had this buffer from hello to string base64. We can take a base64 encoded string and get a buffer back out from it. But what is the right way to base64 encode data from javascript? People that have been developing the browser for the longest time are probably familiar with this b2a function. Is it the right way? It's the only standard we have for base64 encoding. But honestly, it kind of sucks. It doesn't handle everything that you need. It just kind of works okay. But we really can't change it because of backwards compatibility and all that kind of stuff. So what is the right way to do base64 encoding in javascript? Is it Node's api, buffer from? Is it deno's api, where you import this encode function from their standard library? Or is the answer something out on npm? Where you have to go out and npm install some module once you find it, once you find the right one. And hopefully you find one that is well maintained, has an active, at least an active contributor that will keep it up to date and keep it moving and ensure that it keeps working with different Node versions or different deno versions or other runtimes. And once you have the basic mechanism, which api here is the correct one? Unfortunately, the answer is all of them. All of them are the correct way. But having so many different ways to do something that is so fundamental is a problem. It just adds friction, adds difficulty to developers who are writing code that runs on the web. Amazingly, this fourth option, this base64js module has over 30 million downloads per week from npm and has over 1300 dependents. And a lot of those are Node applications, despite the fact that Node has this built in. But also a large number of those dependencies are browsers, browser applications that all they have to rely on built in is B2A. And B2A itself is lacking in quite a bit of functionality. So the answer is all of these are the right way, but none of them are the right way because there's always a different way of doing it. What if we wanted to have base64 URL encoding, which is a variance of base64? Well, it turns out even that from base64 to base64 URL, there are differences in APIs. Not all the options work. Sometimes we have to change a few things. Fortunately, both deno and Node have this built in as part of their core libraries. And at least those will work consistently with each other. But you still start having a lot of these options. So it comes up to a very important question. Why is something so common and so widely used in javascript have so many different ways of doing it? And why should it matter which javascript runtime I'm using? Why should I, as a developer writing javascript, need to know, okay, is this code going to run in Node or is it going to run in deno? Which one of these base64 encoding libraries, APIs do I need to use? Oh, wait, now my company's priority shifted. They're investing now in edge computing. Now it's going to go over to cloudflare workers. Which one of these APIs will work there? Or how do I get it to work there? Or do I need to change all of my code that does base64 encoding to do something brand new? That's ridiculous. That something so common has to be such a difficult, time-consuming conversation to just base64 a little bit of text. What about hex encoding? What are the APIs for that? Node does it one way. Again, it's consistent with the way it does base64. deno does it its own way, which is consistent with its APIs. There's nothing you can use on the browser unless you find some random module on npm and again, hope that it's being maintained. But how often do we come across hex encoding? It's everywhere. We see this all over the place. So again, why is it so difficult to find a single api for doing hex encoding in javascript? It doesn't make any sense. What if we want to look at other functionality? Comparing subsets of two different binary ranges. Say if you want to see if one binary sequence is embedded in another binary sequence. What's the api for doing this? Is there a standard api for doing this? Unfortunately, typed arrays in javascript don't do this. They don't have comparing subsets of different binary ranges. All you can do is look at an individual member, an individual item in a typed array and compare it to another individual item in another typed array. That's all you can do. And unfortunately, Node and deno, two completely different ways of doing this. It just doesn't make any sense. What if we're accumulating stream data into a simple stream? We want to stream some data in, build a string from it. How do we do that in the different platforms? Again, different APIs. It works differently. It doesn't need to be. javascript runtimes have historically been very anti-standard library. This thought really stems from this idea we'll explore in a little bit, the small core idea. The underlying principle here, the idea behind it, the theory behind it, is that just let the ecosystem, all of you, all the developers, do whatever you want to do. Pick whatever dependency works for you. Make whatever choice fits the best for your particular situation. And keep the runtimes small. Keep them unopinionated. Just implement the bare minimum that the language provides. Unfortunately, that doesn't make life any easier for the developer. It makes things quite a bit harder. You have to find the implementation of these things. You have to select which one of several you're going to use. Make sure that it's being maintained. Make sure it doesn't introduce any security risks. It is open source software. There are examples of modules out on npm being hijacked, having security vulnerabilities injected into them. There's a risk of it not being kept up to date with a particular version of Node or deno or WordGrooves or whatever and then something breaks. There were some modules that fairly recently in server-side rendering libraries that are built on an old model of native add-on APIs for Node. Well, that old model is no longer supported. Once you get past Node 16 on, they don't work as well or at all. When you start picking up these APIs, how do you know they're going to continue to work as the runtimes evolve? This idea of small core, we're going to let the developers decide, is basically just a different way of saying, we're going to make this the developer's problem. We're not going to think about it. We're going to force you guys to think about it. This idea of no standard library for javascript has actually caused quite a bit of friction. Now, granted, it has allowed a lot of evolution, faster cycles, a lot of experimentation, and really fast growth. But it just fundamentally ends up adding headache to most developers. This idea of small core, again, what this basically means is that the APIs that your runtime provide you are as minimal as possible. Essentially, the idea is only do what javascript does, plus a little bit more. Then whatever is missing, you would just go out to npm or some other registry somewhere, some website somewhere, or some other GitHub repo, or wherever you can find it, and just find that additional missing functionality somewhere else. This allows runtimes like Node to be smaller. It gives us fewer things to maintain over time. But again, just kicks the can down the road a little bit and says it's not our problem, it's your problem if you need some bit of functionality that's not there. The idea of small core has been abused and misinterpreted to limit innovation in javascript runtimes. Whenever a standard api would come out, for instance, with a Node, there's always this argument, well, nah, that can be implemented in the ecosystem. We don't need it here. An example of that would be a WebCrypto api. Node has a crypto api. It's had it for forever. Then the W3C developed this WebCrypto api, which does a lot of the exact same things, but in a completely different way than the Node's api. Developers started using it and started asking, hey, Node, can you implement WebCrypto? The response was, no, small core. Small core, if you want that, you go off and implement it yourself. Here's native add-ons, here's ways of doing it, but we're not going to do that. It ended up limiting the compatibility between Node and browser runtimes and other runtimes. Whereas we are still fighting interoperability issues between these things today, now that we have WebCrypto in Node, but it was that resistance, that constant fighting uphill battle for trying to get these things into Node to evolve that platform. You see something like deno come along and they're like, you know what? Hey, we're going all in on standards/talks">web standards. We're doing Web streams. We're doing WebCrypto right from the start. Developers look at that and say, well, hey, they did it. Node, why can't you do this? Again, it goes back to small core. There was this constant years and years and years of resistance to adding these new things into the runtime. There's another argument, and this is really from day one of my involvement with Node, and I've been hearing this, is Node is not a web browser. It shouldn't act like one. It shouldn't have these APIs that exist in the browser. They do different things. Honestly, they're a lot closer than it may appear. Yeah, Node doesn't have a rendering process. It's not going to go out and parse html and render it in a window. There is a lot of stuff that Node does that web browsers don't, and there's a lot of stuff that web browsers do that Node doesn't. Yes, the statement is factually true, but it ignores a very fundamental concept that Node and web browsers do a lot of the exact same things. They both are used to implement applications on the web. They both need to work with streaming data. They both need to work with binary data. They both need to do crypto. There is a huge amount of overlap in functionality between these environments. I'm not just talking just Node. This includes deno. This includes environments like cloudflare Workers. There is a huge amount of overlap between these platforms. Why are these platforms doing things in completely different ways, even when they're on the areas of overlap? It doesn't make any sense. The argument has been used throughout the history to basically say it's okay for Node to do everything its own way, even if developers are doing those exact same things in browsers. To be fair, web browsers have not really shown that they care much about server and edge developers either. Web browsers will basically get together and decide on an api, and they'll go out and say, this is the standard for this, and all of you developers go off and use that. When those of us that are working on Node and Edge and say, well, wait, that doesn't really work great or we already have an api for this. Why didn't you just use that? The response typically has been, we don't care what you're doing. This is what browsers are going to do. It's gotten a little better here recently, but historically there's been this conflict back and forth between these environments for quite some time. It's really coming up to this idea, well, yes, all javascript, but we're going to do it a different way. Yes, we're both run times running on a server somewhere, but we're going to do it the deno way or we're going to do it the Node way, and it really doesn't make any sense. It just causes friction for developers. One example of this, the readable stream module on npm has 105 million downloads per week with over 3,100 dependents. It works in Node, it works in deno, it works on cloud for Workers and every major browser, but it's a Node-specific api, but it's used everywhere in many different applications. Even if you've never touched Node, you'll probably use an application that is using the readable stream module. Every run time's choice to do its own thing is very real cost and causes very real pain for developers. The readable stream module, right? When we decide to change an api in Node, that module has to change and that trickles down to everyone else who's using it, even if they are not using Node. The readable stream module, the decisions we make about what other dependencies in Node are required trickle down. So, readable streams uses the buffer api, which means you can't just pull in the readable stream api into Node or into deno or any other environment. You also have to pull buffer in. Readable stream is also based on Node's idea of event emitter, which means you have to pull in event emitter into deno and workers and browsers. So, this choice that Node made to have this api and the fact that so many people are using it, even if they've never touched Node, it drags on a huge amount of additional work and additional pain and additional effort for developers pretty much everywhere, whether you're using Node or not. So, there's another idea here that we shouldn't pick a winner. So, backing up what this means is Node is a runtime and there's this ecosystem of developers who are creating stuff on that run, on that runtime, and they publish those things to npm. This idea of we shouldn't pick a winner is basically a concept within Node that says we shouldn't decide, we being the Node core developers, we shouldn't decide which of these modules out on npm people should use. If there are two different implementations of a web framework, for instance, we shouldn't have an opinion about which one is better or we shouldn't advantage one over the other. Basically just let the market decide, let the developers decide. All we're doing is providing a platform upon which these things will run. The idea of this, I mean, it's fine in concept, except when it just ends up adding friction to developers, right? And the friction I'm talking about is deciding which one of these you actually should use, which one of these is going to be maintained long term, which one of these is actually going to be the safest investment for your application. And how are you protected against making the wrong choice, right? How do you know you're not going to end up having to rewrite your application or things are going to stop working or you're going to be stuck on an older version of Node that might have security vulnerabilities because you can't update because it requires half your application to write, to be rewritten. It's a fundamental problem and it's not one that we should take lightly. All right, so I'm going to say just right off, I'm sorry, this idea is wrong. It's something that we messed up on with the Node. It's a philosophy I had bought into, but as time has gone, I've come to realize that this entire idea is wrong. We need to be opinionated. Being opinionated is good. And the runtime being opinionated about what APIs are best is what you need to, is something that the runtimes themselves need to do. When there is a common need for something, runtimes should favor a single consistent common api while encouraging competition in the implementation of it. We should absolutely pick a winner when it comes to APIs if we don't developers suffer. case study of this is the Node Buffer api. Buffer is a Node specific api. It predates typed arrays. When Buffer was added, it was the only api for working with raw binary data in javascript. Today, Buffer extends UNT 8 array, kind of. Buffer slice works differently. You have Buffer includes, where it's searching sub ranges while UNT 8 array does not. ToString works differently between Buffer and UNT 8 array. And there are probably more applications using Buffer in the world than there are UNT 8 array. The Buffer polyfill alone on npm has 44 million downloads a week. So this is a significant api, but it's entirely driven by Node's decisions. The needs to work with binary is not limited to Node, right? But all of the decisions on what happens with that api are made only by Node, impacting millions of developers, including those of you who aren't using Node at all. And that seems wrong. That definitely feels wrong to me. The Buffer api should be standardized. Decisions that impact the api should not be made by Node contributors alone, even though Buffer originated as a Node api. All right. So I have a modest proposal. javascript runtimes should not introduce new runtime specific APIs unless absolutely necessary. What this means is, you know, any time there is some bit of new functionality that needs to be added, there should be a conversation between Node and deno and all these different runtimes and say, OK, you know, is this something we should do? Is this something developers need? How do we do this in a way consistent across all these runtimes? Browsers are basically already doing this through venues like the W3C and WG. It's well past time for Node, deno and others to catch up and time for browser developers to care more about servers and edge developers also and participate in these conversations as well. It doesn't matter which runtime we're talking about. Runtime specific APIs that have to be polyfilled in other environments are a mistake. Right. If they are if they need to be in these other environments, if developers need this functionality in multiple places, then there should be a standard api. And it should be common. Node streams versus web streams. Here's a great example of this. Node streams predate the W3C stream standard. It's been around for years. There are three different versions of Node streams in one implementation. It's been it's one of the oldest, most complicated APIs in Node and also one of the most broadly used across the entire ecosystem. Node streams are an overly complicated mess of code and APIs that are, you know, that are horrible to use, but still work really, really well. Web streams look at the complexity of Node streams instead of hold my beer. Basically, you know, it looks like it looks like it might be an easier api, but under the covers, it is actually a very, very complicated api. Web streams aren't any less complicated than Node streams. They have about the same level of complexity. They do the same things. An argument could be made that web streams never needed to exist. You just did it the way Node did it. But, you know, we are where we're at today. We have two different streams api models that are used in javascript. Node streams are faster than web streams, but they both do exactly the same thing. Node controls all the decisions on what changes are made to Node streams. And again, like I said, those decisions impact millions of developers around the world, whether they use Node or not. The WG, on the other hand, controls all decisions on changes to web streams. And what WG is not just a single organization. It's an open process for participating here. But what WG explicitly prioritizes the needs of web browsers over all other platforms. It's caused issues. All right. You know, there needs to be more cooperation. So, you know, but let's take a step back. What's the DNO streams api? Web streams. All right. What is the cloudflare workers streams api? All right. It's web streams. What is the browser streams api? It's web streams. Right? What should the standard streams api for javascript be? No matter how much kicking and streaming Node developers may do about it, it should be web streams. And async iterators. That's a different story. That's a different conversation to have. But it doesn't matter that Node streams are so popular. The standard, the one that everyone should be using is web streams. And that's the direction we should be heading. The DNO standard library. I just got to make a point. They call it a standard library. It's really just set up DNO specific APIs. They call it a standard library for DNO itself. Brilliant idea. It would be better if it wasn't specific to DNO. And if you look through the implementation, there are a number of places where it's very specific to DNO. You know, it's something that we can do better. Right? It's something that we can continue to iterate on. But, you know, one key question here. What about backwards compatibility? Adding standard APIs does not mean changing or removing existing ones. A good example of this is URL parse and new URL in Node. Both of these still exist in Node. New URL is the one you should be using. But URL parse is not going anywhere. It's still there. It just has a bunch of bugs and some security concerns. And there's lots of reasons you shouldn't use it. But if you do use it, it'll continue to work. So, adding these new things does not mean getting rid of the old stuff. Right? But Node should follow DNO's example of having a standard library of common APIs that are not necessarily built into the runtime. Right? These still might be things you install through npm. But they are blessed by Node. And they implement common APIs that also work in DNO, that also work in edge environments like workers. That work in any of these runtimes. Right? They're based on standard APIs that work consistently across multiple platforms. These should not be specific to Node in any way. Right? They should not be specific to DNO in any way. All javascript runtimes should share a common standard library of APIs that work consistently, that aren't necessarily part of the language, but still provide common functionality on top. What functionality are we talking about? There's quite a bit. There's quite a bit here. And definitely most of these do not belong actually in the language itself. But they are things that we can work on together to drive in the platform. All right. So, what do we need? And this is the crux of the modest proposal I'm talking about. A new collaborative open source effort, preferably driven by contributors to both Node, DNO, and other javascript runtimes like cloudflare workers, develop a common library of APIs. So, take basically what DNO has started with their standard library, which exists as a separate repo and separately installable pieces. Let's start there, perhaps. Right? Expand that out to something that is truly cross platform, truly multi runtime environment that works for Node, it works for DNO, it works for workers, and it will work for any environment that chooses to support those standard library APIs. The underlying implementations can be different. Nobody says that these have to be implemented the exact same way. But the APIs themselves should be clear, consistent, and in common across all of these different environments. Anyway, I'm out of time. So, that's my modest proposal. I'm looking forward to conversations around this and seeing where we can go. And I hope you all enjoy the rest of the conference. So, let's see the results of the poll. So, James asked, if your javascript application uses polyfills for node.js or DNO APIs, 41% responded no, 32% responded I don't know, and 25% responded yes. What do you think about this poll, James? This is what you expected? It's pretty much what I expected. That 33% and the I don't know, it's really interesting that a third of the folks just aren't sure which APIs are actually being used and whether there's stuff from Node there or not, and where those things are coming from. It's very interesting. Nice. So, now let's move to the questions. People can still ask questions in the Node Talk Q&A in the Discord channel. We have one from Con Frarial. Are any Node core members part of W3C, WG, or tc39? Yeah, absolutely. So, the one thing I gotta describe is Node itself is part of the JS Foundation, which is a certain type of nonprofit organization in the United States. It cannot, because of the type of organization it is, Node itself cannot join tc39, and it can't join W3C as an organization. But individuals can, right? Individuals that work for employers like IBM or cloudflare or whatever, where these companies might be involved with these groups, then they can participate. So, we have individual Node core contributors who are very active in tc39, W3C. And WG is, you don't have to be a member company or anything, anybody can participate in there. We've been active in a lot of those conversations. Thanks. Thank you for your answer. We have another question. Browsers expose a very large number of standardized APIs for users. How many of those APIs are actually relevant to server and edge developers running node.js, Xenon, or edge compute environments like cloudflare workers? Quite a bit of them. Anybody who's been paying attention to Node for a while, and deno and workers, we've seen a lot of APIs like URL, compression stream, decompression stream, like deno just rolled out decompression stream I think in 1.9 today, I think. There's all of the streams APIs, all of the crypto APIs. There's the WebSocket APIs. There's file and blob. And there's a huge number of these things that are relevant that represent functionality that is common to all these platforms. So, it's yes, there's a lot of the web platform ones that just are never going to be relevant. I don't think we're ever going to see media rendering APIs there. But yeah, where there is overlap, there's definitely a lot of APIs that we can look at. Thank you. We have a comment here from JS Cars saying that he's totally right. Most of the time I need two mindsets. One when I work for front-end development and other when I'm working with node.js. So, yeah. So, thank you very much for your talk. Now that you work in Node Core, can you tell us what is the most interesting topic that you're working on right now or any feature? For Node, we're working on trying to get the Quick Protocol landed. So, we're going to be working on that quite a bit more over the next couple of months. And hopefully, we'll get that landed by about halfway through the year. Wow. Very exciting. Very exciting. I hope it happens soon. Thank you very much, James. I hope you enjoy this conference and I hope the attendees learn a lot too. See you. Thanks for having me. Bye.
34 min
17 Feb, 2022

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Workshops on related topic