This workshop will walk you through writing a module in TypeScript that can be consumed users of Deno, Node and the browsers. I will explain how to set up formatting, linting and testing in Deno, and then how to publish your module to deno.land/x and npm. We’ll start out with a quick introduction to what Deno is.
Writing Universal Modules for Deno, Node and the Browser
Transcription
Okay, let me just introduce myself real quick. I'm Luca. I work at DinoLand Inc. as a software engineer. I work on the open source Dino CLI project. So that's what you download when you go to Dino.Land and follow the install steps. But we also have a cloud compute offering, like an edge compute offering similar to cloudflare Workers called Dino Deploy. I also work on that. And I also do most of the standards/talks">web standards work for Dino. So Dino is very web-first. Like, we like using browser APIs where we can. And because of that, I get to work with all the awesome folks from all the specification bodies like ECMA and what WG that specify things like the javascript language itself in T30.9, where I'm a part. But also things like the fetch spec or the stream spec or the html spec, even though that's less relevant to Dino. I'll give a quick introduction of what Dino is, just so we're all on the same page. I'm sure most of you will know already, but just quick introductions here. Dino is a modern runtime for javascript and typescript. It's secure by default. There's no permissions to do anything by default. So you can't read anything from the file system. You can't have network access. You don't have environment variable access. You can't run subprocesses, anything like that, unless you explicitly say, I want to allow this. So that's like in the browser, where a website can't send you notifications without you opting in. And Dino can run typescript out of the box. Dino is a single executable. That makes it super easy to install and deploy. There's no need to install any crazy dependencies or make sure you have the right versions of things. It's just a single binary and everything is included. And it includes a bunch of built-in utilities that we're going to use today during the little thing that we're going to be building here. A linter and a formatter. So that's like ESLint and Prettier, but written in rust super fast. And we also have a test framework, which is very, very simple, but allows us to do very advanced testing. And will, like I said, cover all your testing needs, which is pretty cool. We have a set of standard modules, dinoland slash std, that encapsulates some common functionality. It's guaranteed to work in Dino and we maintain that. Essentially if you look at the top 100 things on npm, they're either built into the runtime directly or they're in the standard library. So there's a whole bunch of stuff you don't need to trust a whole different bunch of people for, but you can just trust us, which is kind of cool. And because we're a very new modern runtime, we get to follow the standards/talks">web standards for essentially all of our tooling and built-ins. We have the fetch global by default, we support import maps, we use ECMAScript modules, we have web workers, we use streams, like the web streams that we're going to actually be using quite a bit today. We use promises for everything, so no callbacks. Yeah, it's pretty nice to work with. Do we have any questions on this? Doesn't look like it. Cool, so let's quickly run over what we're actually going to do today and then we can get right into it. And you guys can follow along and if you have any questions, if I'm going too fast, please just stop me and just send the message on Discord and I'll slow down and I'm happy to explain things as we go if I'm not clear enough. So we're going to write a little utility library that does some stream combinators, I'll explain what those are in a second, that's going to work in deno and in Node and in the browser, all from a single code base. And you're going to be able to use all the deno tooling, so the testing framework and the linting and the formatting and everything, but you're still going to be able to publish your library to npm and use it in the browser as well. So the library itself, I'll get to that in a moment. And then on the next slide, let's cover the things that we're actually going to do. So we're going to write the library, we're going to write some unit tests using the test framework, we're going to run formatting and linting, just to show you how that works. We're going to set up CI on github actions, that's an important part for any project, making sure that your code doesn't break every time you push. We're going to publish the library to DenoLandX and we're going to test the code in Node and publish it to npm. And depending on how long each of these is going to take, we might skip over some of them or, yeah, we'll see. Okay, so the library itself is going to just have a single stream combinator. And a stream combinator is essentially a way of transforming streams. So streams are, there's two real types of streams, readable streams and writable streams. So one is the data source, one is the data sync. And the source is things like the body from a fetch request. So when you get data streamed down to you, that's a data source. And then the data sync is things like you want to write to a file. So you're syncing the data to some system, for example, a file. And you can combine these. So you could, for example, read from a readable stream. We make a fetch request, we get the response body, that's a readable stream. And then we pipe that into a writable. So we're essentially going from a readable to a writable. That's the direction you always have data go, from a source to a sync. And what you can do is actually you can put things in between that. So you can put a compression stream in between, for example. Which what that will do is it will compress data as it flows through the stream. So whenever a chunk arrives from example.com, it's going to compress that with gzip. And it's going to write that to a file with this pipe2. So we're piping something through a transformer. And this compression stream, that's a stream combinator. And the web actually has a standard for this. It's called transform stream. A transformer consumes data, it then transforms it, and then emits new data. Like that transformed data. And the transform stream actually has a readable and a writable end. So the writable end is the thing, you give it data that you want to transform. And the readable end, you can read the data that was transformed. So the library that we're going to build is going to have a JSON parse stream. Which it's going to be very simple. Each chunk is just a string. And you parse that as a JSON message. And emit that as a result on the readable end of the stream. And this may sound kind of confusing. You'll see in a moment, it's not confusing. It's pretty simple. But if we have time, we'll do this dual line stream as well. But I don't think we're going to get to it. One thing I want to point out is that we're not going to, the code itself is not going to be super duper complicated or anything. And this library is probably not going to be very useful. We're just writing this library so you have an example of how you can approach something like this. And all the steps you need, all the scaffolding you need. And all the scaffolding you don't need as well. And then you can write your own libraries that do other things. But yeah, use the same sort of setup. So let's actually get started coding. On the requirements of the workshop, I had asked you all to download deno and Node and set up your LSP or your IDE, sorry. I set up VS Code because that's the editor that I usually use. So what we'll start out is I'll just show you how exactly to do that, what setup steps you need. And then I'll show you what project structure you need to create. And you'll see there's actually not much to it. And then we'll actually write this JSON parse stream implementation. It's going to be a couple of lines. Okay, so this is my editor. I have the deno extension installed. You can see here the deno extension is installed. That's very important. You need to make sure you have the deno extension installed if you're using VS Code. And I also have deno installed, version 1.19. That's the most recent version. And then all I need to do to create a deno project in VS Code, I need to open the command palette. I can do that with F1 or I can press Command, what is it, Command Shift P. And then I run this deno initialized workspace configuration command. And what that does is it'll ask me some questions and then it'll tell VS Code that I'm going to use deno in this workspace. And that just makes sure all the linting and everything works. So I'll answer yes on the question about linting and I'll answer no but on stable APIs. We're just going to use stable APIs today. Cool. And I just created this VS Code directory with a settings.json file that just has the settings in here for enabling deno and enabling the linting, disabling the unstable stuff. Okay. Any questions so far? Can I show that again? Yes, sure. So you want to press Command Shift P or Fn F1, depends on what exactly your bindings are. Then it opens this command palette and then you type in deno and you'll find deno initialized workspace configuration as one of these options here. And you just click on that and then you can answer yes and then no. Cool. Then let's get started with writing actual code. So all we need to do to write code in deno is to just write a open, create a file to actually put the code into. You don't need a package.json file. You don't need any other manifest file. You're just writing code right away. So let's start with just something simple to show that deno is working. Let's just log out Hello World. A question, is this the only way to start a new project or is there also a CLI command? There's no CLI command because if you're using just the CLI, you don't actually need to do this. This settings is only if you're using VS Code. So deno will run without this settings.json file. This is only if you're using VS Code and you want the auto-completion to work correctly, you need to set this up. But you don't need it to actually run the code. Okay. So you run denorun and then mod.ts. Boom. And it outputs Hello World. Does anyone have any questions? You're getting deno project initialization failed. You open. That's unfortunate. Is deno installed? Can you, if you open the terminal and run deno-b, you get the deno version? Answering questions again. One moment here. Okay. And can you, you might need to restart VS Code real quick here. Are you all on Windows? Okay. What you can do is you can open your terminal and type in whichdeno on Mac. Okay. You can open up your terminal and type in whichdeno and you should get a file path back. Let me know if you don't get that. And then you press no path. Okay. And that's on Windows. Okay. Let's see here. How can we do this on Windows? Can you try and PowerShell instead of in CMD? So type in pwsh and then type in whichdeno again. Cool. If you have the file path, you just press command dot or control dot to open the settings page, type in deno.path. And then you can enter the path here for the CLI. Bjorn, if you can't get a path, if you install deno, it'll also print out where it installed it. If you still have that, then that may work too. Okay, cool. Then let's continue. So assuming you have that working now, you can type in, as I said, deno run and it'll run something. Let's replace this with some actual code here. So we were just printing out hello world before. Let's create a new class, which is going to be our JSON parse stream. So these transform streams, they're always classes. And these classes, how do you get to the settings page? You press control dot or command dot. On Mac OS, you press command dot. Or command comma, sorry, not command dot, command comma. Maybe you click up here, where is it? Somewhere up here. Okay. I'll give you a moment here. Okay, so what you want to do is you want to create a new class, which is going to be this JSON parse stream class, which we're creating here. And it's going to be a transform stream. So this needs to extend transform stream. And transform stream, that's a built-in. So transform stream, it's just built into deno, it's built into browsers. Yeah, so you want to extend that. Because we're using typescript, we're going to add some type arguments as well. The input type is going to be string. So it's going to consume strings that it will parse. It will output Java or some value. We're not sure because it's parsing JSON, random JSON, and could be anything. So we're just going to say unknown. We don't know what it is. And this will still work. You won't actually do anything, but it'll compile. And then we want to set up a constructor. So the constructor, this is called every time you create a new one of these JSON parse streams. You create the constructor, and then you call super. And super is, this is essentially the constructor for the transform stream. So this takes all the options that a transform stream does. So a transform stream takes all these different options. Most importantly, this transform function, we'll get to that in a moment. And we'll set that up now. So when a user creates a new transform or a new JSON parse stream, we're going to create a transform stream internally with this transform function. And the transform function takes a chunk and a controller as an argument. The chunk, that's a string. So that's for each string that gets piped through the transform stream. This transform function gets called once. First argument is the chunk, and the second argument is the controller. And the controller, you can do things like queue an item onto the readable end. So when you write something to a transform stream, it will call this transform function. You then do some processing on the chunk. And then you enqueue the result using this enqueue method. So if we wanted to do nothing, for example, we could just enqueue the incoming chunk right back onto the controller. And then this does nothing. This essentially just, yeah, does nothing. It takes the chunk and puts the chunk right back onto the controller. So it can be read. But we don't actually want to do that. We want to parse it. So we're going to do const object is equal to JSON.parse. javascript has the built-in JSON parser. And we're going to give it that input string here, this chunk. It's going to parse it. And then the result value is this object. And then we're going to put the object onto the controller. Okay, so let's run that. And you'll see it does nothing. We're just creating this class here. We're not actually using it anywhere yet. So yeah, yeah. But this is essentially going to be the entire library code. So let's reiterate what we've actually done here. We've created this JSON parse stream. This has a transform function, which is called for every chunk that gets written to the transform stream. It then parses that chunk using JSON.parse. And it queues the results onto the controller. So it can be read. So that's the first step. Nope. Okay, cool. Let's get rid of it. We want to write some tests for it. deno has a built-in test runner, denotest. And it has a very simple interface, but it allows for very advanced functionality. So you can do subtests and before hooks and after hooks and all that kind of stuff with a very, very simple interface. And it's all built into deno. So you don't need to install anything else to do testing. It's all just built in. So let's write a test for this JSON parse stream. The convention is that you put your test into a file, which has the same name as the file that you have your code in, but it ends in test, so underscore test. And it actually needs to end in either underscore test or dot test for denotest to pick it up automatically. So make sure all your tests end in underscore test or dot test. Okay, so you create this test file, which you register all your tests in, and then you call denotest. And this is simple. There's a few ways to register tests. One way is to, as the first argument, specify the name. That's what we're going to do. So this test is going to be called JSON parse stream test. That's the test part of that. And then the second argument is the function that we're actually going to test. So this is going to be an async function. I'll show you why in a moment. But yeah, this creates a test. And we can actually run this test already if we just run denotest. You'll see it runs the test. It doesn't test anything yet because it's running. And I can show you that that's actually happening. If I log something here and run it again, you'll see it logged that out. Cool. Let's actually create, let's start by importing this JSON parse stream. So we have it inside of this mod.ts file. We want to get it into the mod underscore test.ts. So we need to put in an import statement. So we import JSON parse stream from./.mod.ts. And then you have your JSON parse stream here. And let's actually create a new JSON parse stream now. So const stream is equal to new JSON parse stream. So what this will do is it will call this constructor, call the super function with this transform function that we made earlier, and it will assign that to the stream variable here. We can see that that's working by just logging this out. So we'll log out the stream here. And then you can see in the terminal, it's a new JSON parse stream with a readable and a writable. And the readable is a readable stream and the writable is a writable stream. So let's actually write something to the stream. So if we remember, each chunk is going to be JSON decoded. So let's write a JSON encoded string to the stream. So what we're going to do is we're going to get a writer for the stream. So stream.writable.getWriter. And then we're going to call writer.write. And here's the chunk that we're going to give it. So we're just going to give it an array with three values in it. So this is valid JSON, right? It's an array with three numbers as values. And then the next thing we're going to do is we're going to get a reader. So we wrote something to the writable end of the stream. We now want to read something from the readable end. So we call stream.readable.getReader. And then we have this reader function, or this reader object here that we can call readOn. So reader.read. And that actually returns a promise, so we're going to need to await this. And then we have a result. And we can log that out as well here, console.log the result. So what this is doing now is it's creating a stream. It's getting a writer, writing one chunk to the writing end of that stream. It's going to then read one chunk from the readable end of that stream, and then it just logs out the result. Let's do that. And you can see it works. We're writing in the string, and it parsed it out as here, the value. So the value is this array with one, two, three on it. So if I want to just get the first item of the array, or just get the array itself here, I can do that. And I could do things like.length. And typescript doesn't like this, so what we're going to do here is we're going to say this is a number array. Okay, let's get the length of the array. There we go, that should work. So it has three items in it. The first item is a one, and the second item is a two. Is a two. Wait, no, that's the third item. That's the second item. Start counting at zero. Okay, so that's nice and all. But now we're just logging this out. As a test, we want to make sure that this is actually the case. So we'll import an assertion from the standard library. So you can, we'll do that here, dino.land.std. Std is the standard library. I will do the latest version. And then there's this testing module which has an asserts function, or an asserts module, sorry. And that has all these asserts methods. And we're just going to use assert equal. We want to make sure that two values are the same. So we're going to compare the value with what we expect it to be. And we expect it to be one, two, three, as an array. So if we run this again, it passes. If we change this so we expect another value here, for example. This is going to fail the test. See the test says values are not equal, JSON parse stream failed. And you'll see there's this extra value in the expected that wasn't actually present in the actual test result. Okay, run this again. And you can see we have a test now that checks that our JSON parse stream works as expected, which is very nice. We can also run this right from VS Code. So above the denote test call, there's a run test button. If I click that, it'll run just that test. So if you have a bunch of tests and you want to just test one of them to make sure it's working, you just press this run test button and it'll run just that test. And you can even debug the test right away. So let's say we want to put a breakpoint here on this JSON parse function. I can press debug and we'll hit the breakpoint and you can see the chunk is 123. And if I step over this, the object is the array 123. Cool, so debugging also works. Are there any questions or anything went wrong? Question on Zoom, let me check here. Should we wait the write operation? No, and the reason why is slightly confusing. I'll show you what happens if you do. It's not going to work. It's actually going to exit with this weird error. Promise resolution is still pending, but the event loop is already resolved. It's kind of a weird error. And the reason this happens is because the promise return from write doesn't complete until you've actually read the value from the readable end. So this promise doesn't resolve until this promise resolves here. The problem is if I put in a wait here, it's just going to get stuck here, because nothing is ever going to read from the readable end, because we're waiting to write. But write never completes because we need to read for the write to complete. So that doesn't work. So we get into this lock, where everything gets locked up. And Dino will print out this error if that happens. But yeah, so you don't want to await the write, because the write is going to be complete by the time the read is complete. Because for the read to succeed, you need to have already written something to the writer. Does that make sense? Cool. Okay, awesome. So now we have some tests written for our project. We have the source code written. Let's do formatting of the thing. So formatting is the thing where you run some tool, like prettier, over your source code to make it all some one consistent style. So you don't have to deal with adding semicolons in the right places or stuff like that. The formatter will ensure that it's all correct. And Dino is a built-in formatter for javascript and typescript and JSON and even Markdown. And it's very opinionated to ensure that there's a consistent style across the entire ecosystem. So for example, you can't turn off if it used semicolons or not. It always uses semicolons, because we think that it's important that everyone uses the same style. That way there's less arguments about what style a new project should use. And if everyone uses the same style, everyone will feel right at home in any project, because everybody uses the same style. That said, the style is very similar to prettier. So it's going to look very similar. If you've ever formatted anything with prettier, you'll see this is a to space indent just like prettier. But it's much faster, it's written in rust. So we're orders of magnitude faster than prettier. And you can really feel that, especially on large projects. Prettier will sometimes take a few seconds to complete formatting, whereas dno will complete within a couple of hundred of milliseconds. And we also have a built in linter. There's sometimes this confusion that linters and formatters are the same thing. They're not. Linters are for catching logic errors and enforcing style choices like camel case, stuff like that. But they're not there for ensuring that everything ends in a semicolon, or that there's no trailing commas or things like that. That's all handled by the formatter. So dno-lint really does not check formatting, it only checks for logic errors. So things like you're comparing to the NAN value, NAN, I'll demo that in a second, which in javascript is invalid. You're not allowed to do that. It doesn't work. So let's do that real quick here. Let's run dno-fmt. And you can see it checked four files. It didn't actually format anything because everything's already formatted correctly. But if we remove some semicolons here and mess up the styling a little, then run dno-fmt again. It all snaps right back into place where it's meant to be. And this also works with the readme. So I can add some very well formatted stuff here and run dno-format and it all formats it correctly again. And there's also a dno-format-check, which you can use in CI to check that all the formatting is correct. So it won't actually fix it, but it'll just print out an error message that it's not correct. And then you can fail your CI if somebody committed something with invalid formatting. Cool, and then the other one is dno-lint. So you just run that the same way, dno-lint, and it checked these two files, the two typescript files here. It didn't find anything, so it didn't print anything out. But I could add something that would complain about here. So if I compare to nan, as I said earlier, nan is not a number. So if you can't compare with not a number, this always returns false. That's just one of the oddities of javascript. And you can actually already see this code's already telling me that this is bad. Don't do this. I should use the isNan function. And if I run dno-lint, you'll see here, use isNan function to compare with nan. So it gave me linter. And it even gave me this link here where I can go to a website to get some more information on the error and how to fix it. Let's remove that, dno-lint, cool. Okay, let's actually check this into my repository here. Initial commit. And upload that. Are there any questions? Okay, so next step, let's set up a CI pipeline for github actions. I don't know if any of you have done this before. It's pretty simple. You just create this.github folder. And inside of there, you create a workflows folder. And inside of there, you create a yaml file. And this, so what I was saying is you create this GitHub workflows folder and you put a yml file in that, a yaml file. And this you can declare your workflow with. So we're going to run a few. Let's give this a name first. So this is going to be our test job. We also need to give it the operating system we want it to run on. We'll run this on Linux. So we'll go into latest, it's Linux. And then we need to give it some steps to execute. We'll start by checking out the repository. So that's actions slash checkout. This will check out the repository. Then we need to install dno. So we'll do that. So there's a dnoland slash setup dnoaction. We can give that some arguments. So dnoversion, here we want to install the latest 1.0 dnoversion. Then we can run the formatter. So let's run dnofmt. And we'll run it with a check flag because we don't actually want it to format. We want it to error out if formatting fails. Then we want to run dnolint as well. And for good measure, we'll run dnotest as well. So that's our CI script. We check out the repository. We then install dno. We run the formatter. We run the linter. We run testing. And then we're done. Let's commit that. Are there any questions about this? This is the repository, by the way, here. Here's the commit that adds the CI. Let's wait for it to start here. Cool, it completed. Took four seconds. Dno's written in rust. It's super fast. Check out took one second. Setup took one second. And then formatting completed. Linting completed. testing completed. And then we're done. So cool, that works. And if I now change something in here to make it, I could make CI break. Let's remove the semicolon here and commit that. You will see that CI will fail. I'm not signed in, so I can't read the logs here, all of them. But you also have to trust me on this. Let's give it a couple seconds. Does anybody have any questions on this? Is it compatible with GitLab CI? Yes, you can also use GitLab. You won't have the actions or you won't have the setup step here that you can do, setup dno. But what you can do is you can just have dno. I think GitLab uses Docker images as base images. What you can do is there's this repository, dno.land, dno.docker, which has Docker images for dno. And you can just use this as your base image for your GitLab CI. And then still run dno test and dno.fmt. So you won't be able to, it's not going to look exactly the same, but it should still work. That's Julian here. That's that. Cool. And let's check that that failed. There we go. Yeah, dno.fmt failed. Cool. So that's because that formatting error. Let's get back to the slides. So we're nearly all the way through now. We're going to skip this part because we don't have as much time. But dno.landx, you can have dno.landx is like the first, is a dno-first module registry where you can publish your source code for dno. It's sort of like npm, but for dno. But it's not a blessed registry. So that means it's not the only registry that dno. You can import your code from anywhere. You can even host it on your own domain. This is just one that we wrote for dno, which works really well and hooks right into your existing GitHub workflow if you do that. It uses GitHub webhooks to publish. It's really cool. If you want to publish something, you can go to dno.landx and press this button here. And it'll give you all the information about how to do things. And if you have any questions, hop onto our Discord, which is linked right here. And we can help out. OK, then we're going to skip that. And then let's get onto the interesting part, which is also making this work in Node. So there's some specialties about Node. Node is not as modern as dno. And as such, it doesn't have the transform stream, or readable stream, or writable stream as a global. Instead, these need to be imported from stream web. So if I open the Node repo here and try to get a readable stream, it'll say readable stream isn't defined. Whereas if I do that in dno readable stream, you'll see it is defined. And in Node, the way I get it is by doing a require for it. So require readable stream. Once readable stream is equal to require stream web. And then if I log it out, I get it. So in Node, we need to do the special casing, which is kind of annoying. On the browser, we don't need to do that. The browser just supports readable stream, writable stream, and transform stream out of the box. And the other specialty is that we wrote in typescript here. But Node doesn't know how to run typescript files, and nor does the browser. So we'll need to embed plain javascript files for these. And we also want to publish our package to npm, because that's where most of the Node packages live. So what we're going to use to do that is dnt. And dnt is a project by the dno folks. So by us, we build it and maintain it. It's dno-node-transform. And what it does is it takes your typescript code that's written for dno and transpiles it to CommonJS and ESM for distribution on npm. So it'll take all the URL imports and turn them into something that npm understands. And it'll take all of your ESM imports and exports and turn them into CommonJS imports and exports that Node understands. And it'll automatically replace all the globals that are not available in Node. So streams, for example, are not available in Node in the globals. It'll replace them with the versions that are imported from either Node itself or from some pony fields or poly fields. And it'll actually also transpile all of your tests and run them in Node. So that way, you can make sure that if you have a nice, big test suite, that your code doesn't just run in dno, but the transpiled code also works in Node. And that gives you the best of both worlds. You can develop your module with all of the built-in dno tooling, so the linter and the formatter and the testing framework and the VS Code integration, all out of the box while still making the modules available to your end users that are writing code for Node. Yeah. So dnt, that's this repo over here, dno.land.dnt. And it has this very, very large readme with the entire examples of everything it can do. But we don't actually need a bunch of this. We're just going to take the little example at the top here, which is to create a build script. And I'll copy this and paste that. And then I'll run you through all of the different steps here. And I'll show you how to do that. And I'll actually paste this into the Zoom chat as well. It'd be great if somebody could repost it to Discord so you can view that. Yeah, so what we're going to do is write some more code. We're going to create the setup build script. We're going to set up a little build script that will run the code to Node. We'll run the test to Node. And then we'll publish our code to npm. So first thing we're going to do is create a new file. Let's call it script slash build.ts. And that's a little build script to import dnt directory. There's actually currently no directory. But the npm directory is where we're going to generate this code to. So we'll empty that if that already exists. And then we're going to call this build function from dnt. And that takes a few arguments. So the first argument is the entry point of the module we actually want to transpile, in this case, mod.ts. That's this one here. We then want to specify the directory to output to. So that's going to be the.npm directory that we emptied earlier. We'll get to the streams part in a moment here. And then this is our package.json. If you want to publish something to npm, you need to have a package.json. So you can specify all the options you want here. We're not going to set this up here. License MIT, sure. Let's call this what we're going to call the stream utils. And let's add a description as well. Some stream utilities for gnome and Node. And we don't have a license file, so we'll remove this step. But we'll copy over the readme file into the output directory as well. OK, so what the shims does is shims are all the things that we want dnt to automatically shim out for us. So things like the dno global. That obviously doesn't exist in Node. But we can have dnt inject some code that will make it exist in Node. So essentially, we're polyfilling dno in Node. And that lets you write just for dno, but still have your code run in Node. We don't actually use any of the dno globals except for in the testing. So we're just going to set this to dev, which means we only want this for things like the tests. We don't want it for the actual production build. And then we want to add some custom shims as well. So we're going to use this custom keyword. And we're going to want to shim out the things from streams web, or from stream web that I showed earlier. So we want to shim out transform stream, readable stream, and writable stream. And what this is going to do is whenever this transpiled step encounters any of these globals, it will rewrite them to use the exports from the stream web module in Node. And then one more thing we're going to do is, by default, dnt expects that you use the latest Node LTS build, so the latest stable build. But we're actually using the stream web thing. That's only supported in Node 17. So we're going to have to specify that we actually want to use the newer version of Node. And we do that by just specifying dev dependency. Did I spell that right? Yes. And we're going to set up types Node to be version 17. So we want to include Node version 17 here. And then what we can do is dno run script build.ts. And you'll see that this runs. And then it's going to ask me some questions. It's going to ask me, do you want to allow environment access to home? For example, if I say yes, you're going to ask me, do you want to allow environment access to whatever, something else? This is what I said earlier. If dno, by default, dno does not allow you to access any environment or network or disk things. You can only do that if you either explicitly opt in using these prompts or if you specify a flag, like dash dash allow env, for example, to specify environment access. We're going to do for script specify dash a. Dash a means allow everything because we trust this code. So we're just going to allow it to write whatever we want. And then it's going to do this transforming here. It created the npm directory. It's building. It's type checking. It's submitting some code. And now it ran the tests. So you can see that the tests that we previously ran in dno, they're now running in Node as well. And they also pass in Node. And if you actually look at the outputted code here, this is the npm directory that was created. It has a package.json file in it with all these things in it. Here's the version, the name, the description, the license, the dev dependencies we specified earlier. And here's the actual outputted code. So you can see this outputted code looks very similar to the typescript code we had earlier, except for that all the types are stripped out. And the transform stream that was previously we got from the global, we're now getting it from this shims module. And the shims module just imports it from stream web. Cool. So that works. Let's add that to a gitignore because we don't want to check that in. That's the build output. Gitignore npm slash. Cool. So we have that admitted now. Last step to do is actually publish it to npm. So we're going to publish or cd into that folder and then do npm publish. And then I need to type in a one time passcode. One moment. Let me grab that. It's all this newfangled security. And while I'm doing that, feel free to already ask some questions. I'm sure there are questions. Oh, OK. This package name seems to already be used. Let's change the package name to something else. That, for example. Let's run that build again. OK. Cd npm and then npm publish again. And I need another one time passcode. Do I sign up for private packages? OK. Well, I guess we're changing the name again. You can really see I don't use Node much anymore. I'm not aware of all these newfangled payments in npm here. But you need. OK. Let's try that one more time. And we need another one time passcode. I hope that showing this four times in a row doesn't immediately compromise my account. There we go. And it's published. OK. So now if we go to npm, we'll see that. There we go. It's published. Did you read me? Published just a few seconds ago. And if you want to try this out locally, you can put this in your package JSON now and try out this JSON parse stream. Are there any questions? Let's check Discord here. Could this be automated with CICD? Yes, definitely. So one thing you could do is you could say, let's actually just do that real quick here. You can say dno run, what is it, dash a scripts build.ts. We have to put in a version number. Let's set that to 1. And we can run that all the time. And then we can say run npm publish. And then inside of the npm directory, only if. I don't remember what the exact thing is. I think it's the github.branch. I don't know. You'd have to look this up in CI or in documentation. But you can say only if it's on the main branch or only if it's a tag. Only if this is a git tag. You can publish it automatically. But yeah, you could definitely do this in CI as well. You'll just have to figure out the exact commands you want to use. As far as I understand, not all features can be polyfilled in Node. Yes, correct. There's some things which are very difficult to polyfill in Node. Some things like fetch, for example, they can be polyfilled with like unichi or Node fetch. And DNT actually supports these right out of the box. So for example, if you want to polyfill, let's find it, timers or prompts or blob or crypto or DOM exception or fetch or weak ref. All these things, if you want to, those are really easy to polyfill. And then the DNO namespace is also pretty much entirely polyfilled. I think there's a file here which shows how far we are along with that. Let me find it real quick. Maybe not. I don't know. But yeah, there's some things which can't be polyfilled. But essentially, you can polyfill pretty much anything. It's just not going to be as nice, right? Because you need to have all these polyfills. If you're writing completely with DNO only, that's obviously the nicest because that's the least amount of dependencies that you're adding. And everything just works out of the box. But if you want this to work for Node users as well, then yeah. You'll need to shim out things. And maybe there's a certain function that you can't use because it's not available in Node. That's possible. Then you just need to avoid that or use something else. But there's a bunch of documentation here. And if you have any specific questions, feel free to join our Discord. It's discord.gg slash dno. And there's a help channel in there. You can always ask questions. OK. Are there any other questions? We're pretty much out of time. So I could show some other things. But is there an easy way to add Git hooks to SQL Engine Formatter on commit? I think so. I'm personally not a fan of Git hooks. But I'm sure there are. Let's see. Git hooks dno. OK. Here's this thing. Oh, these are just here. Let's try to look at that. I'm sure there is. Maybe ask this on our Discord. And then you'll get some better answers. I'm not sure. I've never done it. But yeah, it's definitely possible. How exactly to do it? I'm not quite sure. OK. I'll show you real quick how to use this in the browser, because that's part of the title of the talk, or the webinar here. Yeah, so in the browser, you can't use the.ts file. You'll need to use a js file. But that's fine. We can use dno bundle, the subcommand. And what dno bundle does is it takes all of your typescript code written for dno and turns it into a single javascript file. So we'll bundle this mod.ts file. Go back to main directory. dno bundle mod.ts into a single mod.js file. So this just stripped out all of the types. And this you can now use in the browser. So if I have a in a set html file here, and I have a script tag, and that's type module, and I import JSON parse stream from mod.js, and I log that out. And let me serve that here and open the console. You can see that works. You can print that out. If I create a new one, your new JSON parse stream, it'll create a new JSON parse stream. OK, I think that is all I have for today. Yes, you can also import from CDNs. Let me show that real quick as well. You can also use javascript files, dno, by the way, not just typescript files if you like javascript more. If you want to import react, for example, you can import from one of these CDNs is ESM.sh, HTTPS ESM.sh slash react, for example. This will import from react, and you can have all the react stuff if you want to use react. Or if you want to use, I don't know. You can import a bunch of stuff from npm using some CDN. So there's ESM.sh, there's skypack.dev, which is another one. There's jspm.dev, which is another CDN. Usually you can import most things through CDNs if you want to. You can also specify versions in here if you want, like a specific version of react, for example.