This workshop will walk you through writing a module in TypeScript that can be consumed users of Deno, Node and the browsers. I will explain how to set up formatting, linting and testing in Deno, and then how to publish your module to deno.land/x and npm. We’ll start out with a quick introduction to what Deno is.
Writing Universal Modules for Deno, Node and the Browser
AI Generated Video Summary
Luca, a software engineer at Dinoland Inc., works on the open source Dino CLI project and the cloud compute offering called Dino deploy. They contribute to web standards work for Dino. The Workshop covers writing a utility library that works in Deno, Node, and the browser, utilizing Deno's tooling and enabling publishing to npm. It also includes topics like setting up Deno in VS Code and PowerShell, creating a JSON parse stream, testing, formatting, setting up CI pipelines, making code work in Node, transpiling, and publishing to npm. The Workshop explores automation with CI/CD and Git hooks, using Denobundle for browser compatibility, and importing from CDNs.
1. Introduction to Luca and Dino
I'm Luca, a software engineer at Dinoland Inc. I work on the open source Dino CLI project and the cloud compute offering called Dino deploy. I also contribute to web standards work for Dino, collaborating with specification bodies like ECMA and WG.
2. Introduction to Dino and Workshop Agenda
Dino can run TypeScript out of the box. Dino is a SQL executable. That makes it super easy to install and deploy. There's no need to install any crazy dependencies or make sure you have the right versions of things. It's just a single binary and everything is included. It includes a bunch of built-in utilities that we're gonna use today during the little thing that we're gonna build in here. A linter and a formatter. That's like ESLint and Prittier, but written in Rust, super fast. We also have a test framework, which is very, very simple, but allows us to do very advanced testing. And will essentially cover all your testing needs, which is pretty cool. We have a set of standard modules, Dino Land, slash STD, that encapsulates some common functionality. It's guaranteed to work in Dino, and we maintain that. And, yeah, essentially, if you look at the top 100 things on NPM, they're either built into the runtime directly, or they're in the standard library. So, yeah, there's a whole bunch of stuff you don't need to trust a whole different bunch of people for, but you can just trust us, which is kind of cool. And because we're a very new, modern runtime, we get to follow the web standards for essentially all of our tooling and built-ins. We have the fetch global by default. We support import maps. We use ECMAScript modules. We have web workers. We use streams, like the web streams that we're going to actually be using quite a bit today. We use promises for everything, so no callbacks. Yeah, it's pretty nice to work with.
Do we have any questions on this? It doesn't look like it. Cool. So let's quickly run over what we're actually going to do today, and then we can get right into it. And you guys can follow along. And if you have any questions, and I'm going too fast, please just stop me. Just send me a message on Discord, and I'll slow down. And I'm happy to explain things as we go if I'm not clear enough. So we're going to write a little utility library that does some stream combinators. I'll explain what those are in a second. That's going to work in Deno, and in Node, and in the browser, all from a single code base. And you're going to be able to use all the Deno tooling. So the testing framework, and the linting, and the formatting, and everything, but you're still going to be able to publish your library to npm and use it in the browser as well. So the library itself, I'll get to that in a moment. And then on the next slide, let's cover the things that we're actually going to do. So we're going to write the library, we're going to write some unit tests using the test framework. We're going to run formatting and linting. Just to show you how that works. We're going to set up CI and get have actions. That's like an important part for any project, right, making sure that your code doesn't break every time you push.
3. Stream Combinators and JSON Parse Stream
We're going to publish the library to DynalandX, test the code in Node, and publish it to npm. The library will have a single stream combinator, which transforms streams. It will include a JSON parse stream that converts chunks into JSON messages. The code itself won't be complicated, and the library may not be very useful. We'll start by setting up Deno and Node, and then create the project structure. Finally, we'll write the JSON parse stream implementation.
We're going to publish the library to DynalandX, and we're going to test the code in Node and publish it to npm. And depending on how long each of these is going to take, we might skip over some of them, or, yeah. We'll see.
Okay. So the library itself is going to just have a single stream combinator. A stream combinator is essentially a way of transforming streams. So streams are – there's two real types of streams, readable streams and writable streams. One is the data source. One is the data sync. And the source is things like the body from a fetch request. When you get a data stream down to you, that's a data source. And then the data sync is things like you want to write to a file, so you're syncing the data to some system, for example, a file. And you can combine these. So you could, for example, read from a readable stream. Here we get – we make a fetch request. We get the response body. That's a readable stream. And then we pipe that into a writable. So we're essentially going from a readable to a writable. That's the direction you always have data go, from a source to a sync. And what you can do is actually you can put things in between that. So you can put a compression stream in between, for example, which what that will do is it will compress data as it flows through the stream. So whenever a chunk arrives from example.com, it's going to compress that with gzip, and it's going to write that to a file with this pipe to. So we're piping something through the transformer. And this compression stream, that's the stream combinator. And the web actually has a standard for this, that's it's called transform stream. A transformer consumes data, it then transforms it and then emits new data like that transform data. And the transform stream actually has a readable and a writable end. So the writable end is the thing is it's you give it data. That you want to transform and the readable and you can read the data that was transformed. So the library that we're going to build is going to have a JSON parse stream, which it's going to be very simple. Each chunk is just a string. And you parse that as a JSON message and emit that as a result on the readable end of the stream. And this may sound kind of confusing, you'll see in a moment, it's not confusing. It's pretty simple. And if we have time we'll do this dualine stream as well, but I don't think we're going to get to it.
One thing I want to point out is that we're not going to this is the code itself is not going to be super duper complicated or anything. And this library is probably not going to be very useful. We're just writing this library. So you have an example of how you can approach something like this and like all the steps you need, all the scaffolding you need, and all the scaffolding you don't need as well. And then you can write your own libraries that do other things. Use the same sort of setup.
So let's actually get started coding. On the requirements of the workshop, I had asked you all to download Deno and Node and set up your LSP or your IDE. I set up VS code because that's the editor that I usually use. What we'll start out, I'll just show you how exactly to do that, what setup steps you need. And then I'll show you what project structure you need to create. And you'll see that there's actually not much to it. And then we'll actually write this JSON parse stream implementation. Just going to be a couple of lines.
4. Setting Up Deno in VS Code
In VS code, ensure you have the Deno extension installed and Deno version 1.19. Open the command palette, run the Deno initialized workspace configuration command, and answer yes for linting and no for stable APIs. This creates a VS code directory with a settings.json file. To write code in Deno, create a file and start coding. No need for a package JSON or other manifest files. Run Deno run mod.js to output 'Hello World'. If you encounter Deno project initialization failed, check if Deno is installed and try restarting VS Code. For Windows, open the terminal and type 'which Deno?'
Okay, so this is my editor. I have the Deno extension installed. You can see here, the Deno extension is installed. That's very important. You need to make sure you have the Deno extension installed if you're using VS code. And I also have Deno installed, version 1.19. That's the most recent version.
And then all I need to do to create a Deno project in VS code, I need to open the command palette. I can do that with F1 or I can press command, what is it, command shift P. And then I run this Deno initialized workspace configuration command. And what that does is it'll ask me some questions and then it'll tell VS code that I'm going to use Deno in this workspace. And that just makes sure all the linting and everything works. So, I'll answer yes on the question about linting and I'll answer no but on stable APIs. We're just going to use stable APIs today. Cool, and I just created this VS code directory with a settings.json file that just has the settings in here for enabling Deno and enabling the linting, disabling the unstable stuff.
Okay. Any questions so far? Can I show that again? Yes, sure. So you want to press Command Shift P or FNF1. Depends on what exactly your bindings are. Then it opens this command palette and then you type in Deno and you'll find Deno initialized workspace configuration as one of these options here and you just click on that and then you can answer yes and then no. Cool. Then let's get started with writing actual code. So all we need to do to write code in Deno is to just create a file to actually put the code into. You don't need a package JSON file, you don't need any other manifest file, you're just writing code right away. So let's start with just something simple to show that Deno's working. Let's just log out hello world.
A question is, is this the only way to start a new project or is there also a CLI command? There's no CLI command because if you're using just the CLI you don't actually need to do this. This settings is only if you're using VS Code. So Deno will run without this settings.json file. This is only if you're using VS Code and you want the autocompletion to work correctly, you need to set this up but you don't need it to actually run the code. Okay, so you run Deno run and then mod.js. Boom and it outputs Hello World. Does anyone have any questions? You're getting Deno project initialization failed. Can you open? That's unfortunate. Is Deno installed? If you open the terminal and run deno-b, do you get the Deno version? Answering questions again. One moment here. Okay. Then can you... You might need to restart VS Code real quick here. Are you all on Windows? Okay. What you can do is you can open your terminal and type in, which Deno? On Mac. Okay. You can open up your terminal and type in which Deno? You should get a file path back. Let me know if you don't get that. And then you press... No path. Okay. And that's on Windows. Okay. Okay.
5. Setting Up Deno in PowerShell
Let's see how to do this on Windows. Try in PowerShell instead of CMD. Type 'p w s h' and 'which Deno' again. If you have the file path, press command dot or control dot to open the settings page. Enter the path for deno.path. If you can't get a path, install Deno and it will print out the installation location. Now, you can run 'Deno run' and replace the code with a new class called JSONParseStream. This class extends transform stream and has a constructor that calls the super constructor. The transform function in the constructor takes a chunk and a controller as arguments.
Let's see here. How can we do this on Windows? Can you try in PowerShell instead of in CMD? So type in p w s h and then type in which Deno again? Cool. If you have the file path, you just press command dot or control dot to open the settings page. Type in deno.path. And then you can enter the path here. Wow, for the CLI. Bjorn, if you can't get a path, if you install Deno, it will also print out where it installed it. If you still have that, then that may work, too.
Okay, cool. Then let's continue. So, assuming you have that working now you can type in, as I said, Deno run, and it'll run something. Let's replace this with some actual code here. So we were just printing out HelloWorld before. Let's create a new class, which is going to be our JSONParsStream. So these TransformStreams, they're always classes. And these classes, how do you get to the Settings page? You press Control dot or Command dot. On MacOs you press Command dot. Or Command comma, sorry, not Command dot, Command comma. Or you click up here, where is it, somewhere up here.
Okay. I'll give you a moment here. Okay. So what you want to do is you want to create a new class, which is going to be this JSON by stream class, which we're, which we're creating here. And this is going to be a transform stream. So this needs to extend transform stream and transform stream. That's a builtin. So the transcripts team that is just built into deno, it's built into browsers. Yeah. So you want to extend that because we're using TypeScript. We're going to add some type arguments as well. The input type is going to be string, so it's going to consume strings that it will parse. It will output some value. We're not sure because it's parsing JSON, random JSON, and could be anything. So we're just going to say unknown. We don't know what it is. And this will still work. It won't actually do anything, but it'll compile. And then we want to set up a constructor. So the constructor, this is called every time you create a new, one of these JSON parse streams. You create the constructor, and then you call super. And super is, this is essentially the constructor for the transform stream. So this takes all the options that a transform stream does. So a transform stream takes all these different options, most importantly, this transform function, we'll get to that in a moment. And we'll set that up now. So when a user creates a new transform or a new JSON parse stream, we're going to create a transform stream internally with this transform function. And the transform function takes a chunk and a controller as an argument. The chunk, that's a string. So that's for each string that gets piped to the transform stream, this transform function gets called once. First argument is the chunk, and the second argument is the controller. And for the controller, you can do things like queue an item onto the readable.
6. JSON Parse Stream and Testing
When writing to the transform stream, we call the transform function to process the chunk and enqueue the result. We use JSON.parse to parse the input chunk into an object and then enqueue it. This is the entire library code. Now, let's write tests for the JSON parse stream. DittoTest is a built-in test runner with a simple interface. We create a test file with the same name as the code file, ending in underscore test or .test. We register tests in the test file and call Ditto test. We import the JSON parse stream into the test file and create a test function. We can run the test and see the output. Now, let's import the JSON parse stream into the mod underscore test.ts file.
7. Creating a JSON Parse Stream
Let's create a new JSON parse stream. We write a JSON-encoded string to the stream, read from the readable end, and log the result. It works, and the value is an array with one, two, three. We can get the length of the array and assert it using the testing module's asserts methods.
And let's actually create a new JSON parse stream. Okay, now, so const stream is equal to new JSON parse stream. So what this will do is it will call this constructor, call the super function with this transform function that we made earlier. And it will assign that to the stream variable here. We can see that that's working by just logging this out. So we'll log out the stream here. And then you can see in the terminal, it's a new JSON parse stream with the readable and a writable. And the readable is a readable stream and the writable is a writable stream.
So let's actually write something to the stream. So if we remember, each chunk is going to be JSON decoded. So let's write a JSON-encoded string to the stream. So what we're going to do is, we're going to get a writer for the stream. So stream.writable.getWriter. And then we're gonna call writer.write. And here's the chunk that we're going to give it. So we're just going to give it an array with three values in it. So this is valid JSON, right? It's an array with three numbers as values. And then the next thing we're going to do is we're going to get a reader. So we wrote something to the right-of-the-stream. We now want to read something from the readable end. So we call stream.readable.getReader. And then we have this reader function or this reader object here that we can read on. So reader.read. And that actually returns a promise. So we're going to need to await this. And then we have a result. And we can log that out as well here, console.log the result. So what this is doing now is it's creating a stream, it's getting a writer, writing one chunk to the writing end of that stream. It's going to then read one chunk from the readable end of that stream, and then it just logs out the result. Let's do that. And you can see, it works. We're writing in the string and it parsed it out as here, the value. So the value is this array with one, two, three on it. So if I want to just get the first item of the array, or just get the array itself here, I can do that. And I could do things like dot length, and TypeScript doesn't like this. So what we're going to do here is we're going to say this is a number array. Okay, let's get the length of the array. There we go, that should work. So it has three items in it. The first item is a one, and the second item is a two. Oh, wait, no, that's the third item. That's the second item. Start counting to zero. Okay, so that's nice and all, but now we're just logging this out as a test. We want to make sure that this is actually the case. So we'll import an assertion from the standard library. So you can, we'll do that here, dino.land.std. Std is the standard library. I will do the latest version, and then there's this testing module which has an asserts function, or an asserts module, sorry. And that has all these asserts methods.
8. Testing and Formatting
We compare the value with what we expect it to be and run the test to ensure it passes. Debugging is also possible, and we can run tests directly from VS Code. We cannot await the write operation because it will get stuck in a lock. Now that we have tests written and source code completed, let's move on to formatting. Deno has a built-in formatter that ensures a consistent style across the ecosystem. It is similar to Prettier but faster, written in Rust.
And we're just going to use assert equal. We want to make sure that two values are the same. So we're going to compare the value with what we expect it to be, and we expect it to be one, two, three as an array. So if we run this again, it passes. If we change this, so we expect another value here, for example, this is going to fail the test. See the test says, values are not equal, JSON parse stream failed, and you'll see there's this extra value in the expect that wasn't actually present in the actual test result.
Okay, run this again. And you can see, we have a test now that checks that our JSON parse stream works as expected, which is very nice. We can also run this right from VS Code. So above the denote test call there's a run test button, if I click that, it'll run just that test. So if you have a bunch of tests and you want to just test one of them to make sure it's working, press this run test button and it'll run just that test. And you can even debug the test right away. So let's say we want to put a break point here on this JSON parse function. I can press debug, and we'll hit the break point. And you can see the chunk is one, two, three. And if I step over this, your object is the array one, two, three, cool. So debugging also works. Are there any questions or anything went wrong? Question on Zoom, let me check here. Should we await the write operation? No, and the reason why is slightly confusing. I'll show you what happens if you do. It's not going to work. It's actually going to exit with this weird error. Promise resolution is still pending, but the event loop has already resolved. It's kind of a weird error. And the reason this happens is because the promise returned from write doesn't complete until you've actually read the value from the readable end. So this promise doesn't resolve until this promise resolves here. The problem is, if I put it in a wait here, it's just going to get stuck here because nothing is ever going to read from the readable end because we're waiting to write, but write never completes because we need to read for the write to complete. So that doesn't work. So we get into this lock, like where everything gets locked up and Dino will print out this error if that happens. But yeah, so you don't want to await the write because the right is going to be complete by the time the read is complete because for the read to succeed, you need to have already written something to the writer. Does that make sense? Cool. Okay.
9. DenoLint and DenoFormat
10. Setting Up CI Pipeline for GitHub Actions
Let's set up a CI pipeline for GitHub Actions. Create a YAML file in the .github/workflows folder to declare the workflow. We'll check out the repository, install Deno, run the formatter, linter, and testing. After committing, the CI process will run and complete. If there are any questions, feel free to ask. GitLab CI can also be used with Docker images from the DenoLand/DenoDocker repository.
Okay, let's actually check this into my repository here. Init, initial commit. And upload that. Are there any questions?
Okay, so next step, let's set up a CI pipeline for GitHub Actions. I don't know if any of you have done this before. It's pretty simple. You just create this .github folder and inside of there, you create a workflows folder and inside of there, you create a Yaml file. So what I was saying is, you create this GitHub workflows folder and you put a Yml file in that, a Yaml file, and this, you can declare your workflow with. So we're going to run a few, let's give this a name first. So this is going to be our test job. We also need to give it the operating system we want it to run on. We'll run this on Linux. So we'll go into latest as Linux, and then we need to give it some steps to execute. We'll start by checking out the repository. That's actions slash checkout. This will check out the repository, then we need to install Deno, so we'll do that. There's a denoland slash setup Deno action. We can give that some arguments. So Deno version here. We want to install the latest 1.0 Deno version. Then we can run the formatter. So let's run Denofmt. And we'll run it with a check flag because we don't actually want it to format, we want it to error out if formatting fails. Then we want to run Deno-lint as well. And for good measure, we'll run Deno-test as well. So that's our CI script. We check out the repository, we then install Deno, we run the formatter, we run the linter, we run testing, and then we're done. Let's commit that. Are there any questions about this? This is the repository, by the way, here. With that as the CI, let's wait for it to start here. Cool. It completed. Took four seconds. Deno's written in rust, it's super fast. Check out took one second. Set up took one second. And then formatting completed. Editing completed. Testing completed. And then we're done. So, cool. That works. And if I now change something in here to make it, I could make CI break. Let's remove the semicolon here and commit that. You will see that CI will fail. I'm not signed in so I can't view the logs here, all of them, but you also have to trust me on this. Let's give it a couple seconds. Does anybody have any questions on this? Is it compatible with GitLab CI? Yes, you can also use GitLab. You won't have the actions, or you won't have the setup step here that you can do, Setup Deno, but what you can do is you can just have Deno, I think GitLab uses Docker images, right? As base images. What you can do is there's this repository, DenoLand, DenoDocker, which has Docker images for Deno.
11. Using Dino Land X and Making It Work in Node
And you can just use this as your base image for your GitLab CI and then still run Deno test and DenoFMT. So you won't be able to... It's not going to look exactly the same, but it should still work. That's Julian here. That's that. Cool. And let's check that that failed. There we go. Yeah, DenoFMT failed. Cool. So that's because of that formatting error. Let's get back to the slides.
So we're nearly all the way through now. We're going to skip this part because we don't have as much time. But Dino Land X you can have... Dino Land X is the first, is a Dino-first module registry where you can publish your source code for Dino. It's sort of like NPM, but for Dino, but it's not a blessed registry. So that means it's not the only registry that you know. You can import your code from anywhere. You can even host it on your own domain. This is just one that we wrote for Dino, which works really well and hooks right into your existing GitHub workflow if you do that. It uses GitHub webhooks to publish, it's really cool. If you want to publish something, you can go to Dino Land slash X and press this button here, and it'll give you all the information about how to do things. Yeah, and if you have any questions, hop onto our Discord, which is linked right here. We can help out. Okay, then we're gonna skip that. And then let's get onto the interesting part, which is also making this work in Node.
12. Transpiling and Testing for Node Compatibility
You can develop your modules with Deno's built-in tooling and ensure compatibility with Node by transpiling and running tests in Node. This allows you to have the best of both worlds: utilizing Deno's tooling while making the modules available to Node users.
It'll replace them with the versions that are imported from either Node itself or from some pony fields or polyfills. And it'll actually also transpile all of your tests and run them in Node. So that way you can make sure that if you have a nice big test suite, that your code doesn't just run in Deno but the transpiled code also works in Node. And that gives you the best of both worlds. You can develop all your module with all of the built-in Deno tooling. So the linter and the formatter and the testing framework and the VS Code integration, all out of the box while still making the modules available to your end users that are writing code for Node.
13. Building and Publishing to npm with DNT
We're going to create a build script that transpiles the code to Node, runs tests, and publishes the code to npm. We'll specify the entry point, output directory, package JSON, and add a description. Shims are used to polyfill Dino in Node, allowing code to run in both environments. We'll shim the Dino Global and custom shims for transform stream, readable stream, and writable stream. We'll also specify the newer version of Node to use.
Yeah, so DNT, that's this a repo here, deno-land-slash-dnt. And it has this a very, very large readme with the entire examples of everything it can do. But we don't actually need a bunch of this. We're just going to take the little example at the top here, which is to create a build script. And I'll copy this and paste that. And then I'll run you through all of the different steps here. And I'll actually paste this into the Zoom chat as well. It'd be great if somebody could repost it to Discord. So you can view that.
Yeah, so we're going to do is write some more code. We're going to create the setup build script. We're going to set up a little build script that transpiles the code to node. We'll run the test in node and then we'll publish our code to npm. So first thing we're going to do is create a new file. That's script slash build dot ts. And that's a little build screen to import the nt directory. There's actually currently no directory but the npm directory is where we're going to generate this code to. So we'll empty that if that already exists. And then we're going to call this build function from dnt and that takes a few arguments. So the first argument is the entry point of the module we actually want to transpile. In this case, mod dot ts. That's this one here. We then want to specify the directory to output to. So that's going to be the dot npm directory that we emptied earlier. We'll get to the sims part in a moment here. And then this is our package JSON. If you want to publish something to npm you need to have a package JSON. So you can specify all the options you want here. We're not going to set this up here. License MIT sure. Let's call this what we're going to call this stream utils. And let's add a description as well. Some stream utilities for gnome-node. And we don't have a license file so we'll remove this step. We'll copy over the readme file into the output directory as well.
Okay, so what the shims does is shims are all the things that we want to we want DNT to automatically shim out for us. So things like the Dino Global. We can, that obviously doesn't exist in node but we can have DNT inject some code that will make it exist in node. So essentially we're poly filling Dino in node and that lets you write just for Dino but still have your code run in node. We don't actually use any of the Dino Global except for in the testing. So we're just going to set this to dev which means we only want this for things like the tests. We don't want it for the actual production build. And then we want to add some custom shims as well. So we're gonna choose this custom keyword and we're going to want to shim out the things from streams web or from stream web that I showed earlier. So we want to shim out transform stream, readable stream and writable stream. And what this is going to do is whenever this transpiled step encounters any of these globals, it will rewrite them to use the exports from the stream web module in Node. And then one more thing we're going to do is by default, Node or by default, DNT expects that you use the latest Node LTS build. So the latest stable build, but we're actually using the stream web thing and that's only supported in Node 17. So we're going to have to specify that we actually want to use the newer version of Node. And we do that by just specifying dev dependency, did I spell that right? Yes.
14. Setting Up Node and Publishing to npm
We set up Node version 17, allowed environment access, transformed the code, ran tests in both Dino and Node, and published the code to npm. The outputted code is similar to the TypeScript code, with types stripped out and the transform stream imported from a shims module. We added the build output to gitignore, changed the package name due to conflicts, and successfully published it to npm. Feel free to ask questions.
And we're going to set up types Node to be version 17. So we want to include Node version 17 here. And then what we can do is DNO run script, build LTS. And you'll see that it runs. And then it's going to ask me some questions. It's going to ask me, do you want to allow environment access to home? For example, if I say yes, you're going to ask me do you want to allow environment access to whatever something else. This is what I said earlier. If DNO, by default DNO does not allow you to access any environment or network or disk things. You can only do that if you either specifically opt-in using these prompts or if you specify a flag like a dash dash allow and for example, to specify environment access. We're going to do a script specified dash A. Dash A means allow everything because we trust this code. So we're just going to allow it to write whatever we want. And then it's going to do this transforming here. It created the NPM directory, it's building, it's type checking, it's emitting some code and now it ran the tests. So you can see that the test that we previously ran in Dino they're now running in node as well and they also pass in node. And if you actually look at the outputted code here, this is the NPM directory that was created. It has a package JSON file in it with all these things in it. Here's the version, the name, the description, the license the dev dependencies we specified earlier. And here's the actual outputted code. So you can see this outputted code looks very similar to the TypeScript code we had earlier except for that all the types are stripped out and the transform stream that was previously we got from the global. We're now getting it from this shims module and the shims module just imports it from stream web. Cool, so that works. Let's add that to a gitignore because we don't want to check that in. That's build output, gitignore npm, slash. Cool, so we have that emitted now. Last step to do is actually publish it to npm. So we're going to publish or cd into that folder and then do npm publish. And then I need to type in a one-time passcode. One moment, let me grab that. It's all this newfangled security. And while I'm doing that, feel free to already ask some questions. I'm sure there are questions. Oh, okay, this package name seems to already be used. Let's change the package name to something else. That, for example. Let's run that build again. Okay, CD npm, and then npm publish again. And I need another one time passcode. Do I sign up for private packages? Okay, well, I guess we're changing the name again. Can really see I don't use note much anymore. I'm not aware of all these newfangled payments and npm here. That you need, okay, let's try that one more time. And we need another one time passcode. I hope that like showing this four times in a row doesn't compromise my account. There we go, and it's published, okay. So now if we go to npm, we'll see that. There we go, it's published. Did you read me? Published just a few seconds ago. And if you want to try this out locally, you can put this in your package json now and try out this transform stream or json parse stream. Are there any questions? Let's check Discord here.
Automating with CICD and Git Hooks
Could this be automated with CICD? Yes, definitely. You could run 'dino run' and 'npm publish' commands in a CI pipeline. Some features can be polyfilled in Node, but not all. DNT supports polyfilling many features, but it may not be as nice as using Dino-only code. If you have specific questions, join our Discord community. Adding Git hooks to XGBlingine formatter on commit is possible, but the exact process is not clear. You can ask for help on our Discord. There's also a way to do this in the browser, which will be shown later.
Could this be automated with CICD? Yes, definitely. So one thing you could do is you could say a let's actually just do that real quick here. We can say dino run what is it dash a scripts build.ts we have to put in a version number let's set that to one. And we can run that all the time and then we can say run npm publish and then inside of the npm directory, only if I don't remember what the exact thing is, I think it's a git hub.branch. I don't know, you'd have to look this up at CI, but or in documentation, but you can say if it's on the main branch or if it's a tag. If this is a git tag, you can publish it automatically. But yeah, you could definitely do this in CI as well. You'll just have to figure out the exact commands you want to use.
As far as I understand, not all features can be polyfilled in node. Yes, correct. There's some things which are very difficult to polyfill the node. Some things like fetch, for example, can be polyfilled with ... or node fetch. And DNT actually supports these right out of the box. So for example, if you want to polyfill... Let's find it... Timers or prompts or blob or crypto or DOM exception or fetch or regref. All these things if you want to, those are really easy to polyfill. And then the DNO namespace is also pretty much entirely polyfilled. I think there's a file here which shows how far we are along with that. Let me find it real quick. Maybe not, I don't know. But yeah, there's some things which can't be polyfilled, but essentially you can polyfill pretty much anything. There's a... It's just not going to be as nice, right? Because you need to have all these polyfills. Like if you're writing completely with DNO only, that's obviously the nicest that's the least amount of dependencies that you're adding and everything just works out of the box. But if you want this to work for Node users as well then yeah, you'll need to shim out things. And maybe there's like a certain function that you can't use because it's not available in Node. That's possible, then you just need to avoid that or use something else. But there's a bunch of documentation here. And if you have any specific questions, feel free to join our Discord. It's discord.gg slash Dino. And there's a help channel on there. You can always ask questions.
Are there any other questions? We're pretty much out of time. So I could show some other things, but is there an easy way to add Git hooks to XGBlingine formatter on commit? I think so. I'm personally not a fan of Git hooks but I'm sure there are. Let's see, Githooks, Dino. Let's see. Okay, here's this, this thing. Oh, these are just here. Let's try to look at that. I'm sure there is. Maybe ask this on our Discord and then you'll get some better answers. I'm not sure, I've never done it. But yeah, it's definitely possible. How exactly to do it? I'm not quite sure. I'll show you real quick how to do this in the browser because that's part of the title of the talk, or the webinar here. Let's create a...
Using Denobundle and Importing from CDNs