Node.js is great - easy to develop, performant, easy to scale. But there are tasks that are less suited for it - heavy computations or data processing. Join me and learn how you can incorporate Rust as well as WebAssembly into Node and JavaScript and take your performance to the next level!
🚀 Supercharge your NodeJS with Rust
From:

JSNation 2022
Transcription
Hi, my name is Mitri kudrafson. I'm a senior software engineer and I'm very passionate about two things JavaScript and rust and so today I want to talk with you how you can supercharge your JavaScript and know just experience using the rust programming language. So, let's Dive In We all know that no justice great. It has a great ecosystem of packages. I think it's the biggest ecosystem of packages among our programming languages. There's a very nice development experience. You can pretty much write and rest even a few lines of code and typescript is making the experience even greater. so you can get type checking in it's not statically type language but still in the type checking which is nice and but now Justice also slow sometimes and special if you're doing like CPU tasks. Let's say yeah generation maybe image processing and I've seen people do very Creative Solutions this problems like Outsourcing if you bound task through message you to other process which can work on that and some people do Lambda servers and that you can call and demand if you have a if you have some heavy computation, but there is also other solution in the solution is actually writing native modules in either C or c++? Or rust and Now you probably ask why you're asked and what or what is rust why not just you see or c++? And so let's try to answer this question first and as you probably know see and c++, they are pretty old. They already show their age and they're still in development. I think C++ has the 21st version now and I'm not pretty sure I don't follow it pretty much but nevertheless they are all they're a bit messy and the lack modern tooling there is no decent dependency manager and have a relatively poor standard libraries. So Most of the time when you do need like every containers or iterators and you have to to libraries Coast boost many things from the game to the series into the C++ specification and but it still works a lot of things and the biggest downside in my opinion is that they're not Memories safe in so you probably saw this message that we all right calm down presentation fault and because somewhere you forgot to stop your for Loop and you iterated too much on your array or you access the memory that is no longer only the application and this makes development in CNC plus plus a very hard. Trust on the other hand is a strongly typed in compiled language like cnc++. , it has a rich standard Library. So you get smartphone the speaker writers. Everything is built into the language itself, and you don't need any short party packages to its support for such things. And there's a model tooling so you have cargo which is an npm equivalent and run tasks with you can install packages with cargo and the biggest growing. My opinion is the trust is no more safe. And the the way they achievement safety is very interesting and deep into it and you can read about it if you if you're interested and but the notion We Trust is that it compiles it will run so there will be no memory or so the memories actually check compile time. It's big big Pro in my opinion compared to see or c++, you still can write on safe for us. It is possible but by default or else that you write is safe for us and check in compilation time. Well, great and all but you and problem and I we all write JavaScript. So how we can integrate between the two so enter Neon. Neon is a library in the two chain and for embedding rust in node.js. It's an open source project and it's very cool project. I suggest to check it and and let's take a look at a Fibonacci function that we write in Rust and Export it into JavaScript world. So Below is the code and don't worry I have links to my GitHub later in the presentations where you can find an executable examples and but for now, let's focus on this example and Let's break it into a few buckets to be easier to analyze online is one through four. We have the important required statements equivalence from now, just so we bring some stuff from the neon Library. And Line 6 to 12 is the actual Fibonacci function in nothing fancy here. It's a recursive function that goes in searches for the required Fibonacci number. Now lines 14 through 18 are what I call a glue layer between the JavaScript world and the rust world and since the two languages are different and Architectural in a different way. We need a way to convert between JavaScript types into rust type. So you always have to write some blue layer in neon the two convert. Your JavaScript calls into rust calls and this is what we are doing in on those five lines where converting. The JavaScript call into a rust call and we actually call our Fibonacci function return the result back to main JavaScript. And as with every executable, we need to have a main function and in case of neon the main function in for a functions as an export statement, and so you can export functions from rust back to to JavaScript World in that case. We export the Fibonacci API function as Fibonacci arrests and so in jobs, we can access it as people naturists. in order to build this and there is another tool that the new maintenance it's called cargo cbrt facts and it copies the artifact that cargo builds produces and what it does behind the lion actually generating the dynamic library and so dll or so equivalent if you're on Windows or Unix and but it has all the wrappings of neon and now Jess apis and because you know just not support Farm function interface and actually JavaScript does not support for function interface but no Just Sports. So this is why we can write native libraries for node.js. In order to cognitive library and would require it like a regular node.js module and we can see online one in the index node that we've generated previously required like a regular function and then we covered like a regular function and the cool thing is that this will also generate a typescript definitions for you. And so you can write native rust code and have a typescript definitions in your JavaScript code and the integration process is pretty easy in my opinion and And it's it's cool. I suggest you check neon. It's a great Library. And it allows us to embed rust in JavaScript code. Now. If you are familiar with assembly then you probably ask okay, but why not webassembly and before I answer this question, let's understand. What is webassembly now Rob assembly is a portable binary format and and the corresponding text format. You can think of it as assembly from this exacture. It's a bit different it's version and it's actually executed by virtual machine. So the compiled binary is not native. It's native to the VM and it's important to remember this. it's important in all major browsers and So across the across the stacks you can use it on Firefox Edge and Safari. And the cool thing about webassembly is that okay? It can be actually compilation targets for other languages and raster among them so you can take a code written and rust and compile it to webassembly and which is pretty cool. And but if you don't want to do that, there is also a special language code assembly script which is a limited version of typescript type say and that you can write and it will compile to And 12 assembly. So let's take a look at an example of how we can write the same Fibonacci function. But this time it will compile into webassembly. It's a lot Charter and and aside from the function itself. We need only two things. They they use statement and the macro that we do on a line tree. And since I since webassembly is tight you have types. Then there was some bike and which is the package that converts the last code to webassembling and we will do all the magic needs to be done in order to convert this code to do webassembly and keep in mind if you have complex structures, like custom truck with custom functionality. You need to have a custom wrapping for it. It wasn't buying and doesn't know how to convert non primitive types like integer float and strings and to have assembly But if you have a simple functions like this Fibonacci and the conversion is straightforward because webassembly supports integers and floats out of the box. And so while some buying again is able actually to convert this code to a native webassembly. And now I showed you two ways to write native modules in JavaScript and you probably wondering well which one is faster. So hey benchmarks. I did some performance testing on four different Fibonacci numbers. And what you're seeing here is there are is the mean running time that I got from high profile, which is a benchmarking tool. The numbers you've seen green is the performance improved a compared to JavaScript and you can see an anomaly here in the shortest Fibonacci number you can see the performance Improvement is pretty much acceptable. And and this is a good reminder to everyone who is doing performance and want to improve something. Always always have performance tests. And because if you need to calculate like clothing numbers, for example, the performance Improvement, you get from native models does not work the hassle and but if you go up and in the future numbers, you can see that again tremendous boost in performance Improvement and which is actually around 60% If you go to the rust and they're out 45% if you go to the webassembly and which is which is a lot it's it's twice as fast a compared to JavaScript. And we can also draw second conclusion. But to ask is around 45% faster than awesome. and And the results I know that you always need to run your own benchmarks to make sure. And that what your benchmarking performance you're gaining from it and work the hustle and because as I said, you can see at the forecast you wanna know the performance game not that spectacular compared to 446 number. So we've seen two ways to write native modules. We can either do Native models directly or we can do a rust that compiles to webassembly and you probably have a question. Which so which one should I use and when? So let's look at a few examples a few categories that I've outlined to help you to make a better decision when to use native models when to use Rob assembly. If you are talking about performance, I think it's pretty clear that trust native models will be faster native will always be for anything executed in VM because don't have a conversion layer in which is the VM itself and take a look at your C++ compared to Java. Obviously you can write non-performant called many programming languages, but if we're talking about You know the language, you know to write performance code you always achieve more performance if you buy us the virtual machine, but it's also worth noting that webassembly is very fast. I was surprised to learn that and even though it's a VM executed. It's pretty cool to know the trips and our disposal. It is supported in all major browsers as well as now just let's talk about reusability. And when I talk about reusability I speak about taking the same code and running it on different Platforms in that case. It's a bit complicated and because native libraries there can be reused in other languages through something that's called the foreign function interface and which is a ffi in so for example, you can have your business logic written and rust and you can use it in node.js you can use it in the Java and you can use it in your iOS application that is written on Swift web assembly on the other hand is only portable across different webassembly VMS. So your webassembly code your intern Edge in Firefox in safari in no just but you won't be able to execute it on Android for example And unless you have a virtual machine that can execute webassembly. If you are talking about ergonomics in I'm pretty sure that it's clear that webassembly is a better and way better ergonomics. The code is very shorter and The Binding example, I showed you requests pretty much nothing to convert a native rust function into webassembly and it is able to convert basic types like integers loads now on the other hand. It's a glue layer to convert your rust and JavaScript and JavaScript to us. And so they're economic suffer a bit with the Native Native example compared to the webassembly one. If we talk about the steady delayed, which is everything the system networking and if you want to access the operating system, then your only way to do the to do this is using grass native models. And because webassembly cannot access standard Library, so can't access file system. You can access networking unless you're willing to wait for a webassembly system interface which is still in development and but they are working on making it accessible to webassembly as well. And portability in works on my machine said every developer ever and you always need to remember native code is machine dependent. So if you compile it on Mac OS you will not be able to run it on Windows. And so if you're developing on my quest and your production environment is very Linux containers. You need to compile it for very Linux containers. You can use Docker for that and webassembly on the other hand if I compile my code in webassembly and I give you the executable you will be able to run it on your machine browser that supports webassembly which is not true for Native models that really compile it on my machine good chance that you want if your machine is different for example inter-based market and I have an Arm based Mark then it needs to be very compiled and Especially if you have Windows or Linux machinetics to be compiled. And if we talk about entry barrier for new developers, I think webassembly is easier if you want to introduce performance code in your node.js ecosystem. I suggest you go through webassembly because native models can be written only in C++ or us. Those are pretty Complex languages to grasp for beginners for junior people. I'm not saying it's impossible but the investment will be pretty big if you just want to write one or two functions native performance and awesome while they can be compiled from C C++ for us. There is also partial support for python will be in goal and moreover. You have the specialized language called assembly script that is dedicated to be compiled to us. So you can always start with assembly script and go and get your webassembly. Version and if you need to squeeze every possible inch of EXC performance, you can always then introduce rust. If we're talking about now just or browser, even though we are more fast to know just here JavaScript still kinking in the browser. And sometimes we need to run things in the browser. You also need to remember that native models can be used in browsers because JavaScript does not have support for foreign function interface and so native models can be used only in node.js but not in the brows or webassembly on the other hand can be used in every place where you have a webassembly which one machine available. So it's all the now just versions plus all the browsers accept Internet Explorer. I'm sorry for if you need to work with Internet Explorer and but other than Internet Explorer the website we support is across the entire stack including mobile browsers. And so see if you have a code that you should be back in the front end. Your pretty much a solution is to go in the web and if you want to squeeze performance only Recent code you can look into rust native models. And here we have the summary and I will leave it to you to read and I just want to outline two main points. In my opinion native models are meant to extend the node.js with performance codes. If you performance Improvement into your node just ecosystem and Native models is the way and to do that and webassembly in my opinion is a way to replace non-performance JavaScript pieces of code. So if you have a non-performant function and I suggest you try to write in webassembly whether through us that is compiled to webassembly or by using assembly script. And and if you want to add performance in code to your node.js you want to extend your node just ecosystem. Then you can look at Native modules and we trust for example. Thank you very much. And You can find me on Twitter LinkedIn and you can scan the QR code to take you to my blog we can find links to to articles that this talk is based on on my GitHub. You can find a link to the guitar repositories for both the rust and the webassembly versions, which you can clone. You can run you can play with them you can experiment and Thank you for listening. I hope you learned something new and JavaScript can be performant not by itself, but with the help of for us, and so I hope you'll find use for for your new knowledge. Thank you.