JavaScript Iteration Protocols

Rate this content

How many ways do you know to do iteration with JavaScript and Node.js? While, for loop,, for..of, .map(), .forEach(), streams, iterators, etc! Yes, there are a lot of ways! But did you know that JavaScript has iteration protocols to standardise synchronous and even asynchronous iteration?

In this workshop we will learn about these protocols and discover how to build iterators and iterable objects, both synchronous and asynchronous. We will learn about some common use cases for these protocols, explore generators and async generators (great tools for iteration) and finally discuss some hot tips, common pitfalls, and some (more or less successful) wild ideas!

27 min
01 Jun, 2023

Video Summary and Transcription

We are working on troubleshooting a production issue in a startup. The CTO identified a problem with loading large files into memory and suggested reading the file line by line. We learn about iterators and generators in JavaScript, which allow us to process data one item at a time. Generators can be used to combine async and generator functions for file processing. The speaker also discusses using a for loop instead of map, filter, and reduce. The Talk concludes with the speaker mentioning poly-filling the implementation using core.js and offering free workshops on iteration protocols and Node.js streams.

Available in Español

1. Introduction to the Problem

Short description:

We just started to work in a new startup. The CTO reached out to us for help with troubleshooting a production issue. We need to analyze log files, count occurrences of a specific error, and group them by customer.

So, thank you for the introduction. I know we are just after lunch, but I hope we can forget about that for a second and start a new adventure together. So this is basically the story. We just started to work in a new company, it's a startup, and this is our first day in this startup. And the first thing that happens is that the CTO reaches out to us asking, can you help me with troubleshooting a problem that I have? It turns out that it's an actual production issue. And the CTO is looking at some logs and is trying to figure out which customers are impacted by these logs. And the way that we can help the CTO is by helping him to analyze some log files, where we want to count how many are FCKD, I don't know what that means, that are in that particular file, and we want to count how many of them are happening for each and every one customer. So if we look at the log file, it looks more or less like this. We are interested in these three lines here, in this particular section, just because that particular error shows up. And then we want to take all these lines and count how many occurrences there are per every different customer. So this should be pretty easy, so let's work together on an implementation for this.

2. Reading and Processing Log Files

Short description:

We import the read file function from node.js to read the log file and store the information in the raw data variable. We then split the file into individual lines and create JSON objects for each line. After filtering for the specific error, we use reduce to group and count occurrences by customer. We provide the script to the CTO, who is initially impressed. However, the next day, the CTO identifies a problem with loading large files into memory. Instead, we should read the file line by line, filtering and aggregating the results as we go.

So we need to read log files. So the first thing that we do is we import read file from node.js, then we use that function to actually read the file, and this is going to load all that file information into this variable called raw data. Then we want to split on every line because eventually we want to process every single line individually.

Now we take this array of lines, we map it so that we can use JSON parts to actually create JSON objects for every single line. And at this point we can filter and just say, okay, just keep the lines where we have that message dot error equal to the specific error we are looking for. And finally, we can just do a reduce to group things, actually to count things by specific customers. And we just log the result of that reduce. And we should see something like this. Basically, we see an object where the keys are customer IDs and the values are the number of times that that particular error was happening for every single customer.

So we give this script to the CTO, the CTO is really happy and is telling us, great job, great first day, looking forward to seeing you tomorrow. So experience plus plus. All right. The next day, the CTO is telling us, well, you know that script you gave me yesterday? Actually, now there is a problem. I am seeing this error called range error invalid string length. Now I'm going to show you the code again. Can anyone tell me in which line we have a problem? Come on, don't be shy. Okay somebody's number three. Good guess. Why is this a problem? The problem is there because we are basically loading the entire content of the file in memory. We are just saying take all that data and store it into a variable as a string. And if the file starts to be relatively big, we are starting to talk about the order of gigabytes, you can start to see that particular problem. Or if it's over two gigabytes, you will see another problem, which is very similar. So the problem conceptually here is that we are preloading all the information in memory, which for big files is not something that is going to work. So what is an ideal approach instead? The best approach would be, OK, why not pass this file line by line? At the end of the day, we just care about reading one line at a time. We don't need to preload everything in memory. So an ideal implementation would look like this. We just load this one line, we pass it, we filter for it, and we start to aggregate the result. We get the next line and in this case, when we try to filter, we realize we don't care about this line, so we just get disregarded. We move it to the next line, again we pass, we filter, and we aggregate. And again, just to show you another example, we pass, we filter, aggregate, and in this case we find again the first customer, so we can just increment the counter to 2.

3. Introduction to Iterators in JavaScript

Short description:

In this part, we learn about iterators in JavaScript, which allow us to process data one item at a time from a collection. Iterators are a lazy abstraction that can handle large or endless data sets. JavaScript has had iterators since ECMAScript 2015, and we can use them with arrays and other objects. Examples demonstrate how to use the for-of loop to iterate over an array and print each item, as well as how to iterate over a string and retrieve each character.

So in this case you can see that the current state is something that keeps evolving with every single line, and at the end when there are no more lines left to read, the current state is going to be our final result. So we can implement all of this with iterators. So today we're going to learn a little bit more about that.

Now before getting into this, let me introduce myself. My name is Luciano. I am AWS Serverless Hero and a Microsoft MVP. I work as a senior architect for a company called For Theorem. If you think my name is familiar, maybe you've bumped into this book called Node.js Design Patterns. If you haven't, please check it out. I'm really curious to hear your opinion. And you can connect with me, I'm very friendly, we can keep chatting later online. So these are some of my contacts, but it should be easy to find me online.

Now very quickly about For Theorem. We are a consulting company specialized on AWS and serverless, so most of our work is basically helping other companies to implement or optimize serverless and AWS solutions. If all of that sounds interesting, again, I'm very open to chat, so definitely ping me later.

Now, why do we care about iterators? Iterators are a lazy abstraction that allows us to represent the concept of repetition in programming languages. And the cool thing about that is that it basically allows us to process things, one item at a time, from a specific collection. The even cooler thing is that we can process larger or even endless data sets. Imagine we have a sensor that is reading data, maybe, I don't know, temperature, and it's constantly sending us data. We will always receive more and more data. How do we process that, if not in a lazy way, just getting chunks of data one at a time?

So there is good news for JavaScript that the concept of iterators exists already in JavaScript, and that has been true since ECMAScript 2015. It has been already a few years that we have access to iterators in JavaScript. And let's refresh a little bit of the syntax that you might already be familiar with, and let's try to learn more, just starting from some examples. Let's look at this code. We have an array, and we do a for-off on this array, and we print every single item using this loop. Now, the output is pretty obvious. We just see foo, bar, and baz. The cool thing here is that we are using this particular syntax, for-off, that easily gives us access to a repetition of all the values in this array. But you might be wondering, can we do this only with arrays? Can we do it with other kinds of objects? Let's try it with strings. What happens here is that you get, for every iteration, one character in the original string.

4. Iterators and Generators in JavaScript

Short description:

The for-off syntax works with every iterable object, including user-defined iterables. The spread syntax allows passing arguments to a function call from an iterable object. Many APIs, such as promise.all and map, support iterables.

So, somehow this is working also for strings, the for-off syntax. Now, what if we try this with objects like plain JavaScript objects? In this case, we actually get an error, which is a bit surprising. But the cool thing about this error is that it's telling us this concept, object, is not iterable. So you can see that there is already a concept of what it means to be iterable. So we can try to find out by looking at the MDN documentation. Sorry, this is very bright, I'm realizing now.

The cool thing about this page is that it's telling us, for off works with every iterable object, but also, if we look down below, we can also see it also work with user-defined iterable. So that's already giving us a way that we can create our own custom iterables if we want to. Now, another quick piece of syntax that you might have seen, this is the spread syntax. You can use it, for instance, to pass arguments to a function call, taking them one by one from an iterable object. And again, if we look at the documentation, sorry again for waking you up, what we can see is that it's telling us this works on iterable objects. So any object that is iterable is going to support this particular syntax. If we keep looking, there are actually many APIs that say to support iterables. You might have seen promise.all, you might have seen promise.all.settle, or even map and set was used in some of the talks before.

5. Generators and Iterators in JavaScript

Short description:

Let's explore generators in JavaScript. A generator function allows pausing and resuming the execution, and a generator object can be iterated using dot next. We can also use for of to iterate over generator objects. Iterators and iterables are different concepts, with iterators acting as a cursor on a collection and iterables being the idea of a collection that can be iterated over. JavaScript has iteration protocols that define iterable and iterator objects. An iterator object must have a next method that returns an object with the keys done and value.

So, let's go into trying to find out more about what iterable actually means. But in order to do that, I think it's very useful to learn a bit more about generators, which I think is still a concept that is not so well known in the JavaScript landscape.

If you've never seen generators, there is this concept of generator functions and generator objects. A generator function is, looks like a function, it's just a little bit special. You have to put an asterisk there. And what we gain if we put that asterisk is that now we can use this keyword called yield.

And what yield does is kind of a special kind of return. It allows you to return a value, but rather than stopping the execution of the particular function, it's going to remember exactly where we stopped the last time. So it's more like a pause than actually stopping the function execution. And we can resume that execution from the point where it was left on the previous iteration.

But how do we use a generator function? Well, we can invoke it as a regular function, and that is going to return to us a generator object. And on a generator object, we can keep calling dot next dot next, and every time we receive back an object that contains the keys done and value that are basically telling us there is more data, yes or no, and this is the current value.

Now let me show you a more maybe interesting example where, sorry, where we have multiple yield points. This is fruit generator, so every instance we are returning a new fruit, or we are yielding a new fruit. And when we call dot next dot next, you can see that we get every single time a different item, so you can imagine the execution is pausing when we yield. And when we call dot next again, it's gonna continue from the line where we paused the last time.

The cool thing is that we can also use for of, and if we use for of on generator objects, we just get straight away access to the values. We don't have to think about that wrapper object anymore, and the loop is gonna stop automatically when we reach done equal true.

Now we can finally talk about iterators and iterables. Now these are two different concepts, very similar, very often mixed up together, so let's try to figure out what are the differences between the two. I like to think about iterator objects as a cursor on a collection of data. So you can imagine this yellow arrow there is an iterator object. The only thing we can do on it is just say, give me the current value or move to the next value.

While an iterable object is a slightly more generic concept, it's more the idea that you have a collection and this collection is somehow iterable, and the way you can iterate over it is to get an iterator from that collection. So again, you can imagine iterator as the cursor, iterable as the idea of a collection that you can get a cursor from.

Now if we want to understand what JavaScript considers as iterable and iterator, we need to look at those iteration protocols, and the first protocol is the iterator protocol, which tells us, I'm going to consider an object if it complies to this definition, and the definition says the object needs to have a method called next, and this method, every time you call it, needs to return an object with the keys done and value.

Now just to show you a very quick example, here we have a factory function that allows us to create an iterator that represents a countdown, so if we call this function, for instance with 3, it's going to create an iterator that will give us the value 3, 2, 1, 0, right? So we want to implement this iterator, so we need to comply with the definition of the iteration protocol. So we need to have an object that has a next method, which returns objects with the keys done and value. So this is basically a factory function creating an iterator for us. And once we do that, we can instantiate it, count down is an iterator.

6. Iterators, Iterables, and Async Equivalents

Short description:

So we can keep calling dot next dot next to get all the values until done is true. Generators implement the iterator protocol, making the code more readable. JavaScript objects can be made iterable by implementing symbol dot iterator. Iterable objects can be used with the for-off syntax. The iterator and iterable protocols are not mutually exclusive. Generator functions make objects both iterators and iterables. Iterators and iterables also have async equivalents.

So we can keep calling dot next dot next to get all the values until done is true. Now, one cool thing that is not very well known, and this is the reason why I introduced generators, is that if you know how to write generators, all the code you see here is pretty much equivalent to this code here. Because generators behind the scenes implement the iterator protocol for us. So if you are familiar with generators, it's just much nicer to read and write this kind of code.

Now, let's look at the iterable protocol. How do JavaScript say that one object is iterable? It's actually pretty simple. You just need to implement a special function called symbol dot iterator. And that function needs to return an iterator object. So it's basically marking itself as, if you need an iterator for me, just call symbol dot iterator. And we can easily implement that countdown example as before to make it an iterable. So this time when we create a countdown, it's going to be an iterable object. So it needs to have symbol dot iterator, which returns an iterator. And this is exactly the same code as before, because before we created an iterator. So you can see that we are just adding an extra layer of wrapping.

Now, the cool thing is that when an object is iterable, you can use the for-off syntax. So this is generally much nicer as an interface to give to your users, because they are more in control. They can either just dot next manually on the iterator, or they can use for-off. But if you know generators, as we should know at this point, you can basically take all this code and rewrite it this way. How many of you realize that this is exactly the same code as before for the iterator as well? And I know that this can be a little bit confusing and also surprising. But the cool trick is that basically the two protocols are not mutually exclusive. Here, we can implement an iterable iterator, so an object that is both an iterable and an iterator. And the first part here is implementing the iterator protocol, while the second part is implementing the iterator protocol. By just saying return this, it's basically because itself is an iterator, it's saying if you need an iterator from me, I am already an iterator. And this is exactly what's happening behind the scenes when you use generators, and that's why, if you use a generator function, your objects are going to be both iterators and iterables.

Now, everything I just told you could be a little bit weird if you start to think, wait, you showed me an example that was involving reading from a file. That's an asynchronous operation. You just showed me protocols that don't take into account about promises or async, so what do we do in that case? Well, it turns out that iterators and iterables, they also have an async equivalent, and I'm going to quickly show you the difference in their protocols. So for the async-iterator protocol, it's pretty much the same story, except that rather than returning an object with done-in-value, we return a promise that eventually is going to resolve to an object with keys done-in-value. And the easiest way to do that is just to make the next function async, if you just want a shortcut. Very similar, the async-iterable protocol, it's pretty much the same story, except this time we have to use symbol.asyncIterator, and we have to return an async-iterator of course.

7. Using Generators and Iterators for File Processing

Short description:

We can use generators to combine async and generator functions, giving us both the await and yield superpowers. This allows us to implement the async-iterator and async-iterable. By using the for-await-of loop, we can await promises before moving to the next iteration, making it useful for processing paginated data sets. To consume a file in chunks, we can use readable streams in Node.js. However, chunks may not match a line, so we create a function to yield every line from the original AsyncIterable. By importing the necessary utilities and initializing the readable stream, we can iterate over each line, parse it as JSON, filter, and accumulate the data per customer. The resulting state is logged, and the solution impresses the CTO, even with large files. Great job, day two is successfully completed.

Not surprising again, we can use generators, and the cool thing is that nothing is stopping you from making a function both async and a generator, and you get both superpowers there. You get both the await superpower and the yield superpower. And when you do that, surprise, it's implementing for you both the async-iterator and the async-iterable.

Once we do that, we can actually use this for await syntax, and in this case I'm creating a countdown where I am waiting one second before the promise is resolved. So basically you can see there in the animation that it takes a little bit of time before we see the next item, and this particular for await of loop is taking care of awaiting the promise for us before going to the next iteration. So this is something that can be really cool for instance if you are processing a paginated data set and you need to do a request to get the next page.

Now that we know all of this, very quickly let's try to rewrite our original example to take advantage of iterators. Now the first thing that we need to do is we need to think about what is the best way for me to consume a file in chunks rather than reading everything in one go. Now with Node.js there are different ways, but the one that I prefer the most is streams, and don't worry too much if you don't know streams. I'm going to try to simplify it a little bit. The idea is that we can import this function, create read stream, which gives us a readable stream that we can use to create an instance that allows us to consume chunks of information from a file.

Now, since in Node.js the readable streams are async iterable, we can just use for a wait off, and we are going to read one piece of file at the time, which here we are calling chunk. But there is a bit of a problem here. A chunk is an arbitrary amount of data. By default I think it is around 4 kilobytes. It's something you can configure, but that's not necessarily going to match a line. Maybe we have a super big line which is more than 4 kilobytes, maybe we have multiple lines and at some point we break a line in between. So this is not really convenient.

So we can create a function here. I'm not going to go too much into detail in the implementation, but this function takes an AsyncIterable of any kind and it gives you back a new AsyncIterable which yields every single line from that original AsyncIterable. So we are wrapping our iterable into a new one that is a little bit more specialized. At this point we have all the ingredients we need to fulfill our task. We can import our createReadStream, we can import our byline utility, we can initialize the readable stream from the log file. At this point we initialize our state, at the beginning we don't know of any customer so it's just going to be an empty dictionary. And now what we can do is this forAwaitLine byline readable. This might be a little bit scary, but what it actually means is use this readable, use the byline utility, at this point we have an AsyncIterable object and we want to get every single value in a loop. So at this point we are basically doing iterations, and for every iteration we are processing one line from the original source file. All the code inside of here is exactly the same as before, so we are just parsing as JSON and filtering and then basically accumulating the data per customer. So once all of that iteration is done, we can just log the resulting state there. Now we give this file to the CTO, the CTO is impressed, he's trying files that are multiple gigabytes and it works seamlessly, so everyone is happy, good job, we completed day two very successfully.

8. Using for loop instead of map, filter, and reduce

Short description:

In the original implementation, we used map, filter, and reduce, while in this implementation we just did a for loop. There are two proposals under the TC39 to add utility functions in iterators and async iterators, but until they are merged, we have to do other things.

Now you might have a question though. In the original implementation, we used map, filter, and reduce, while in this implementation we just did a for loop. Which, I don't know, maybe you would like more to use map, filter, and reduce, most people will find that a little bit more expressive, so why didn't I do that? Well, the problem is that it's not 100% possible yet to do that. There are actually two proposals under the TC39 to add this kind of utility functions in iterators and async iterators. So until those two proposals get merged, we have to do other things.

9. Poly-filling Implementation and Free Workshops

Short description:

We can poly-fill this implementation using core.js. I have an implementation using core.js and map, filter, and reduce. I have two free workshops on iteration protocols and Node.js streams. The protagonist of the story gained more experience and will level up. Good use cases for generators are processing paginated data sets and implementing a Q consumer. I used to generate this presentation.

And one of the things that we can do, we can actually poly-fill this implementation using core.js. And if you're curious, I actually have an implementation of this particular solution using core.js and using map, filter, and reduce. Now, you can find that implementation in the code link down there.

And I also have two bonus things for you, I have two free workshops. One on iteration protocols, which goes a lot more into detail. You can go on GitHub and just do the exercises yourself, totally free. And the other one is about Node.js streams, if you are more curious about this particular thing. I see a lot of people taking pictures, so I'm just gonna go to the last slide, where you have a QR code, you can grab all the slides, and then take all the links from there. Thank you very much.

Well, first, of course, we all want to know what happened to the protagonist of your little story. Did they get a raise after doing all this? They got more experience, so eventually they will level up and decide how to spend their points. Alright. Perfect. I'd love some more questions. There are not so many questions and I don't believe that there are no questions. We can talk about Final Fantasy as well. So one of the questions is, I assume the final solution will lead, I'm gonna just approve it. It's all live. It's all live. What are good use cases for generators? So there are some use cases that are actually used in production which I think are really cool. One of them as I mentioned in my talk is when you have to process paginated data sets, maybe you are fetching information from an API and you want to read multiple pages from that API. Using generators or the concept of async iterables in general, I think it's a good elegant solution and will give you very clean code from the consumer side. Another use case is when you want to implement like a Q consumer, so when you want to poll tasks to do from a queue, again you can implement it in the same way. So I think these are good use cases.

Yes, this is a question I had as well, so I'm just going to ask it even if it's not the top question. How did you generate this presentation? How did you make these slides? So I used but, yeah, I think it's like reveal.js but a little bit more involved, yeah. Right. So it's JavaScript HTML CSS. Thank you. I wanted you to know that too.


Q&A on Final Fantasy and ReadStream

Short description:

Thank you for the questions! My favorite Final Fantasy game is VIII, and I'm also a fan of the character. As for controlling the chunk size of ReadStream, I'm not sure if it's related to this track, but volunteers can assist with that.

So thank you for whomever asked that question. Another question. Let me see. I don't know if I want to ask this question. Oh, I was hoping for that one. Well, in that case, please. What is your favorite Final Fantasy? I am a bit undecided between Final Fantasy VII and VIII. I'll go with the Eight just because everyone else would say VII, just to be different. You're not the only one. That's good. You have community here. It's so good. So what's my favorite Final Fantasy character. How can you control the chunk size of ReadStream? Wait, I think this might be for the other track. I'm just reading from my screen, so volunteers, please help me out.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Remix Conf Europe 2022Remix Conf Europe 2022
23 min
Scaling Up with Remix and Micro Frontends
Top Content
Do you have a large product built by many teams? Are you struggling to release often? Did your frontend turn into a massive unmaintainable monolith? If, like me, you’ve answered yes to any of those questions, this talk is for you! I’ll show you exactly how you can build a micro frontend architecture with Remix to solve those challenges.
Remix Conf Europe 2022Remix Conf Europe 2022
37 min
Full Stack Components
Top Content
Remix is a web framework that gives you the simple mental model of a Multi-Page App (MPA) but the power and capabilities of a Single-Page App (SPA). One of the big challenges of SPAs is network management resulting in a great deal of indirection and buggy code. This is especially noticeable in application state which Remix completely eliminates, but it's also an issue in individual components that communicate with a single-purpose backend endpoint (like a combobox search for example).
In this talk, Kent will demonstrate how Remix enables you to build complex UI components that are connected to a backend in the simplest and most powerful way you've ever seen. Leaving you time to chill with your family or whatever else you do for fun.
JSNation Live 2021JSNation Live 2021
29 min
Making JavaScript on WebAssembly Fast
Top Content
JavaScript in the browser runs many times faster than it did two decades ago. And that happened because the browser vendors spent that time working on intensive performance optimizations in their JavaScript engines.Because of this optimization work, JavaScript is now running in many places besides the browser. But there are still some environments where the JS engines can’t apply those optimizations in the right way to make things fast.We’re working to solve this, beginning a whole new wave of JavaScript optimization work. We’re improving JavaScript performance for entirely different environments, where different rules apply. And this is possible because of WebAssembly. In this talk, I'll explain how this all works and what's coming next.
React Summit 2023React Summit 2023
24 min
Debugging JS
As developers, we spend much of our time debugging apps - often code we didn't even write. Sadly, few developers have ever been taught how to approach debugging - it's something most of us learn through painful experience.  The good news is you _can_ learn how to debug effectively, and there's several key techniques and tools you can use for debugging JS and React apps.

Workshops on related topic

React Day Berlin 2022React Day Berlin 2022
86 min
Using CodeMirror to Build a JavaScript Editor with Linting and AutoComplete
Top Content
Using a library might seem easy at first glance, but how do you choose the right library? How do you upgrade an existing one? And how do you wade through the documentation to find what you want?
In this workshop, we’ll discuss all these finer points while going through a general example of building a code editor using CodeMirror in React. All while sharing some of the nuances our team learned about using this library and some problems we encountered.
TestJS Summit - January, 2021TestJS Summit - January, 2021
173 min
Testing Web Applications Using Cypress
This workshop will teach you the basics of writing useful end-to-end tests using Cypress Test Runner.
We will cover writing tests, covering every application feature, structuring tests, intercepting network requests, and setting up the backend data.
Anyone who knows JavaScript programming language and has NPM installed would be able to follow along.
Node Congress 2023Node Congress 2023
63 min
0 to Auth in an Hour Using NodeJS SDK
Passwordless authentication may seem complex, but it is simple to add it to any app using the right tool.
We will enhance a full-stack JS application (Node.JS backend + React frontend) to authenticate users with OAuth (social login) and One Time Passwords (email), including:- User authentication - Managing user interactions, returning session / refresh JWTs- Session management and validation - Storing the session for subsequent client requests, validating / refreshing sessions
At the end of the workshop, we will also touch on another approach to code authentication using frontend Descope Flows (drag-and-drop workflows), while keeping only session validation in the backend. With this, we will also show how easy it is to enable biometrics and other passwordless authentication methods.
Table of contents- A quick intro to core authentication concepts- Coding- Why passwordless matters
Prerequisites- IDE for your choice- Node 18 or higher
React Summit US 2023React Summit US 2023
96 min
Build a powerful DataGrid in few hours with Ag Grid
Does your React app need to efficiently display lots (and lots) of data in a grid? Do your users want to be able to search, sort, filter, and edit data? AG Grid is the best JavaScript grid in the world and is packed with features, highly performant, and extensible. In this workshop, you’ll learn how to get started with AG Grid, how we can enable sorting and filtering of data in the grid, cell rendering, and more. You will walk away from this free 3-hour workshop equipped with the knowledge for implementing AG Grid into your React application.
We all know that rolling our own grid solution is not easy, and let's be honest, is not something that we should be working on. We are focused on building a product and driving forward innovation. In this workshop, you'll see just how easy it is to get started with AG Grid.
Prerequisites: Basic React and JavaScript
Workshop level: Beginner
Node Congress 2023Node Congress 2023
49 min
JavaScript-based full-text search with Orama everywhere
In this workshop, we will see how to adopt Orama, a powerful full-text search engine written entirely in JavaScript, to make search available wherever JavaScript runs. We will learn when, how, and why deploying it on a serverless function could be a great idea, and when it would be better to keep it directly on the browser. Forget APIs, complex configurations, etc: Orama will make it easy to integrate search on projects of any scale.
Node Congress 2022Node Congress 2022
128 min
Back to the basics
“You’ll never believe where objects come from in JavaScript.”
“These 10 languages are worse than JavaScript in asynchronous programming.”
Let’s explore some aspects of JavaScript that you might take for granted in the clickbaitest workshop.
To attend this workshop you only need to be able to write and run NodeJS code on your computer. Both junior and senior developers are welcome.
Objects are from Mars, functions are from Venus
Let’s deep-dive into the ins and outs of objects and then zoom out to see modules from a different perspective. How many ways are there to create objects? Are they all that useful? When should you consider using them?
If you’re now thinking “who cares?“, then this workshop is probably for you.
Asynchronous JavaScript: the good? parts
Let’s have an honest conversation.
I mean… why, oh why, do we need to bear with all this BS? My guess is that it depends on perspective too. Let’s first assume a hard truth about it: it could be worse… then maybe we can start seeing the not-so-bad-even-great features of JavaScript regarding non-blocking programs.