Building Reliable Backends with Durable Execution

Rate this content
Bookmark
Sylwia Vargas
Sylwia Vargas
21 min
04 Apr, 2024

Comments

Sign in or register to post your comment.
  • James S
    James S
    I was disappointed that this ended up being an ad for a SaaS instead of ways to build reliable backends without throwing money at the problem.

Video Summary and Transcription

This Talk explores the paradigm of message queues for reliable backend execution. It highlights the benefits of message queues, such as guaranteed delivery and offloading of long-running processes. The drawbacks of using queues are discussed, including the complexity of managing infrastructure and applications. The solution of using a reliability layer called Ingest is presented, which allows for non-blocking background tasks and provides a dashboard for monitoring and managing jobs. The Talk also emphasizes the importance of reliability in building software systems and introduces the expanding scope and functionality of Ingest.

1. Introduction to Message Queues

Short description:

Hello, everyone. Welcome to my talk about reliability, backend, and execution. I will discuss the paradigm that makes life easier. We are now living in a constant 90s nostalgia. The 90s brought us many great things, but there is one thing we could say goodbye to: queues. Message queues are a form of asynchronous service to service communication. They allow for guaranteed delivery and offloading of long-running processes.

Hello, everyone. Welcome to my talk where, for the next 20 minutes, I will talk about reliability, backend, and execution. Just a quick introduction. My name is Sylvia Vargas. I'm from Poland. I really love pierogi and previously I worked at StackBlitz. Now I'm a developer relations lead at Ingest.

This talk is about the paradigm that makes life easier. But before we talk about the good, let's talk about the bad. We are now living in a constant 90s nostalgia. And, of course, this is no surprise. The 90s brought to us a lot of different things, great stuff that really is still with us. However, there is one thing that possibly we could say goodbye to. And these are the queues.

So let's look at what message queues are. A message queue is a form of asynchronous service to service communication using service and microservices architecture. Messages are stored on the queue until they are processed and deleted. Each message is processed only once by a single consumer. But here I need to interject because in actuality, multiple workers can consume messages from a queue. In order to preserve ordering of tasks, they will need to execute serially. But back to the definition now. And message queues can be used to decouple heavyweight processing to buffer or batch work and to smooth spikey workloads. So you can think about it that once you add something to the queue, it will reach its destination one by one. The delivery is guaranteed. And what's happening in the queue does not impact other parts of the infrastructure. And queues can be really massive.

So let's recap. With queues, you get guaranteed delivery because you know that once something is added to the queue, it will leave it only once it's processed. And queues allow developers to offload long-running processes to the background so that your application does not choke. You would use queues for data-intensive processes or when integrating with external systems.

2. Drawbacks of Using Queues

Short description:

And another benefit of queues is horizontal scalability. However, there are drawbacks to using queues. Building additional infrastructure and managing complex applications can be a lot of work. In times of limited budgets and resources, it's worth considering if managing queues is the right choice. Instead, durable execution allows us to define workflow logic in our application code and ensures reliable execution.

And another benefit is horizontal scalability because multiple messages can be processed in parallel. As workload increases, multi-applications can handle high throughput while remaining reliable.

However, there is a but. So let's look at this Reddit comment. So queues are great in data intensive processes, as I said, that don't need to run on main thread because they execute asynchronously. The tasks are processed in the background and the application is still responsive. However, there are some drawbacks to the queues, which this Reddit user delicately mentions in this quote. Once you take something from the queue, the rest is on you. And queuing service does not care anymore. So what does it even mean? Let's look at that. So queues are great when your application is simple. When it grows in complexity or if it's distributed, you all of a sudden need to worry about a whole wealth of additional infrastructure that you need to build.

And it's going to be you who needs to build it. So, for example, you will need to build concurrency because you want to be able to control how many steps are executed at one time. Or, for example, debouncing because we all know how costly it is when functions execute multiple times. Or state persistence and management because now that you have a distributed or complex application, you have to share state across different functions and queues. Then there's also error handling because what if just hypothetically one service provider has an outage? You will need to include retries and also failures. I mean, retries for failures and also timeouts. And in that case, you also need to recover tooling to understand and process the errors and failed events.

So this already sounds like a lot of work and it's not even an exhaustive list. So you don't have to listen to me on that. In times like this, when engineering budgets and headcounts are slashed down, we as individual developers, engineers need to do more with less. So it is really worth asking at this point, do you really want to be in the business of managing and operating your own queues? Well, Matthew Druker, the CEO of SoundCloud, doesn't think we should. So if this is now a common knowledge, why are people still using queues? Well, we are used to something. It feels familiar and cozy even if it's not the coziest solution. You can make everything work with just enough effort.

Fortunately, there is a better solution that builds on the concept of message queues. So instead of separating our infrastructure, such as queues from our code, what if we could define our workflow logic purely in our application code and ensure it executes reliably? So this is what durable execution gives us. Durable execution is, as the name says, durable. It guarantees that our code will run, it will be completed, even if there are messages failures along the way.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Vite: Rethinking Frontend Tooling
JSNation Live 2021JSNation Live 2021
31 min
Vite: Rethinking Frontend Tooling
Top Content
Vite is a new build tool that intends to provide a leaner, faster, and more friction-less workflow for building modern web apps. This talk will dive into the project's background, rationale, technical details and design decisions: what problem does it solve, what makes it fast, and how does it fit into the JS tooling landscape.
React Compiler - Understanding Idiomatic React (React Forget)
React Advanced Conference 2023React Advanced Conference 2023
33 min
React Compiler - Understanding Idiomatic React (React Forget)
Top Content
React provides a contract to developers- uphold certain rules, and React can efficiently and correctly update the UI. In this talk we'll explore these rules in depth, understanding the reasoning behind them and how they unlock new directions such as automatic memoization. 
How Bun Makes Building React Apps Simpler & Faster
React Day Berlin 2022React Day Berlin 2022
9 min
How Bun Makes Building React Apps Simpler & Faster
Bun’s builtin JSX transpiler, hot reloads on the server, JSX prop punning, macro api, automatic package installs, console.log JSX support, 4x faster serverside rendering and more make Bun the best runtime for building React apps
The Inner Workings of Vite Build
DevOps.js Conf 2022DevOps.js Conf 2022
31 min
The Inner Workings of Vite Build
Vite unbundled ESM dev server and fast HMR are game-changing for DX. But Vite also shines when building your production applications.This talk will dive into how the main pieces fit together to bundle and minify your code:Vite build as an opinionated Rollup setup.How esbuild is used as a fast TS and JSX transpile and a minifier.The production plugins pipeline.Modern frameworks (Nuxt, SvelteKit, Astro, among others) have chosen Vite, augmenting the DX and optimizations for their target use case.We'll discover Vite as a polished and extendable toolkit to craft optimized modern apps.

Workshops on related topic

Using CodeMirror to Build a JavaScript Editor with Linting and AutoComplete
React Day Berlin 2022React Day Berlin 2022
86 min
Using CodeMirror to Build a JavaScript Editor with Linting and AutoComplete
Top Content
WorkshopFree
Hussien Khayoon
Kahvi Patel
2 authors
Using a library might seem easy at first glance, but how do you choose the right library? How do you upgrade an existing one? And how do you wade through the documentation to find what you want?
In this workshop, we’ll discuss all these finer points while going through a general example of building a code editor using CodeMirror in React. All while sharing some of the nuances our team learned about using this library and some problems we encountered.
Building a Hyper Fast Web Server with Deno
JSNation Live 2021JSNation Live 2021
156 min
Building a Hyper Fast Web Server with Deno
WorkshopFree
Matt Landers
Will Johnston
2 authors
Deno 1.9 introduced a new web server API that takes advantage of Hyper, a fast and correct HTTP implementation for Rust. Using this API instead of the std/http implementation increases performance and provides support for HTTP2. In this workshop, learn how to create a web server utilizing Hyper under the hood and boost the performance for your web apps.
Database Workflows & API Development with Prisma
Node Congress 2022Node Congress 2022
98 min
Database Workflows & API Development with Prisma
WorkshopFree
Nikolas Burk
Nikolas Burk
Prisma is an open-source ORM for Node.js and TypeScript. In this workshop, you’ll learn the fundamental Prisma workflows to model data, perform database migrations and query the database to read and write data. You’ll also learn how Prisma fits into your application stack, building a REST API and a GraphQL API from scratch using SQLite as the database.
Table of contents:
- Setting up Prisma, data modeling & migrations- Exploring Prisma Client to query the database- Building REST API routes with Express- Building a GraphQL API with Apollo Server
Building a GraphQL-native serverless backend with Fauna
GraphQL Galaxy 2021GraphQL Galaxy 2021
143 min
Building a GraphQL-native serverless backend with Fauna
WorkshopFree
Rob Sutter
Shadid Haque
2 authors
Welcome to Fauna! This workshop helps GraphQL developers build performant applications with Fauna that scale to any size userbase. You start with the basics, using only the GraphQL playground in the Fauna dashboard, then build a complete full-stack application with Next.js, adding functionality as you go along.

In the first section, Getting started with Fauna, you learn how Fauna automatically creates queries, mutations, and other resources based on your GraphQL schema. You learn how to accomplish common tasks with GraphQL, how to use the Fauna Query Language (FQL) to perform more advanced tasks.

In the second section, Building with Fauna, you learn how Fauna automatically creates queries, mutations, and other resources based on your GraphQL schema. You learn how to accomplish common tasks with GraphQL, how to use the Fauna Query Language (FQL) to perform more advanced tasks.
Building GraphQL APIs With The Neo4j GraphQL Library
GraphQL Galaxy 2021GraphQL Galaxy 2021
175 min
Building GraphQL APIs With The Neo4j GraphQL Library
WorkshopFree
William Lyon
William Lyon
This workshop will explore how to build GraphQL APIs backed Neo4j, a native graph database. The Neo4j GraphQL Library allows developers to quickly design and implement fully functional GraphQL APIs without writing any resolvers. This workshop will show how to use the Neo4j GraphQL Library to build a Node.js GraphQL API, including adding custom logic and authorization rules.

Table of contents:
- Overview of GraphQL and building GraphQL APIs
- Building Node.js GraphQL APIs backed a native graph database using the Neo4j GraphQL Library
- Adding custom logic to our GraphQL API using the @cypher schema directive and custom resolvers
- Adding authentication and authorization rules to our GraphQL API