Delightful Integration Tests With Testcontainers

Rate this content
Bookmark

Dockerized services are an excellent tool for creating repeatable, isolated environments ideal for integration tests. In this session, we'll look at the Testcontainers libraries which provide flexible and intuitive API for programmatically controlling lifecycle of your service dependencies in Docker containers. Running databases, Kafka, Elasticsearch, and even cloud technologies, straight from your test code ensures environment config is always up-to-date and consistent during local development and in CI pipelines.

You’ll learn everything necessary to start adding powerful integration tests to your codebase without the headache of managing external service dependencies manually!

21 min
03 Nov, 2022

Video Summary and Transcription

Testing is crucial for development and production, with integration tests becoming more popular. Test containers is a library that integrates with Docker to create reliable test environments. It is flexible and can be used with various frameworks and test libraries. The IDE setup involves configuring the container and connecting it to the application. Test containers can be used for complex operations and allows running tests with real dependencies.

Available in Español

1. Introduction to Integration Testing

Short description:

Testing is super important for development and production. Automated tests are crucial for releasing software. Integration tests have become more popular as applications rely on interactions with third-party systems. They provide a reliable test suite that catches real-world issues.

Hi. You're watching TestJS Summit and this is delightful integration tests with test containers. Testing is very important. More projects should test applications better and I hope after this quick session you're gonna learn about how you can do integration tests which you like using Test Containers libraries.

My name is Alex Shoive and I work as a Developer Relations person at Atomic Jar, a company created... a startup created by the Test Containers Java maintainers originally and now we have more people from different language ecosystems helping us work on Test Containers. If you have any questions, you can find me online. I'd be happy to chat about anything. Test Containers, so testing related or just software engineering in general. I think it would be very, very cool. So drop me, drop me a line.

Testing is super, super important because it lies on the critical paths from development to production. If we don't have the good automated test suite, we cannot release things well. We need to have automated tests because we want to make sure that whenever we have something that we potentially want to release, we can go through our pipeline without bottlenecking on any manual process. This is helpful during a normal development practice, development loop, but it's also super helpful in case there are any security issues or supply chain security issues where you update the third party packages, and then you need to release things because they could be security and vulnerability fixes, but if you don't have good test suits that you trust, then this is a manual process and you are as good as exposed. But if you do, you can run your automated tests. You can release immediately because you have confidence in your tests. This is very, very important, and lately, the way how we see what types of tests we want to run has been shifted.

In the past, we had the testing pyramid and we run a ton of unit tests and they covered all possible scenarios and we had very good test coverage, and then we still missed some issues. So, recently, independent teams have been coming out how they're rethinking the testing pyramid and how they put more and more emphasis on integration tests. Meanwhile, it makes a lot of sense. Our applications have become smaller. We are mostly writing, and we are talking about the backend applications here, we are mostly writing microservices that talk to other APIs or talk to various technologies like databases or message brokers or Cloud technologies, and the application behavior very much is encoded in the interactions with those third party systems rather than the business logic within the particular application, how it transforms the data. So, it does make sense to have fewer implementation detail tests, and use the integration test which run your application with the immediate environment, with all the necessary components for your application to run properly as it would run in production, but in your testing setup. That could be the bulk of our test suite. That could be the test that we trust and rely on. And we still can have end-to-end integrator tests that run in the environment similar to production, where all the systems are spin up at the same time. And when we check the actual workflows, as if that would be a production environment, production data, or similar data, but in a much larger environment. So for a test suite that you run everywhere, on your machine, on your colleague's machine, in your CI, integration tests hit the sweet spot between the simplicity of the setup and also how many issues with the real-world technologies they can catch. That's why they are getting more and more popular.

2. Introduction to Test Containers

Short description:

Test containers is a library that integrates with Docker to create ephemeral environments for running third-party service dependencies. It allows you to test your application with real dependencies, making your tests more reliable. Test containers uses Docker as the environment to spin up containers. However, Docker is sometimes inflexible for integration tests. This is where TestContainers comes in, providing programmatic access to create and manage containers for testing.

This brings us to test containers. Test containers is libraries in different languages, including the test getters' node implementation that works for JavaScript and TypeScript. They integrate with Docker to create ephemeral environments where you can run the third-party service dependencies that your application requires. You can run the databases, you can run your Kafka, you can run your Elasticsearch, you can learn your local stack, if you work with LWS technologies.

You can run them in Docker containers, and your application has the full control over the lifecycle of those. And your tests have the full control over the configuration of those. So you can test your application with the real dependencies and know that it works as expected.

Test containers has recently been named in the ThoughtWorks Technology Radar. It was put into the Adult category, which means technically that there should be a strong reason. You should really know what you're doing, if you don't want to use test containers. They are allowing, test containers allows you to create a reliable environment with the programmatic creation of those lightweight containers for your dependencies. And it makes your tests more reliable, and it tries to nudge you into doing the right things with your integration tests, and that's why there are more and more projects, which are using test containers in various setups and environments.

Test containers uses Docker as the environment where it spins up those containers that your application wants to run. And this is great because Docker is almost universally available, it runs on all popular operating systems, and its developers understand how Docker works, or how to use Docker from the outside. So this is a great, great option for leveraging a runtime to run those dependencies for your application. However, the stock sort of look and feel user experience of Docker is not sometimes flexible enough for your integration tests.

Docker is great because it has all the software in the world that can be run in Docker. There are registries where you can pull all the technologies that your soul requires. It provides you with the process isolation. It provides you with the ability to configure both the container and the application within the container. They give you the CPU and memory limits. All those good things, but it is a little bit inflexible for the tests specifically because during the tests, we want to put our application into the specific scenarios where something might go wrong. What will happen when the application works with a database and the data schema is incorrect? Or what will happen if my application doesn't have a long latency until it reaches Kafka? Or what happens when my Redis key numbers are close to the integer range and are trying to overflow? All the different scenarios and they all break the setup in some way. This is the notion of tests. This is what tests should do. They put your application under stress and then they want to figure out whether it behaves correctly. So with Docker, once you break the environment, it's very, very hard to recreate the environment. And this is where TestContainers comes in.

TestContainers gives you the programmatic access to create, manage, lifecycle and clean up the containers that you want to run. It gives you API to configure both the container, like expose which ports you want to expose from the container if you're working with it through the network.

3. Introduction to Test Containers Continued

Short description:

Test containers is a popular approach for integration testing that provides a flexible setup. It integrates with various frameworks and test libraries, including Jest. Test containers take care of container cleanup, ensuring a repeatable environment. The library comes with modules for running popular technologies in containers, allowing developers to focus on business logic. It is not limited to Node.js and can be used in any programming language ecosystem.

Or which files you want to copy, or whether you want to programmatically follow the logs of the container and so on. So you can configure everything from your application tests, from your IDE. And you don't need, you can, you can do this any number of times. So tests bring their own environment into the play. And it also integrates with various frameworks and test libraries.

For example, there is a module for Jest test containers, which simplifies working with Jest test, and test containers, where you can declaratively specify which containers you want, for example. Testcontainersnode, as the other test containers implementations, is an open source project. Christian is a true hero of the testcontainersnode implementation, the main maintainer currently. There is an npm package, which is how you get test containers into your application. And what it does, it uses docker-node to talk to the docker environment, so your docker environment doesn't need to be any particular docker implementation. It, of course, runs with Docker Desktop, but it also can run with any other compatible docker implementation. So, for example, if you're running Minikube, the lightweight Kubernetes cluster which exposes Docker API, you can use that to run your test containers-based tests. Or if you're using a remote Docker, your test container says it can talk to that. And internally at Atomic Jar, we're building the cloud solution, where you can get on-demand VM and run your test containers' tests against that. So it's a very, very flexible setup, and it works really well.

One thing that is very important here is that test containers take care of the cleanup of the containers. We know that for reliable integration tests, you need to have a repeatable environment, and for that, you want always to clean up after the run. That means if your tests pass, we clean up the containers and remove them. If your tests fail, we clean up the containers and remove them. If your machine runs with a remote Docker environment and your machine crashes, like Internet blows up, we still will clean up the containers on the remote Docker host. That means that you will never be in a situation where your test connects to the Kafka instance that you started two weeks ago and it's lingering for some reason on your beefy CI machine. And then because the issues that arise from that are really, really hard to reproduce and incredibly hard to debug and fix. So test containers, libraries try to nudge you into the right direction with test container tests to enable parallelization of tests nicely, to kind of nudge you into using the correct API, to do the cleanup at all times. And in general, it's a very, very popular approach.

Besides being just good library by itself, test containers comes with ecosystem of the modules where popular technologies have little implementation, little libraries, little modules, which specify and encode how to run that particular technology in your code. So you don't have to figure out what you need to do to run Cassandra in a Docker container or Kafka in a Docker container, but you can just use the API and specify, give me a Kafka container, give me a MongoDB container, and you will get an instance of that immediately for yourself, which is great because that allows you to concentrate on the actual business logic of your tasks without spending time on figuring out the infrastructure, because that is managed by task containers. And it's not just a node project, right? Task containers is good integration tests are required in any ecosystem of any programming language. So what you can do, you can have the similar approach in your Java application, so your .NET applications, in your Go application, there's Python, there's task containers Rust. So it's a it's a very, very popular engineering approach. And now I would like to show you a little bit how it feels to have tasks and what are the building blocks of the API that you need to know to be productive with task containers.

4. Exploring IDE and Basic Setup

Short description:

Let's look into the IDE. We can declare the dependency as normal to get the npm package. The basic building block is generic container, which represents the container managed by task containers. We configure the container by specifying the Docker image name, exposing ports, and copying files. The container can be run anywhere, and we use the generic container instance to provide information for our application to connect. After the tests, test containers cleans up the container. The test itself puts and retrieves values in Redis.

Let's look into the IDE. Right. So I have a very, very simple project here. It doesn't it doesn't even have the actual application. I just want to give you the taste of the API and talk through what's important from the task containers point of view.

All right. So we can just declare the dependency as normal to get that npm package. And here in our test file, what we can do, we can require task containers and the basic building block is generic container. Generic container is an abstraction that represents the container that we can manage via task containers. What we need to give it, we need to give it the Docker image name, which is which is the in the current cases, Redis we're going to run with the Redis container and then we do the configuration. We can expose the ports. We can configure which files we need to copy into the container. For example, this is a good way to instantiate your database schema by sending it into your container. And then, of course, we have the methods to use and configure the life cycle of our containers of our infrastructure that we need for the test.

So here in this in this just test, we are. Specified that we want to create the container before all the tests. So this container will be shared between all the tests. Currently, we don't have many, but we create the container and then what this is the interesting bit, the container can be run anywhere. It can be running on the remote Docker host or on your local host or in a VM somewhere. When we make sure that our application knows how to talk to the technology, to the database, or in this case, Redis in that container, we don't hard core any configuration, but we use that generic container instance to provide information where it's running so we can configure our application properly. So here we just say, oh Redis, you're going to talk to Redis, the Redis client, sorry. You're going to talk to Redis by that host, which we get from the container and the exposed port 6379 will be mapped to the high level random port on the host side here. Let's say, get the map port for that value, and then our Redis client is ready to connect to Redis. After all, we clean up everything, but we don't have to. At least we don't have to clean up the container because test containers, when our tests are passed, test containers will clean it up by itself. But this is still a good practice, if you want to spin out like thousands of containers during a test suite, maybe you should take care of the lifecycle itself and stop them. So they just don't use all the resources at the same time. So the test itself is super simple. We just want to put the values into the Redis and get the values in Redis. And you can see that if I run my application, then we can see that the containers are there.

5. Running Tests and Importing Modules

Short description:

When running tests, the containers spin up and run in TestCenter's cloud. An example of an express app using MongoDB as the storage is provided. The tests use SuperTest for high-level functional testing, expecting successful requests and data retrieval. Instead of building the containers ourselves, we import modules for common technologies like MySQL, MongoDB, Kafka, Elasticsearch, and Postgres. Contributions of modules for other technologies are welcome.

So if I do docker stats. There's currently only the root container, which is the container testing that uses internally to clean up for the cleanup purposes. So if I just do. Very simple setup here. So I will run tests again and you can see that the containers will be spinning up for a second and then running as well.

So I don't run my Docker locally here. I use TestCenter's cloud, which is connected to a cluster nearby. That's where my Docker containers are running. And then if you want a larger example of our of our situation, then have a different one. This is an actual express app that uses MongoDB as the storage. Let me just the test very simple. Let's look at the test. We just use SuperTest to send a request, actual high level functional tests for our application. And then we expect those requests to succeed. And then we expect the data back. So this is a very high level functional test.

And the interesting part here is that we don't build the containers ourselves. We don't say how to configure MongoDB. But what we're doing, we're just importing the module that we can have. And if you look at the... If you look at the repository, you can see that there are some modules that are for fairly common technologies. MySQL, Mongo, Kafka, Elasticsearch, Postgres that are there. And it's not such a large implementation. So if you're interested in any other technologies that you want to run, maybe after figuring it out for your environment, consider contributing a module back. That'd be a great use. Right. So here we use Mongo. And you can see how it works. So one way to be container we run. We wait for that.

6. Using Test Containers for Complex Operations

Short description:

Test containers provide a single way to support long running complex operations, such as starting containers and waiting for them to start. It's a flexible and easy-to-integrate approach that allows you to run tests with real dependencies. You can build complex topologies, wait for the database to start, create images on the fly, and perform various operations with Docker. Check out the source on GitHub and join the Slack community for further discussions.

That's going to note is a magic. No just package. So a single way to support it, of course, for all long running complex operations like starting containers and waiting for the technology, the container to start. And then we can figure our MongoDB client to use the connection string from our model. You don't even need to know what exactly is the connection string or how it looks. What's the format of that? You just can run it. And this is very, very great.

So now after that, we, of course, disconnect and then we can run the tests. And here we run more tests. So if I if I run NPM test here, you can see in the background, if the Docker image is not in the cache of my Docker demon, then the image will be automatically pulled from Docker Hub for Desktop support, the private registries as well, authentication, anything that you can pull with with Docker would be supported. And then you can just run it. And then after everything is said and done, the containers are cleaned up and removed together with volumes and all of that. So it's a very, very flexible approach. It's very easy to integrate in your applications. You can also package your application in Docker container, but I would prefer running normally, running it normally on my machine for my ID, because then they can send breakpoints and run it at will.

So just one more slide. There is the you can do the complex things. You can build complex topologies with network. You can wait for the database in the container to start. You can create the images on the fly. You can pull the logs and copy files back and forth and run the commands. Anything that works with Docker works as test containers. And I think this is really, really great approach. And you are welcome to check out the source on GitHub. And if you have any questions or if you want to try and talk to the community, please join the Slack at SlackTaskContainers.org to talk to the like minded individuals. This is it. Thank you very much for watching. And if you have any questions, I'd be happy to answer them.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Remix Conf Europe 2022Remix Conf Europe 2022
23 min
Scaling Up with Remix and Micro Frontends
Top Content
Do you have a large product built by many teams? Are you struggling to release often? Did your frontend turn into a massive unmaintainable monolith? If, like me, you’ve answered yes to any of those questions, this talk is for you! I’ll show you exactly how you can build a micro frontend architecture with Remix to solve those challenges.
Remix Conf Europe 2022Remix Conf Europe 2022
37 min
Full Stack Components
Top Content
Remix is a web framework that gives you the simple mental model of a Multi-Page App (MPA) but the power and capabilities of a Single-Page App (SPA). One of the big challenges of SPAs is network management resulting in a great deal of indirection and buggy code. This is especially noticeable in application state which Remix completely eliminates, but it's also an issue in individual components that communicate with a single-purpose backend endpoint (like a combobox search for example).
In this talk, Kent will demonstrate how Remix enables you to build complex UI components that are connected to a backend in the simplest and most powerful way you've ever seen. Leaving you time to chill with your family or whatever else you do for fun.
JSNation Live 2021JSNation Live 2021
29 min
Making JavaScript on WebAssembly Fast
Top Content
JavaScript in the browser runs many times faster than it did two decades ago. And that happened because the browser vendors spent that time working on intensive performance optimizations in their JavaScript engines.Because of this optimization work, JavaScript is now running in many places besides the browser. But there are still some environments where the JS engines can’t apply those optimizations in the right way to make things fast.We’re working to solve this, beginning a whole new wave of JavaScript optimization work. We’re improving JavaScript performance for entirely different environments, where different rules apply. And this is possible because of WebAssembly. In this talk, I'll explain how this all works and what's coming next.
React Summit 2023React Summit 2023
24 min
Debugging JS
As developers, we spend much of our time debugging apps - often code we didn't even write. Sadly, few developers have ever been taught how to approach debugging - it's something most of us learn through painful experience.  The good news is you _can_ learn how to debug effectively, and there's several key techniques and tools you can use for debugging JS and React apps.

Workshops on related topic

React Day Berlin 2022React Day Berlin 2022
86 min
Using CodeMirror to Build a JavaScript Editor with Linting and AutoComplete
Top Content
WorkshopFree
Using a library might seem easy at first glance, but how do you choose the right library? How do you upgrade an existing one? And how do you wade through the documentation to find what you want?
In this workshop, we’ll discuss all these finer points while going through a general example of building a code editor using CodeMirror in React. All while sharing some of the nuances our team learned about using this library and some problems we encountered.
TestJS Summit - January, 2021TestJS Summit - January, 2021
173 min
Testing Web Applications Using Cypress
WorkshopFree
This workshop will teach you the basics of writing useful end-to-end tests using Cypress Test Runner.
We will cover writing tests, covering every application feature, structuring tests, intercepting network requests, and setting up the backend data.
Anyone who knows JavaScript programming language and has NPM installed would be able to follow along.
Node Congress 2023Node Congress 2023
63 min
0 to Auth in an Hour Using NodeJS SDK
WorkshopFree
Passwordless authentication may seem complex, but it is simple to add it to any app using the right tool.
We will enhance a full-stack JS application (Node.JS backend + React frontend) to authenticate users with OAuth (social login) and One Time Passwords (email), including:- User authentication - Managing user interactions, returning session / refresh JWTs- Session management and validation - Storing the session for subsequent client requests, validating / refreshing sessions
At the end of the workshop, we will also touch on another approach to code authentication using frontend Descope Flows (drag-and-drop workflows), while keeping only session validation in the backend. With this, we will also show how easy it is to enable biometrics and other passwordless authentication methods.
Table of contents- A quick intro to core authentication concepts- Coding- Why passwordless matters
Prerequisites- IDE for your choice- Node 18 or higher
React Summit US 2023React Summit US 2023
96 min
Build a powerful DataGrid in few hours with Ag Grid
WorkshopFree
Does your React app need to efficiently display lots (and lots) of data in a grid? Do your users want to be able to search, sort, filter, and edit data? AG Grid is the best JavaScript grid in the world and is packed with features, highly performant, and extensible. In this workshop, you’ll learn how to get started with AG Grid, how we can enable sorting and filtering of data in the grid, cell rendering, and more. You will walk away from this free 3-hour workshop equipped with the knowledge for implementing AG Grid into your React application.
We all know that rolling our own grid solution is not easy, and let's be honest, is not something that we should be working on. We are focused on building a product and driving forward innovation. In this workshop, you'll see just how easy it is to get started with AG Grid.
Prerequisites: Basic React and JavaScript
Workshop level: Beginner
Node Congress 2023Node Congress 2023
49 min
JavaScript-based full-text search with Orama everywhere
Workshop
In this workshop, we will see how to adopt Orama, a powerful full-text search engine written entirely in JavaScript, to make search available wherever JavaScript runs. We will learn when, how, and why deploying it on a serverless function could be a great idea, and when it would be better to keep it directly on the browser. Forget APIs, complex configurations, etc: Orama will make it easy to integrate search on projects of any scale.
TestJS Summit 2021TestJS Summit 2021
85 min
Automated accessibility testing with jest-axe and Lighthouse CI
Workshop
Do your automated tests include a11y checks? This workshop will cover how to get started with jest-axe to detect code-based accessibility violations, and Lighthouse CI to validate the accessibility of fully rendered pages. No amount of automated tests can replace manual accessibility testing, but these checks will make sure that your manual testers aren't doing more work than they need to.