1. Introduction to Integration Testing
Testing is super important for development and production. Automated tests are crucial for releasing software. Integration tests have become more popular as applications rely on interactions with third-party systems. They provide a reliable test suite that catches real-world issues.
Hi. You're watching TestJS Summit and this is delightful integration tests with test containers. Testing is very important. More projects should test applications better and I hope after this quick session you're gonna learn about how you can do integration tests which you like using Test Containers libraries.
My name is Alex Shoive and I work as a Developer Relations person at Atomic Jar, a company created... a startup created by the Test Containers Java maintainers originally and now we have more people from different language ecosystems helping us work on Test Containers. If you have any questions, you can find me online. I'd be happy to chat about anything. Test Containers, so testing related or just software engineering in general. I think it would be very, very cool. So drop me, drop me a line.
Testing is super, super important because it lies on the critical paths from development to production. If we don't have the good automated test suite, we cannot release things well. We need to have automated tests because we want to make sure that whenever we have something that we potentially want to release, we can go through our pipeline without bottlenecking on any manual process. This is helpful during a normal development practice, development loop, but it's also super helpful in case there are any security issues or supply chain security issues where you update the third party packages, and then you need to release things because they could be security and vulnerability fixes, but if you don't have good test suits that you trust, then this is a manual process and you are as good as exposed. But if you do, you can run your automated tests. You can release immediately because you have confidence in your tests. This is very, very important, and lately, the way how we see what types of tests we want to run has been shifted.
In the past, we had the testing pyramid and we run a ton of unit tests and they covered all possible scenarios and we had very good test coverage, and then we still missed some issues. So, recently, independent teams have been coming out how they're rethinking the testing pyramid and how they put more and more emphasis on integration tests. Meanwhile, it makes a lot of sense. Our applications have become smaller. We are mostly writing, and we are talking about the backend applications here, we are mostly writing microservices that talk to other APIs or talk to various technologies like databases or message brokers or Cloud technologies, and the application behavior very much is encoded in the interactions with those third party systems rather than the business logic within the particular application, how it transforms the data. So, it does make sense to have fewer implementation detail tests, and use the integration test which run your application with the immediate environment, with all the necessary components for your application to run properly as it would run in production, but in your testing setup. That could be the bulk of our test suite. That could be the test that we trust and rely on. And we still can have end-to-end integrator tests that run in the environment similar to production, where all the systems are spin up at the same time. And when we check the actual workflows, as if that would be a production environment, production data, or similar data, but in a much larger environment. So for a test suite that you run everywhere, on your machine, on your colleague's machine, in your CI, integration tests hit the sweet spot between the simplicity of the setup and also how many issues with the real-world technologies they can catch. That's why they are getting more and more popular.
2. Introduction to Test Containers
Test containers is a library that integrates with Docker to create ephemeral environments for running third-party service dependencies. It allows you to test your application with real dependencies, making your tests more reliable. Test containers uses Docker as the environment to spin up containers. However, Docker is sometimes inflexible for integration tests. This is where TestContainers comes in, providing programmatic access to create and manage containers for testing.
You can run them in Docker containers, and your application has the full control over the lifecycle of those. And your tests have the full control over the configuration of those. So you can test your application with the real dependencies and know that it works as expected.
Test containers has recently been named in the ThoughtWorks Technology Radar. It was put into the Adult category, which means technically that there should be a strong reason. You should really know what you're doing, if you don't want to use test containers. They are allowing, test containers allows you to create a reliable environment with the programmatic creation of those lightweight containers for your dependencies. And it makes your tests more reliable, and it tries to nudge you into doing the right things with your integration tests, and that's why there are more and more projects, which are using test containers in various setups and environments.
Test containers uses Docker as the environment where it spins up those containers that your application wants to run. And this is great because Docker is almost universally available, it runs on all popular operating systems, and its developers understand how Docker works, or how to use Docker from the outside. So this is a great, great option for leveraging a runtime to run those dependencies for your application. However, the stock sort of look and feel user experience of Docker is not sometimes flexible enough for your integration tests.
Docker is great because it has all the software in the world that can be run in Docker. There are registries where you can pull all the technologies that your soul requires. It provides you with the process isolation. It provides you with the ability to configure both the container and the application within the container. They give you the CPU and memory limits. All those good things, but it is a little bit inflexible for the tests specifically because during the tests, we want to put our application into the specific scenarios where something might go wrong. What will happen when the application works with a database and the data schema is incorrect? Or what will happen if my application doesn't have a long latency until it reaches Kafka? Or what happens when my Redis key numbers are close to the integer range and are trying to overflow? All the different scenarios and they all break the setup in some way. This is the notion of tests. This is what tests should do. They put your application under stress and then they want to figure out whether it behaves correctly. So with Docker, once you break the environment, it's very, very hard to recreate the environment. And this is where TestContainers comes in.
TestContainers gives you the programmatic access to create, manage, lifecycle and clean up the containers that you want to run. It gives you API to configure both the container, like expose which ports you want to expose from the container if you're working with it through the network.
3. Introduction to Test Containers Continued
Test containers is a popular approach for integration testing that provides a flexible setup. It integrates with various frameworks and test libraries, including Jest. Test containers take care of container cleanup, ensuring a repeatable environment. The library comes with modules for running popular technologies in containers, allowing developers to focus on business logic. It is not limited to Node.js and can be used in any programming language ecosystem.
Or which files you want to copy, or whether you want to programmatically follow the logs of the container and so on. So you can configure everything from your application tests, from your IDE. And you don't need, you can, you can do this any number of times. So tests bring their own environment into the play. And it also integrates with various frameworks and test libraries.
For example, there is a module for Jest test containers, which simplifies working with Jest test, and test containers, where you can declaratively specify which containers you want, for example. Testcontainersnode, as the other test containers implementations, is an open source project. Christian is a true hero of the testcontainersnode implementation, the main maintainer currently. There is an npm package, which is how you get test containers into your application. And what it does, it uses docker-node to talk to the docker environment, so your docker environment doesn't need to be any particular docker implementation. It, of course, runs with Docker Desktop, but it also can run with any other compatible docker implementation. So, for example, if you're running Minikube, the lightweight Kubernetes cluster which exposes Docker API, you can use that to run your test containers-based tests. Or if you're using a remote Docker, your test container says it can talk to that. And internally at Atomic Jar, we're building the cloud solution, where you can get on-demand VM and run your test containers' tests against that. So it's a very, very flexible setup, and it works really well.
One thing that is very important here is that test containers take care of the cleanup of the containers. We know that for reliable integration tests, you need to have a repeatable environment, and for that, you want always to clean up after the run. That means if your tests pass, we clean up the containers and remove them. If your tests fail, we clean up the containers and remove them. If your machine runs with a remote Docker environment and your machine crashes, like Internet blows up, we still will clean up the containers on the remote Docker host. That means that you will never be in a situation where your test connects to the Kafka instance that you started two weeks ago and it's lingering for some reason on your beefy CI machine. And then because the issues that arise from that are really, really hard to reproduce and incredibly hard to debug and fix. So test containers, libraries try to nudge you into the right direction with test container tests to enable parallelization of tests nicely, to kind of nudge you into using the correct API, to do the cleanup at all times. And in general, it's a very, very popular approach.
Besides being just good library by itself, test containers comes with ecosystem of the modules where popular technologies have little implementation, little libraries, little modules, which specify and encode how to run that particular technology in your code. So you don't have to figure out what you need to do to run Cassandra in a Docker container or Kafka in a Docker container, but you can just use the API and specify, give me a Kafka container, give me a MongoDB container, and you will get an instance of that immediately for yourself, which is great because that allows you to concentrate on the actual business logic of your tasks without spending time on figuring out the infrastructure, because that is managed by task containers. And it's not just a node project, right? Task containers is good integration tests are required in any ecosystem of any programming language. So what you can do, you can have the similar approach in your Java application, so your .NET applications, in your Go application, there's Python, there's task containers Rust. So it's a it's a very, very popular engineering approach. And now I would like to show you a little bit how it feels to have tasks and what are the building blocks of the API that you need to know to be productive with task containers.
4. Exploring IDE and Basic Setup
Let's look into the IDE. We can declare the dependency as normal to get the npm package. The basic building block is generic container, which represents the container managed by task containers. We configure the container by specifying the Docker image name, exposing ports, and copying files. The container can be run anywhere, and we use the generic container instance to provide information for our application to connect. After the tests, test containers cleans up the container. The test itself puts and retrieves values in Redis.
Let's look into the IDE. Right. So I have a very, very simple project here. It doesn't it doesn't even have the actual application. I just want to give you the taste of the API and talk through what's important from the task containers point of view.
All right. So we can just declare the dependency as normal to get that npm package. And here in our test file, what we can do, we can require task containers and the basic building block is generic container. Generic container is an abstraction that represents the container that we can manage via task containers. What we need to give it, we need to give it the Docker image name, which is which is the in the current cases, Redis we're going to run with the Redis container and then we do the configuration. We can expose the ports. We can configure which files we need to copy into the container. For example, this is a good way to instantiate your database schema by sending it into your container. And then, of course, we have the methods to use and configure the life cycle of our containers of our infrastructure that we need for the test.
So here in this in this just test, we are. Specified that we want to create the container before all the tests. So this container will be shared between all the tests. Currently, we don't have many, but we create the container and then what this is the interesting bit, the container can be run anywhere. It can be running on the remote Docker host or on your local host or in a VM somewhere. When we make sure that our application knows how to talk to the technology, to the database, or in this case, Redis in that container, we don't hard core any configuration, but we use that generic container instance to provide information where it's running so we can configure our application properly. So here we just say, oh Redis, you're going to talk to Redis, the Redis client, sorry. You're going to talk to Redis by that host, which we get from the container and the exposed port 6379 will be mapped to the high level random port on the host side here. Let's say, get the map port for that value, and then our Redis client is ready to connect to Redis. After all, we clean up everything, but we don't have to. At least we don't have to clean up the container because test containers, when our tests are passed, test containers will clean it up by itself. But this is still a good practice, if you want to spin out like thousands of containers during a test suite, maybe you should take care of the lifecycle itself and stop them. So they just don't use all the resources at the same time. So the test itself is super simple. We just want to put the values into the Redis and get the values in Redis. And you can see that if I run my application, then we can see that the containers are there.
5. Running Tests and Importing Modules
When running tests, the containers spin up and run in TestCenter's cloud. An example of an express app using MongoDB as the storage is provided. The tests use SuperTest for high-level functional testing, expecting successful requests and data retrieval. Instead of building the containers ourselves, we import modules for common technologies like MySQL, MongoDB, Kafka, Elasticsearch, and Postgres. Contributions of modules for other technologies are welcome.
So if I do docker stats. There's currently only the root container, which is the container testing that uses internally to clean up for the cleanup purposes. So if I just do. Very simple setup here. So I will run tests again and you can see that the containers will be spinning up for a second and then running as well.
So I don't run my Docker locally here. I use TestCenter's cloud, which is connected to a cluster nearby. That's where my Docker containers are running. And then if you want a larger example of our of our situation, then have a different one. This is an actual express app that uses MongoDB as the storage. Let me just the test very simple. Let's look at the test. We just use SuperTest to send a request, actual high level functional tests for our application. And then we expect those requests to succeed. And then we expect the data back. So this is a very high level functional test.
And the interesting part here is that we don't build the containers ourselves. We don't say how to configure MongoDB. But what we're doing, we're just importing the module that we can have. And if you look at the... If you look at the repository, you can see that there are some modules that are for fairly common technologies. MySQL, Mongo, Kafka, Elasticsearch, Postgres that are there. And it's not such a large implementation. So if you're interested in any other technologies that you want to run, maybe after figuring it out for your environment, consider contributing a module back. That'd be a great use. Right. So here we use Mongo. And you can see how it works. So one way to be container we run. We wait for that.
6. Using Test Containers for Complex Operations
Test containers provide a single way to support long running complex operations, such as starting containers and waiting for them to start. It's a flexible and easy-to-integrate approach that allows you to run tests with real dependencies. You can build complex topologies, wait for the database to start, create images on the fly, and perform various operations with Docker. Check out the source on GitHub and join the Slack community for further discussions.
That's going to note is a magic. No just package. So a single way to support it, of course, for all long running complex operations like starting containers and waiting for the technology, the container to start. And then we can figure our MongoDB client to use the connection string from our model. You don't even need to know what exactly is the connection string or how it looks. What's the format of that? You just can run it. And this is very, very great.
So now after that, we, of course, disconnect and then we can run the tests. And here we run more tests. So if I if I run NPM test here, you can see in the background, if the Docker image is not in the cache of my Docker demon, then the image will be automatically pulled from Docker Hub for Desktop support, the private registries as well, authentication, anything that you can pull with with Docker would be supported. And then you can just run it. And then after everything is said and done, the containers are cleaned up and removed together with volumes and all of that. So it's a very, very flexible approach. It's very easy to integrate in your applications. You can also package your application in Docker container, but I would prefer running normally, running it normally on my machine for my ID, because then they can send breakpoints and run it at will.
So just one more slide. There is the you can do the complex things. You can build complex topologies with network. You can wait for the database in the container to start. You can create the images on the fly. You can pull the logs and copy files back and forth and run the commands. Anything that works with Docker works as test containers. And I think this is really, really great approach. And you are welcome to check out the source on GitHub. And if you have any questions or if you want to try and talk to the community, please join the Slack at SlackTaskContainers.org to talk to the like minded individuals. This is it. Thank you very much for watching. And if you have any questions, I'd be happy to answer them.