Spring Boot Integration Tests With AWS Services Using LocalStack

Last Updated:  June 16, 2022 | Published: April 21, 2020

If your Spring Boot application integrates AWS services like S3, SNS, or SQS, you might wonder how to write effective integration tests. Should I mock the entire interaction with AWS? Should I duplicate the AWS infrastructure for testing? Can I somehow start a local AWS clone for testing purposes? With this blog post, you'll learn how to write integration tests for your Spring Boot application integrating AWS service with the help of Testcontainers and LocalStack. The demo application integrates Amazon SQS and S3 to process order events for which we'll write an integration test.

UPDATE: This blog post was recently updated to use Java 17, Spring Boot 2.5.5, LocalStack 0.13.0, and Spring Cloud AWS 2.3.2 (at its new home awspring). There's still a section explaining the integration test setup for Spring Cloud AWS < 2.3.0.

Spring Boot Maven Project Setup

The sample application is a basic Spring Boot Web project using Java 17 and Spring Cloud AWS.

We're using the spring-boot-starter-aws-messaging to conveniently connect to an SQS queue and have an annotation-driven message listener. Furthermore, this messaging starter has a transitive dependency to spring-cloud-starter-aws that provides the core support for AWS:

On the 17th of April 2020, the Spring Team announced that the Spring Cloud AWS project is no longer part of the Spring Cloud release train. Ever since the project found a new home at awspring. One of the core maintainers, Maciej Walkowiak, wrote about this transition and the implications in one of his blog posts.

For our integration test setup, we need the following dependencies:

To manage and align the dependency versions of Testcontainers and Spring Cloud AWS dependencies, we're using their BOMs:

Test Dependency Walkthrough

Let's have a look at why we need these dependencies and what they are doing.

As our demo application integrates two AWS services (S3 and SQS), we have to provide this infrastructure for the integration tests. We can either duplicate the AWS services for our tests or provide a mock infrastructure.

Option one is using real AWS services for our tests and hence making sure the application can work with them. It has the downside of additional AWS costs, and once two developers execute the integration tests, they might run into issues.

With option two, we have the drawback of using a mock infrastructure and a more artificial environment (e.g., almost no latency, not using the real AWS services, …).  On the other hand, it's cheap and can run on multiple machines in parallel. For this tutorial, we're choosing option two and making use of LocalStack:

  • Easy-to-use test/mocking framework for developing AWS applications
  • It spins up an environment with the same functionality and APIs as AWS
  • Available as Docker containers and supports all core AWS services in the community edition

To manage the lifecycle of the LocalStack Docker container, we're using Testcontainers:

As we're about to test an asynchronous operation, we include Awaitility:

  • Java Utility library to test asynchronous code
  • Express expectations using a DSL

Finally, the Spring Boot Starter Test serves multiple purposes:

  • Swiss-army knife for testing Spring applications
  • Including basic testing infrastructure: test engine, assertion libraries, mocking library, etc.

Spring Boot Application Walkthrough

First, let's have a look at how we configure the connection to AWS for our application.

The Java AWS SDK offers multiple ways (e.g., system properties, env variables, etc.) to configure the AWS credentials. As we are using the Spring Cloud Starter for AWS, we can additionally configure the access by specifying both the access and secret key inside our application.yml :

Apart from this, we're also specifying the AWS region and both SQS and S3 logical resource names using a custom property section (event-processing). If we're using the Parameter Store of the AWS Systems Manager (SSM), we can also define such Spring Boot configuration values within AWS.

As the application is not running inside an AWS stack (e.g., EC2), automatic stack detection is disabled with cloud.aws.stack.auto.

The actual processing logic of this demo application happens inside the SQS listener. We subscribe to the SQS queue using an annotation-driven listener:

The business logic of our message listener is dead simple. We log each incoming OrderEvent and upload it to S3.

The OrderEvent is a POJO containing information about an order:

While the raw SQS payload is a String, the AWS messaging dependency allows us to serialize it to a Java object. We are already doing this, as the processMessage method takes OrderEvent as a parameter. Behind the scenes, this conversion is done using the MappingJackson2MessageConverter.

By default, this message converter instantiates its own Jackson ObjectMapper. As the OrderEvent uses a Java 8 LocalDateTime we need the Java Time Module registered for our ObjectMapper.

We can provide our own MappingJackson2MessageConverter and set the ObjectMapper to override this default behavior. We're using the auto-configured ObjectMapper from Spring Boot for this purpose as it contains all required Jackson modules out-of-the-box:

The second bean inside this configuration is optional, but we'll use the QueueMessagingTemplate to send a message to SQS during test execution.

Next, let's see how we can write integration tests for our Spring application using these AWS components with Testcontainers and LocalStack.

Integration Test Setup with Testcontainers and LocalStack

Now it's time to write an integration test for this asynchronous process with LocalStack and Testcontainers.

During the test execution, we need access to a local SQS queue and S3 bucket. For this purpose, we use the LocalStack module of Testcontainers and activate both AWS services:

With the help of @Testcontainers, we active the JUnit Jupiter Testcontainers extension that will manage the lifecycle of the LocalStack Docker container:

As our application expects an SQS queue to subscribe to and an S3 bucket to write data to, we need to create both resources. We can perform additional setup tasks inside the Docker container using the execInContainer method from Testcontainers. We can use this mechanism to create the infrastructure using awslocal (thin AWS CLI wrapper of LocalStack):

As part of the JUnit Jupiter lifecycle@BeforeAll, we prepare the LocalStack environment before any test is executed. While this is one possible solution to initialize our Docker container with Testcontainers, there are multiple other Testcontainers initialization strategies.

What's left is to override the endpoint for our AWS Java SDK  clients. We don't want to reach out to the real AWS services during test execution but rather connect to our locally running LocalStack infrastructure.

LocalStack makes all activated AWS services available at a single edge port (4566). However, this edge port is mapped to an ephemeral port that changes with each test execution. Hence, we have to override this endpoint dynamically and can't hardcode http://localhost:4566:

Starting with Spring Cloud AWS 2.3.0, we can use a configuration property to set the endpoint URL for each AWS service integration. Furthermore, we override the logical names of our SQS queue and S3 bucket alongside the credentials for LocalStack. With the default configuration, LocalStack accepts every access and secret key combination and doesn't enforce IAM permissions.

Verifying the Spring Boot Event Processing

Finally, we can now write the actual test for our order event processing.

First, we put a message into the local AWS SQS queue using the QueueMessagingTemplate and then expect to find an object in the S3 bucket with the given order id:

As this message processing happens asynchronously, we can't expect the object to be present in S3 right after putting the message in the queue. Therefore we'll wait for five seconds to find the object in S3 and otherwise fail the test. The given() part comes from Awaitility.

The .ignoreException() part is necessary, as the S3 client will throw this exception whenever it can't find the request object, which might be the case in the first milliseconds of trying to find it.

Spring Cloud AWS < 2.3.0 Integration Test Setup

For projects that use a Spring Cloud AWS version before 2.3.0 or manually create the AWS Java SDK clients, the integration test setup looks slightly different.

We can provide the bean definition for our tests using a @TestConfiguration class and define the endpoint configuration when constructing the AWS Java SDK client:

Both clients will now point to the local Amazon S3 and Amazon SQS instances. As our @TestConfiguration class is a static inner class of our integration test, Spring automatically detects it. There's no need to explicitly use @Import(AwsTestConfig.class) unless we outsource it to a dedicated file.

Spring Boot, AWS, and LocalStack Summary

With this testing recipe, we are now able to write integration tests for our Spring Boot applications with ease. Thanks to LocalStack, we can launch a local AWS cloud in a matter of seconds. With the help of Testcontainers, we can seamlessly integrate LocalStack into the lifecycle of our test and run our local AWS cloud within a Docker container.

Furthermore, this setup also works for any other AWS service that LocalStack supports, e.g., DynamoDB or SNS. You can find the LocalStack AWS feature coverage and the feature parity for all services here. For more information about LocalStack and Testcontainers, take a look at the Java Testing Toolbox eBook.

The demo application is available on GitHub.

For more hands-on examples for building real-world applications with Spring Boot and AWS from top to bottom, make sure to take a look at the Stratospheric project. As part of this book, you'll learn everything you need to know to get your Spring Boot application into production with AWS. This includes AWS infrastructure setup with the AWS CDK, feature development with Spring Boot and Spring Cloud AWS, and best practices for operating the application in production.

Have fun writing integration tests for your Spring Boot application using AWS with LocalStack and Testcontainers,

Phil

  • Hi! Thanks for this! It’s been helpful to try and get something setup. My only issue is that when the bean AmazonSQSAsync is created in the TestConfiguration it doesn’t override the bean created in the actual application (which has to be marked as primary to override the default Spring one). This means it always tries to connect to the actual bean. Is there a way to make sure the test bean overrides the actual bean for testing?

    • Hey Dexter,

      yes, just recently I found a better way of doing it and also solving your request. You can put the following inside your application.properties file inside src/main/test

      spring.main.allow-bean-definition-overriding=true

      This will override the bean and you won’t have to use @Primary at all

  • Great article, thanks.

    Just one question, in my @SpringBootTest then @SqsListener is unable to connect to the queue. Any idea how to resolve this?

    SimpleMessageListenerContainer : An Exception occurred while polling queue 'my-queue'. The failing operation will be retried in 10000 milliseconds
    com.amazonaws.SdkClientException: Unable to execute HTTP request: Connect to localhost:56734 [localhost/127.0.0.1, localhost/0:0:0:0:0:0:0:1] failed: Connection refused (Connection refused)

    • Hi David,

      thanks for your feedback.

      I’m happy to help – could you please either create a GitHub issue or a Stack Overflow question and send me the link as a reply to this comment? Please also add your existing test setup and further information about your project (e.g. dependency version, etc.).

      The functionality of my blog’s comment section is quite limited to properly paste large code examples and seamlessly interact.

      Kind regards,
      Philip

  • {"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
    >