If your Spring application uses AWS components like S3, SNS, or SQS you might wonder how to write effective integration tests. Should I mock the entire interaction with AWS? Should I duplicate the AWS infrastructure for testing? Is there maybe a local AWS clone available? With this blog post, you'll learn how to write integration tests for your Spring Boot application integrating AWS service with the help of Testcontainers and LocalStack. The demo application uses both SQS and S3 to process order events for which we'll write an integration test.
UPDATE: With more recent versions of LocalStack, Testcontainers, and Spring Cloud AWS, the required integration test setup is more streamlined. More about this in an upcoming blog post.
Spring Boot Application Setup
The application is a basic Spring Boot project using Java 11 and Spring Cloud. We're using the spring-boot-starter-aws-messaging
to have almost no setup for connecting to an SQS queue. Furthermore, this messaging starter has a transitive dependency to spring-cloud-starter-aws
that provides the core AWS support:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 | <?xml version="1.0" encoding="UTF-8"?> <project> <modelVersion>4.0.0</modelVersion> <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>2.3.9.RELEASE</version> <relativePath/> <!-- lookup parent from repository --> </parent> <groupId>de.rieckpil.blog</groupId> <artifactId>spring-boot-aws-integration-tests</artifactId> <version>0.0.1-SNAPSHOT</version> <name>spring-boot-aws-integration-tests</name> <description>Writing integration tests while using AWS S3</description> <properties> <java.version>11</java.version> <spring-cloud.version>Hoxton.SR9</spring-cloud.version> <testcontainers.version>1.15.2</testcontainers.version> <awaitility.version>4.0.2</awaitility.version> </properties> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> <dependency> <groupId>org.springframework.cloud</groupId> <artifactId>spring-cloud-starter-aws-messaging</artifactId> </dependency> <!-- Test Dependencies --> </dependencies> <dependencyManagement> <dependencies> <dependency> <groupId>org.springframework.cloud</groupId> <artifactId>spring-cloud-dependencies</artifactId> <version>${spring-cloud.version}</version> <type>pom</type> <scope>import</scope> </dependency> </dependencies> </dependencyManagement> <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> </plugin> </plugins> </build> </project> |
For writing integration tests we need the following test dependencies:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 | <dependency> <groupId>org.testcontainers</groupId> <artifactId>localstack</artifactId> <version>${testcontainers.version}</version> <scope>test</scope> </dependency> <dependency> <groupId>org.testcontainers</groupId> <artifactId>junit-jupiter</artifactId> <version>${testcontainers.version}</version> <scope>test</scope> </dependency> <dependency> <groupId>org.awaitility</groupId> <artifactId>awaitility</artifactId> <version>${awaitility.version}</version> <scope>test</scope> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> <exclusions> <exclusion> <groupId>org.junit.vintage</groupId> <artifactId>junit-vintage-engine</artifactId> </exclusion> </exclusions> </dependency> |
Let's have a look at why we need these dependencies and what they are doing.
As our demo application makes use of AWS services (S3 and SQS), we have to provide this infrastructure for the integration tests. We can either duplicate the AWS services for our tests or provide a mock infrastructure. Option one is using real AWS services for our tests and hence making sure the application can work with them. It has the downside of additional AWS costs and once two developers execute the integration tests, they might run into issues.
With option two we have the drawback of using a mock infrastructure and a more artificial environment (e.g. almost no latency, not using the real AWS services, …). On the other hand, it's cheap and can run on multiple machines in parallel. For this tutorial we're choosing option two and make use of LocalStack:
- Easy-to-use test/mocking framework for developing AWS applications
- It spins up an environment with the same functionality and APIs as AWS
- Available as Docker containers and supports all core AWS services in the community edition
To manage the lifecycle of the LocalStack Docker container, we're using Testcontainers:
- De-facto standard when it comes to spinning up containers for JUnit tests (introduction article)
- Comes with an own LocalStack module
Next, as we want to verify the main functionality of the application (processing of SQS messages), we have to test asynchronous code. Fortunately, there is a great Java library available for this Awaitility:
- Java Utility library to test asynchronous code
- Express expectations using a DSL
Finally, the Spring Boot Starter Test serves multiple purposes:
- Swiss-army knife for testing Spring applications
- Includes basic testing infrastructure: test engine, assertion libraries, mocking library, etc.
- Exclude the
junit-vintage-engine
to only use JUnit 5
Spring Boot Application Walkthrough
First, let's have a look at how the connection to AWS is configured. The Java AWS SDK offers multiple ways (e.g. system properties, env variables, etc.) to configure the AWS credentials. As we are using the Spring Cloud Starter for AWS, we can additionally configure them the Spring Boot-way by specifying both the access and secret key inside our application.yml
:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 | cloud: aws: region: static: eu-central-1 stack: auto: false credentials: secretKey: ABC accessKey: XYZ sqs: orderEventQueue: order-event-queue s3: orderEventBucket: order-event-bucket |
Apart from this, we're also specifying the AWS region and both SQS and S3 logical resource names. If we're using the Parameter Store of the AWS Systems Manager (SSM), we can also define such configuration values in AWS. As the application is not running inside an AWS stack (e.g. EC2), automatic stack detection is disabled with cloud.aws.stack.auto
.
The actual processing logic of this demo application happens inside the SQS listener. As we are using the messaging starter for AWS, we can easily subscribe to an SQS queue using @SqsListener
. The logic for each incoming message is dead-simple. We'll log each incoming OrderEvent
and upload it to S3.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 | @Component public class SimpleMessageListener { private final AmazonS3 amazonS3; private final ObjectMapper objectMapper; private final String orderEventBucket; public SimpleMessageListener(@Value("${s3.orderEventBucket}") String orderEventBucket, AmazonS3 amazonS3, ObjectMapper objectMapper) { this.amazonS3 = amazonS3; this.objectMapper = objectMapper; this.orderEventBucket = orderEventBucket; } @SqsListener(value = "${sqs.orderEventQueue}") public void processMessage(@Payload OrderEvent orderEvent) throws JsonProcessingException { System.out.println("Incoming order: " + orderEvent); amazonS3.putObject(orderEventBucket, orderEvent.getId(), objectMapper.writeValueAsString(orderEvent)); System.out.println("Successfully uploaded order to S3"); } } |
The OrderEvent
is a POJO containing information about the order:
1 2 3 4 5 6 7 8 9 10 11 12 13 | public class OrderEvent { private String id; private String product; private String message; private boolean expressDelivery; @JsonFormat(shape = Shape.STRING, pattern = "yyyy-MM-dd HH:mm:ss") private LocalDateTime orderedAt; // constructor, getters & setters } |
While the raw SQS payload is a String
, the AWS messaging dependency allows us to serialize it to a Java object. We are already doing this, as the processMessage
method takes OrderEvent
as a parameter. Behind the scenes, this conversion is done using the MappingJackson2MessageConverter
.
By default, this message converter instantiates its own Jackson ObjectMapper
. As the OrderEvent
uses a Java 8 LocalDateTime
we need the Java Time Module registered inside the ObjectMapper
.
To override this default behavior, we can provide our own MappingJackson2MessageConverter
and set the ObjectMapper
. We're using the auto-configured ObjectMapper
from Spring Boot for this as it contains all required Jackson modules out-of-the-box:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 | @Configuration public class MessagingConfig { @Bean public MappingJackson2MessageConverter mappingJackson2MessageConverter(@Autowired ObjectMapper objectMapper) { MappingJackson2MessageConverter jackson2MessageConverter = new MappingJackson2MessageConverter(); jackson2MessageConverter.setObjectMapper(objectMapper); return jackson2MessageConverter; } @Bean public QueueMessagingTemplate queueMessagingTemplate(@Autowired AmazonSQSAsync amazonSQS) { return new QueueMessagingTemplate(amazonSQS); } } |
The second bean inside this configuration is optional, but we'll use the QueueMessagingTemplate
to send a message to SQS during test execution.
Next, let's see how we can write integration tests for our Spring application using these AWS components with Testcontainers and LocalStack.
Integration Test Setup with Testcontainers and LocalStack
Now we can focus on testing this application logic with an integration test. During the test execution, we need access to SQS and S3. For this we'll use the LocalStack module of Testcontainers and use the JUnit Jupiter extension to manage the lifecycle of the LocalStack Docker container:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 | @Testcontainers @SpringBootTest @Import(AwsTestConfig.class) public class SimpleMessageListenerIT { @Container static LocalStackContainer localStack = new LocalStackContainer(DockerImageName.parse("localstack/localstack:0.10.0")) .withServices(S3, SQS) .withEnv("DEFAULT_REGION", "eu-central-1"); @BeforeAll static void beforeAll() throws IOException, InterruptedException { localStack.execInContainer("awslocal", "sqs", "create-queue", "--queue-name", QUEUE_NAME); localStack.execInContainer("awslocal", "s3", "mb", "s3://" + BUCKET_NAME); } private static final String QUEUE_NAME = "order-event-test-queue"; private static final String BUCKET_NAME = "order-event-test-bucket"; // ... actual test } |
As our application expects an SQS queue to subscribe to and an S3 bucket to write data to, we need to create both resources. We can perform additional setup tasks inside the Docker container using the execInContainer
method from Testcontainers. We can use this mechanism to create the infrastructure using awslocal (thin AWS CLI wrapper of LocalStack).
As part of the JUnit Jupiter lifecycle@BeforeAll
, we prepare the LocalStack environment before any test is executed.
As the queue and bucket names differ from the ones we used for the production profile, we can specify the new names inside src/test/resources/application.yml
:
1 2 3 4 5 6 7 8 9 10 11 12 | cloud: aws: region: static: eu-central-1 stack: auto: false sqs: orderEventQueue: order-event-test-queue s3: orderEventBucket: order-event-test-bucket |
Verify the AWS SQS Message Processing of The Spring Application
The remaining setup for our test is to configure the AWS clients properly. Without manually specifying the required clients as Spring beans, the auto-configure mechanism would try to connect to the real AWS cloud during application startup.
As we don't want to connect to the actual AWS services and rather use our mocked environment of LocalStack, we can provide the beans for ourselves a @TestConfiguration
class :
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 | public class SimpleMessageListenerIT { // ... container setup @TestConfiguration static class AwsTestConfig { @Bean public AmazonS3 amazonS3() { return AmazonS3ClientBuilder.standard() .withCredentials(localStack.getDefaultCredentialsProvider()) .withEndpointConfiguration(localStack.getEndpointConfiguration(S3)) .build(); } @Bean public AmazonSQSAsync amazonSQS() { return AmazonSQSAsyncClientBuilder.standard() .withCredentials(localStack.getDefaultCredentialsProvider()) .withEndpointConfiguration(localStack.getEndpointConfiguration(SQS)) .build(); } } // ... tests } |
Both clients will now point to the local AWS S3 and AWS SQS instances. As our @TestConfiguration
class is an inner static class of our integration test, it will be detected automatically. There's no need to explicitly use @Import(AwsTestConfig.class)
unless we outsource it to a dedicated file.
Finally, we can now write the test for our order event processing.
First, we put a message into the local AWS SQS queue using the QueueMessagingTemplate
and then expect to find an object in the S3 bucket with the given order id:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 | public class SimpleMessageListenerIT { // ... setup like described @Autowired private AmazonS3 amazonS3; @Autowired private QueueMessagingTemplate queueMessagingTemplate; @Test public void testMessageShouldBeUploadedToS3OnceConsumed() { String orderId = UUID.randomUUID().toString(); OrderEvent orderEvent = new OrderEvent(orderId, "MacBook", "42", LocalDateTime.now(), false); queueMessagingTemplate.convertAndSend(QUEUE_NAME, orderEvent); given() .ignoreException(AmazonS3Exception.class) .await() .atMost(5, SECONDS) .untilAsserted(() -> assertNotNull(amazonS3.getObject(BUCKET_NAME, orderId))); } } |
The given()
part comes from Awaitility. As this message processing happens asynchronously, we can't expect the object to be present in S3 right after we put the message in the queue. Therefore we'll wait for five seconds to find the object in S3 and otherwise fail the test.
The .ignoreException()
part is necessary, as the S3 client will throw this exception whenever it can't find the request object, which might be the case in the first milliseconds of trying to find it.
Summary
With the setup above you are now able to write integration tests for your Spring application and can include AWS components using LocalStack and Testcontainers. Furthermore, this setup also works for any other AWS component that LocalStack supports e.g. DynamoDB or SNS.
The demo application is available on GitHub.
If you’re interested in learning more about building applications with Spring Boot and AWS from top to bottom, make sure to take a look at the Stratospheric project. With this book, you'll learn all you need to know to get your Spring Boot application into production with AWS and how to effectively integrate multiple AWS services. (PS: I'm co-authoring this book)
Have fun writing integration tests for your Spring application using AWS with LocalStack and Testcontainers,
Phil
Hi! Thanks for this! It’s been helpful to try and get something setup. My only issue is that when the bean
AmazonSQSAsync
is created in the TestConfiguration it doesn’t override the bean created in the actual application (which has to be marked as primary to override the default Spring one). This means it always tries to connect to the actual bean. Is there a way to make sure the test bean overrides the actual bean for testing?Hey Dexter,
yes, just recently I found a better way of doing it and also solving your request. You can put the following inside your
application.properties
file insidesrc/main/test
spring.main.allow-bean-definition-overriding=true
This will override the bean and you won’t have to use
@Primary
at allHow does the test configuration get the localStack reference?
For this demo, it’s static access: https://github.com/rieckpil/blog-tutorials/blob/master/spring-boot-aws-integration-tests/src/test/java/de/rieckpil/blog/AwsTestConfig.java
This does not scale well and you can instead introduce either an
AbstractIntegrationTest
class that takes care of the setup or use anApplicationContextInitializer
to bootstrap the Docker containers and then also populate the required beans.[…] of Testcontainers as an example. In short, with LocalStack we can spin up local AWS services that make testing Java applications that connect to e.g. SQS or S3 a […]