If you are familiar with Spring Boot you might wonder if you can use this knowledge to write a serverless application using AWS Lambda. While plain Java is enough for simple use cases, it might be helpful to use Spring Framework features (e.g the WebClient, data access, etc.). With Spring Cloud Function you can achieve this and use the AWS adapter to deploy it as an AWS Lambda. In combination with the Serverless Framework, you get a running function in under five minutes.
By demonstrating five different use cases, I'll demonstrate how to use Spring Cloud Function to deploy AWS Lambda functions.
Maven project setup
To make use of Spring Boot for writing our Lambda functions, we can include the Spring Boot Maven parent for our application. It's not mandatory to include Spring Boot, as Spring Cloud Function also works without it.
As I'll demonstrate different examples of using Spring Cloud Function with AWS, our project contains multiple AWS, Spring Boot and Spring Cloud dependencies:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 | <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>2.3.0.RELEASE</version> <relativePath/> <!-- lookup parent from repository --> </parent> <groupId>de.rieckpil.blog</groupId> <artifactId>spring-cloud-function-aws</artifactId> <version>1.0.0</version> <name>spring-cloud-function-aws</name> <description>Demo Spring Boot project as AWS Lambda</description> <properties> <java.version>11</java.version> <aws-lambda-java-core.version>1.2.1</aws-lambda-java-core.version> <spring-cloud-function.version>3.0.7.RELEASE</spring-cloud-function.version> <wrapper.version>1.0.17.RELEASE</wrapper.version> <aws-lambda-java-events.version>2.2.7</aws-lambda-java-events.version> <aws-java-sdk-s3.version>1.11.792</aws-java-sdk-s3.version> </properties> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-json</artifactId> </dependency> <dependency> <groupId>com.fasterxml.jackson.dataformat</groupId> <artifactId>jackson-dataformat-xml</artifactId> </dependency> <dependency> <groupId>org.springframework.cloud</groupId> <artifactId>spring-cloud-function-adapter-aws</artifactId> <version>${spring-cloud-function.version}</version> </dependency> <dependency> <groupId>com.amazonaws</groupId> <artifactId>aws-java-sdk-s3</artifactId> <version>${aws-java-sdk-s3.version}</version> </dependency> <dependency> <groupId>com.amazonaws</groupId> <artifactId>aws-lambda-java-core</artifactId> <version>${aws-lambda-java-core.version}</version> </dependency> <dependency> <groupId>com.amazonaws</groupId> <artifactId>aws-lambda-java-events</artifactId> <version>${aws-lambda-java-events.version}</version> </dependency> <!-- build section described below --> </project> |
Besides including all required dependencies, we have to configure the way the .jar
is built. Once we want to deploy the application to AWS Lambda, we can't rely on the default way the Spring Boot Maven plugin bundles our project.
It's required to create a shaded .jar
(you can find a good explanation of shading dependencies here and here). For this we'll make use of the Maven Shade Plugin and configure the build
section of our pom.xml
as the following:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 | <build> <finalName>${project.artifactId}</finalName> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <dependencies> <dependency> <groupId>org.springframework.boot.experimental</groupId> <artifactId>spring-boot-thin-layout</artifactId> <version>${wrapper.version}</version> </dependency> </dependencies> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> <configuration> <createDependencyReducedPom>false</createDependencyReducedPom> <shadedArtifactAttached>true</shadedArtifactAttached> <shadedClassifierName>shaded</shadedClassifierName> </configuration> </plugin> </plugins> </build> |
Furthermore, we use the Spring Boot Thin Launcher to optimize the size of the .jar
file. You can read more about this experimental feature on GitHub. For this example, the thin layout reduces the size of the artifact by almost 50% (39 MB to 20MB).
While building the application with mvn package
, Maven now outputs two .jar
files. For deploying the application to AWS only the shaded build artifact is important. In our case that's: spring-cloud-function-aws-1.0.0-shaded.jar
Introduction to Spring Cloud Function
Before we write our first AWS Lambda with Spring Cloud Function, I quickly want to summarize what Spring Cloud Function is all about.
The documentation states the following goals about this Spring Cloud project:
- Promote the implementation of business logic via functions.
- Decouple the development lifecycle of business logic from any specific runtime target so that the same code can run as a web endpoint, a stream processor, or a task.
- Support a uniform programming model across serverless providers, as well as the ability to run standalone (locally or in a PaaS).
- Enable Spring Boot features (auto-configuration, dependency injection, metrics) on serverless providers.
It abstracts away all of the transport details and infrastructure, allowing the developer to keep all the familiar tools and processes, and focus firmly on business logic.
In a nutshell, you simply write either a Java 8 Function
, Supplier
or Consumer
, make use of well-known Spring Boot features, select an adapter for your cloud provider (e.g. AWS, Azure, etc.), and get a running function without caring about provisioning hardware (aka. serverless). That's it.
What's left is to pick the right AWS Request Handler provided by the Spring Cloud Function AWS adapter. There are multiple to choose from and I'll cover most of them throughout the different examples:
FunctionInvoker
SpringBootRequestHandler
SpringBootRequestStreamHandler
SpringBootKinesisEventHandler
SpringBootApiGatewayRequestHandler
All of them use either the RequestHandler
or RequestStreamHandler
interface of the underlying AWS Lambda Java dependency.
FYI: In a previous blog post I developed an AWS Lambda function while implementing one of these interfaces without Spring Cloud Function.
A short note about the deployment
Throughout the following examples, I'm using Serverless to deploy the functions to AWS Lambda. The basic setup looks like the following:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 | service: spring-cloud-function-aws provider: name: aws runtime: java11 region: eu-central-1 profile: serverless-admin memorySize: 1024 timeout: 10 iamRoleStatements: - Effect: 'Allow' Action: - 's3:*' Resource: - 'arn:aws:s3:::${self:custom.orderEventBucket}/*' - !Join ['', ['arn:aws:s3:::', !Ref ServerlessDeploymentBucket, '/*']] custom: orderEventBucket: order-event-bucket-rieckpil package: artifact: target/spring-cloud-function-aws-1.0.0-shaded.jar |
Besides the standard region and runtime configuration for AWS Lambda, I'm adding an S3 Bucket role to access S3 later on. It's important to use at least 1024 MB of memory and a timeout of 10 seconds, as the first invocation (cold start) of the AWS Lambda takes more time.
With the package
attribute, we point to the shaded .jar
file which will be uploaded to AWS for each deployment.
You'll find the specific function configuration within each section and take a look at this blog post for more information on using the Serverless Framework.
AWS Lambda to uppercase a String
Let's start with a simple use case, the Hello World of writing functions: uppercasing a String
.
Within our Spring Boot application, we can define a Function<String, String>
and make it available to the Spring context using @Bean
:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 | @Configuration public class FunctionConfiguration { private static Logger logger = LoggerFactory.getLogger(Application.class); @Bean public Function<String, String> uppercase() { return value -> { logger.info("Processing uppercase for String '{}'", value); return value.toUpperCase(); }; } // ... more functions to come } |
For the handler, I'm choosing FunctionInvoker
, which is one of the most generic ones and implements RequestStreamHandler
under the hood. The handler will take care of serializing the payload, invoking the correct function, and deserializing the result.
As we can have multiple functions within the Spring context, we have to tell Spring Cloud Function which function we want to trigger. For the FunctionInvoker
we can achieve this with an environment variable SPRING_CLOUD_FUNCTION_DEFINITION
:
1 2 3 4 5 | functions: uppercase: handler: org.springframework.cloud.function.adapter.aws.FunctionInvoker environment: SPRING_CLOUD_FUNCTION_DEFINITION: uppercase |
Once we build the project and deploy it to AWS (using serverless deploy
), we can invoke it with either Serverless or the AWS Console:
1 2 3 | sls invoke -l -f uppercase -d 'hello world from console!' "HELLO WORLD FROM CONSOLE!" |
AWS Lambda to generate random UUIDs
Next, let's add an example of using the Supplier
interface.
This is relevant for use cases where it's not necessary to consume a value but only return one:
1 2 3 4 | @Bean public Supplier<String> randomString() { return () -> UUID.randomUUID().toString(); } |
For this example, we'll use the SpringBootRequestHandler
, which we can extend to define the input and output types of our processing:
1 2 | public class EmptyInputHandler extends SpringBootRequestHandler<Void, String> { } |
The input type is Void
in this case, as we don't care about any incoming value and just produce a random String
.
We can now use our EmptyInputHandler
as the handler
and have to specify which function we want to invoke using the FUNCTION_NAME
environment variable in this case.
1 2 3 4 5 6 7 8 | functions: # ... randomString: handler: de.rieckpil.blog.EmptyInputHandler environment: FUNCTION_NAME: randomString |
Invoking the AWS Lambda without any input returns the random UUID
:
1 2 | sls invoke -l -f randomString "2835b2de-741e-44b1-b17d-0fee28ee1571" |
AWS Lambda to process S3 events
For our next example, I want to make use of the Consumer
interface to only consume values but don't return anything.
As a more realistic use case, we'll trigger the function whenever a file is uploaded to an S3 bucket:
1 2 3 4 5 6 7 8 9 10 11 | @Bean public Consumer<S3Event> processS3Event() { return s3Event -> { String bucket = s3Event.getRecords().get(0).getS3().getBucket().getName(); String key = s3Event.getRecords().get(0).getS3().getObject().getKey(); logger.info("Something was uploaded to S3: " + bucket + "/" + key); // ... further processing of the S3Ev ent }; } |
The handler for this processing looks like the following:
1 2 | public class S3EventHandler extends SpringBootRequestHandler<S3Event, Void> { } |
With Serverless we can now configure the event on which the Lambda is triggered:
1 2 3 4 5 6 7 8 9 10 11 12 | functions: # ... s3EventProcessor: handler: de.rieckpil.blog.S3EventHandler events: - s3: bucket: ${self:custom.orderEventBucket} event: s3:ObjectCreated:* environment: FUNCTION_NAME: processS3Event |
… and create the following output on each upload:
1 | INFO 7 --- [main] de.rieckpil.blog.Application : Something was uploaded to S3: order-event-bucket-rieckpil/crud-applications-with-spring-boot-course-logo.png |
AWS Lambda behind an API Gateway Part I
As AWS Lambda functions can also be behind an API Gateway and triggered by HTTP calls, let's add an example for this.
Let's say we want a REST API endpoint /persons
to process and store Person
entities somewhere:
1 2 3 4 5 6 7 | public class Person { private String id; private String name; private LocalDate dayOfBirth; } |
Within the processing, we get access to the serialized Java object and can perform any operation, e.g. store it in a database:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 | @Bean public Function<Message<Person>, Message<Person>> processPerson() { return value -> { Person person = value.getPayload(); logger.info("Processing incoming person '{}'", person); // ... storing Person in database person.setId(UUID.randomUUID().toString()); logger.info("Successfully stored person in database with id '{}'", person.getId()); Map<String, Object> resultHeader = new HashMap(); resultHeader.put("statuscode", HttpStatus.CREATED.value()); resultHeader.put("X-Custom-Header", "Hello World from Spring Cloud Function AWS Adapter"); return new GenericMessage(person, resultHeader); }; } |
Here I'm wrapping the input and output types within the Message
interface from org.springeframework.messaging
. This is not required but as we'll use the SpringBootApiGatewayRequestHandler
we can use this wrapper to add metadata (e.g headers) to the response object.
The handler will take care of creating the correct response type (including the statuscode
field, etc.) for the API Gateway:
1 2 3 4 5 6 7 8 9 10 11 12 13 | functions: # ... createPerson: handler: org.springframework.cloud.function.adapter.aws.SpringBootApiGatewayRequestHandler events: - http: path: persons method: post cors: true environment: FUNCTION_NAME: processPerson |
Once you deploy the function with the Serverless framework, you'll receive an HTTP endpoint to trigger your Lambda:
1 2 3 4 5 | curl -X POST -H "Content-Type: application/json" \ -d '{"name":"duke", "dayOfBirth":"2020-01-01"}' \ https://jcpgqxtzmd.execute-api.eu-central-1.amazonaws.com/dev/persons {"id":"aad301e0-4af7-4278-af04-23f7c098b278","name":"duke","dayOfBirth":"2020-01-01"} |
AWS Lambda behind an API Gateway Part II
In the last example, I want to show you how to have more control over the serialization of the incoming payload. For this, we'll use the SpringBootStreamHandler
and are going to parse incoming XML payload to our AWS Lambda (behind an API Gateway).
Like the example above, we'll accept entities via a REST API and process them:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 | @Bean public Function<APIGatewayProxyRequestEvent, APIGatewayProxyResponseEvent> processXmlOrder() { return value -> { try { ObjectMapper objectMapper = new XmlMapper(); Order order = objectMapper.readValue(value.getBody(), Order.class); logger.info("Successfully deserialized XML order '{}'", order); // ... processing Order order.setProcessed(true); APIGatewayProxyResponseEvent responseEvent = new APIGatewayProxyResponseEvent(); responseEvent.setStatusCode(201); responseEvent.setHeaders(Map.of("Content-Type", "application/xml")); responseEvent.setBody(objectMapper.writeValueAsString(order)); return responseEvent; } catch (IOException e) { e.printStackTrace(); return new APIGatewayProxyResponseEvent().withStatusCode(500); } }; } |
The input and output types are coming from the aws-lambda-java-events
dependency and represent the payload of the API Gateway. As we are not defining the actual payload type, we can parse the body of the APIGatewayProxyRequestEvent
(containing XML as a string) for ourselves.
You can even go further and request the raw InputStream
for your function and handle everything for yourself.
The deployment configuration is similar to the other Lambda behind the API Gateway:
1 2 3 4 5 6 7 8 9 10 11 12 13 | functions: # ... createXmlOrder: handler: org.springframework.cloud.function.adapter.aws.SpringBootStreamHandler events: - http: path: orders method: post cors: true environment: FUNCTION_NAME: processXmlOrder |
Spring Cloud Function for AWS Lambda conclusion
After reading this blog post, you shouldn't jump into writing your whole application using this approach. Use it wisely where the use case makes sense to write an AWS Lambda function. Whether you use Spring Cloud Function or the Java AWS Lambda interfaces (RequestHandler
/RequestStreamHandler
) directly, depends on your use case.
There is definitely an overhead when using Spring Cloud Function (bigger .jar
files, slow cold starts as we have to start the Spring context), but being able to stay in the well-known Spring ecosystem may also result in more productivity.
While Spring Cloud Function provides a helpful abstraction, the biggest pain point for me, in the beginning, was to find the correct handler for my function. The documentation on them is limited. Inspecting the source code of each handler in the IDE helped me to quickly understand their differences.
You can find the source code for the examples above on GitHub.
Have fun writing AWS Lambda functions with Spring Cloud Function,
Philip
Hey Philip,
Very helpful article.
However, while trying out a Supplier function with AWS Lambda, I found it does not work. Below are the details:
My function is:
@Bean
Supplier getAllUsers(){
return () -> new User();
}
I have also created a custom handler as per your article:
public class SupplierRequestHandler extends
SpringBootRequestHandler {
}
When I run it locally, it works, but while running via AWS Lambda, I get this problem:
Could Not Convert Input: org.springframework.messaging.converter.MessageConversionException
org.springframework.messaging.converter.MessageConversionException: Could Not Convert Input
Hey Saikat,
thanks for reaching out.
Are you deploying your AWS Lambda behind an API Gateway and try to access it over an HTTP endpoint or do you directly invoke it via Serverless?
PS: I’ve created a course on this topic that might help you get more insights and maybe automatically solve the issue. You can watch it here: https://rieckpil.de/courses/going-serverless-with-java/
Kind regards,
Philip
Philip
Thanks for the time spend writing this article, I just started digging deep into AWS Lambda and this is what I needed, keep up the good work
thank you, Nikola. I’m glad I could help you 🙂
Many thanks for your article. How can I inject a JPARepository in this method, so I can send data to my database ? processXmlOrder()
Hi Denilson,
you can inject any Spring Bean to the class where you define the
@Bean
function for AWS Lambda. If your project contains Spring Data JPA, simply add@Autowired
to your repository field or use constructor injection.