5 september 2021

Mission accomplished

Micronaut to the cloud

Header image

Today’s most popular Java frameworks are designed to put developer experience first, albeit at the cost of a little start up time and memory overhead. Developers would agree that for most applications this is a price worth paying, except in the world of serverless and Function-as-a-Service (FaaS), when - quite literally - it isn’t.

Traditional frameworks running on the JVM are generally considered too slow to start and too memory hungry to make them serious candidates for FaaS architectures. This is because FaaS providers, the likes of AWS, Apache OpenWhisk and Microsoft Azure, charge based on memory usage and time executing (including boot time).

However, the recent release of Quarkus put the spotlight back on the viability of serverless functions written in Java. GraalVM and Ahead of Time (AOT) compilation means that application boot time and memory need no longer come at the expense of framework niceties that developers have grown fond of. They can co-exist.

But Quarkus was not the first to notice the potential of AOT to deliver the best of both worlds. More than a year before Quarkus, the team behind Grails gave us a sneak preview of Micronaut, a JVM-based framework promising (among other things) to launch Java into the serverless space. Six months later came the GA release:

Over a year in the making, Micronaut 1.0 represents a major leap forward in our mission to enable developers to create efficient, low-memory-footprint microservices and serverless apps for the JVM. Fast forward to today, and the (imminent) 1.1 release promises some additions that make FaaS with Micronaut even more tempting, including:

Support for AWS API Gateway and GraalVM Custom Runtime. Simplified command to turn your JAR into a native image for ultra-fast startups. Even faster cold start time and performance optimizations. New test templates to make unit testing easier. We’ve decided not to wait for the GA release of 1.1 but instead see if we can launch our micronaut into the cloud so to speak, using the new support for AWS Lambda.

So, if you care to join us, buckle up and hold onto your seats because this is a mission to the cloud and it’s going to be fast!

Quick guide: Micronaut on AWS lambda

In this short guide we'll create a “greeting” function that runs as a native image on AWS Lambda with use of a custom runtime. Here you will:

  • Generate a project with the Micronaut CLI.
  • Code a simple function.
  • Unit test the function using Micronaut’s test framework.
  • Build an AWS native image.
  • Deploy to AWS Lambda with the AWS CLI.

Setup

Install:

Micronaut

Docker

AWS CLI

Micronaut comes with a convenient CLI to create applications and functions.

Create a project called micronaut-greeter with everything required to deploy to AWS, by inlcuding aws-api-gateway-graal feature:

mvn create-app micronaut-greeter --features aws-api-gateway-graal

Open the project in your favorite IDE and we're ready to begin.

Code

Let's create a controller and inject a service that does the greeting:

@Controller
public class RocketController {

    @Inject
    private RadioService radio;

    @Get("/greet")
    public String greet() {
        return radio.greet();
    }

}

Our injected service looks like this:

@Singleton
public class RadioService {

    public String greet() {
        return "Hello from Micronaut";
    }

}

That's all the code we need.

Test

Micronaut eliminates the artificial separation imposed by traditional frameworks between function and unit tests because it’s inexpensive to start and stop the entire application between tests. We will create a test for our REST API by sending it an HTTP request just as our real user might through a web browser. For this we need an HTTP client. Micronaut Test framework makes it very simple to create one dedicated to our application:

@Client("/")
public interface RadioClient {

    @Get("/greet")
    String greet();

}

All the necessary HTTP client code is generated behind the scenes.

Prior to version 1.1, explicitly starting and stopping Micronaut was necessary and the client needed to be fetched from the application context:

public class RocketTest {

    @Test
    public void test_greeter_response_is_as_expected() {
        EmbeddedServer server = ApplicationContext.run(EmbeddedServer.class);

        RadioClient client = server.getApplicationContext().getBean(RadioClient.class);
        assertEquals(client.vote(), "Hello from Micronaut");

        server.stop();
    }

}

However, the new @MicronautTest annotation allows us to inject our HTTP client into our unit test and we don't have to worry about starting or stopping the server:

@MicronautTest
public class RocketTest {

    @Inject
    RadioClient client;

    @Test
    public void test_greeter_response_is_as_expected() {
        String response = client.greet();
        assertEquals(response, "Hello from Micronaut");
    }

}

The previous way of unit testing is still supported, but the new version makes test code more succinct and readable.

Build

FaaS providers load functions on demand. If we want the fastest boot time possible it's necessary to compile our application into a native image. Our target FaaS provider is AWS Lambda so we will create a native image targeted specifically for a custom (as opposed to Java) Lambda runtime.

At the start of the guide we created our project with the aws-api-gateway-graal feature and this generated for us the means to build our native image with Docker.

Execute the following to build the image:

docker build . -t micronaut-greeter

This will start the build of our project using Gradle - the default building tool for Micronaut, download an Amazon Linux compliant distribution and build a native image of our application using GraalVM.

In order to deploy it as a custom runtime to AWS Lambda, we must provide our newly generated native image and also a bootstrap file responsible for executing it. Since our file is generated within the container, we now need to bring it to our local file system so we can deploy it. Make a build directory:

mkdir build

Then execute:

docker run --rm --entrypoint cat micronaut-greeter /home/application/function.zip > build/function.zip

Launch time! Destination AWS Lambda awaits.

Deploy

Use the AWS CLI to create a Lamda function.

aws lambda create-function \
 --function-name micronaut-greeter \
 --zip-file fileb://build/function.zip \
 --handler function.handler \
 --runtime provided \
 --role arn:aws:iam::123456789012:role/lambda-role

Run

Ordinarily, invoking AWS Lambda functions over HTTP is done through Amazon API Gateway as the HTTP endpoint, which routes the request to Lambda. However, the set up is outside the scope of this guide. For testing purposes though, the CLI provides an invoke command. It requires the following payload:

{
  "resource": "/{proxy+}",
  "path": "/greet",
  "httpMethod": "GET",
  "pathParameters": {
    "proxy": "greet"
  },
  "requestContext": {
    "identity": {}
  }
}

Here is the full command to invoke our Lambda function:

aws lambda invoke --function-name micronaut-greeter --payload '{"resource": "/{proxy+}","path": "/greet","httpMethod": "GET","pathParameters":{"proxy":"greet"},"requestContext":{"identity":{}}}' build/response.txt

Invoking Lamda through the AWS Management Console is also possible.

Cold start execution time breaches the 100ms billing threshold so is charged at 200ms - but only just and still very quick!

When warmed up the execution time is minimal.

The numbers vary per execution of course, but it’s interesting to take note of the cold start duration of 107.68ms vs the warm start of 3.64ms. The cold start time in particular is the one to pay attention to - it’s still exceedingly fast.

Conclusion

Micronaut takes inspiration from Spring and Grails so those already familiar with the popular frameworks will feel right at home. But super-fast startups and low memory consumption opens up new possibilities.

We’ve seen in the guide that the boundary between unit and function testing disappears - testing REST endpoints in Micronaut is a breeze.

With support for native image compilation we’ve seen Micronaut boot up from a cold start on AWS Lambda in impressively quick times, making it a viable means to develop Java applications destined for FaaS. Will Micronaut be the vehicle that launches Java into the FaaS space?

We’re interested to find out!

Join a team

Heb je een teamlid, collega of vriend met wie je het liefst blijft ontwikkelen, meld je dan samen aan!

orange-arrow-leftorange-arrow-centerorange-arrow-rightorange-arrow-leftorange-arrow-right

Contact Mark

Geen recruiter maar een developer, net zoals jij.