The faster Lambda runtime - NodeJS or LLRT? We benchmarked.

Updated on
AWS Lambda NodeJS vs LLRT

LLRT?

No clue what it means, or what it does. I felt this way when I saw it for the first time. But you don’t have to feel the same way I did.

I’ve tested, benchmarked and compiled all the details which will help you make the best use of AWS LLRT.

Here is what you need to know about LLRT

LLRT v0.1.0 beta was released in October 2023.

Warning

For now, LLRT is an experimental package. It is subject to change and intended only for testing purposes.

LLRT compatibility with NodeJS

Internally, LLRT uses quickjs, which is a JavaScript engine with size of < 1 MB compared to NodeJS binary which is of ~160 MB.

You can download the quickjs binary from the website to try it out.

NodeJS vs QuickJS on scale

Setting up NodeJS every time becomes time consuming for AWS Lambda which leads to higher cold start / init time.

As of March 2024, built-in node libraries like buffer, stream, fetch and fs/promises are fully supported by AWS LLRT, while some others like fs, http, tls and child_process are partially supported at the moment.

This is what the AWS team had to say:

The long term goal for LLRT is to become Winter CG compliant. Not every API from Node.js will be supported.

This means that LLRT will not have implementations for all NodeJS APIs, but only the major Node APIs. You can find out if LLRT supports the node libraries you use from LLRT GitHub docs.

How does this impact you?

Many of your favourite NodeJS libraries may not work with LLRT, so check if a library is compatible with LLRT APIs for AWS Lambda.

You can check the compatibility matrix on the LLRT repo.

Tip

Fascinated to see the benchmarks, but didn’t get to use LLRT yourself? Try a quick LLRT setup using this guide.

AWS SDK v3 Optimization

AWS team has optimized and bundled many AWS SDKs in the LLRT runtime itself. So you don’t need to bundle it with your JavaScript code.

Supported AWS SDKs

Important

When bundling your code make sure to exclude these dependencies

Bundled AWS SDK packages
@aws-sdk/client-dynamodb
@aws-sdk/lib-dynamodb
@aws-sdk/client-kms
@aws-sdk/client-lambda
@aws-sdk/client-s3
@aws-sdk/client-secrets-manager
@aws-sdk/client-ses
@aws-sdk/client-sns
@aws-sdk/client-sqs
@aws-sdk/client-sts
@aws-sdk/client-ssm
@aws-sdk/client-cloudwatch-logs
@aws-sdk/client-cloudwatch-events
@aws-sdk/client-eventbridge
@aws-sdk/client-sfn
@aws-sdk/client-xray
@aws-sdk/client-cognito-identity
@aws-sdk/util-dynamodb
@aws-sdk/credential-providers
@smithy

QuickJS vs Hermes vs V8 Benchmark

There is no simple way to test the performance of a JavaScript engines like V8 (with / without JIT), QuickJS, Hermes and many others. So we need to put them through complex tests with high iterations.

Benchmark Score (The Higher, The Better)

These are the famous benchmarks that have been used to test the most used JavaScript engines:

EngineQuickJSHermesV8 —jitlessV8 (JIT)
Executable size620K27M28M28M
Richards777818103629745
DeltaBlue761651114365173
Crypto1061109088434215
RayTrace915937298969781
EarleyBoyer14171728458348254
RegExp25133521427637
Splay16411602430326150
NavierStokes18561522137736766
Total score
(w/o RegExp)
11381127188641576
Total score942968191633640

Thanks to the QuickJS developer for providing the benchmark data.

How to Read This Graph

This is a stacked bar chart:

A Benchmark on the Most Popular JavaScript Runtimes You can see the long green bars dominating the blue, red and yellow ones. This shows that in each benchmark test, V8 (JIT) wins consistently over QuickJS, Hermes, V8 (No JIT) all combined.

What This Graph Doesn’t Tell You

So just basing off the graph, it won’t be a great idea to decide which engine works best for your usecase.

For example if you need to perform simple tasks like making API requests, reading files, parsing JSON and doing basic calculations, you don’t need to go for V8 engine.

NodeJS vs QuickJS Benchmark

Earlier, we saw that V8 outperforms every JavaScript engine. Now we need to know how QuickJS compares with NodeJS.

I ran tests with microbench.js file.

QuickJS vs NodeJS NodeJS (V8) outperforms QuickJS in all the tests by taking way less time.

Note

If your code heavily relies on JavaScript, NodeJS might be a better choice than using QuickJS based runtime like LLRT.

It tested the basic operation of JavaScript APIs like:

CDK Code for Testing LLRT vs NodeJS

Create a CDK project wtih npx cdk init app --language typescript.

Tip

Don’t know how to use AWS CDK? Follow this step-by-step article to create a TypeScript CDK project

Once you’ve created a CDK project update the stack code with the LLRT and NodeJS function:

import { NodejsFunction, OutputFormat } from "aws-cdk-lib/aws-lambda-nodejs";
import { CfnOutput, Stack, StackProps } from "aws-cdk-lib";
import { PolicyStatement } from "aws-cdk-lib/aws-iam";
import { LlrtFunction } from "cdk-lambda-llrt";
import { Construct } from "constructs";
import {
  Architecture,
  FunctionUrlAuthType,
  Runtime,
} from "aws-cdk-lib/aws-lambda";

export class LambdaLlrtStack extends Stack {
  constructor(scope: Construct, id: string, props?: StackProps) {
    super(scope, id, props);

    const llrtFn = new LlrtFunction(this, "llrt-demo-fn", {
      entry: "lambda/index.ts",
      architecture: Architecture.ARM_64,
    });

    const nodejsFn = new NodejsFunction(this, "nodejs-demo-fn", {
      entry: "lambda/index.ts",
      architecture: Architecture.ARM_64,
      runtime: Runtime.NODEJS_20_X,
      bundling: {
        format: OutputFormat.ESM,
        minify: true,
      },
    });

    const s3Policy = new PolicyStatement({
      actions: ["s3:ListAllMyBuckets"],
      resources: ["*"],
    });

    llrtFn.addToRolePolicy(s3Policy);
    nodejsFn.addToRolePolicy(s3Policy);

    const llrtFnUrl = llrtFn.addFunctionUrl({
      authType: FunctionUrlAuthType.NONE,
    });

    const nodejsFnUrl = nodejsFn.addFunctionUrl({
      authType: FunctionUrlAuthType.NONE,
    });

    new CfnOutput(this, "llrt-url", { value: llrtFnUrl.url });
    new CfnOutput(this, "nodejs-url", { value: nodejsFnUrl.url });
  }
}

What does the CDK code do?

The CDK code does the following:

Lambda Function Code to List S3 buckets

Update your lambda/index.ts code to list the S3 buckets:

import { APIGatewayProxyHandlerV2 } from "aws-lambda";
import { ListBucketsCommand, S3Client } from "@aws-sdk/client-s3";

const s3 = new S3Client({});

export const handler: APIGatewayProxyHandlerV2 = async (event, context) => {
  const list = await s3.send(new ListBucketsCommand({}));
  // console.log(list.Buckets);
  return {
    body: list.Buckets?.map((buck) => buck.Name).join(", "),
    headers: { "Content-Type": "text/plain" },
  };
};

Deploy the changes with cdk deploy.

You’ll have both LLRT and NodeJS function URL in the output:


 ✅  LambdaLlrtStack

✨  Deployment time: 50.24s

Outputs:
LambdaLlrtStack.llrturl = https://zuw7ghxpubmi7ug4hcpwd6hmja0sbwnv.lambda-url.us-east-1.on.aws/
LambdaLlrtStack.nodejsurl = https://i2wubwr5rpf7btbwxymtbyf4l40ewlpp.lambda-url.us-east-1.on.aws/
Stack ARN:
arn:aws:cloudformation:us-east-1:205979422636:stack/LambdaLlrtStack/fcd29450-d3be-11ee-9451-0ea773d37f77

✨  Total time: 53.41s

You can simply visit the function URL to test the response.

LLRT vs NodeJS Lambda Startup

To see the logs in your terminal you can run cdk watch and it will automatically show you the logs from both the Lambda function.

LLRT vs NodeJS init duration log As you can see above in the screenshot that LLRT (1) has init duration / startup time of 47.65 ms and NodeJS (2) has 770.63 ms.

So here is the table for a Lambda getting cold start:

RuntimeInit DurationBilled Duration
LLRT47.65130
NodeJS770.631194

Who wins the race of cold start time - Node or LLRT?

These are the cold start / init durations: Node vs LLRT - Who takes more time LLRT wins! NodeJS takes a lot more time to cold start the same JavaScript code.

LLRT Time (ms)Node v20 Time (ms)How faster is LLRT than NodeJS?
52.38684.1113x
41.81715.2117x
48.39707.2715x

HTTP Request Latency Benchmark

To test the end-to-end performance, I ran some tests with autocannon.

How does the API Response time vary with duration?

Setting connections: 10 as constant

Duration (s)LLRT (ms)NodeJS (ms)
530463707
1020042708
15317784
20304586

AWS Lambda LLRT vs NodeJS Latency duration based

How does the API Response time vary with number of connections?

Setting duration: 10s as constant

ConnectionsLLRTNodeJS
178599855
1075499488
5064128465
10027545456

AWS Lambda LLRT vs NodeJS Latency connections based

We can clearly see who the winner is in maintaining connections and duration. It’s LLRT, again.

Source code for CDK project can be found on LearnAWS LLRT Github repo.

You can view AWS Lambda LLRT vs NodeJS Latency Graphs on Sheets as well.


Hills 🏔 and Skills, What's Common?

They both need you to be on top.

You will get lifetime access with:

All yours, just at:

$149

Just type and your search result will magically appear here