LLRT?
No clue what it means, or what it does. I felt this way when I saw it for the first time. But you don’t have to feel the same way I did.
I’ve tested, benchmarked and compiled all the details which will help you make the best use of AWS LLRT.
Here is what you need to know about LLRT
LLRT v0.1.0 beta was released in October 2023.
- LLRT stands for Low Latency Runtime
- Lightweight JavaScript runtime written in Rust
- Possible replacement for NodeJS runtimes
- 10x faster startup on average
- 2x cost reduction on AWS Lambda
Warning
For now, LLRT is an experimental package. It is subject to change and intended only for testing purposes.
LLRT compatibility with NodeJS
Internally, LLRT uses quickjs, which is a JavaScript engine with size of < 1 MB compared to NodeJS binary which is of ~160 MB.
You can download the quickjs binary from the website to try it out.
Setting up NodeJS every time becomes time consuming for AWS Lambda which leads to higher cold start / init time.
As of March 2024, built-in node libraries like buffer, stream, fetch and fs/promises are fully supported by AWS LLRT, while some others like fs, http, tls and child_process are partially supported at the moment.
This is what the AWS team had to say:
The long term goal for LLRT is to become Winter CG compliant. Not every API from Node.js will be supported.
This means that LLRT will not have implementations for all NodeJS APIs, but only the major Node APIs. You can find out if LLRT supports the node libraries you use from LLRT GitHub docs.
How does this impact you?
Many of your favourite NodeJS libraries may not work with LLRT, so check if a library is compatible with LLRT APIs for AWS Lambda.
You can check the compatibility matrix on the LLRT repo.
Tip
Fascinated to see the benchmarks, but didn’t get to use LLRT yourself? Try a quick LLRT setup using this guide.
AWS SDK v3 Optimization
AWS team has optimized and bundled many AWS SDKs in the LLRT runtime itself. So you don’t need to bundle it with your JavaScript code.
Supported AWS SDKs
Important
When bundling your code make sure to exclude these dependencies
Bundled AWS SDK packages |
---|
@aws-sdk/client-dynamodb |
@aws-sdk/lib-dynamodb |
@aws-sdk/client-kms |
@aws-sdk/client-lambda |
@aws-sdk/client-s3 |
@aws-sdk/client-secrets-manager |
@aws-sdk/client-ses |
@aws-sdk/client-sns |
@aws-sdk/client-sqs |
@aws-sdk/client-sts |
@aws-sdk/client-ssm |
@aws-sdk/client-cloudwatch-logs |
@aws-sdk/client-cloudwatch-events |
@aws-sdk/client-eventbridge |
@aws-sdk/client-sfn |
@aws-sdk/client-xray |
@aws-sdk/client-cognito-identity |
@aws-sdk/util-dynamodb |
@aws-sdk/credential-providers |
@smithy |
QuickJS vs Hermes vs V8 Benchmark
There is no simple way to test the performance of a JavaScript engines like V8 (with / without JIT), QuickJS, Hermes and many others. So we need to put them through complex tests with high iterations.
Benchmark Score (The Higher, The Better)
These are the famous benchmarks that have been used to test the most used JavaScript engines:
- RegExp: This benchmark is generated by loading 50 of the most popular pages on the web and logging all regexp operations performed.
- Crypto: This benchmark has been done against RSA Encryption
- Splay: This benchmark effectively measures how fast the JavaScript engine is at allocating nodes and reclaiming the memory used for old nodes
- RayTrace: This benchmark measures how fast the JavaScript engine creates light paths based on shapes like sphere and plane.
- DeltaBlue: This benchmark uses a fast algorithm for satisfying dynamically changing constraint hierarchies to test the JavaScript engine.
Engine | QuickJS | Hermes | V8 —jitless | V8 (JIT) |
---|---|---|---|---|
Executable size | 620K | 27M | 28M | 28M |
Richards | 777 | 818 | 1036 | 29745 |
DeltaBlue | 761 | 651 | 1143 | 65173 |
Crypto | 1061 | 1090 | 884 | 34215 |
RayTrace | 915 | 937 | 2989 | 69781 |
EarleyBoyer | 1417 | 1728 | 4583 | 48254 |
RegExp | 251 | 335 | 2142 | 7637 |
Splay | 1641 | 1602 | 4303 | 26150 |
NavierStokes | 1856 | 1522 | 1377 | 36766 |
Total score (w/o RegExp) | 1138 | 1127 | 1886 | 41576 |
Total score | 942 | 968 | 1916 | 33640 |
Thanks to the QuickJS developer for providing the benchmark data.
How to Read This Graph
This is a stacked bar chart:
- Y axis has the algorithms
- X axis denotes the benchmark score of the respective engines
You can see the long green bars dominating the blue, red and yellow ones. This shows that in each benchmark test, V8 (JIT) wins consistently over QuickJS, Hermes, V8 (No JIT) all combined.
What This Graph Doesn’t Tell You
- NodeJS (V8) is 160 MB, not a very portable JavaScript runtime and since mostly used in chromium based browsers
- QuickJS can be used even in microcontrollers like ESP32, thanks to its small size ~ 1 MB. Now it’s also being used with AWS Lambda
- Hermes is a portable JavaScript engine which is used in React Native apps to improve app’s performance and reduces bundle size
So just basing off the graph, it won’t be a great idea to decide which engine works best for your usecase.
For example if you need to perform simple tasks like making API requests, reading files, parsing JSON and doing basic calculations, you don’t need to go for V8 engine.
NodeJS vs QuickJS Benchmark
Earlier, we saw that V8 outperforms every JavaScript engine. Now we need to know how QuickJS compares with NodeJS.
I ran tests with microbench.js file.
NodeJS (V8) outperforms QuickJS in all the tests by taking way less time.
Note
If your code heavily relies on JavaScript, NodeJS might be a better choice than using QuickJS based runtime like LLRT.
It tested the basic operation of JavaScript APIs like:
- Array methods
- String methods
- Date.now()
- Empty for loops and so on.
CDK Code for Testing LLRT vs NodeJS
Create a CDK project wtih npx cdk init app --language typescript
.
Tip
Don’t know how to use AWS CDK? Follow this step-by-step article to create a TypeScript CDK project
Once you’ve created a CDK project update the stack code with the LLRT and NodeJS function:
import { NodejsFunction, OutputFormat } from "aws-cdk-lib/aws-lambda-nodejs";
import { CfnOutput, Stack, StackProps } from "aws-cdk-lib";
import { PolicyStatement } from "aws-cdk-lib/aws-iam";
import { LlrtFunction } from "cdk-lambda-llrt";
import { Construct } from "constructs";
import {
Architecture,
FunctionUrlAuthType,
Runtime,
} from "aws-cdk-lib/aws-lambda";
export class LambdaLlrtStack extends Stack {
constructor(scope: Construct, id: string, props?: StackProps) {
super(scope, id, props);
const llrtFn = new LlrtFunction(this, "llrt-demo-fn", {
entry: "lambda/index.ts",
architecture: Architecture.ARM_64,
});
const nodejsFn = new NodejsFunction(this, "nodejs-demo-fn", {
entry: "lambda/index.ts",
architecture: Architecture.ARM_64,
runtime: Runtime.NODEJS_20_X,
bundling: {
format: OutputFormat.ESM,
minify: true,
},
});
const s3Policy = new PolicyStatement({
actions: ["s3:ListAllMyBuckets"],
resources: ["*"],
});
llrtFn.addToRolePolicy(s3Policy);
nodejsFn.addToRolePolicy(s3Policy);
const llrtFnUrl = llrtFn.addFunctionUrl({
authType: FunctionUrlAuthType.NONE,
});
const nodejsFnUrl = nodejsFn.addFunctionUrl({
authType: FunctionUrlAuthType.NONE,
});
new CfnOutput(this, "llrt-url", { value: llrtFnUrl.url });
new CfnOutput(this, "nodejs-url", { value: nodejsFnUrl.url });
}
}
What does the CDK code do?
The CDK code does the following:
- Creates an LLRT Lambda function with
arm64
architechture. - Creates an NodeJS Lambda function with same code
- Creates an S3 policy with
ListAllMyBuckets
action - Attaches S3 policy to both the Lambda function
- Enables function URL with public access
- Logs the LLRT and NodeJS function URLs
Lambda Function Code to List S3 buckets
Update your lambda/index.ts
code to list the S3 buckets:
import { APIGatewayProxyHandlerV2 } from "aws-lambda";
import { ListBucketsCommand, S3Client } from "@aws-sdk/client-s3";
const s3 = new S3Client({});
export const handler: APIGatewayProxyHandlerV2 = async (event, context) => {
const list = await s3.send(new ListBucketsCommand({}));
// console.log(list.Buckets);
return {
body: list.Buckets?.map((buck) => buck.Name).join(", "),
headers: { "Content-Type": "text/plain" },
};
};
Deploy the changes with cdk deploy
.
You’ll have both LLRT and NodeJS function URL in the output:
✅ LambdaLlrtStack
✨ Deployment time: 50.24s
Outputs:
LambdaLlrtStack.llrturl = https://zuw7ghxpubmi7ug4hcpwd6hmja0sbwnv.lambda-url.us-east-1.on.aws/
LambdaLlrtStack.nodejsurl = https://i2wubwr5rpf7btbwxymtbyf4l40ewlpp.lambda-url.us-east-1.on.aws/
Stack ARN:
arn:aws:cloudformation:us-east-1:205979422636:stack/LambdaLlrtStack/fcd29450-d3be-11ee-9451-0ea773d37f77
✨ Total time: 53.41s
You can simply visit the function URL to test the response.
LLRT vs NodeJS Lambda Startup
To see the logs in your terminal you can run cdk watch
and it will automatically show you the logs from both the Lambda function.
As you can see above in the screenshot that LLRT (1) has init duration / startup time of 47.65 ms and NodeJS (2) has 770.63 ms.
So here is the table for a Lambda getting cold start:
Runtime | Init Duration | Billed Duration |
---|---|---|
LLRT | 47.65 | 130 |
NodeJS | 770.63 | 1194 |
Who wins the race of cold start time - Node or LLRT?
These are the cold start / init durations: LLRT wins! NodeJS takes a lot more time to cold start the same JavaScript code.
LLRT Time (ms) | Node v20 Time (ms) | How faster is LLRT than NodeJS? |
---|---|---|
52.38 | 684.11 | 13x |
41.81 | 715.21 | 17x |
48.39 | 707.27 | 15x |
HTTP Request Latency Benchmark
To test the end-to-end performance, I ran some tests with autocannon.
How does the API Response time vary with duration?
Setting connections: 10 as constant
Duration (s) | LLRT (ms) | NodeJS (ms) |
---|---|---|
5 | 3046 | 3707 |
10 | 2004 | 2708 |
15 | 317 | 784 |
20 | 304 | 586 |
How does the API Response time vary with number of connections?
Setting duration: 10s as constant
Connections | LLRT | NodeJS |
---|---|---|
1 | 7859 | 9855 |
10 | 7549 | 9488 |
50 | 6412 | 8465 |
100 | 2754 | 5456 |
We can clearly see who the winner is in maintaining connections and duration. It’s LLRT, again.
Source code for CDK project can be found on LearnAWS LLRT Github repo.
You can view AWS Lambda LLRT vs NodeJS Latency Graphs on Sheets as well.