Serverless Bun vs Node: Benchmarking on AWS Lambda

Mitchell Kossoris
8 min readSep 11, 2023

Since the inception of server environments for JavaScript, NodeJS has reigned supreme as the go-to runtime. Node, along with its package manager, NPM, is now used extensively for projects ranging from small hobbyist side projects to enterprise-grade high-traffic systems. Most developers haven’t dared challenge Node due to its default status in the industry and comprehensive ecosystem of open-source development. Deno was the first major contender, but it never quite caught on, likely due to its lack of strong interoperability between Node, open source NPM packages, and CommonJS.

A few days ago, September 8, 2023, a new challenger, Bun, officially announced a stable release of their runtime and accompanying tooling, including an NPM-compatible package manager, bundler, APIs, and more. Bun promises a whole host of benefits, mostly around performance and developer experience, while boasting strong interoperability standards. In other words, Bun looks to be a drop-in replacement for Node-based applications that improves everything about the experience of developing JS applications and their performance.

As someone who works primarily on serverless function-based applications, my immediate question was: this all sounds great, but can I actually use this? I especially wondered this because Lambda has optimized their service for NodeJS, so how would Bun fair when it has no official support? To find out, I devised a set of benchmarks to test it out with the aim of answering my question.

The Tests

Running generic benchmark tests has limitations for comparing between runtimes and languages and therefore cannot substitute for testing your specific use cases. That said, I tried to focus on a few key areas I thought would be of interest to developers in the context of running JavaScript in a serverless function environment. I came up with the following three benchmark tests that consider theoretical, real-world, and serverless-specific aspects:

General processing performance

Bun claims it can process logic at 3–4x the speed of NodeJS. While many applications may be more I/O-bound (database queries, network calls, file system reads/writes, etc), this type of general improvement, if accurate, helps every system, and especially those with CPU-bound and memory-bound workloads. As such, I wanted to see if Bun’s pure performance claims would translate to a serverless environment.

Test: Generate and then sort (with built-in Array.sort) 100K random numbers 10 times back-to-back.

CRUD API

Testing CPU-heavy workloads is interesting, but it is likely to inflate the real-world benefits of Bun, as the reality is that Lambda-like environments are very frequently used for much simpler applications that are more I/O-bound (network, file system, etc). A very common pattern is to use API Gateway, Lambda, and DynamoDB together to create simple CRUD APIs where the computations are simple. I wanted to test these more real-world scenarios to see whether Bun can still provide benefit even in the absence of heavy CPU-bound logic.

Test: Implement a CRUD Update function that validates input, retrieves an object from DynamoDB, shallow merges the requested modifications into the existing object, and puts it back in DynamoDB.

Cold start times

Serverless functions tend to suffer from what’s known as the “cold-start” issue. AWS Lambda functions run in a containerized environment, with each container typically lasting between 10–30 minutes. When a request comes in and no container is spun up and available, Lambda will spin up a new container to handle the request, and that spin up time is the cold start time we’re referring to. Officially-supported runtimes like NodeJS will typically be more optimized to spin up quickly, so the major question to answer is how will Bun, an unoptimized runtime, compare?

Test: Hello world function with intentionally-induced cold starts.

Environment Configuration

The following configuration was used for these tests:

  • REST API Gateway-to-Lambda Function (response times measured from API Gateway to get full end-to-end scope)
  • 1024MB memory for Lambda Functions
  • x86_64 architecture on Amazon Linux 2 for Bun
  • Base runtime configuration on x86_64 architecture for Node.js 18.x
  • Provisioned concurrency of 5 (except for the cold-start test)

General Processing Test Results

This test was run 1,000 times for each runtime, with response time data gathered server-side and collected in CloudWatch and X-Ray.

API Gateway response time stats for Node vs Bun compute test, from CloudWatch.

Node

  • Median Response Time: 3736ms
  • Min Response Time: 3391ms
  • Max Response Time: 4580ms
  • p95 Response Time: 3989ms
  • p99 Response Time: 4404ms
Response time distribution for Node runtime compute test, from X-Ray.

Bun

  • Median Response Time: 1836ms (-50.9%)
  • Min Response Time: 1564ms (-53.9%)
  • Max Response Time: 3571ms (-22.0%)
  • p95 Response Time: 2027ms (-49.2%)
  • p99 Response Time: 2117ms (-51.9%)
Response time distribution for Bun runtime compute test, from X-Ray.

CRUD API Test Results

This test was run 1,000 times for each runtime, with response time data gathered server-side and collected in CloudWatch and X-Ray.

API Gateway response time stats for Node vs Bun simple CRUD API test, from CloudWatch.

Node

  • Median Response Time: 24ms
  • Min Response Time: 17ms
  • Max Response Time: 135ms
  • p95 Response Time: 29ms
  • p99 Response Time: 51ms
Response time distribution for Node runtime simple CRUD API test, from X-Ray.

Bun

  • Median Response Time: 25ms (+4.2%)
  • Min Response Time: 16ms (-5.9%))
  • Max Response Time: 157ms (+16.3%))
  • p95 Response Time: 33ms (+13.8%)
  • p99 Response Time: 44ms (-13.7%)
Response time distribution for Bun runtime simple CRUD API test, from X-Ray.

Cold Start Times

This test was run only 10 times for each runtime, as there is no great way to automate forced cold starts in a load test. Note that the results here reflect the full response time, not just the initialization times.

Lambda execution time stats for Node vs Bun simple CRUD API test, from CloudWatch.

Node

  • Median Response Time: 302ms
  • Min Response Time: 252ms
  • Max Response Time: 361ms
  • p95 Response Time: 312ms
  • p99 Response Time: 361ms
Response time distribution for Node runtime cold start test, from X-Ray.

Bun

  • Median Response Time: 775ms (+156%)
  • Min Response Time: 717ms (+184%)
  • Max Response Time: 1382ms (+282%)
  • p95 Response Time: 1174ms (+276%)
  • p99 Response Time: 1382ms (+282%)
Response time distribution for Bun runtime cold start test, from X-Ray.

Interpreting the Results

To my surprise, despite Node being specifically optimized for Lambda and the official Bun Lambda Layer being under-tested (and therefore not optimized), Bun managed to come out ahead in measurable and significant ways for CPU-bound tasks. It was also neck and neck for more simple and I/O-bound tasks, meaning you’re at a bare minimum unlikely to see a downgrade in performance from using Bun. I think if AWS Lambda were to add Bun as an official runtime, we’d see Bun perform at least slightly better than Node, if not with significant improvements. And if you’re running your logic with frameworks and libraries like Express, Nest, Apollo, or NextJS, I wouldn’t be surprised if you saw consistently faster performance (this is something I’d like to test next).

Is it ready for production? Well, I wouldn’t jump to that conclusion just yet, because I did run into some rough patches along the way with the Bun Lambda Layer, and it’s not particularly battle-tested at this point. Plus, there still are some NodeJS APIs not yet interoperable with Bun, and Bun itself is still very new, meaning we don’t know all of the quirks with it just yet. However, these are solvable problems, and these tests show that, with some effort, there is real potential for this to be a legitimate runtime for production serverless application development.

Combating cold-start issues

Obviously, the cold start problem is something to be aware of, but I wouldn’t call it a deal breaker as there are two popular solutions for making this problem nearly obsolete:

  1. AWS’s official solution of provisioned concurrency, which keeps a configured number of containers always warm for a cost. Depending on the load to your application, this could either be very reasonable or may be more expensive of an approach than it’s worth.
  2. Custom warming solution that pings your Lambda function every so often to ensure a number of containers are always available. This is a good solution for lower throughput applications as it’s cheap and keeps a container ready for use whenever you need it.

Performance Benefits

In common application use cases, simply swapping out NodeJS for Bun may make a slight improvement to latency, giving your applications a snappier feel. The more your application does in the compute itself, the more benefit you’ll see, so it will really depend on your specific use cases. That said, paired with a cold-start strategy, it seems like Bun will at least be on par with Node while offering all of its other development experience benefits.

Cost Benefits

Reducing execution time of Lambda functions has a direct positive impact on the cost of running your application, as Lambda functions are charged per millisecond of use. If you’re doing much CPU-bound logic in your application, you could see massive cost savings by switching to Bun, but if you’re just running some simple functions that call databases or other APIs and don’t do much with that data in the Lambda function itself, then you probably won’t see huge differences, plus you’ll have to pay for whatever cold start strategy you go with.

One other caveat to be aware of is that using the built-in Node runtime means you are not billed for initialization (cold-start) costs. On the other hand, you are charged for initialization of Lambda containers for Bun, so on top of combating cold-start issues for performance reasons, you would likely want to reduce these for cost benefit as well.

Final Thoughts

I was highly skeptical about Bun in general and especially in terms of potential for using it in a serverless application workflow. I was pleasantly surprised to have that skepticism quashed by these results. While I won’t be personally using Bun for Lambda development just yet due to its newness, I will be keeping a close eye on it as its ecosystem matures and as tooling for Bun on Lambda is hardened and optimized further.

As a side note, this article did not discuss the development experience of Bun, but the tl;dr is that it was a refreshing experience with extremely fast package installation and build times and effectively native support for testing Lambda functions locally - something that is otherwise more painful and requires additional tooling from AWS SAM. With more maturity, I think Bun has a real chance of overtaking Node as the industry default runtime for JavaScript and TypeScript development, and I’ll personally be rooting for it. Also, AWS Lambda, if you happen to read this article, please consider supporting Bun as a native runtime for Lambda.

--

--

Mitchell Kossoris

AI startup CEO and former Senior Software Engineer at AWS. Interests in LLMs, microservices, GraphQL, React, and more.