Why I Rewrote My Node.js API in Bash
How choosing the “wrong” tool gave me 2x performance and 90% less memory usage
The Problem
I needed to build a proxy API for Ryanair’s availability endpoints. Simple enough: take incoming requests, forward them to Ryanair, handle rate limiting, return the response.
My brain immediately went to the default: Node.js + Serverless Framework. It’s what I knew. It’s what everyone uses for APIs. It’s the “right” tool.
The Default Choice
// What I built first
const axios = require('axios');
exports.handler = async (event) => {
const response = await axios.get(ryanairUrl, {
params: event.queryStringParameters
});
return {
statusCode: 200,
body: JSON.stringify(response.data)
};
};
Standard stuff. Works fine. Ships fast. Problem solved, right?
The Realization
After running this in production for a while, I started noticing the actual work being done:
- HTTP request to Ryanair
- JSON parsing of the response
- Rate limiting logic
- HTTP response back to client
That’s it. No complex business logic. No database calls. No framework features.
Just curl + jq + some basic logic.
The “Wrong” Tool
#!/bin/bash
# What I rewrote it as
ryanair_proxy() {
local params="$1"
# Make the request
local response=$(curl -s "https://api.ryanair.com/availability" \
-G -d "$params" \
--max-time 10)
# Parse and filter the response
echo "$response" | jq '{
flights: .trips[0].dates[0].flights | map({
price: .regularFare.fares[0].amount,
departure: .time[0],
arrival: .time[1]
})
}'
}
The Results
Memory Usage:
- Node.js: ~128MB baseline + dependencies
- Bash: ~15MB total
Cold Start Time:
- Node.js: ~800ms (JS engine + module loading)
- Bash: ~200ms (process spawn + tool loading)
Execution Time:
- Node.js: ~300ms (HTTP + JSON parsing + response)
- Bash: ~150ms (curl + jq + output)
Package Size:
- Node.js: ~50MB (runtime + node_modules)
- Bash: ~5MB (static binaries)
Why This Happened
Node.js overhead for this use case:
- V8 engine initialization
- Module resolution and loading
- JSON parsing in JavaScript
- HTTP client abstraction layers
- Garbage collection during execution
Bash directness:
- Direct system calls
- Native JSON processing (jq)
- No runtime initialization
- No memory management overhead
The Broader Pattern
This isn’t about Node.js being bad. It’s about tool appropriateness.
When Node.js makes sense:
- Complex business logic
- Database interactions
- Real-time features
- Rich ecosystem needs
When bash makes sense:
- HTTP proxying
- Data transformation
- File processing
- System integration
The Right Tool Fallacy
We choose tools based on familiarity rather than fitness.
The fallacy: “I know Node.js, therefore Node.js is the right tool.”
The reality: The right tool is the one that solves the problem with the least complexity.
Making Better Choices
Ask these questions:
- What is the actual work being done?
- What’s the simplest tool that can do this work?
- Am I adding complexity for complexity’s sake?
For my Ryanair proxy:
- HTTP request + JSON parsing + response
- curl + jq + basic shell logic
- Yes - Node.js was complexity for familiarity’s sake
The Infrastructure Angle
This is why I built lambda-shell-runtime.
The problem: AWS Lambda assumes you want a “real” programming language.
The reality: Sometimes you just need curl + jq to run at scale.
The solution: Make bash a first-class Lambda runtime option.
Conclusion
The right tool isn’t always the familiar tool.
Sometimes the “wrong” choice - bash for an API, curl instead of an HTTP library, jq instead of JSON parsing code - is actually the right choice.
The infrastructure should get out of your way and let you solve problems with appropriate complexity.
Not every problem needs a framework. Some problems just need curl.
This is part of the cloudless philosophy - infrastructure that gets out of your way.