From Bash Script to Production API: The Complete Guide
You have a bash script. You need it exposed as a production API. What do you do?
Most people don’t really know what AWS Lambda is. They think “serverless” means “no servers” or “magic scaling.”
Lambda is simpler: Run code when HTTP request comes in. Return HTTP response. That’s it.
Here’s how to turn any bash script into a serverless API:
Problem: Function → API
You have: A bash function that works locally.
Your first API is embarrassingly simple:
#!/bin/bash
data=$(curl -sS "https://httpbin.org/ip")
body=$(printf '%s' "$data" | base64 -w0)
printf '{"statusCode":200,"body":"%s"}' "$body"
That’s it. 4 lines of actual logic. This script:
- Calls an external API
- Base64 encodes the response (Lambda requirement)
- Returns proper JSON with status codes
Deploy it with the cloudless foundation:
# Bootstrap infrastructure
curl -sL https://raw.githubusercontent.com/ql4b/cloudless-infra/main/bootstrap | bash
curl -sL https://raw.githubusercontent.com/ql4b/cloudless-app/main/bootstrap | bash
# Configure and deploy
vim .env
./tf apply && cd app && npm run deploy
Mark the function as private to require API key:
# app/sls/functions.yml
events:
- http:
private: true
path: /ip
method: get
cors: true
Redeploy and test:
npm run deploy
API_KEY=$(cd .. && tf output -json api | jq -r '.api_keys.staging.value')
curl -sS --header "X-Api-Key: $API_KEY" https://YOUR_API.amazonaws.com/staging/ip
Result: Production API with staging/prod environments, API keys, usage plans, and monitoring. From bash script to global API in 5 minutes.
Works for: Simple data fetching, basic transformations, standard Unix tools.
Problem: Missing Tools
Need: Your script needs jq for JSON processing. Lambda doesn’t have it.
#!/bin/bash
data=$(curl -sS "https://httpbin.org/ip")
ip=$(echo "$data" | jq -r '.origin')
location=$(curl -sS "http://ip-api.com/json/$ip" | jq -r '.city')
echo '{"statusCode":200,"body":'{"ip":"'$ip'","location":"'$location'"}'}'
But jq isn’t available in Lambda. Solution: add a layer from lambda-shell-layers.
# infra/main.tf
module "jq_layer" {
source = "git@github.com:ql4b/lambda-shell-layers.git//jq"
context = module.label.context
}
This creates the layer and stores its ARN in SSM for serverless reference:
# serverless.yml
functions:
api:
layers:
- ${ssm:/${env:NAMESPACE}/${env:NAME}/layers/jq}
environment:
PATH: "/opt/bin:${env:PATH}"
Available layers: jq, qrencode, htmlq, imagemagick, pandoc, sqlite, yq
Build your own:
git clone https://github.com/ql4b/lambda-shell-layers.git
cd lambda-shell-layers/jq && ./build.sh
Result: Same bash script. Same deployment. Just added the tool you needed.
Works for: 1-2 additional tools. JSON processing, HTTP clients, simple databases.
Problem: Multiple Tools
Need: Your script now generates QR codes, resizes images, processes data.
#!/bin/bash
data=$(curl -sS "https://httpbin.org/ip" | jq -r '.origin')
# Generate QR code
echo "$data" | qrencode -o /tmp/qr.png
# Resize to thumbnail
convert /tmp/qr.png -resize 100x100 /tmp/thumb.png
# Base64 encode for response
image_data=$(base64 -w0 /tmp/thumb.png)
echo '{"statusCode":200,"body":"'$image_data'","headers":{"Content-Type":"image/png"}}'
Stack the layers you need:
module "jq_layer" {
source = "git@github.com:ql4b/lambda-shell-layers.git//jq"
context = module.label.context
}
module "qrencode_layer" {
source = "git@github.com:ql4b/lambda-shell-layers.git//qrencode"
context = module.label.context
}
module "imagemagick_layer" {
source = "git@github.com:ql4b/lambda-shell-layers.git//imagemagick"
context = module.label.context
}
functions:
api:
layers:
- ${ssm:/${env:NAMESPACE}/${env:NAME}/layers/jq}
- ${ssm:/${env:NAMESPACE}/${env:NAME}/layers/qrencode}
- ${ssm:/${env:NAMESPACE}/${env:NAME}/layers/imagemagick}
Works for: Multiple specialized tools that exist as layers. Image processing, document generation.
Problem: Complex Requirements
Need: Custom tools, conflicting dependencies, or everything in one container.
# Dockerfile
FROM ghcr.io/ql4b/lambda-shell-runtime:full
RUN apt-get update && apt-get install -y pandoc texlive-latex-base
COPY src/ .
Your script can now do anything:
#!/bin/bash
# Convert markdown to PDF via LaTeX
echo "$1" | jq -r '.body' | pandoc -f markdown -t pdf -o /tmp/output.pdf
pdf_data=$(base64 -w0 /tmp/output.pdf)
echo '{"statusCode":200,"body":"'$pdf_data'","headers":{"Content-Type":"application/pdf"}}'
Add runtime infrastructure:
module "runtime" {
source = "git@github.com:ql4b/terraform-aws-lambda-runtime.git"
attributes = ["custom"]
context = module.label.context
}
# serverless.yml
functions:
api:
image:
uri: ${ssm:/${env:NAMESPACE}/${env:NAME}/lambda-runtime-custom/image}
Build and deploy:
REPO_URL=$(tf output -raw runtime.repository_url)
docker build -t $REPO_URL:latest .
docker push $REPO_URL:latest
npm run deploy
Result: Unlimited flexibility. Any tool, any dependency, any complexity.
Tradeoff: Larger images, longer cold starts. But complete control.
The Pattern
Each solution maintains the same infrastructure:
- API Gateway with usage plans and API keys
- Same deployment:
npm run deploy - Same bash scripts: logic doesn’t change
- Same debugging:
npm run logs:tail
You’re not rewriting. You’re not learning frameworks. You’re adding tools when you need them.
Examples in Production
Basic APIs:
- Router web interface to REST API Tool stacking: QR generators with image processing Custom runtimes: PDF generation with LaTeX
The Philosophy
Progressive enhancement for infrastructure. Start simple. Add complexity only when value demands it. Never throw away what works.
The cloud should feel like your laptop. Your tools should work the same locally and in production.
That’s cloudless: bash scripts become APIs, complexity grows naturally, infrastructure gets out of your way.