by skunxicat

Shell Scripts as Lambda Functions

I needed a simple endpoint: fetch data from GitHub’s API, reshape it with jq, return JSON.

The standard approach costs $4.12 per million requests. A shell script costs $0.53.

Here’s what I learned building lambda-shell-endpoint.

The Standard Approach

// Node.js + API Gateway
exports.handler = async (event) => {
  const data = await fetch('https://api.github.com/repos/...');
  return {
    statusCode: 200,
    body: JSON.stringify(data)
  };
};

Monthly cost (1M requests):

  • Lambda: $0.53
  • API Gateway: $3.50
  • Total: $4.12

Package: 2-5MB
Cold start: 150-300ms

The Shell Approach

#!/bin/bash
set -euo pipefail

run() {
    curl -sS "https://api.github.com/repos/${REPO}/traffic/views" \
        -H "Authorization: token ${GITHUB_TOKEN}" \
    | jq '{
        total_views: .count,
        unique_visitors: .uniques,
        daily: .views | map({date, views: .count})
    }'
}

Monthly cost (1M requests):

  • Lambda: $0.53
  • Function URL: $0.00
  • Total: $0.53

Package: ~50KB
Cold start: ~80ms

No API Gateway needed. Lambda Function URLs are free.

Note: By default, Function URLs are public. For internal APIs, use IAM authentication:

# Terraform configuration
authorization_type = "AWS_IAM"

Clients then need AWS credentials to invoke the function.

How It Works

The architecture is simple:

Lambda Function URL

Go Bootstrap (2.3MB, raw TCP)

Shell Handler (your code, ~1KB)

jq Layer (JSON processing)

The Go bootstrap handles Lambda Runtime API communication. Your shell script handles business logic.

When This Makes Sense

Good for:

  • API aggregation
  • Data transformation
  • Observability endpoints
  • Simple JSON responses

Not good for:

  • Heavy computation
  • Persistent state
  • Complex authentication
  • WebSocket connections

Example: Multi-API Aggregation

run() {
    local api1 api2 api3
    
    api1=$(curl -sS --max-time 5 "https://api1.example.com/status")
    api2=$(curl -sS --max-time 5 "https://api2.example.com/metrics")
    api3=$(curl -sS --max-time 5 "https://api3.example.com/health")
    
    jq -n \
        --argjson a1 "$api1" \
        --argjson a2 "$api2" \
        --argjson a3 "$api3" \
        '{services: {api1: $a1, api2: $a2, api3: $a3}}'
}

Three API calls, one JSON response. Total handler size: ~500 bytes.

Getting Started

git clone https://github.com/ql4b/lambda-shell-endpoint
cd lambda-shell-endpoint

# Build and test locally
make build
make test

# Deploy
make deploy

# Invoke
make invoke

The Makefile handles everything: building the Go bootstrap, packaging layers, local Docker testing, Terraform deployment.

Local Testing

Test without AWS:

make test

This builds a Docker image with Lambda RIE, starts it, runs integration tests, and cleans up.

Or test the handler directly:

cd app/src
source handler.sh
run | jq

Cost at Scale

10 Million Requests/Month

ApproachCost
Shell + Function URL$5.30
Node.js + API Gateway$41.20
Python + API Gateway$42.10

100 Million Requests/Month

ApproachCost
Shell + Function URL$53.00
Node.js + API Gateway$412.00
Python + API Gateway$421.00

The gap widens at scale.

Why It’s Fast

This builds on previous research:

  • Raw TCP bootstrap (50% faster than HTTP clients)
  • Container image packaging (faster for larger runtimes)
  • arm64 architecture (20% cheaper)
  • No framework overhead

Result: 80ms cold starts, minimal cost.

Production Examples

The repository includes working handlers:

GitHub Traffic:

run() {
    curl -sS "https://api.github.com/repos/${REPO}/traffic/views" \
        -H "Authorization: token ${GITHUB_TOKEN}" \
    | jq '{total: .count, unique: .uniques}'
}

Stripe Revenue:

run() {
    curl -sS "https://api.stripe.com/v1/charges?limit=100" \
        -u "${STRIPE_API_KEY}:" \
    | jq '{revenue: ([.data[] | select(.status == "succeeded") | .amount] | add / 100)}'
}

Copy, customize, deploy.

The Tradeoff

You’re trading flexibility for simplicity and cost:

You lose:

  • Framework features
  • Type safety
  • Complex error handling
  • IDE support

You gain:

  • 677% cost reduction
  • 3x faster cold starts
  • 40x smaller packages
  • Simpler deployment

For simple JSON endpoints, this tradeoff often makes sense.

Try It

Repository: github.com/ql4b/lambda-shell-endpoint

Documentation:

The complete project includes:

  • Go bootstrap with raw TCP
  • jq layer for JSON processing
  • Docker-based local testing
  • Terraform infrastructure
  • Production-ready examples
  • Comprehensive documentation

Conclusion

Shell scripts can be production infrastructure. For simple JSON endpoints that aggregate or transform data, they’re often the right tool.

Not because they’re clever. Because they’re sufficient.

$0.53 vs $4.12 per million requests isn’t a trick. It’s what happens when you skip API Gateway and use Function URLs. The shell script is just the delivery mechanism.

Sometimes the simplest solution is the cheapest solution.


Part of the cloudless ecosystem.