by QL4B Team

Your Shell Script, Running in the Cloud

The foundational mechanics of turning local code into remote execution


Here’s what just happened: you wrote a shell function on your laptop, packaged it into a zip file, uploaded it to AWS, and now it’s running in the cloud. No containers, no runtimes, no frameworks. Just your code, executing remotely.

This is the foundational pattern that makes cloudless computing possible. Let’s break down exactly how it works.

The Mechanics: Package, Upload, Execute

Every cloud function follows the same pattern:

  1. Package - Bundle your code into a deployable artifact
  2. Upload - Send that artifact to the cloud provider
  3. Execute - The cloud runs your code when triggered

But not everyone knows that you don’t have to adapt your code to the cloud - you can shape the environment you want to run your code into in the way you want.

The Code: Just Shell Functions

This is your entire cloud function:

api_handler() {
    local event="$1"
    local name=$(whoami)  # This runs in the cloud
    
    echo '{
        "statusCode": 200,
        "body": "Hello from '"$name"' in the cloud!"
    }'
}

That $(whoami) command? It executes in AWS Lambda. The local variable? It’s allocated in cloud memory. The echo output? It becomes the HTTP response.

Your shell script doesn’t know it’s distributed. It just runs, wherever it happens to be.

The Runtime: What Makes This Possible

Behind the scenes, three things make this possible:

1. The Bootstrap

#!/bin/bash
while true; do
  EVENT_DATA=$(curl -sS "http://${AWS_LAMBDA_RUNTIME_API}/2018-06-01/runtime/invocation/next")
  RESPONSE=$(bash "${HANDLER}" "$EVENT_DATA")
  curl -X POST "http://${AWS_LAMBDA_RUNTIME_API}/2018-06-01/runtime/invocation/$REQUEST_ID/response" -d "$RESPONSE"
done

This loop runs in the cloud, fetching events and executing your handler. Your function never sees the Lambda Runtime API - it just gets called with data.

2. The Package

zip -r function.zip bootstrap handler.sh

Your code becomes a zip file. That’s it. No build steps, no compilation, no dependency resolution.

3. The Upload

aws lambda update-function-code --function-name $FUNCTION_NAME --zip-file fileb://function.zip

AWS receives your zip, extracts it to /var/task/ in a container, sets execute permissions on your bootstrap, and when the first request arrives, it runs ./bootstrap in that directory. Your code is now live, waiting in a loop for events to process.

The Workflow: Local to Cloud in Minutes

1. Write Your Function

# handler.sh
api_handler() {
    local event="$1"
    local timestamp=$(date)
    local hostname=$(hostname)
    
    echo '{
        "message": "Running on '"$hostname"' at '"$timestamp"'",
        "event": '"$event"'
    }'
}

2. Package and Deploy

# This happens automatically via Terraform
zip -r function.zip bootstrap handler.sh
aws lambda update-function-code --function-name $FUNCTION_NAME --zip-file fileb://function.zip

3. Execute in the Cloud

# Your local command
make invoke

# Triggers this in AWS Lambda:
# api_handler '{"test": "data"}'
# Returns: {"message": "Running on ip-10-0-1-23 at Wed Jan 15 14:30:22 UTC 2025"}

The hostname in the response? That’s the actual AWS Lambda container. Your shell script is running on Amazon’s infrastructure, but it doesn’t know or care.

The Breakthrough: Location Independence

This is the foundational insight: your code doesn’t need to know where it runs.

# This function works identically on your laptop and in AWS Lambda
process_data() {
    local input="$1"
    echo "$input" | jq '.items[] | select(.active == true)' | wc -l
}

The jq command runs wherever the function runs. The pipe works the same way. The variable scoping is identical. Your shell script is location-agnostic.

This breaks the traditional model where cloud functions are fundamentally different from local scripts. Here, they’re the same thing - just executed in different places.

No Runtime Lock-in

Because it’s just shell code, you can:

  • Run it locally for testing: bash handler.sh '{"test": "data"}'
  • Deploy it to AWS Lambda via our module
  • Run it in any container with bash installed
  • Execute it on any Unix system

The code is portable. The execution environment adapts to the code, not vice versa.

What This Enables

Once you understand this pattern, everything changes:

Your Terminal Commands Become APIs

# Local command
aws s3 ls s3://my-bucket | grep ".csv" | wc -l

# Same command, as a cloud function
count_csv_files() {
    aws s3 ls s3://my-bucket | grep ".csv" | wc -l
}

Data Processing Pipelines

# This runs in the cloud, processing files at scale
process_logs() {
    local s3_path="$1"
    aws s3 cp "$s3_path" - | \
        grep "ERROR" | \
        awk '{print $1, $4}' | \
        sort | uniq -c
}

Infrastructure Automation

# Triggered by CloudWatch events
scale_cluster() {
    local cpu_usage="$1"
    if [ "$cpu_usage" -gt 80 ]; then
        aws ecs update-service --desired-count 5 --service my-service
    fi
}

The power isn’t in the framework - it’s in the fact that any shell command can become a cloud service.

The Bigger Picture: Code-First Infrastructure

This is more than a deployment tool - it’s a new way of thinking about cloud computing.

Instead of:

  1. Design your infrastructure
  2. Choose your runtime
  3. Adapt your code to fit

You:

  1. Write your code
  2. Infrastructure adapts automatically
  3. Deploy anywhere

Your shell script doesn’t care if it’s running on your laptop, in a Lambda function, or in a Kubernetes pod. The execution environment becomes an implementation detail.

This is the foundation of cloudless computing: infrastructure that adapts to code, not code that adapts to infrastructure.

Try It: From Local Script to Cloud Function

# 1. Write a function (this works locally)
echo 'api_handler() { echo "{\"hostname\": \"$(hostname)\", \"user\": \"$(whoami)\"}"; }' > handler.sh

# 2. Test locally
bash handler.sh && api_handler '{}'
# Output: {"hostname": "your-laptop", "user": "your-username"}

# 3. Deploy to cloud (using our module)
tf apply
export FUNCTION_NAME=$(tf output -raw lambda.function_name)
make deploy

# 4. Test in cloud
make invoke
# Output: {"hostname": "ip-10-0-1-23", "user": "sbx_user1051"}

Same function. Same code. Different execution environment. The shell script doesn’t know the difference.

The Foundation

This pattern - package, upload, execute - is the foundation of all cloud computing. But most platforms hide it behind layers of abstraction.

By exposing the raw mechanics and making them simple, we enable a new kind of development where:

  • Local scripts become cloud services
  • Infrastructure becomes invisible
  • Deployment becomes trivial
  • Code becomes portable

This is just the beginning. Once you understand that any code can run anywhere, the possibilities are endless.


The terraform-aws-lambda-function module demonstrates these foundational mechanics. It’s available on GitHub as part of the cloudless ecosystem - where infrastructure adapts to code, not the other way around.