HomeGuidesAWSAWS Lambda Explained — Event-Driven Functions, Triggers & Execution Model
☁️ AWS

AWS Lambda: Serverless Functions, Triggers, and Limits

Lambda is the core of AWS serverless. Here's what the exam tests — execution model, cold starts, and integration patterns.

Examifyr·2026·5 min read

Lambda fundamentals

Lambda runs code in response to events, without managing servers. You pay only when code is executing.

# Key characteristics:
# - Event-driven: triggered by AWS services or HTTP
# - Stateless: no persistent state between invocations
# - Auto-scaling: runs many instances in parallel
# - Max timeout: 15 minutes
# - Max memory: 10,240 MB
# - Max deployment package: 250 MB (unzipped)

# Execution environment:
# - AWS manages underlying compute
# - You provide code + runtime (Node.js, Python, Java, etc.)
# - Execution role grants Lambda access to other AWS services

# Basic Lambda function (Python):
def handler(event, context):
    print("Event:", event)
    return {
        "statusCode": 200,
        "body": "Hello from Lambda!"
    }
Note: Lambda scales automatically — each invocation can run in its own execution environment. This means no shared state between invocations.

Triggers and event sources

Lambda can be triggered by dozens of AWS services.

# Common triggers:
# API Gateway    → HTTP REST/WebSocket API
# S3             → Object created/deleted events
# DynamoDB       → Stream records (inserts/updates/deletes)
# SQS            → Message queue processing
# SNS            → Pub/sub notifications
# EventBridge    → Scheduled events (cron), AWS events
# Kinesis        → Real-time stream processing
# CloudWatch     → Logs and alarms

# Synchronous vs Asynchronous invocation:
# Synchronous (API Gateway, SDK direct call):
#   - Caller waits for response
#   - Errors returned to caller

# Asynchronous (S3, SNS, EventBridge):
#   - Lambda queues the event
#   - Caller doesn't wait
#   - Errors go to Dead Letter Queue (DLQ)

Cold starts and performance

A cold start happens when Lambda creates a new execution environment. It adds latency to the first invocation.

# Cold start sequence:
# 1. Download code package
# 2. Start execution environment (container)
# 3. Initialise runtime (Node/Python/etc.)
# 4. Run handler function

# Warm invocation:
# - Execution environment already running
# - Only runs handler function (fast)

# Mitigating cold starts:
# - Use Provisioned Concurrency (keep instances warm)
# - Keep deployment packages small
# - Use lighter runtimes (Node/Python vs Java)
# - Move initialisation outside handler

# Move DB connections outside handler:
import boto3
db = boto3.resource('dynamodb')     # runs once on cold start
table = db.Table('my-table')        # reused on warm invocations

def handler(event, context):
    result = table.get_item(Key={'id': '1'})  # uses warm connection
Note: Initialisation code outside the handler function runs on cold start and is reused across warm invocations — a key performance optimisation.

Exam tip

The most common Lambda exam question: "What is a cold start?" — it's the overhead of provisioning a new execution environment for the first invocation. Provisioned Concurrency eliminates cold starts by keeping environments warm, at extra cost.

🎯

Think you're ready? Prove it.

Take the free AWS readiness test. Get a score, topic breakdown, and your exact weak areas.

Take the free AWS test →

Free · No sign-up · Instant results

← Previous
AWS VPC Explained — Subnets, Route Tables, NAT Gateway & Security Groups
Next →
AWS Architecture — High Availability, Auto Scaling & Well-Architected
← All AWS guides