umma.dev

AWS: Lambda and Serverless

Creating an AWS Lambda Function

Create and Run Lambda Functions in AWS Console using Python

  • Navigate to Lambda via AWS Management Console
  • Click on Create function
  • Configurations
    • Name: <unique-function-name>
    • Runtime: Python 3.11
  • Change default execution role
    • Go to the permissions tab
    • Navigate to the section where you can edit the execution role
    • Modify the default execution role to Use an existing role
      • An execution role grants the Lambda function permissions to access AWS services and resources
      • E.g. if you want your Lambda function to make API calls to access an S3 bucket, the execution role would need to include policies allowing it to read and write to the specific bucket
    • Chose/create a new role
    • Click Create function
  • Click Deploy with the “Hello world” Python example
  • Then click on Test at the top of the page
  • In the configure test event dialog, your the test event a name (you can use the example JSON given)
    • Click on run

General Configuration

  • Navigate to the Configuration tab
  • Ephemeral storage: this is temporary storage space with a Lambda function can read or write data
    • By default AWS Lambda allocates 512 MB for a function’s /tmp directory
    • The size of the /tmp directory can be configured to be between 512 MB to 10240 MB (in 1 MB increments)
  • SnapStart: a performance optimisation feature to improve startup times for applications
  • Memory: refers to the amount of memory available to the Lambda function at runtime
    • Memory can be increased or decreased and CPU power allocated to your function using Memory (MB) setting
    • To configure the memory for your function, you can set a value between 128 MB and 10240 MB
    • Lambda allocates CPU power in proportion to the amount of memory configured
    • Increasing the memory also increases the CPU power, thus increasing the overall computational power available

Configuring Memory of Your Function

  • Assume we are executing this particular Lambda function
import json

def lambda_handler(event, context):
    upper_limit = int(event.get('upper_limit', 1000000))
    primes = find_prime(upper_limit)
    return {
        'statusCode': 200,
        'body': json.dumps({'message': f'Found {len(primes)} primes up to {upper_limit}.'})
    }

def find_prime(n):
    sieve = [True] * (n + 1)
    sieve[0:2] = [False, False]
    for current in range(2, int(n**0.5) + 1):
        if sieve[current]:
            sieve[current*current::current] = [False] * len(range(current*current, n+1, current))
    return [i for i, v in enumerate(sieve) if v]
  • Save changes by clicking on Deploy
  • Test the function by clicking the Test button
  • On the function configuration page, within the General configuration pane, click on Edit
  • In the Memory (MB) field, increase the memory from 128 MB to 512 MB
  • Click on Save
  • Head over the Test tab again and click on the Test button
    • You should notice the function executes faster than the first time we ran it, as indicated by the shorter duration
  • Timeout: refers to the maximum amount of time that your Lambda function can run before it is stopped by the Lambda services
    • You can set the timeout value between 1 and 900 seconds (15 mins)
    • The default timeout value in Lambda console is 3 seconds
    • This timeout value acts as a safety buffer, that prevents functions that never exit from running indefinitely

Set the Timeout for an AWS Lambda Function

  • Copy and paste the following code into your Lambda function (this code simulates a function performing actual processing by pausing the execution for 5 seconds - since the default timeout is 3 seconds, the function will return a timeout error)
import json
import time

def lambda_handler(event, context):
    # TODO implement
    time.sleep(5);
    return {
        'statusCode': 200,
        'body': json.dumps('Hello from Lambda!')
    }
  • Save the changes by clicking on Deploy (you should see an error)
  • In the General configuration pane, click on Edit
  • For the timeout, change the value from 3 seconds to 10 seconds, click on Save and then go to the Code tab to execute the function again

Creating a Node.js Function in AWS Lambda

Create an AWS Lambda Function

  • Create a function with the following config
    • Select blueprint
    • Blueprint Name: <blueprint-name>
    • Function Name: <lambda-function-name>
    • Execution role: select an existing role or create a new one
  • Click on Create Function

Invoke the Lambda Function using the Test Button

  • Once the function is created, you should be redirected to the dashboard

  • Click on Test and give the test a name, leave the event json as default

  • Review the response

  • Event: the input that your Lambda function processes

  • Modify the test data and use a new name for the event

    • Replace the test data with the following:
    {
      "key1": "Boracay",
      "key2": "Palawan",
      "key3": "Manila"
    }
    • Click Save

Modify the Function

  • Navigate to the Code tab or Code source section
  • Modify line 8 with event.key3 or something similar
  • Click on Deploy
  • Click on Test again
    • The function should now return Manila

Aliases and Versions in AWS Lambda

Creating and Testing an AWS Lambda Function

  • Create a new Lambda function with the following config

    • Select Use a blueprint
    • Blueprint Name: <lambda-blueprint-name>
    • Function Name: <lambda-function-name>
    • Execution role: select an existing role or create a new one
    • Click Create function
    • Click on Test and configure a test event - Name: <test-name> - Paste in the following to the Event JSON field
    {
      "key1": "Version Boracay",
      "key2": "Version Palawan",
      "key3": "Version Manila"
    }
    • Click on Save and then Test
      • The response should come back with Version Boracay

Create a New Version

  • Scroll down the Code source section
    • Locate line 8 and change it to: return event.key2
    • Click on Deploy to apply changes
  • Navigate to the versions tab
    • Click on Publish new version
    • Add an optional description and click Publish
    • In function overview, click on the Test tab and click the Test button
      • Take note of what’s in details in the green box

Create an Alias

  • An alias refers to a particular version of a Lambda function and can be modified to point to different versions

  • User’s interact with the function through the alias’ Amazon Resource Name (ARN)

  • When a new version is deployed, you can adjust the alias to direct traffic to this version or, balance the load between the existing and new versions

  • Navigate to the Configuration tab of the Lambda function’s dashboard and click Create alias

    • Name: <alias-name>
    • Version: 1
    • Click Save
    • Click on Test and observe the function returns the same version as the previous test

Modify the Function and Create a New Version

  • Go back to the Code source section
  • Click on the Code tab and change the return statement to return event.key3
  • Click Deploy to apply changes
  • Publish the new version
    • Go to the Versions tab
    • Describe as version 2 and click Publish
  • Update the alias
    • Scroll up and click on version 2 from the dropdown menu (next to the Copy ARN button)
    • Select Alias: Dev and then click Edit
    • Change the version it points to from 1 to 2
    • Click Save to update the alias
    • Click on Test to observe the version returned

Implement Weighted Traffic Routing

  • Go to the Configuration tab and click Edit
  • Click on the Weighted alias Dropdown
  • Set the traffic weight between the two versions
    • Click Save
    • Click Test to test the weighted routing

Creating an AWS Lambda Function to Return an HTML Page

  • With Lambda, you can execute your code in response to events such as, HTTP requests, changes to data in S3 or updates in DynamoDB
  • Function URL: a way to expose a Lambda function HTTP(S) requests without needed an API gateway

Create an AWS Lambda Function

  • Go to AWS Lambda
  • Create a function with the following config
    • Chose Author from scratch
    • Function name: HTMLPageLambda
    • Select Node.js 20.x as the runtime
    • Execution role: select a role/create a new one
    • Advanced settings
      • Check the box for Enable function URL
      • Auth type: none
    • Click Create function

Creating the index.html file

  • In the coder editor, look at the env where you have an index.mjs file
  • Right click on HTMLPageLambda folder and create a new file called index.html
  • Add the following code:
<!DOCTYPE html>
<html lang="en">
  <head>
    <meta charset="UTF-8" />
    <meta name="viewport" content="width=device-width, initial-scale=1.0" />
    <title>Discover the Philippines</title>
    <style>
      body {
        font-family: Arial, sans-serif;
        margin: 20px;
        background-color: #f0f8ff;
        color: #333;
      }
      h1 {
        color: #0073bb;
      }
      p {
        line-height: 1.6;
      }
      .highlight {
        color: #e67e22;
        font-weight: bold;
      }
    </style>
  </head>
  <body>
    <h1>Welcome to the Philippines</h1>
    <p>
      The Philippines is an archipelago comprising more than 7,000 islands,
      known for its rich biodiversity, vibrant culture, and stunning landscapes.
    </p>
    <p>
      <span class="highlight">Manila</span> is the capital city, while
      <span class="highlight">Cebu</span> and
      <span class="highlight">Davao</span> are major urban centers.
    </p>
    <p>
      The country offers a wide range of tourist attractions, from pristine
      beaches to historical landmarks.
    </p>
  </body>
</html>
  • Click Deploy

Modifying the mjs Code

  • Go back to the code editor with the index.mjs file
  • Add the following code:
import * as fs from "node:fs";

// Read the HTML content from the file system
const html = fs.readFileSync("index.html", { encoding: "utf8" });

// Lambda function handler to return the HTML content
export const handler = async () => {
  const response = {
    statusCode: 200,
    headers: {
      "Content-Type": "text/html",
    },
    body: html,
  };
  return response;
};
  • It reads the file you just created and returns it as a response through the body
  • Content-type ensures the browser interprets it as a HTML doc and the Lambda functions handler returns the body as a HTTP response
  • click on Deploy

Testing the Lambda Function

  • Copy the Function URL (on the right under)
  • Paste the URL in the browser and you should be able to see the HTML page

Using Environment Variables in AWS Lambda

Create a Lambda Function

  • Chose Author from scratch
  • Function Name: <lambda-function-name>
  • Select Python 3.12 as the runtime
  • Execution role
    • Select an existing role or create a new one
  • Click Create function

Adding Environment Variables

  • Go to the Configuration tab of the Lambda function
  • Under Environment variables click Edit
  • Click on the Add environment variable button
  • Add the follow new env variables
    • WELCOME_MESSAGE: Hello from Lambda!
    • ENVIRONMENT: Development
  • Click Save
  • Modify the Lambda Code to use the env variables
import os

def lambda_handler(event, context):
    # Retrieve environment variables
    welcome_message = os.getenv('WELCOME_MESSAGE', 'Default Welcome Message')
    environment = os.getenv('ENVIRONMENT', 'development')

    # Return a response that includes the environment variables
    return {
        'statusCode': 200,
        'body': {
            'message': welcome_message,
            'environment': environment
        }
    }
  • Click Deploy to save changes

Test the Lambda Function

  • Click on Test (next to Deploy)
  • Create a config test event
    • Event Name: Test
    • Template (optional): hello-world
      • Leave everything as default
    • Click Save
  • Now click on Test again
  • Go back to the Configuration tab and update the env variables
    • WELCOME_MESSAGE: Hello round 2
    • ENVIRONMENT: England
  • Click Save and retest

Creating an AWS Lambda Function that Interacts with an Amazon DynamoDB Table

Create the DynamoDB Table

  • Table Name: <dynamodb-table-name>
  • Primary Key: CityID (string)
  • Sort Key: Date (string)
  • Table settings: select Default settings
  • Click Create table
  • Ensure the table status is Active
  • Add an item into the table
    • CityID: 001 (string)
    • Date: 2025-03-30 (string)
    • CityName: London (string)
    • Population 1285349532342234234 (number)
    • AverageTemperature: 14 (number)
    • Area: 100.1

Create the Lambda Function

  • Chose Author from scratch
  • Function name: CityDataHandler
  • Select Node.js 20.x as the runtime
  • Execution role: create one or chose an existing one
  • Click Create function
  • Paste the following into the code editor
import { DynamoDB } from "@aws-sdk/client-dynamodb";
import { DynamoDBDocument } from "@aws-sdk/lib-dynamodb";

const dynamo = DynamoDBDocument.from(new DynamoDB());

/**
 * This function handles HTTP requests to interact with a DynamoDB table.
 * It supports GET, POST, PUT, and DELETE methods.
 */
export const handler = async (event) => {
  console.log("Received event:", JSON.stringify(event, null, 2));

  let responseBody;
  let statusCode = 200; // Default status code for successful requests
  const headers = {
    "Content-Type": "application/json",
  };

  try {
    // Determine the HTTP method and perform the corresponding action
    switch (event.httpMethod) {
      case "GET":
        // For GET requests, scan the DynamoDB table and return the data
        const scanParams = { TableName: event.queryStringParameters.TableName };
        responseBody = await dynamo.scan(scanParams);
        break;

      case "POST":
        // For POST requests, add a new item to the DynamoDB table
        const postParams = JSON.parse(event.body);
        responseBody = await dynamo.put(postParams);
        break;

      case "PUT":
        // For PUT requests, update an existing item in the DynamoDB table
        const putParams = JSON.parse(event.body);
        responseBody = await dynamo.update(putParams);
        break;

      case "DELETE":
        // For DELETE requests, delete an item from the DynamoDB table
        const deleteParams = JSON.parse(event.body);
        responseBody = await dynamo.delete(deleteParams);
        break;

      default:
        // If an unsupported HTTP method is used, return an error
        throw new Error(`Unsupported method "${event.httpMethod}"`);
    }
  } catch (err) {
    // If an error occurs, return a 400 status code and the error message
    statusCode = 400;
    responseBody = { error: err.message };
  }

  // Convert the response body to a JSON string
  return {
    statusCode: statusCode.toString(), // Convert status code to string
    body: JSON.stringify(responseBody),
    headers,
  };
};
  • Click Deploy to save changes

Test the Lambda Function using Test Events

  • Go the Test tab and create a new event
  • Event name: GET_Event_Test
  • In the event json:
{
  "httpMethod": "GET",
  "queryStringParameters": {
    "TableName": "PhilippinesCities"
  },
  "body": null
}
  • Click Save
  • Click on Test and ensure the test event is successful
  • Create a new test event called POST_Event_Name
  • Past the following into the event json:
{
  "httpMethod": "POST",
  "queryStringParameters": null,
  "body": "{\"TableName\": \"PhilippinesCities\", \"Item\": {\"CityID\": \"004\", \"Date\": \"2024-08-20\", \"CityName\": \"Iloilo City\", \"Population\": 457626, \"AverageTemperature\": 29.0, \"Area\": 113.62}}"
}
  • Click Save and then click Test

Using Layers in AWS Lambda Functions

  • A Layer in AWS Lambda is a way to manage code, libraries and dependencies separately from the main Lambda function code
  • Layers allow you to package libraries and other dependencies of the Lambda function
    • This makes it easier to manage and share code across multiple Lambda functions

Create a Simple Lambda Function

  • Choose Author from scratch
  • Function name: <lambda-function-name>
  • Runtime: Python 3.12
  • Execution role: select an existing role or create a new one
  • Click Create function
  • Replace the default code with:
import requests

def lambda_handler(event, context):
    response = requests.get('https://jsonplaceholder.typicode.com/todos/1')
    return response.json()
  • Click Deploy

Testing the Lambda Function without the Required Library

  • Once the function is deployed, click on Test
  • Click on Configure test event
    • Event name: Test
    • Template (optional): hello-world (leave the rest as default)
    • Click Save
  • Click on Test to execute the function

Creating and Adding a Layer with the Requests Library

  • Download the requests library (it will be a zip file)
  • Click on Layers on the left hand side of the AWS Lambda console
  • Click Create Layer
    • Layer Name: <lambda-layer-name>
    • Upload the zip file containing the library
    • Choose Python 3.12 as the runtime
    • Click Create
  • Add the layer to your Lambda function
    • Go to your Lambda function
    • Scroll down to the Code tab and in the Layers section, click Add a layer
    • Choose Custom layers and select the layer you just created and it’s current version
    • Click Add
  • Test the Lambda function

Processing Amazon CloudWatch Logs with AWS Lambda

  • CloudWatch enables real-time monitoring and logging of applications and infrastructure in AWS

Create Sample CloudWatch Logs Data

  • Create an AWS Lambda
    • Chose Auth from scratch
    • Function name: <Lambda-function-name>
    • Runtime: Python 3.12
    • Execution role: select an existing role/create a new role
    • Go to the code tab and put the following code in
import boto3
import time
import random

logs_client = boto3.client('logs')

def lambda_handler(event, context):
    log_group_name = 'TestLogGroup'
    log_stream_name = 'TestLogStream'

    # Create log group if it doesn't exist
    try:
        logs_client.create_log_group(logGroupName=log_group_name)
    except logs_client.exceptions.ResourceAlreadyExistsException:
        pass  # Log group already exists

    # Create log stream if it doesn't exist
    try:
        logs_client.create_log_stream(logGroupName=log_group_name, logStreamName=log_stream_name)
    except logs_client.exceptions.ResourceAlreadyExistsException:
        pass  # Log stream already exists

    # Sample data for log entries
    http_methods = ['GET', 'POST', 'PUT', 'DELETE']
    request_urls = [
        '/home',
        '/api/user',
        '/login',
        '/products',
        '/checkout',
        '/cart',
        '/search?q=aws',
        '/api/order/123',
        '/api/product/567'
    ]
    user_agents = [
        'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36',
        'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.0.3 Safari/605.1.15',
        'Mozilla/5.0 (iPhone; CPU iPhone OS 14_0 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.0 Mobile/15A5341f Safari/604.1',
        'Mozilla/5.0 (Linux; Android 10; SM-G973F) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.81 Mobile Safari/537.36'
    ]
    status_codes = [200, 201, 400, 401, 403, 404, 500]

    # Create log events
    log_events = []
    for i in range(20):
        log_event = {
            'timestamp': int(time.time() * 1000),
            'message': (
                f"{random.choice(http_methods)} "
                f"{random.choice(request_urls)} "
                f"{random.choice(status_codes)} "
                f"{random.choice(user_agents)}"
            )
        }
        log_events.append(log_event)

    # Put log events
    logs_client.put_log_events(
        logGroupName=log_group_name,
        logStreamName=log_stream_name,
        logEvents=log_events
    )

    return {
        'statusCode': 200,
        'body': 'Successfully created log events.'
    }
  • Click Deploy and then Test
  • Create a Configure test event
    • Name: <test-name>
  • Click on Test again
  • After running the Lambda function, search for CloudWatch and navigate to the log group section - locate the TestLogGroup

Create the Log Processing Lambda Function

  • Author from scratch
  • Function name: <lambda-function-name>
  • Select Python 3.12 as the runtime
  • Execution role: create a new one or an existing one
  • Click Create function
  • Paste the following code in:
import json
import gzip
import base64

def lambda_handler(event, context):
    # Decode and decompress CloudWatch Logs data
    decoded_data = base64.b64decode(event['awslogs']['data'])
    decompressed_data = gzip.decompress(decoded_data)
    log_data = json.loads(decompressed_data)

    # Iterate through log events and filter for HTTP 500 status codes
    for log_event in log_data['logEvents']:
        message = log_event['message']
        parts = message.split(' ')
        if len(parts) >= 3 and parts[2] == '500':
            print(f"Detected HTTP 500 request: {message}")

    return f"Successfully processed {len(log_data['logEvents'])} log events."
This function processes the logs from CloudWatch Logs by decoding and decompressing the log data, then iterating through each log entry to check for HTTP 500 status codes. The function logs it to the console if a 500 status code is found.
  • Click on Deploy and then Test

Create the CloudWatch Logs Subscription Filter

  • Go to CloudWatch and go to Log groups
  • Click on TestLogGroup
  • Navigate to the Subscription Filter tab
  • Click on Create
  • Select Create Lambda subscription filter
  • In the Create Lambda subscription filter page
    • Destination: <lambda-function-previously-created>
    • Configure log format and filters
      • Log format: other
      • Subscription filter pattern: 500
      • Subscription filter name: 500 error filter
      • Test pattern: TestLogStream
      • Click Start streaming

Testing the Log Processing

  • In AWS Lambda, trigger the Lambda by clicking Test, this will create log events
  • Return to functions within Lambda and navigate to the Lambda function
  • Click on the Monitor tab and then View CloudWatch logs

Connecting AWS Lambda to Amazon RDS

  • Relational databases

Create an Amazon RDS Instance

  • Database creation method: standard create
  • Engine options
    • Type: MySQL
    • Version: leave as default
  • Template: free tier
  • Settings
    • DB Instance Identifier: <db-instance-name>
    • Master username: <username>
    • Credentials management: self-managed
    • Check the box for Auto generate password (for testing/this exercise only)
  • Instance configuration
    • DB Instance Size: db.t3.micro
  • Storage
    • Type: General Purpose SSD (gp2)
    • Allocated storage: 20
  • Connectivity
    • Compute resource: n/a (doesn’t connect to an EC2 compute resource)
    • VPC: leave as default
    • DB subnet group: create a new db subnet group
    • Public access: yes
    • VPC security group (firewall): select Create new
      • New VPC security group name: <vpc-security-group-name>
      • Availability zone: select your preferred zone
  • Leave the rest as default and click Create database

Create the Lambda Function

  • Choose Author from scratch
  • Function name: <lambda-function-name>
  • Runtime: Python 3.12
  • Execution role: create a new IAM role or select an existing one
  • Click Create function

Configure VPC and Security Groups

  • Ensure the Lambda function and RDS instance are both in the same VPC
  • Modify the Lambda VPC settings to include subnets and security groups (to communicate with RDS instance)
    • Click on Configuration tab and then VPC (on the left)
    • Click on Edit
    • Add the following config
      • VPC: keep as default
      • Subnets: choose same subnets where you set the RDS Instance
      • Security groups: choose the one created in the RDS Instance
    • Click Save
  • Ensure the RDS security group allows inbound connections on the database port

Upload the Lambda Function Code

  • Navigate to the Lambda Function Console and select the Code tab
  • Either upload a .zip file of the code or paste it in
  • Add env variables (go to the Configuration tab)
    • DB_HOST: endpoint for RDS instance
    • DB_USER: db username
    • DB_PASSWORD: db password
    • DB_NAME: MySQL
  • Click Save
  • In the Configuration tab, select General configuration (on the left)
    • Change the Timeout to 1 min and click Save

Set Up Lambda Connect

  • Navigate back to the RDS Instance created earlier
  • Click on Actions (top right) and select Set up Lambda connection
  • Add the Lambda function created earlier
  • Uncheck Connect using RDS Proxy
  • Click Set up

Test the Lambda Function

  • Navigate back to the Code tab in AWS Lambda
  • Click on Test and then Configure test event
    • Event name: <test-event-name>
    • Template (optional): leave as default
    • Click on Save and then Test

Custom Error Handling with AWS Lambda

Create an AWS Lambda Function

  • Author from scratch
  • Function name: <lambda-function-name>
  • Runtime: Python 3.12
  • Execution role: create one or select an existing one
  • Click Create function

Write the Custom Error Handling Code

  • Scroll to the Code source section
  • Replace the default code:
import logging
import json

# Set up logging
logger = logging.getLogger()
logger.setLevel(logging.INFO)

# Custom exception class
class CustomError(Exception):
    def __init__(self, message):
        self.message = message
        super().__init__(self.message)

def lambda_handler(event, context):
    try:
        # Simulating an error scenario
        if 'trigger_error' in event and event['trigger_error']:
            raise CustomError("This is a custom error message for simulation purposes.")

        # Normal processing logic
        return {
            'statusCode': 200,
            'body': json.dumps('Function executed successfully!')
        }
    except CustomError as e:
        logger.error(f"Custom error occurred: {e.message}")
        return {
            'statusCode': 400,
            'body': json.dumps(f"Error: {e.message}")
        }
    except Exception as e:
        logger.error(f"An unexpected error occurred: {str(e)}")
        return {
            'statusCode': 500,
            'body': json.dumps(f"Internal Server Error: {str(e)}")
        }

Test the Lambda Function

  • Click on Test to create a new test event
  • Use the following sample event:
{
  "trigger_error": false
}
  • Click Save and then Test
    • The function should return the response: Function executed successfully!
  • Now simulate an error scenario:
{
  "trigger_error": true
}
  • Click Save and then Test
    • The function should return the response: Error: This is a custom error message for simulation purposes.

View Logs in Amazon CloudWatch

  • Select the Monitoring tab
  • Select View logs in CloudWatch
  • Find the latest log stream for your Lambda function
  • Review the logs to see the messages printed by your function

Invoking Lambda Functions Through Function URL

  • Function URLs allow you to call serverless functions directly via HTTPS endpoints, which can be configured with/without authentication

Creating your Lambda Function

  • Author from scratch
  • Use an existing role/create a new one
  • Click Create function

Enabling Function URL and Writing Function Code

  • Go to the Configuration tab and select Function URL on the left
  • Click Create function URL
  • Select None
  • Click Save
  • In the function code editor, replace the code with the following:
def lambda_handler(event, context):
    return {
        'statusCode': 200,
        'headers': {'Content-Type': 'application/json'},
        'body': '{"message": "Hello from Lambda Function URL!"}'
    }
  • Click Deploy

Invoke Lambda via URL

  • Scroll up to Function overview and the on the right there is a heading called Function URL with a link
  • Copy the link and navigate to it from a new tab
  • The response should be: {"message": "Hello from Lambda Function URL!"}

Invoking Lambda Functions through Amazon SQS Queue

Create an Amazon SQS Queue

  • Click on Create queue
  • Choose Standard queue
  • Name of Queue: <queue-name>
  • Keep default settings and click Create Queue

Create an AWS Lambda Function

  • Choose Author from scratch
  • Function name: <lambda-function-name>
  • Runtime: Python 3.12
  • Execution role: create one/select existing one
  • Click Create function
  • Replace the code with the following:
import json

def lambda_handler(event, context):
    # Print the event received from SQS
    print("Received event from SQS:")
    print(json.dumps(event, separators=(',', ':')))

    # Process each record in the event
    for record in event['Records']:
        body = record['body']
        print(f"Processing message: {body}")

    return {
        'statusCode': 200,
        'body': json.dumps('Messages processed successfully!')
    }
  • Click Deploy

Configure the SQS Queue as a Trigger for Lambda

  • Click + Add Trigger in the Function overview in Lambda

  • Configure the following trigger settings

    • Select a trigger: chose SQS form the dropdown menu
    • SQS Queue: choose the queue you created earlier
    • Activate trigger: check box
    • Batch size: set to 2
    • Batch window (optional): set to 3
    • Maximum concurrency (optional): leave empty
    • Filter criteria (optional): click Add and enter a JSON filter to process messages that meet a certain conditions
    {
      "body": {
        "order_type": ["express"]
      }
    }
    • Click Add
  • Ensure the Configuration Triggers of the Lambda for the SQS state are Enabled

Test the Setup

  • Send messages to the SQS Queue

    • Go to SQS and click on the queue you created earlier
    • Select Send and receive messages
    • In the message body, enter a message with a specific order_type
    {
      "order_id": 5678,
      "customer": "Jane Smith",
      "order_type": "express"
    }
    • Click Send Message
    • Now send a message with a different order_type
    {
      "order_id": 91011,
      "customer": "Alice Johnson",
      "order_type": "standard"
    }
  • Verify Lambda execution

    • Go to Lambda and click Monitoring
    • Select View logs in CloudWatch and find the latest log stream
    • Review the logs matching the order_type filter
    • Messages with order_type values that don’t meet the filter criteria will not appear in the logs