In this tutorial, you create and configure a Lambda function that resizes images added to an Amazon Simple Storage Service (Amazon S3) bucket. When you add an image file to your bucket, Amazon S3 invokes your Lambda function. The function then creates a thumbnail version of the image and outputs it to a different Amazon S3 bucket.

To complete this tutorial, you carry out the following steps:
-
Create source and destination Amazon S3 buckets and upload a sample image.
-
Create a Lambda function that resizes an image and outputs a thumbnail to an Amazon S3 bucket.
-
Configure a Lambda trigger that invokes your function when objects are uploaded to your source bucket.
-
Test your function, first with a dummy event, and then by uploading an image to your source bucket.
By completing these steps, you’ll learn how to use Lambda to carry out a file processing task on objects added to an Amazon S3 bucket. You can complete this tutorial using the AWS Command Line Interface (AWS CLI) or the AWS Management Console.
If you're looking for a simpler example to learn how to configure an Amazon S3 trigger for Lambda, you can try Tutorial: Using an Amazon S3 trigger to invoke a Lambda function.
Topics
Prerequisites
If you want to use the AWS CLI to complete the tutorial, install the latest version of the AWS Command Line Interface.
For your Lambda function code, you can use Python or Node.js. Install the language support tools and a package manager for the language that you want to use.
If you have not yet installed the AWS Command Line Interface, follow the steps at Installing or updating the latest version of the AWS CLI to install it.
The tutorial requires a command line terminal or shell to run commands. In Linux and macOS, use your preferred shell and package manager.
Note
In Windows, some Bash CLI commands that you commonly use with Lambda (such as zip
) are not supported by the operating system's built-in terminals.
To get a Windows-integrated version of Ubuntu and Bash, install the Windows Subsystem for Linux
Create two Amazon S3 buckets

First create two Amazon S3 buckets. The first bucket is the source bucket you will upload your images to. The second bucket is used by Lambda to save the resized thumbnail when you invoke your function.
To create the Amazon S3 buckets (console)
-
Open the Amazon S3 console
and select the General purpose buckets page. -
Select the AWS Region closest to your geographical location. You can change your region using the drop-down list at the top of the screen. Later in the tutorial, you must create your Lambda function in the same Region.
-
Choose Create bucket.
-
Under General configuration, do the following:
-
For Bucket type, ensure General purpose is selected.
-
For Bucket name, enter a globally unique name that meets the Amazon S3 Bucket naming rules. Bucket names can contain only lower case letters, numbers, dots (.), and hyphens (-).
-
-
Leave all other options set to their default values and choose Create bucket.
-
Repeat steps 1 to 5 to create your destination bucket. For Bucket name, enter
, whereamzn-s3-demo-source-bucket-resized
is the name of the source bucket you just created.amzn-s3-demo-source-bucket
Upload a test image to your source bucket

Later in the tutorial, you’ll test your Lambda function by invoking it using the AWS CLI or the Lambda console. To confirm that your function is operating correctly, your source bucket needs to contain a test image. This image can be any JPG or PNG file you choose.
To upload a test image to your source bucket (console)
-
Open the Buckets
page of the Amazon S3 console. -
Select the source bucket you created in the previous step.
-
Choose Upload.
-
Choose Add files and use the file selector to choose the object you want to upload.
-
Choose Open, then choose Upload.
Create a permissions policy

The first step in creating your Lambda function is to create a permissions policy. This policy gives your function the permissions it needs to access other AWS resources. For this tutorial, the policy gives Lambda read and write permissions for Amazon S3 buckets and allows it to write to Amazon CloudWatch Logs.
To create the policy (console)
-
Open the Policies
page of the AWS Identity and Access Management (IAM) console. -
Choose Create policy.
-
Choose the JSON tab, and then paste the following custom policy into the JSON editor.
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "logs:PutLogEvents", "logs:CreateLogGroup", "logs:CreateLogStream" ], "Resource": "arn:aws:logs:*:*:*" }, { "Effect": "Allow", "Action": [ "s3:GetObject" ], "Resource": "arn:aws:s3:::*/*" }, { "Effect": "Allow", "Action": [ "s3:PutObject" ], "Resource": "arn:aws:s3:::*/*" } ] }
-
Choose Next.
-
Under Policy details, for Policy name, enter
.LambdaS3Policy
-
Choose Create policy.
Create an execution role

An execution role is an IAM role that grants a Lambda function permission to access AWS services and resources. To give your function read and write access to an Amazon S3 bucket, you attach the permissions policy you created in the previous step.
To create an execution role and attach your permissions policy (console)
-
Open the Roles
page of the (IAM) console. -
Choose Create role.
-
For Trusted entity type, select AWS service, and for Use case, select Lambda.
-
Choose Next.
-
Add the permissions policy you created in the previous step by doing the following:
-
In the policy search box, enter
.LambdaS3Policy
-
In the search results, select the check box for
LambdaS3Policy
. -
Choose Next.
-
-
Under Role details, for the Role name enter
.LambdaS3Role
-
Choose Create role.
Create the function deployment package

To create your function, you create a deployment package containing your function code and its dependencies. For this
CreateThumbnail
function, your function code uses a separate library for the image resizing. Follow the instructions for your
chosen language to create a deployment package containing the required library.
To create the deployment package (Node.js)
-
Create a directory named
lambda-s3
for your function code and dependencies and navigate into it.mkdir lambda-s3 cd lambda-s3
-
Create a new Node.js project with
npm
. To accept the default options provided in the interactive experience, pressEnter
.npm init
-
Save the following function code in a file named
index.mjs
. Make sure to replaceus-east-1
with the AWS Region in which you created your own source and destination buckets.// dependencies import { S3Client, GetObjectCommand, PutObjectCommand } from '@aws-sdk/client-s3'; import { Readable } from 'stream'; import sharp from 'sharp'; import util from 'util'; // create S3 client const s3 = new S3Client({region:
'us-east-1'
}); // define the handler function export const handler = async (event, context) => { // Read options from the event parameter and get the source bucket console.log("Reading options from event:\n", util.inspect(event, {depth: 5})); const srcBucket = event.Records[0].s3.bucket.name; // Object key may have spaces or unicode non-ASCII characters const srcKey = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, " ")); const dstBucket = srcBucket + "-resized"; const dstKey = "resized-" + srcKey; // Infer the image type from the file suffix const typeMatch = srcKey.match(/\.([^.]*)$/); if (!typeMatch) { console.log("Could not determine the image type."); return; } // Check that the image type is supported const imageType = typeMatch[1].toLowerCase(); if (imageType != "jpg" && imageType != "png") { console.log(`Unsupported image type: ${imageType}`); return; } // Get the image from the source bucket. GetObjectCommand returns a stream. try { const params = { Bucket: srcBucket, Key: srcKey }; var response = await s3.send(new GetObjectCommand(params)); var stream = response.Body; // Convert stream to buffer to pass to sharp resize function. if (stream instanceof Readable) { var content_buffer = Buffer.concat(await stream.toArray()); } else { throw new Error('Unknown object stream type'); } } catch (error) { console.log(error); return; } // set thumbnail width. Resize will set the height automatically to maintain aspect ratio. const width = 200; // Use the sharp module to resize the image and save in a buffer. try { var output_buffer = await sharp(content_buffer).resize(width).toBuffer(); } catch (error) { console.log(error); return; } // Upload the thumbnail image to the destination bucket try { const destparams = { Bucket: dstBucket, Key: dstKey, Body: output_buffer, ContentType: "image" }; const putResult = await s3.send(new PutObjectCommand(destparams)); } catch (error) { console.log(error); return; } console.log('Successfully resized ' + srcBucket + '/' + srcKey + ' and uploaded to ' + dstBucket + '/' + dstKey); }; -
In your
lambda-s3
directory, install the sharp library using npm. Note that the latest version of sharp (0.33) isn't compatible with Lambda. Install version 0.32.6 to complete this tutorial.npm install sharp@0.32.6
The npm
install
command creates anode_modules
directory for your modules. After this step, your directory structure should look like the following.lambda-s3 |- index.mjs |- node_modules | |- base64js | |- bl | |- buffer ... |- package-lock.json |- package.json
-
Create a .zip deployment package containing your function code and its dependencies. In MacOS and Linux, run the following command.
zip -r function.zip .
In Windows, use your preferred zip utility to create a .zip file. Ensure that your
index.mjs
,package.json
, andpackage-lock.json
files and yournode_modules
directory are all at the root of your .zip file.
Create the Lambda function

You can create your Lambda function using either the AWS CLI or the Lambda console. Follow the instructions for your chosen language to create the function.
To create the function (console)
To create your Lambda function using the console, you first create a basic function containing some ‘Hello world’ code. You then replace this code with your own function code by uploading the.zip or JAR file you created in the previous step.
-
Open the Functions page
of the Lambda console. -
Make sure you're working in the same AWS Region you created your Amazon S3 bucket in. You can change your region using the drop-down list at the top of the screen.
-
Choose Create function.
-
Choose Author from scratch.
-
Under Basic information, do the following:
-
For Function name, enter
.CreateThumbnail
-
For Runtime, choose either Node.js 22.x or Python 3.12 according to the language you chose for your function.
-
For Architecture, choose x86_64.
-
-
In the Change default execution role tab, do the following:
-
Expand the tab, then choose Use an existing role.
-
Select the
LambdaS3Role
you created earlier.
-
-
Choose Create function.
To upload the function code (console)
-
In the Code source pane, choose Upload from.
-
Choose .zip file.
-
Choose Upload.
-
In the file selector, select your .zip file and choose Open.
-
Choose Save.
Configure Amazon S3 to invoke the function

For your Lambda function to run when you upload an image to your source bucket, you need to configure a trigger for your function. You can configure the Amazon S3 trigger using either the console or the AWS CLI.
Important
This procedure configures the Amazon S3 bucket to invoke your function every time that an object is created in the bucket. Be sure to
configure this only on the source bucket. If your Lambda function creates objects in the same bucket that invokes it, your function can be
invoked continuously in a loop
To configure the Amazon S3 trigger (console)
-
Open the Functions page
of the Lambda console and choose your function ( CreateThumbnail
). -
Choose Add trigger.
-
Select S3.
-
Under Bucket, select your source bucket.
-
Under Event types, select All object create events.
-
Under Recursive invocation, select the check box to acknowledge that using the same Amazon S3 bucket for input and output is not recommended. You can learn more about recursive invocation patterns in Lambda by reading Recursive patterns that cause run-away Lambda functions
in Serverless Land. -
Choose Add.
When you create a trigger using the Lambda console, Lambda automatically creates a resource based policy to give the service you select permission to invoke your function.
Test your Lambda function with a dummy event

Before you test your whole setup by adding an image file to your Amazon S3 source bucket, you test that your Lambda function is working correctly by invoking it with a dummy event. An event in Lambda is a JSON-formatted document that contains data for your function to process. When your function is invoked by Amazon S3, the event sent to your function contains information such as the bucket name, bucket ARN, and object key.
To test your Lambda function with a dummy event (console)
-
Open the Functions page
of the Lambda console and choose your function ( CreateThumbnail
). -
Choose the Test tab.
-
To create your test event, in the Test event pane, do the following:
-
Under Test event action, select Create new event.
-
For Event name, enter
myTestEvent
. -
For Template, select S3 Put.
-
Replace the values for the following parameters with your own values.
-
For
awsRegion
, replaceus-east-1
with the AWS Region you created your Amazon S3 buckets in. -
For
name
, replaceamzn-s3-demo-bucket
with the name of your own Amazon S3 source bucket. -
For
key
, replacetest%2Fkey
with the filename of the test object you uploaded to your source bucket in the step Upload a test image to your source bucket.
{ "Records": [ { "eventVersion": "2.0", "eventSource": "aws:s3", "awsRegion":
"us-east-1"
, "eventTime": "1970-01-01T00:00:00.000Z", "eventName": "ObjectCreated:Put", "userIdentity": { "principalId": "EXAMPLE" }, "requestParameters": { "sourceIPAddress": "127.0.0.1" }, "responseElements": { "x-amz-request-id": "EXAMPLE123456789", "x-amz-id-2": "EXAMPLE123/5678abcdefghijklambdaisawesome/mnopqrstuvwxyzABCDEFGH" }, "s3": { "s3SchemaVersion": "1.0", "configurationId": "testConfigRule", "bucket": { "name":"amzn-s3-demo-bucket"
, "ownerIdentity": { "principalId": "EXAMPLE" }, "arn": "arn:aws:s3:::amzn-s3-demo-bucket" }, "object": { "key":"test%2Fkey"
, "size": 1024, "eTag": "0123456789abcdef0123456789abcdef", "sequencer": "0A1B2C3D4E5F678901" } } } ] } -
-
Choose Save.
-
-
In the Test event pane, choose Test.
-
To check the your function has created a resized verison of your image and stored it in your target Amazon S3 bucket, do the following:
-
Open the Buckets page
of the Amazon S3 console. -
Choose your target bucket and confirm that your resized file is listed in the Objects pane.
-
Test your function using the Amazon S3 trigger

Now that you’ve confirmed your Lambda function is operating correctly, you’re ready to test your complete setup by adding an image file to your Amazon S3 source bucket. When you add your image to the source bucket, your Lambda function should be automatically invoked. Your function creates a resized version of the file and stores it in your target bucket.
To test your Lambda function using the Amazon S3 trigger (console)
-
To upload an image to your Amazon S3 bucket, do the following:
-
Open the Buckets
page of the Amazon S3 console and choose your source bucket. -
Choose Upload.
-
Choose Add files and use the file selector to choose the image file you want to upload. Your image object can be any .jpg or .png file.
-
Choose Open, then choose Upload.
-
-
Verify that Lambda has saved a resized version of your image file in your target bucket by doing the following:
-
Navigate back to the Buckets
page of the Amazon S3 console and choose your destination bucket. -
In the Objects pane, you should now see two resized image files, one from each test of your Lambda function. To download your resized image, select the file, then choose Download.
-
Clean up your resources
You can now delete the resources that you created for this tutorial, unless you want to retain them. By deleting AWS resources that you're no longer using, you prevent unnecessary charges to your AWS account.
To delete the Lambda function
-
Open the Functions page
of the Lambda console. -
Select the function that you created.
-
Choose Actions, Delete.
-
Type
confirm
in the text input field and choose Delete.
To delete the policy that you created
-
Open the Policies page
of the IAM console. -
Select the policy that you created (AWSLambdaS3Policy).
-
Choose Policy actions, Delete.
-
Choose Delete.
To delete the execution role
-
Open the Roles page
of the IAM console. -
Select the execution role that you created.
-
Choose Delete.
-
Enter the name of the role in the text input field and choose Delete.
To delete the S3 bucket
-
Open the Amazon S3 console.
-
Select the bucket you created.
-
Choose Delete.
-
Enter the name of the bucket in the text input field.
-
Choose Delete bucket.