使用 SDK for JavaScript (v3) 的 Amazon S3 範例 - AWS SDK 程式碼範例

文件 AWS SDK AWS 範例 SDK 儲存庫中有更多可用的 GitHub 範例。

本文為英文版的機器翻譯版本,如內容有任何歧義或不一致之處,概以英文版為準。

使用 SDK for JavaScript (v3) 的 Amazon S3 範例

下列程式碼範例示範如何搭配 Amazon S3 使用 AWS SDK for JavaScript (v3) 來執行動作和實作常見案例。

基本概念是程式碼範例,示範如何在服務內執行基本操作。

Actions 是大型程式的程式碼摘錄,必須在內容中執行。雖然 動作會示範如何呼叫個別服務函數,但您可以在其相關案例中查看內容中的動作。

案例是程式碼範例,示範如何透過呼叫服務內的多個函數或與其他函數結合來完成特定任務 AWS 服務。

每個範例都包含完整原始程式碼的連結,您可以在其中找到如何在內容中設定和執行程式碼的指示。

開始使用

下列程式碼範例示範如何開始使用 Amazon S3。

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

import { paginateListBuckets, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * List the S3 buckets in your configured AWS account. */ export const helloS3 = async () => { // When no region or credentials are provided, the SDK will use the // region and credentials from the local AWS config. const client = new S3Client({}); try { /** * @type { import("@aws-sdk/client-s3").Bucket[] } */ const buckets = []; for await (const page of paginateListBuckets({ client }, {})) { buckets.push(...page.Buckets); } console.log("Buckets: "); console.log(buckets.map((bucket) => bucket.Name).join("\n")); return buckets; } catch (caught) { // ListBuckets does not throw any modeled errors. Any error caught // here will be something generic like `AccessDenied`. if (caught instanceof S3ServiceException) { console.error(`${caught.name}: ${caught.message}`); } else { // Something besides S3 failed. throw caught; } } };
  • 如需 API 詳細資訊,請參閱 ListBuckets AWS SDK for JavaScript 參考中的 API

基本概念

以下程式碼範例顯示做法:

  • 建立儲存貯體並上傳檔案到該儲存貯體。

  • 從儲存貯體下載物件。

  • 將物件複製至儲存貯體中的子文件夾。

  • 列出儲存貯體中的物件。

  • 刪除儲存貯體物件和該儲存貯體。

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

首先,匯入所有必要模組。

// Used to check if currently running file is this file. import { fileURLToPath } from "node:url"; import { readdirSync, readFileSync, writeFileSync } from "node:fs"; // Local helper utils. import { dirnameFromMetaUrl } from "@aws-doc-sdk-examples/lib/utils/util-fs.js"; import { Prompter } from "@aws-doc-sdk-examples/lib/prompter.js"; import { wrapText } from "@aws-doc-sdk-examples/lib/utils/util-string.js"; import { S3Client, CreateBucketCommand, PutObjectCommand, ListObjectsCommand, CopyObjectCommand, GetObjectCommand, DeleteObjectsCommand, DeleteBucketCommand, } from "@aws-sdk/client-s3";

前面的匯入動作參考了一些協助公用程式。這些公用程式是本節開頭所連結之 GitHub 儲存庫的本機公用程式。如需相關內容,請參閱下列公用程式的實作方式。

export const dirnameFromMetaUrl = (metaUrl) => fileURLToPath(new URL(".", metaUrl)); import { select, input, confirm, checkbox, password } from "@inquirer/prompts"; export class Prompter { /** * @param {{ message: string, choices: { name: string, value: string }[]}} options */ select(options) { return select(options); } /** * @param {{ message: string }} options */ input(options) { return input(options); } /** * @param {{ message: string }} options */ password(options) { return password({ ...options, mask: true }); } /** * @param {string} prompt */ checkContinue = async (prompt = "") => { const prefix = prompt && `${prompt} `; const ok = await this.confirm({ message: `${prefix}Continue?`, }); if (!ok) throw new Error("Exiting..."); }; /** * @param {{ message: string }} options */ confirm(options) { return confirm(options); } /** * @param {{ message: string, choices: { name: string, value: string }[]}} options */ checkbox(options) { return checkbox(options); } } export const wrapText = (text, char = "=") => { const rule = char.repeat(80); return `${rule}\n ${text}\n${rule}\n`; };

S3 的物件儲存在「儲存貯體」。接下來,來定義一個建立新儲存貯體的函數。

export const createBucket = async () => { const bucketName = await prompter.input({ message: "Enter a bucket name. Bucket names must be globally unique:", }); const command = new CreateBucketCommand({ Bucket: bucketName }); await s3Client.send(command); console.log("Bucket created successfully.\n"); return bucketName; };

儲存貯體包含「物件」。此功能會將儲存庫的內容做為物件上傳至您的儲存貯體。

export const uploadFilesToBucket = async ({ bucketName, folderPath }) => { console.log(`Uploading files from ${folderPath}\n`); const keys = readdirSync(folderPath); const files = keys.map((key) => { const filePath = `${folderPath}/${key}`; const fileContent = readFileSync(filePath); return { Key: key, Body: fileContent, }; }); for (const file of files) { await s3Client.send( new PutObjectCommand({ Bucket: bucketName, Body: file.Body, Key: file.Key, }), ); console.log(`${file.Key} uploaded successfully.`); } };

上傳物件之後,請檢查確認已正確上傳物件。您可以為此使用 ListObjects 。您將使用「金鑰」屬性,但在回應中也有其他有用的屬性。

export const listFilesInBucket = async ({ bucketName }) => { const command = new ListObjectsCommand({ Bucket: bucketName }); const { Contents } = await s3Client.send(command); const contentsList = Contents.map((c) => ` • ${c.Key}`).join("\n"); console.log("\nHere's a list of files in the bucket:"); console.log(`${contentsList}\n`); };

有時您可能想從儲存貯體複製物件到另一個儲存貯體。為此使用 CopyObject 命令。

export const copyFileFromBucket = async ({ destinationBucket }) => { const proceed = await prompter.confirm({ message: "Would you like to copy an object from another bucket?", }); if (!proceed) { return; } const copy = async () => { try { const sourceBucket = await prompter.input({ message: "Enter source bucket name:", }); const sourceKey = await prompter.input({ message: "Enter source key:", }); const destinationKey = await prompter.input({ message: "Enter destination key:", }); const command = new CopyObjectCommand({ Bucket: destinationBucket, CopySource: `${sourceBucket}/${sourceKey}`, Key: destinationKey, }); await s3Client.send(command); await copyFileFromBucket({ destinationBucket }); } catch (err) { console.error("Copy error."); console.error(err); const retryAnswer = await prompter.confirm({ message: "Try again?" }); if (retryAnswer) { await copy(); } } }; await copy(); };

沒有從儲存貯體取得多個物件的 SDK 方法。反之,您會建立一份待下載的物件清單,並重複執行這些物件。

export const downloadFilesFromBucket = async ({ bucketName }) => { const { Contents } = await s3Client.send( new ListObjectsCommand({ Bucket: bucketName }), ); const path = await prompter.input({ message: "Enter destination path for files:", }); for (const content of Contents) { const obj = await s3Client.send( new GetObjectCommand({ Bucket: bucketName, Key: content.Key }), ); writeFileSync( `${path}/${content.Key}`, await obj.Body.transformToByteArray(), ); } console.log("Files downloaded successfully.\n"); };

該清除資源了。儲存貯體在刪除之前必須先清空。這兩個函數會清空並刪除儲存貯體。

export const emptyBucket = async ({ bucketName }) => { const listObjectsCommand = new ListObjectsCommand({ Bucket: bucketName }); const { Contents } = await s3Client.send(listObjectsCommand); const keys = Contents.map((c) => c.Key); const deleteObjectsCommand = new DeleteObjectsCommand({ Bucket: bucketName, Delete: { Objects: keys.map((key) => ({ Key: key })) }, }); await s3Client.send(deleteObjectsCommand); console.log(`${bucketName} emptied successfully.\n`); }; export const deleteBucket = async ({ bucketName }) => { const command = new DeleteBucketCommand({ Bucket: bucketName }); await s3Client.send(command); console.log(`${bucketName} deleted successfully.\n`); };

「主要」函數會將所有內容放在一起。若你直接執行這個檔案,主要函數將受到叫用。

const main = async () => { const OBJECT_DIRECTORY = `${dirnameFromMetaUrl( import.meta.url, )}../../../../resources/sample_files/.sample_media`; try { console.log(wrapText("Welcome to the Amazon S3 getting started example.")); console.log("Let's create a bucket."); const bucketName = await createBucket(); await prompter.confirm({ message: continueMessage }); console.log(wrapText("File upload.")); console.log( "I have some default files ready to go. You can edit the source code to provide your own.", ); await uploadFilesToBucket({ bucketName, folderPath: OBJECT_DIRECTORY, }); await listFilesInBucket({ bucketName }); await prompter.confirm({ message: continueMessage }); console.log(wrapText("Copy files.")); await copyFileFromBucket({ destinationBucket: bucketName }); await listFilesInBucket({ bucketName }); await prompter.confirm({ message: continueMessage }); console.log(wrapText("Download files.")); await downloadFilesFromBucket({ bucketName }); console.log(wrapText("Clean up.")); await emptyBucket({ bucketName }); await deleteBucket({ bucketName }); } catch (err) { console.error(err); } };

動作

下列程式碼範例示範如何使用 CopyObject

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

複製物件

import { S3Client, CopyObjectCommand, ObjectNotInActiveTierError, waitUntilObjectExists, } from "@aws-sdk/client-s3"; /** * Copy an S3 object from one bucket to another. * * @param {{ * sourceBucket: string, * sourceKey: string, * destinationBucket: string, * destinationKey: string }} config */ export const main = async ({ sourceBucket, sourceKey, destinationBucket, destinationKey, }) => { const client = new S3Client({}); try { await client.send( new CopyObjectCommand({ CopySource: `${sourceBucket}/${sourceKey}`, Bucket: destinationBucket, Key: destinationKey, }), ); await waitUntilObjectExists( { client }, { Bucket: destinationBucket, Key: destinationKey }, ); console.log( `Successfully copied ${sourceBucket}/${sourceKey} to ${destinationBucket}/${destinationKey}`, ); } catch (caught) { if (caught instanceof ObjectNotInActiveTierError) { console.error( `Could not copy ${sourceKey} from ${sourceBucket}. Object is not in the active tier.`, ); } else { throw caught; } } };
  • 如需 API 詳細資訊,請參閱 CopyObject AWS SDK for JavaScript 參考中的 API

下列程式碼範例示範如何使用 CreateBucket

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

建立儲存貯體。

import { BucketAlreadyExists, BucketAlreadyOwnedByYou, CreateBucketCommand, S3Client, waitUntilBucketExists, } from "@aws-sdk/client-s3"; /** * Create an Amazon S3 bucket. * @param {{ bucketName: string }} config */ export const main = async ({ bucketName }) => { const client = new S3Client({}); try { const { Location } = await client.send( new CreateBucketCommand({ // The name of the bucket. Bucket names are unique and have several other constraints. // See https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucketnamingrules.html Bucket: bucketName, }), ); await waitUntilBucketExists({ client }, { Bucket: bucketName }); console.log(`Bucket created with location ${Location}`); } catch (caught) { if (caught instanceof BucketAlreadyExists) { console.error( `The bucket "${bucketName}" already exists in another AWS account. Bucket names must be globally unique.`, ); } // WARNING: If you try to create a bucket in the North Virginia region, // and you already own a bucket in that region with the same name, this // error will not be thrown. Instead, the call will return successfully // and the ACL on that bucket will be reset. else if (caught instanceof BucketAlreadyOwnedByYou) { console.error( `The bucket "${bucketName}" already exists in this AWS account.`, ); } else { throw caught; } } };

下列程式碼範例示範如何使用 DeleteBucket

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

刪除儲存貯體。

import { DeleteBucketCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Delete an Amazon S3 bucket. * @param {{ bucketName: string }} */ export const main = async ({ bucketName }) => { const client = new S3Client({}); const command = new DeleteBucketCommand({ Bucket: bucketName, }); try { await client.send(command); console.log("Bucket was deleted."); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while deleting bucket. The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while deleting the bucket. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

下列程式碼範例示範如何使用 DeleteBucketPolicy

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

刪除儲存貯體政策。

import { DeleteBucketPolicyCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Remove the policy from an Amazon S3 bucket. * @param {{ bucketName: string }} */ export const main = async ({ bucketName }) => { const client = new S3Client({}); try { await client.send( new DeleteBucketPolicyCommand({ Bucket: bucketName, }), ); console.log(`Bucket policy deleted from "${bucketName}".`); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while deleting policy from ${bucketName}. The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while deleting policy from ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

下列程式碼範例示範如何使用 DeleteBucketWebsite

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

從儲存貯體刪除網站組態

import { DeleteBucketWebsiteCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Remove the website configuration for a bucket. * @param {{ bucketName: string }} */ export const main = async ({ bucketName }) => { const client = new S3Client({}); try { await client.send( new DeleteBucketWebsiteCommand({ Bucket: bucketName, }), ); // The response code will be successful for both removed configurations and // configurations that did not exist in the first place. console.log( `The bucket "${bucketName}" is not longer configured as a website, or it never was.`, ); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while removing website configuration from ${bucketName}. The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while removing website configuration from ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

下列程式碼範例示範如何使用 DeleteObject

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

刪除物件。

import { DeleteObjectCommand, S3Client, S3ServiceException, waitUntilObjectNotExists, } from "@aws-sdk/client-s3"; /** * Delete one object from an Amazon S3 bucket. * @param {{ bucketName: string, key: string }} */ export const main = async ({ bucketName, key }) => { const client = new S3Client({}); try { await client.send( new DeleteObjectCommand({ Bucket: bucketName, Key: key, }), ); await waitUntilObjectNotExists( { client }, { Bucket: bucketName, Key: key }, ); // A successful delete, or a delete for a non-existent object, both return // a 204 response code. console.log( `The object "${key}" from bucket "${bucketName}" was deleted, or it didn't exist.`, ); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while deleting object from ${bucketName}. The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while deleting object from ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };
  • 如需 API 詳細資訊,請參閱 DeleteObject AWS SDK for JavaScript 參考中的 API

下列程式碼範例示範如何使用 DeleteObjects

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

刪除多個物件。

import { DeleteObjectsCommand, S3Client, S3ServiceException, waitUntilObjectNotExists, } from "@aws-sdk/client-s3"; /** * Delete multiple objects from an S3 bucket. * @param {{ bucketName: string, keys: string[] }} */ export const main = async ({ bucketName, keys }) => { const client = new S3Client({}); try { const { Deleted } = await client.send( new DeleteObjectsCommand({ Bucket: bucketName, Delete: { Objects: keys.map((k) => ({ Key: k })), }, }), ); for (const key in keys) { await waitUntilObjectNotExists( { client }, { Bucket: bucketName, Key: key }, ); } console.log( `Successfully deleted ${Deleted.length} objects from S3 bucket. Deleted objects:`, ); console.log(Deleted.map((d) => ` • ${d.Key}`).join("\n")); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while deleting objects from ${bucketName}. The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while deleting objects from ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };
  • 如需 API 詳細資訊,請參閱 DeleteObjects AWS SDK for JavaScript 參考中的 API

下列程式碼範例示範如何使用 GetBucketAcl

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

取得 ACL 許可。

import { GetBucketAclCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Retrieves the Access Control List (ACL) for an S3 bucket. * @param {{ bucketName: string }} */ export const main = async ({ bucketName }) => { const client = new S3Client({}); try { const response = await client.send( new GetBucketAclCommand({ Bucket: bucketName, }), ); console.log(`ACL for bucket "${bucketName}":`); console.log(JSON.stringify(response, null, 2)); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while getting ACL for ${bucketName}. The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while getting ACL for ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

下列程式碼範例示範如何使用 GetBucketCors

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

取得儲存貯體的 CORS 政策。

import { GetBucketCorsCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Log the Cross-Origin Resource Sharing (CORS) configuration information * set for the bucket. * @param {{ bucketName: string }} */ export const main = async ({ bucketName }) => { const client = new S3Client({}); const command = new GetBucketCorsCommand({ Bucket: bucketName, }); try { const { CORSRules } = await client.send(command); console.log(JSON.stringify(CORSRules)); CORSRules.forEach((cr, i) => { console.log( `\nCORSRule ${i + 1}`, `\n${"-".repeat(10)}`, `\nAllowedHeaders: ${cr.AllowedHeaders}`, `\nAllowedMethods: ${cr.AllowedMethods}`, `\nAllowedOrigins: ${cr.AllowedOrigins}`, `\nExposeHeaders: ${cr.ExposeHeaders}`, `\nMaxAgeSeconds: ${cr.MaxAgeSeconds}`, ); }); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while getting bucket CORS rules for ${bucketName}. The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while getting bucket CORS rules for ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

下列程式碼範例示範如何使用 GetBucketPolicy

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

取得儲存貯體政策。

import { GetBucketPolicyCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Logs the policy for a specified bucket. * @param {{ bucketName: string }} */ export const main = async ({ bucketName }) => { const client = new S3Client({}); try { const { Policy } = await client.send( new GetBucketPolicyCommand({ Bucket: bucketName, }), ); console.log(`Policy for "${bucketName}":\n${Policy}`); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while getting policy from ${bucketName}. The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while getting policy from ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

下列程式碼範例示範如何使用 GetBucketWebsite

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

取得網站組態。

import { GetBucketWebsiteCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Log the website configuration for a bucket. * @param {{ bucketName }} */ export const main = async ({ bucketName }) => { const client = new S3Client({}); try { const response = await client.send( new GetBucketWebsiteCommand({ Bucket: bucketName, }), ); console.log( `Your bucket is set up to host a website with the following configuration:\n${JSON.stringify(response, null, 2)}`, ); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchWebsiteConfiguration" ) { console.error( `Error from S3 while getting website configuration for ${bucketName}. The bucket isn't configured as a website.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while getting website configuration for ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };
  • 如需 API 詳細資訊,請參閱 GetBucketWebsite AWS SDK for JavaScript 參考中的 API

下列程式碼範例示範如何使用 GetObject

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

下載物件。

import { GetObjectCommand, NoSuchKey, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Get a single object from a specified S3 bucket. * @param {{ bucketName: string, key: string }} */ export const main = async ({ bucketName, key }) => { const client = new S3Client({}); try { const response = await client.send( new GetObjectCommand({ Bucket: bucketName, Key: key, }), ); // The Body object also has 'transformToByteArray' and 'transformToWebStream' methods. const str = await response.Body.transformToString(); console.log(str); } catch (caught) { if (caught instanceof NoSuchKey) { console.error( `Error from S3 while getting object "${key}" from "${bucketName}". No such key exists.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while getting object from ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

下列程式碼範例示範如何使用 GetObjectLegalHold

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

import { GetObjectLegalHoldCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Get an object's current legal hold status. * @param {{ bucketName: string, key: string }} */ export const main = async ({ bucketName, key }) => { const client = new S3Client({}); try { const response = await client.send( new GetObjectLegalHoldCommand({ Bucket: bucketName, Key: key, // Optionally, you can provide additional parameters // ExpectedBucketOwner: "<account ID that is expected to own the bucket>", // VersionId: "<the specific version id of the object to check>", }), ); console.log(`Legal Hold Status: ${response.LegalHold.Status}`); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while getting legal hold status for ${key} in ${bucketName}. The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while getting legal hold status for ${key} in ${bucketName} from ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } }; // Call function if run directly import { parseArgs } from "node:util"; import { isMain, validateArgs, } from "@aws-doc-sdk-examples/lib/utils/util-node.js"; const loadArgs = () => { const options = { bucketName: { type: "string", required: true, }, key: { type: "string", required: true, }, }; const results = parseArgs({ options }); const { errors } = validateArgs({ options }, results); return { errors, results }; }; if (isMain(import.meta.url)) { const { errors, results } = loadArgs(); if (!errors) { main(results.values); } else { console.error(errors.join("\n")); } }
  • 如需 API 詳細資訊,請參閱 GetObjectLegalHold AWS SDK for JavaScript 參考中的 API

下列程式碼範例示範如何使用 GetObjectLockConfiguration

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

import { GetObjectLockConfigurationCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Gets the Object Lock configuration for a bucket. * @param {{ bucketName: string }} */ export const main = async ({ bucketName }) => { const client = new S3Client({}); try { const { ObjectLockConfiguration } = await client.send( new GetObjectLockConfigurationCommand({ Bucket: bucketName, // Optionally, you can provide additional parameters // ExpectedBucketOwner: "<account ID that is expected to own the bucket>", }), ); console.log( `Object Lock Configuration:\n${JSON.stringify(ObjectLockConfiguration)}`, ); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while getting object lock configuration for ${bucketName}. The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while getting object lock configuration for ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } }; // Call function if run directly import { parseArgs } from "node:util"; import { isMain, validateArgs, } from "@aws-doc-sdk-examples/lib/utils/util-node.js"; const loadArgs = () => { const options = { bucketName: { type: "string", required: true, }, }; const results = parseArgs({ options }); const { errors } = validateArgs({ options }, results); return { errors, results }; }; if (isMain(import.meta.url)) { const { errors, results } = loadArgs(); if (!errors) { main(results.values); } else { console.error(errors.join("\n")); } }

下列程式碼範例示範如何使用 GetObjectRetention

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

import { GetObjectRetentionCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Log the "RetainUntilDate" for an object in an S3 bucket. * @param {{ bucketName: string, key: string }} */ export const main = async ({ bucketName, key }) => { const client = new S3Client({}); try { const { Retention } = await client.send( new GetObjectRetentionCommand({ Bucket: bucketName, Key: key, }), ); console.log( `${key} in ${bucketName} will be retained until ${Retention.RetainUntilDate}`, ); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchObjectLockConfiguration" ) { console.warn( `The object "${key}" in the bucket "${bucketName}" does not have an ObjectLock configuration.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while getting object retention settings for "${bucketName}". ${caught.name}: ${caught.message}`, ); } else { throw caught; } } }; // Call function if run directly import { parseArgs } from "node:util"; import { isMain, validateArgs, } from "@aws-doc-sdk-examples/lib/utils/util-node.js"; const loadArgs = () => { const options = { bucketName: { type: "string", required: true, }, key: { type: "string", required: true, }, }; const results = parseArgs({ options }); const { errors } = validateArgs({ options }, results); return { errors, results }; }; if (isMain(import.meta.url)) { const { errors, results } = loadArgs(); if (!errors) { main(results.values); } else { console.error(errors.join("\n")); } }
  • 如需 API 詳細資訊,請參閱 GetObjectRetention AWS SDK for JavaScript 參考中的 API

下列程式碼範例示範如何使用 ListBuckets

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

列出儲存貯體。

import { paginateListBuckets, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * List the Amazon S3 buckets in your account. */ export const main = async () => { const client = new S3Client({}); /** @type {?import('@aws-sdk/client-s3').Owner} */ let Owner = null; /** @type {import('@aws-sdk/client-s3').Bucket[]} */ const Buckets = []; try { const paginator = paginateListBuckets({ client }, {}); for await (const page of paginator) { if (!Owner) { Owner = page.Owner; } Buckets.push(...page.Buckets); } console.log( `${Owner.DisplayName} owns ${Buckets.length} bucket${ Buckets.length === 1 ? "" : "s" }:`, ); console.log(`${Buckets.map((b) => ` • ${b.Name}`).join("\n")}`); } catch (caught) { if (caught instanceof S3ServiceException) { console.error( `Error from S3 while listing buckets. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

下列程式碼範例示範如何使用 ListObjectsV2

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

條列儲存貯體中的所有物件。如果有多個物件,則會使用 IsTruncated 和 NextContinuationToken 在完整清單上迭代。

import { S3Client, S3ServiceException, // This command supersedes the ListObjectsCommand and is the recommended way to list objects. paginateListObjectsV2, } from "@aws-sdk/client-s3"; /** * Log all of the object keys in a bucket. * @param {{ bucketName: string, pageSize: string }} */ export const main = async ({ bucketName, pageSize }) => { const client = new S3Client({}); /** @type {string[][]} */ const objects = []; try { const paginator = paginateListObjectsV2( { client, /* Max items per page */ pageSize: Number.parseInt(pageSize) }, { Bucket: bucketName }, ); for await (const page of paginator) { objects.push(page.Contents.map((o) => o.Key)); } objects.forEach((objectList, pageNum) => { console.log( `Page ${pageNum + 1}\n------\n${objectList.map((o) => `• ${o}`).join("\n")}\n`, ); }); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while listing objects for "${bucketName}". The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while listing objects for "${bucketName}". ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

下列程式碼範例示範如何使用 PutBucketAcl

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

放置儲存貯體 ACL。

import { PutBucketAclCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Grant read access to a user using their canonical AWS account ID. * * Most Amazon S3 use cases don't require the use of access control lists (ACLs). * We recommend that you disable ACLs, except in unusual circumstances where * you need to control access for each object individually. Consider a policy instead. * For more information see https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucket-policies.html. * @param {{ bucketName: string, granteeCanonicalUserId: string, ownerCanonicalUserId }} */ export const main = async ({ bucketName, granteeCanonicalUserId, ownerCanonicalUserId, }) => { const client = new S3Client({}); const command = new PutBucketAclCommand({ Bucket: bucketName, AccessControlPolicy: { Grants: [ { Grantee: { // The canonical ID of the user. This ID is an obfuscated form of your AWS account number. // It's unique to Amazon S3 and can't be found elsewhere. // For more information, see https://docs.aws.amazon.com/AmazonS3/latest/userguide/finding-canonical-user-id.html. ID: granteeCanonicalUserId, Type: "CanonicalUser", }, // One of FULL_CONTROL | READ | WRITE | READ_ACP | WRITE_ACP // https://docs.aws.amazon.com/AmazonS3/latest/API/API_Grant.html#AmazonS3-Type-Grant-Permission Permission: "READ", }, ], Owner: { ID: ownerCanonicalUserId, }, }, }); try { await client.send(command); console.log(`Granted READ access to ${bucketName}`); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while setting ACL for bucket ${bucketName}. The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while setting ACL for bucket ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

下列程式碼範例示範如何使用 PutBucketCors

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

新增 CORS 規則。

import { PutBucketCorsCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Allows cross-origin requests to an S3 bucket by setting the CORS configuration. * @param {{ bucketName: string }} */ export const main = async ({ bucketName }) => { const client = new S3Client({}); try { await client.send( new PutBucketCorsCommand({ Bucket: bucketName, CORSConfiguration: { CORSRules: [ { // Allow all headers to be sent to this bucket. AllowedHeaders: ["*"], // Allow only GET and PUT methods to be sent to this bucket. AllowedMethods: ["GET", "PUT"], // Allow only requests from the specified origin. AllowedOrigins: ["https://www.example.com"], // Allow the entity tag (ETag) header to be returned in the response. The ETag header // The entity tag represents a specific version of the object. The ETag reflects // changes only to the contents of an object, not its metadata. ExposeHeaders: ["ETag"], // How long the requesting browser should cache the preflight response. After // this time, the preflight request will have to be made again. MaxAgeSeconds: 3600, }, ], }, }), ); console.log(`Successfully set CORS rules for bucket: ${bucketName}`); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while setting CORS rules for ${bucketName}. The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while setting CORS rules for ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

下列程式碼範例示範如何使用 PutBucketPolicy

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

新增政策。

import { PutBucketPolicyCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Grant an IAM role GetObject access to all of the objects * in the provided bucket. * @param {{ bucketName: string, iamRoleArn: string }} */ export const main = async ({ bucketName, iamRoleArn }) => { const client = new S3Client({}); const command = new PutBucketPolicyCommand({ // This is a resource-based policy. For more information on resource-based policies, // see https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies.html#policies_resource-based. Policy: JSON.stringify({ Version: "2012-10-17", Statement: [ { Effect: "Allow", Principal: { AWS: iamRoleArn, }, Action: "s3:GetObject", Resource: `arn:aws:s3:::${bucketName}/*`, }, ], }), // Apply the preceding policy to this bucket. Bucket: bucketName, }); try { await client.send(command); console.log( `GetObject access to the bucket "${bucketName}" was granted to the provided IAM role.`, ); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "MalformedPolicy" ) { console.error( `Error from S3 while setting the bucket policy for the bucket "${bucketName}". The policy was malformed.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while setting the bucket policy for the bucket "${bucketName}". ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

下列程式碼範例示範如何使用 PutBucketWebsite

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

設定儲存貯體網站組態。

import { PutBucketWebsiteCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Configure an Amazon S3 bucket to serve a static website. * Website access must also be granted separately. For more information * on setting the permissions for website access, see * https://docs.aws.amazon.com/AmazonS3/latest/userguide/WebsiteAccessPermissionsReqd.html. * * @param {{ bucketName: string }} */ export const main = async ({ bucketName }) => { const client = new S3Client({}); const command = new PutBucketWebsiteCommand({ Bucket: bucketName, WebsiteConfiguration: { ErrorDocument: { // The object key name to use when a 4XX class error occurs. Key: "error.html", }, IndexDocument: { // A suffix that is appended to a request when the request is // for a directory. Suffix: "index.html", }, }, }); try { await client.send(command); console.log( `The bucket "${bucketName}" has been configured as a static website.`, ); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while configuring the bucket "${bucketName}" as a static website. The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while configuring the bucket "${bucketName}" as a static website. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

下列程式碼範例示範如何使用 PutObject

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

上傳物件。

import { readFile } from "node:fs/promises"; import { PutObjectCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Upload a file to an S3 bucket. * @param {{ bucketName: string, key: string, filePath: string }} */ export const main = async ({ bucketName, key, filePath }) => { const client = new S3Client({}); const command = new PutObjectCommand({ Bucket: bucketName, Key: key, Body: await readFile(filePath), }); try { const response = await client.send(command); console.log(response); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "EntityTooLarge" ) { console.error( `Error from S3 while uploading object to ${bucketName}. \ The object was too large. To upload objects larger than 5GB, use the S3 console (160GB max) \ or the multipart upload API (5TB max).`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while uploading object to ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

下列程式碼範例示範如何使用 PutObjectLegalHold

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

import { PutObjectLegalHoldCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Apply a legal hold configuration to the specified object. * @param {{ bucketName: string, objectKey: string, legalHoldStatus: "ON" | "OFF" }} */ export const main = async ({ bucketName, objectKey, legalHoldStatus }) => { if (!["OFF", "ON"].includes(legalHoldStatus.toUpperCase())) { throw new Error( "Invalid parameter. legalHoldStatus must be 'ON' or 'OFF'.", ); } const client = new S3Client({}); const command = new PutObjectLegalHoldCommand({ Bucket: bucketName, Key: objectKey, LegalHold: { // Set the status to 'ON' to place a legal hold on the object. // Set the status to 'OFF' to remove the legal hold. Status: legalHoldStatus, }, }); try { await client.send(command); console.log( `Legal hold status set to "${legalHoldStatus}" for "${objectKey}" in "${bucketName}"`, ); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while modifying legal hold status for "${objectKey}" in "${bucketName}". The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while modifying legal hold status for "${objectKey}" in "${bucketName}". ${caught.name}: ${caught.message}`, ); } else { throw caught; } } }; // Call function if run directly import { parseArgs } from "node:util"; import { isMain, validateArgs, } from "@aws-doc-sdk-examples/lib/utils/util-node.js"; const loadArgs = () => { const options = { bucketName: { type: "string", required: true, }, objectKey: { type: "string", required: true, }, legalHoldStatus: { type: "string", default: "ON", }, }; const results = parseArgs({ options }); const { errors } = validateArgs({ options }, results); return { errors, results }; }; if (isMain(import.meta.url)) { const { errors, results } = loadArgs(); if (!errors) { main(results.values); } else { console.error(errors.join("\n")); } }
  • 如需 API 詳細資訊,請參閱 PutObjectLegalHold AWS SDK for JavaScript 參考中的 API

下列程式碼範例示範如何使用 PutObjectLockConfiguration

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

設定儲存貯體的物件鎖定組態。

import { PutObjectLockConfigurationCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Enable S3 Object Lock for an Amazon S3 bucket. * After you enable Object Lock on a bucket, you can't * disable Object Lock or suspend versioning for that bucket. * @param {{ bucketName: string, enabled: boolean }} */ export const main = async ({ bucketName }) => { const client = new S3Client({}); const command = new PutObjectLockConfigurationCommand({ Bucket: bucketName, // The Object Lock configuration that you want to apply to the specified bucket. ObjectLockConfiguration: { ObjectLockEnabled: "Enabled", }, }); try { await client.send(command); console.log(`Object Lock for "${bucketName}" enabled.`); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while modifying the object lock configuration for the bucket "${bucketName}". The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while modifying the object lock configuration for the bucket "${bucketName}". ${caught.name}: ${caught.message}`, ); } else { throw caught; } } }; // Call function if run directly import { parseArgs } from "node:util"; import { isMain, validateArgs, } from "@aws-doc-sdk-examples/lib/utils/util-node.js"; const loadArgs = () => { const options = { bucketName: { type: "string", required: true, }, }; const results = parseArgs({ options }); const { errors } = validateArgs({ options }, results); return { errors, results }; }; if (isMain(import.meta.url)) { const { errors, results } = loadArgs(); if (!errors) { main(results.values); } else { console.error(errors.join("\n")); } }

設定儲存貯體的預設保留期間。

import { PutObjectLockConfigurationCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Change the default retention settings for an object in an Amazon S3 bucket. * @param {{ bucketName: string, retentionDays: string }} */ export const main = async ({ bucketName, retentionDays }) => { const client = new S3Client({}); try { await client.send( new PutObjectLockConfigurationCommand({ Bucket: bucketName, // The Object Lock configuration that you want to apply to the specified bucket. ObjectLockConfiguration: { ObjectLockEnabled: "Enabled", Rule: { // The default Object Lock retention mode and period that you want to apply // to new objects placed in the specified bucket. Bucket settings require // both a mode and a period. The period can be either Days or Years but // you must select one. DefaultRetention: { // In governance mode, users can't overwrite or delete an object version // or alter its lock settings unless they have special permissions. With // governance mode, you protect objects against being deleted by most users, // but you can still grant some users permission to alter the retention settings // or delete the objects if necessary. Mode: "GOVERNANCE", Days: Number.parseInt(retentionDays), }, }, }, }), ); console.log( `Set default retention mode to "GOVERNANCE" with a retention period of ${retentionDays} day(s).`, ); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while setting the default object retention for a bucket. The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while setting the default object retention for a bucket. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } }; // Call function if run directly import { parseArgs } from "node:util"; import { isMain, validateArgs, } from "@aws-doc-sdk-examples/lib/utils/util-node.js"; const loadArgs = () => { const options = { bucketName: { type: "string", required: true, }, retentionDays: { type: "string", required: true, }, }; const results = parseArgs({ options }); const { errors } = validateArgs({ options }, results); return { errors, results }; }; if (isMain(import.meta.url)) { const { errors, results } = loadArgs(); if (!errors) { main(results.values); } else { console.error(errors.join("\n")); } }

下列程式碼範例示範如何使用 PutObjectRetention

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

import { PutObjectRetentionCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Place a 24-hour retention period on an object in an Amazon S3 bucket. * @param {{ bucketName: string, key: string }} */ export const main = async ({ bucketName, key }) => { const client = new S3Client({}); const command = new PutObjectRetentionCommand({ Bucket: bucketName, Key: key, BypassGovernanceRetention: false, Retention: { // In governance mode, users can't overwrite or delete an object version // or alter its lock settings unless they have special permissions. With // governance mode, you protect objects against being deleted by most users, // but you can still grant some users permission to alter the retention settings // or delete the objects if necessary. Mode: "GOVERNANCE", RetainUntilDate: new Date(new Date().getTime() + 24 * 60 * 60 * 1000), }, }); try { await client.send(command); console.log("Object Retention settings updated."); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "NoSuchBucket" ) { console.error( `Error from S3 while modifying the governance mode and retention period on an object. The bucket doesn't exist.`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while modifying the governance mode and retention period on an object. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } }; // Call function if run directly import { parseArgs } from "node:util"; import { isMain, validateArgs, } from "@aws-doc-sdk-examples/lib/utils/util-node.js"; const loadArgs = () => { const options = { bucketName: { type: "string", required: true, }, key: { type: "string", required: true, }, }; const results = parseArgs({ options }); const { errors } = validateArgs({ options }, results); return { errors, results }; }; if (isMain(import.meta.url)) { const { errors, results } = loadArgs(); if (!errors) { main(results.values); } else { console.error(errors.join("\n")); } }
  • 如需 API 詳細資訊,請參閱 PutObjectRetention AWS SDK for JavaScript 參考中的 API

案例

下列程式碼範例示範如何為 Amazon S3 建立預先簽章的 URL 並上傳物件。

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

建立預先簽章的 URL,將物件上傳至儲存貯體。

import https from "node:https"; import { XMLParser } from "fast-xml-parser"; import { PutObjectCommand, S3Client } from "@aws-sdk/client-s3"; import { fromIni } from "@aws-sdk/credential-providers"; import { HttpRequest } from "@smithy/protocol-http"; import { getSignedUrl, S3RequestPresigner, } from "@aws-sdk/s3-request-presigner"; import { parseUrl } from "@smithy/url-parser"; import { formatUrl } from "@aws-sdk/util-format-url"; import { Hash } from "@smithy/hash-node"; const createPresignedUrlWithoutClient = async ({ region, bucket, key }) => { const url = parseUrl(`https://${bucket}.s3.${region}.amazonaws.com/${key}`); const presigner = new S3RequestPresigner({ credentials: fromIni(), region, sha256: Hash.bind(null, "sha256"), }); const signedUrlObject = await presigner.presign( new HttpRequest({ ...url, method: "PUT" }), ); return formatUrl(signedUrlObject); }; const createPresignedUrlWithClient = ({ region, bucket, key }) => { const client = new S3Client({ region }); const command = new PutObjectCommand({ Bucket: bucket, Key: key }); return getSignedUrl(client, command, { expiresIn: 3600 }); }; /** * Make a PUT request to the provided URL. * * @param {string} url * @param {string} data */ const put = (url, data) => { return new Promise((resolve, reject) => { const req = https.request( url, { method: "PUT", headers: { "Content-Length": new Blob([data]).size } }, (res) => { let responseBody = ""; res.on("data", (chunk) => { responseBody += chunk; }); res.on("end", () => { const parser = new XMLParser(); if (res.statusCode >= 200 && res.statusCode <= 299) { resolve(parser.parse(responseBody, true)); } else { reject(parser.parse(responseBody, true)); } }); }, ); req.on("error", (err) => { reject(err); }); req.write(data); req.end(); }); }; /** * Create two presigned urls for uploading an object to an S3 bucket. * The first presigned URL is created with credentials from the shared INI file * in the current environment. The second presigned URL is created using an * existing S3Client instance that has already been provided with credentials. * @param {{ bucketName: string, key: string, region: string }} */ export const main = async ({ bucketName, key, region }) => { try { const noClientUrl = await createPresignedUrlWithoutClient({ bucket: bucketName, key, region, }); const clientUrl = await createPresignedUrlWithClient({ bucket: bucketName, region, key, }); // After you get the presigned URL, you can provide your own file // data. Refer to put() above. console.log("Calling PUT using presigned URL without client"); await put(noClientUrl, "Hello World"); console.log("Calling PUT using presigned URL with client"); await put(clientUrl, "Hello World"); console.log("\nDone. Check your S3 console."); } catch (caught) { if (caught instanceof Error && caught.name === "CredentialsProviderError") { console.error( `There was an error getting your credentials. Are your local credentials configured?\n${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

建立預先簽章的 URL 以從儲存貯體下載物件。

import { GetObjectCommand, S3Client } from "@aws-sdk/client-s3"; import { fromIni } from "@aws-sdk/credential-providers"; import { HttpRequest } from "@smithy/protocol-http"; import { getSignedUrl, S3RequestPresigner, } from "@aws-sdk/s3-request-presigner"; import { parseUrl } from "@smithy/url-parser"; import { formatUrl } from "@aws-sdk/util-format-url"; import { Hash } from "@smithy/hash-node"; const createPresignedUrlWithoutClient = async ({ region, bucket, key }) => { const url = parseUrl(`https://${bucket}.s3.${region}.amazonaws.com/${key}`); const presigner = new S3RequestPresigner({ credentials: fromIni(), region, sha256: Hash.bind(null, "sha256"), }); const signedUrlObject = await presigner.presign(new HttpRequest(url)); return formatUrl(signedUrlObject); }; const createPresignedUrlWithClient = ({ region, bucket, key }) => { const client = new S3Client({ region }); const command = new GetObjectCommand({ Bucket: bucket, Key: key }); return getSignedUrl(client, command, { expiresIn: 3600 }); }; /** * Create two presigned urls for downloading an object from an S3 bucket. * The first presigned URL is created with credentials from the shared INI file * in the current environment. The second presigned URL is created using an * existing S3Client instance that has already been provided with credentials. * @param {{ bucketName: string, key: string, region: string }} */ export const main = async ({ bucketName, key, region }) => { try { const noClientUrl = await createPresignedUrlWithoutClient({ bucket: bucketName, region, key, }); const clientUrl = await createPresignedUrlWithClient({ bucket: bucketName, region, key, }); console.log("Presigned URL without client"); console.log(noClientUrl); console.log("\n"); console.log("Presigned URL with client"); console.log(clientUrl); } catch (caught) { if (caught instanceof Error && caught.name === "CredentialsProviderError") { console.error( `There was an error getting your credentials. Are your local credentials configured?\n${caught.name}: ${caught.message}`, ); } else { throw caught; } } };

下列程式碼範例示範如何建立無伺服器應用程式,讓使用者以標籤管理相片。

SDK for JavaScript (v3)

顯示如何開發照片資產管理應用程式,以便使用 Amazon Rekognition 偵測圖片中的標籤,並將其儲存以供日後擷取。

如需完整的原始程式碼和如何設定和執行的指示,請參閱 GitHub 上的完整範例。

如要深入探索此範例的來源,請參閱 AWS  社群上的文章。

此範例中使用的服務
  • API Gateway

  • DynamoDB

  • Lambda

  • Amazon Rekognition

  • Amazon S3

  • Amazon SNS

下列程式碼範例顯示如何在網頁中列出 Amazon S3 物件。

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

下列程式碼是對 AWS SDK 進行呼叫的相關 React 元件。包含此元件的應用程式的可執行版本,請參閱 preceding GitHub 連結。

import { useEffect, useState } from "react"; import { ListObjectsCommand, type ListObjectsCommandOutput, S3Client, } from "@aws-sdk/client-s3"; import { fromCognitoIdentityPool } from "@aws-sdk/credential-providers"; import "./App.css"; function App() { const [objects, setObjects] = useState< Required<ListObjectsCommandOutput>["Contents"] >([]); useEffect(() => { const client = new S3Client({ region: "us-east-1", // Unless you have a public bucket, you'll need access to a private bucket. // One way to do this is to create an Amazon Cognito identity pool, attach a role to the pool, // and grant the role access to the 's3:GetObject' action. // // You'll also need to configure the CORS settings on the bucket to allow traffic from // this example site. Here's an example configuration that allows all origins. Don't // do this in production. //[ // { // "AllowedHeaders": ["*"], // "AllowedMethods": ["GET"], // "AllowedOrigins": ["*"], // "ExposeHeaders": [], // }, //] // credentials: fromCognitoIdentityPool({ clientConfig: { region: "us-east-1" }, identityPoolId: "<YOUR_IDENTITY_POOL_ID>", }), }); const command = new ListObjectsCommand({ Bucket: "bucket-name" }); client.send(command).then(({ Contents }) => setObjects(Contents || [])); }, []); return ( <div className="App"> {objects.map((o) => ( <div key={o.ETag}>{o.Key}</div> ))} </div> ); } export default App;
  • 如需 API 詳細資訊,請參閱 ListObjects AWS SDK for JavaScript 參考中的 API

下列程式碼範例示範如何透過互動式應用程式探索 Amazon Textract 輸出。

SDK for JavaScript (v3)

示範如何使用 AWS SDK for JavaScript 建置 React 應用程式,該應用程式使用 Amazon Textract 從文件映像中擷取資料,並將其顯示在互動式網頁中。此範例會在 Web 瀏覽器中執行,且登入資料需要經過驗證的 Amazon Cognito 身分。它使用 Amazon Simple Storage Service (Amazon S3) 進行儲存,並針對通知輪詢訂閱 Amazon Simple Notification Service (Amazon SQS) 主題的 Amazon Simple Queue Service (Amazon SNS) 佇列。

如需完整的原始程式碼和如何設定和執行的指示,請參閱 GitHub 上的完整範例。

此範例中使用的服務
  • Amazon Cognito Identity

  • Amazon S3

  • Amazon SNS

  • Amazon SQS

  • Amazon Textract

下列程式碼範例示範如何刪除 Amazon S3 儲存貯體中的所有物件。

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

刪除指定 Amazon S3 儲存貯體的所有物件。

import { DeleteObjectsCommand, paginateListObjectsV2, S3Client, } from "@aws-sdk/client-s3"; /** * * @param {{ bucketName: string }} config */ export const main = async ({ bucketName }) => { const client = new S3Client({}); try { console.log(`Deleting all objects in bucket: ${bucketName}`); const paginator = paginateListObjectsV2( { client }, { Bucket: bucketName, }, ); const objectKeys = []; for await (const { Contents } of paginator) { objectKeys.push(...Contents.map((obj) => ({ Key: obj.Key }))); } const deleteCommand = new DeleteObjectsCommand({ Bucket: bucketName, Delete: { Objects: objectKeys }, }); await client.send(deleteCommand); console.log(`All objects deleted from bucket: ${bucketName}`); } catch (caught) { if (caught instanceof Error) { console.error( `Failed to empty ${bucketName}. ${caught.name}: ${caught.message}`, ); } } }; // Call function if run directly. import { fileURLToPath } from "node:url"; import { parseArgs } from "node:util"; if (process.argv[1] === fileURLToPath(import.meta.url)) { const options = { bucketName: { type: "string", }, }; const { values } = parseArgs({ options }); main(values); }

下列程式碼範例示範如何建置使用 Amazon Rekognition 的應用程式,以依影像中的類別偵測物件。

SDK for JavaScript (v3)

示範如何搭配 使用 Amazon Rekognition, AWS SDK for JavaScript 以建立使用 Amazon Rekognition 的應用程式,在位於 Amazon Simple Storage Service (Amazon S3) 儲存貯體的影像中依類別識別物件。應用程式會使用 Amazon Simple Email Service (Amazon SES) 傳送電子郵件通知給管理員,其中包含結果。

了解如何:

  • 使用 Amazon Cognito 建立未經身分驗證的使用者。

  • 使用 Amazon Rekognition 分析映像中的物件。

  • 驗證 Amazon SES 的電子郵件地址。

  • 使用 Amazon SES 傳送電子郵件通知。

如需完整的原始程式碼和如何設定和執行的指示,請參閱 GitHub 上的完整範例。

此範例中使用的服務
  • Amazon Rekognition

  • Amazon S3

  • Amazon SES

下列程式碼範例示範如何使用 S3 物件鎖定功能。

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

index.js - 工作流程的進入點。這會協調所有步驟。Visit GitHub 可查看案例、 ScenarioInput、 ScenarioOutput 和 ScenarioAction 的實作詳細資訊。

import * as Scenarios from "@aws-doc-sdk-examples/lib/scenario/index.js"; import { exitOnFalse, loadState, saveState, } from "@aws-doc-sdk-examples/lib/scenario/steps-common.js"; import { welcome, welcomeContinue } from "./welcome.steps.js"; import { confirmCreateBuckets, confirmPopulateBuckets, confirmSetLegalHoldFileEnabled, confirmSetLegalHoldFileRetention, confirmSetRetentionPeriodFileEnabled, confirmSetRetentionPeriodFileRetention, confirmUpdateLockPolicy, confirmUpdateRetention, createBuckets, createBucketsAction, getBucketPrefix, populateBuckets, populateBucketsAction, setLegalHoldFileEnabledAction, setLegalHoldFileRetentionAction, setRetentionPeriodFileEnabledAction, setRetentionPeriodFileRetentionAction, updateLockPolicy, updateLockPolicyAction, updateRetention, updateRetentionAction, } from "./setup.steps.js"; /** * @param {Scenarios} scenarios * @param {Record<string, any>} initialState */ export const getWorkflowStages = (scenarios, initialState = {}) => { const client = new S3Client({}); return { deploy: new scenarios.Scenario( "S3 Object Locking - Deploy", [ welcome(scenarios), welcomeContinue(scenarios), exitOnFalse(scenarios, "welcomeContinue"), getBucketPrefix(scenarios), createBuckets(scenarios), confirmCreateBuckets(scenarios), exitOnFalse(scenarios, "confirmCreateBuckets"), createBucketsAction(scenarios, client), updateRetention(scenarios), confirmUpdateRetention(scenarios), exitOnFalse(scenarios, "confirmUpdateRetention"), updateRetentionAction(scenarios, client), populateBuckets(scenarios), confirmPopulateBuckets(scenarios), exitOnFalse(scenarios, "confirmPopulateBuckets"), populateBucketsAction(scenarios, client), updateLockPolicy(scenarios), confirmUpdateLockPolicy(scenarios), exitOnFalse(scenarios, "confirmUpdateLockPolicy"), updateLockPolicyAction(scenarios, client), confirmSetLegalHoldFileEnabled(scenarios), setLegalHoldFileEnabledAction(scenarios, client), confirmSetRetentionPeriodFileEnabled(scenarios), setRetentionPeriodFileEnabledAction(scenarios, client), confirmSetLegalHoldFileRetention(scenarios), setLegalHoldFileRetentionAction(scenarios, client), confirmSetRetentionPeriodFileRetention(scenarios), setRetentionPeriodFileRetentionAction(scenarios, client), saveState, ], initialState, ), demo: new scenarios.Scenario( "S3 Object Locking - Demo", [loadState, replAction(scenarios, client)], initialState, ), clean: new scenarios.Scenario( "S3 Object Locking - Destroy", [ loadState, confirmCleanup(scenarios), exitOnFalse(scenarios, "confirmCleanup"), cleanupAction(scenarios, client), ], initialState, ), }; }; // Call function if run directly import { fileURLToPath } from "node:url"; import { S3Client } from "@aws-sdk/client-s3"; import { cleanupAction, confirmCleanup } from "./clean.steps.js"; import { replAction } from "./repl.steps.js"; if (process.argv[1] === fileURLToPath(import.meta.url)) { const objectLockingScenarios = getWorkflowStages(Scenarios); Scenarios.parseScenarioArgs(objectLockingScenarios, { name: "Amazon S3 object locking workflow", description: "Work with Amazon Simple Storage Service (Amazon S3) object locking features.", synopsis: "node index.js --scenario <deploy | demo | clean> [-h|--help] [-y|--yes] [-v|--verbose]", }); }

welcome.steps.js - 將歡迎訊息輸出至主控台。

/** * @typedef {import("@aws-doc-sdk-examples/lib/scenario/index.js")} Scenarios */ /** * @param {Scenarios} scenarios */ const welcome = (scenarios) => new scenarios.ScenarioOutput( "welcome", "Welcome to the Amazon Simple Storage Service (S3) Object Locking Workflow Scenario. For this workflow, we will use the AWS SDK for JavaScript to create several S3 buckets and files to demonstrate working with S3 locking features.", { header: true }, ); /** * @param {Scenarios} scenarios */ const welcomeContinue = (scenarios) => new scenarios.ScenarioInput( "welcomeContinue", "Press Enter when you are ready to start.", { type: "confirm" }, ); export { welcome, welcomeContinue };

setup.steps.js - 部署儲存貯體、物件和檔案設定。

import { BucketVersioningStatus, ChecksumAlgorithm, CreateBucketCommand, MFADeleteStatus, PutBucketVersioningCommand, PutObjectCommand, PutObjectLockConfigurationCommand, PutObjectLegalHoldCommand, PutObjectRetentionCommand, ObjectLockLegalHoldStatus, ObjectLockRetentionMode, GetBucketVersioningCommand, BucketAlreadyExists, BucketAlreadyOwnedByYou, S3ServiceException, waitUntilBucketExists, } from "@aws-sdk/client-s3"; import { retry } from "@aws-doc-sdk-examples/lib/utils/util-timers.js"; /** * @typedef {import("@aws-doc-sdk-examples/lib/scenario/index.js")} Scenarios */ /** * @typedef {import("@aws-sdk/client-s3").S3Client} S3Client */ /** * @param {Scenarios} scenarios */ const getBucketPrefix = (scenarios) => new scenarios.ScenarioInput( "bucketPrefix", "Provide a prefix that will be used for bucket creation.", { type: "input", default: "amzn-s3-demo-bucket" }, ); /** * @param {Scenarios} scenarios */ const createBuckets = (scenarios) => new scenarios.ScenarioOutput( "createBuckets", (state) => `The following buckets will be created: ${state.bucketPrefix}-no-lock with object lock False. ${state.bucketPrefix}-lock-enabled with object lock True. ${state.bucketPrefix}-retention-after-creation with object lock False.`, { preformatted: true }, ); /** * @param {Scenarios} scenarios */ const confirmCreateBuckets = (scenarios) => new scenarios.ScenarioInput("confirmCreateBuckets", "Create the buckets?", { type: "confirm", }); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const createBucketsAction = (scenarios, client) => new scenarios.ScenarioAction("createBucketsAction", async (state) => { const noLockBucketName = `${state.bucketPrefix}-no-lock`; const lockEnabledBucketName = `${state.bucketPrefix}-lock-enabled`; const retentionBucketName = `${state.bucketPrefix}-retention-after-creation`; try { await client.send(new CreateBucketCommand({ Bucket: noLockBucketName })); await waitUntilBucketExists({ client }, { Bucket: noLockBucketName }); await client.send( new CreateBucketCommand({ Bucket: lockEnabledBucketName, ObjectLockEnabledForBucket: true, }), ); await waitUntilBucketExists( { client }, { Bucket: lockEnabledBucketName }, ); await client.send( new CreateBucketCommand({ Bucket: retentionBucketName }), ); await waitUntilBucketExists({ client }, { Bucket: retentionBucketName }); state.noLockBucketName = noLockBucketName; state.lockEnabledBucketName = lockEnabledBucketName; state.retentionBucketName = retentionBucketName; } catch (caught) { if ( caught instanceof BucketAlreadyExists || caught instanceof BucketAlreadyOwnedByYou ) { console.error(`${caught.name}: ${caught.message}`); state.earlyExit = true; } else { throw caught; } } }); /** * @param {Scenarios} scenarios */ const populateBuckets = (scenarios) => new scenarios.ScenarioOutput( "populateBuckets", (state) => `The following test files will be created: file0.txt in ${state.bucketPrefix}-no-lock. file1.txt in ${state.bucketPrefix}-no-lock. file0.txt in ${state.bucketPrefix}-lock-enabled. file1.txt in ${state.bucketPrefix}-lock-enabled. file0.txt in ${state.bucketPrefix}-retention-after-creation. file1.txt in ${state.bucketPrefix}-retention-after-creation.`, { preformatted: true }, ); /** * @param {Scenarios} scenarios */ const confirmPopulateBuckets = (scenarios) => new scenarios.ScenarioInput( "confirmPopulateBuckets", "Populate the buckets?", { type: "confirm" }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const populateBucketsAction = (scenarios, client) => new scenarios.ScenarioAction("populateBucketsAction", async (state) => { try { await client.send( new PutObjectCommand({ Bucket: state.noLockBucketName, Key: "file0.txt", Body: "Content", ChecksumAlgorithm: ChecksumAlgorithm.SHA256, }), ); await client.send( new PutObjectCommand({ Bucket: state.noLockBucketName, Key: "file1.txt", Body: "Content", ChecksumAlgorithm: ChecksumAlgorithm.SHA256, }), ); await client.send( new PutObjectCommand({ Bucket: state.lockEnabledBucketName, Key: "file0.txt", Body: "Content", ChecksumAlgorithm: ChecksumAlgorithm.SHA256, }), ); await client.send( new PutObjectCommand({ Bucket: state.lockEnabledBucketName, Key: "file1.txt", Body: "Content", ChecksumAlgorithm: ChecksumAlgorithm.SHA256, }), ); await client.send( new PutObjectCommand({ Bucket: state.retentionBucketName, Key: "file0.txt", Body: "Content", ChecksumAlgorithm: ChecksumAlgorithm.SHA256, }), ); await client.send( new PutObjectCommand({ Bucket: state.retentionBucketName, Key: "file1.txt", Body: "Content", ChecksumAlgorithm: ChecksumAlgorithm.SHA256, }), ); } catch (caught) { if (caught instanceof S3ServiceException) { console.error( `Error from S3 while uploading object. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } }); /** * @param {Scenarios} scenarios */ const updateRetention = (scenarios) => new scenarios.ScenarioOutput( "updateRetention", (state) => `A bucket can be configured to use object locking with a default retention period. A default retention period will be configured for ${state.bucketPrefix}-retention-after-creation.`, { preformatted: true }, ); /** * @param {Scenarios} scenarios */ const confirmUpdateRetention = (scenarios) => new scenarios.ScenarioInput( "confirmUpdateRetention", "Configure default retention period?", { type: "confirm" }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const updateRetentionAction = (scenarios, client) => new scenarios.ScenarioAction("updateRetentionAction", async (state) => { await client.send( new PutBucketVersioningCommand({ Bucket: state.retentionBucketName, VersioningConfiguration: { MFADelete: MFADeleteStatus.Disabled, Status: BucketVersioningStatus.Enabled, }, }), ); const getBucketVersioning = new GetBucketVersioningCommand({ Bucket: state.retentionBucketName, }); await retry({ intervalInMs: 500, maxRetries: 10 }, async () => { const { Status } = await client.send(getBucketVersioning); if (Status !== "Enabled") { throw new Error("Bucket versioning is not enabled."); } }); await client.send( new PutObjectLockConfigurationCommand({ Bucket: state.retentionBucketName, ObjectLockConfiguration: { ObjectLockEnabled: "Enabled", Rule: { DefaultRetention: { Mode: "GOVERNANCE", Years: 1, }, }, }, }), ); }); /** * @param {Scenarios} scenarios */ const updateLockPolicy = (scenarios) => new scenarios.ScenarioOutput( "updateLockPolicy", (state) => `Object lock policies can also be added to existing buckets. An object lock policy will be added to ${state.bucketPrefix}-lock-enabled.`, { preformatted: true }, ); /** * @param {Scenarios} scenarios */ const confirmUpdateLockPolicy = (scenarios) => new scenarios.ScenarioInput( "confirmUpdateLockPolicy", "Add object lock policy?", { type: "confirm" }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const updateLockPolicyAction = (scenarios, client) => new scenarios.ScenarioAction("updateLockPolicyAction", async (state) => { await client.send( new PutObjectLockConfigurationCommand({ Bucket: state.lockEnabledBucketName, ObjectLockConfiguration: { ObjectLockEnabled: "Enabled", }, }), ); }); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const confirmSetLegalHoldFileEnabled = (scenarios) => new scenarios.ScenarioInput( "confirmSetLegalHoldFileEnabled", (state) => `Would you like to add a legal hold to file0.txt in ${state.lockEnabledBucketName}?`, { type: "confirm", }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const setLegalHoldFileEnabledAction = (scenarios, client) => new scenarios.ScenarioAction( "setLegalHoldFileEnabledAction", async (state) => { await client.send( new PutObjectLegalHoldCommand({ Bucket: state.lockEnabledBucketName, Key: "file0.txt", LegalHold: { Status: ObjectLockLegalHoldStatus.ON, }, }), ); console.log( `Modified legal hold for file0.txt in ${state.lockEnabledBucketName}.`, ); }, { skipWhen: (state) => !state.confirmSetLegalHoldFileEnabled }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const confirmSetRetentionPeriodFileEnabled = (scenarios) => new scenarios.ScenarioInput( "confirmSetRetentionPeriodFileEnabled", (state) => `Would you like to add a 1 day Governance retention period to file1.txt in ${state.lockEnabledBucketName}? Reminder: Only a user with the s3:BypassGovernanceRetention permission will be able to delete this file or its bucket until the retention period has expired.`, { type: "confirm", }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const setRetentionPeriodFileEnabledAction = (scenarios, client) => new scenarios.ScenarioAction( "setRetentionPeriodFileEnabledAction", async (state) => { const retentionDate = new Date(); retentionDate.setDate(retentionDate.getDate() + 1); await client.send( new PutObjectRetentionCommand({ Bucket: state.lockEnabledBucketName, Key: "file1.txt", Retention: { Mode: ObjectLockRetentionMode.GOVERNANCE, RetainUntilDate: retentionDate, }, }), ); console.log( `Set retention for file1.txt in ${state.lockEnabledBucketName} until ${retentionDate.toISOString().split("T")[0]}.`, ); }, { skipWhen: (state) => !state.confirmSetRetentionPeriodFileEnabled }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const confirmSetLegalHoldFileRetention = (scenarios) => new scenarios.ScenarioInput( "confirmSetLegalHoldFileRetention", (state) => `Would you like to add a legal hold to file0.txt in ${state.retentionBucketName}?`, { type: "confirm", }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const setLegalHoldFileRetentionAction = (scenarios, client) => new scenarios.ScenarioAction( "setLegalHoldFileRetentionAction", async (state) => { await client.send( new PutObjectLegalHoldCommand({ Bucket: state.retentionBucketName, Key: "file0.txt", LegalHold: { Status: ObjectLockLegalHoldStatus.ON, }, }), ); console.log( `Modified legal hold for file0.txt in ${state.retentionBucketName}.`, ); }, { skipWhen: (state) => !state.confirmSetLegalHoldFileRetention }, ); /** * @param {Scenarios} scenarios */ const confirmSetRetentionPeriodFileRetention = (scenarios) => new scenarios.ScenarioInput( "confirmSetRetentionPeriodFileRetention", (state) => `Would you like to add a 1 day Governance retention period to file1.txt in ${state.retentionBucketName}? Reminder: Only a user with the s3:BypassGovernanceRetention permission will be able to delete this file or its bucket until the retention period has expired.`, { type: "confirm", }, ); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const setRetentionPeriodFileRetentionAction = (scenarios, client) => new scenarios.ScenarioAction( "setRetentionPeriodFileRetentionAction", async (state) => { const retentionDate = new Date(); retentionDate.setDate(retentionDate.getDate() + 1); await client.send( new PutObjectRetentionCommand({ Bucket: state.retentionBucketName, Key: "file1.txt", Retention: { Mode: ObjectLockRetentionMode.GOVERNANCE, RetainUntilDate: retentionDate, }, BypassGovernanceRetention: true, }), ); console.log( `Set retention for file1.txt in ${state.retentionBucketName} until ${retentionDate.toISOString().split("T")[0]}.`, ); }, { skipWhen: (state) => !state.confirmSetRetentionPeriodFileRetention }, ); export { getBucketPrefix, createBuckets, confirmCreateBuckets, createBucketsAction, populateBuckets, confirmPopulateBuckets, populateBucketsAction, updateRetention, confirmUpdateRetention, updateRetentionAction, updateLockPolicy, confirmUpdateLockPolicy, updateLockPolicyAction, confirmSetLegalHoldFileEnabled, setLegalHoldFileEnabledAction, confirmSetRetentionPeriodFileEnabled, setRetentionPeriodFileEnabledAction, confirmSetLegalHoldFileRetention, setLegalHoldFileRetentionAction, confirmSetRetentionPeriodFileRetention, setRetentionPeriodFileRetentionAction, };

repl.steps.js - 檢視和刪除儲存貯體中的檔案。

import { ChecksumAlgorithm, DeleteObjectCommand, GetObjectLegalHoldCommand, GetObjectLockConfigurationCommand, GetObjectRetentionCommand, ListObjectVersionsCommand, PutObjectCommand, } from "@aws-sdk/client-s3"; /** * @typedef {import("@aws-doc-sdk-examples/lib/scenario/index.js")} Scenarios */ /** * @typedef {import("@aws-sdk/client-s3").S3Client} S3Client */ const choices = { EXIT: 0, LIST_ALL_FILES: 1, DELETE_FILE: 2, DELETE_FILE_WITH_RETENTION: 3, OVERWRITE_FILE: 4, VIEW_RETENTION_SETTINGS: 5, VIEW_LEGAL_HOLD_SETTINGS: 6, }; /** * @param {Scenarios} scenarios */ const replInput = (scenarios) => new scenarios.ScenarioInput( "replChoice", "Explore the S3 locking features by selecting one of the following choices", { type: "select", choices: [ { name: "List all files in buckets", value: choices.LIST_ALL_FILES }, { name: "Attempt to delete a file.", value: choices.DELETE_FILE }, { name: "Attempt to delete a file with retention period bypass.", value: choices.DELETE_FILE_WITH_RETENTION, }, { name: "Attempt to overwrite a file.", value: choices.OVERWRITE_FILE }, { name: "View the object and bucket retention settings for a file.", value: choices.VIEW_RETENTION_SETTINGS, }, { name: "View the legal hold settings for a file.", value: choices.VIEW_LEGAL_HOLD_SETTINGS, }, { name: "Finish the workflow.", value: choices.EXIT }, ], }, ); /** * @param {S3Client} client * @param {string[]} buckets */ const getAllFiles = async (client, buckets) => { /** @type {{bucket: string, key: string, version: string}[]} */ const files = []; for (const bucket of buckets) { const objectsResponse = await client.send( new ListObjectVersionsCommand({ Bucket: bucket }), ); for (const version of objectsResponse.Versions || []) { const { Key, VersionId } = version; files.push({ bucket, key: Key, version: VersionId }); } } return files; }; /** * @param {Scenarios} scenarios * @param {S3Client} client */ const replAction = (scenarios, client) => new scenarios.ScenarioAction( "replAction", async (state) => { const files = await getAllFiles(client, [ state.noLockBucketName, state.lockEnabledBucketName, state.retentionBucketName, ]); const fileInput = new scenarios.ScenarioInput( "selectedFile", "Select a file:", { type: "select", choices: files.map((file, index) => ({ name: `${index + 1}: ${file.bucket}: ${file.key} (version: ${ file.version })`, value: index, })), }, ); const { replChoice } = state; switch (replChoice) { case choices.LIST_ALL_FILES: { const files = await getAllFiles(client, [ state.noLockBucketName, state.lockEnabledBucketName, state.retentionBucketName, ]); state.replOutput = files .map( (file) => `${file.bucket}: ${file.key} (version: ${file.version})`, ) .join("\n"); break; } case choices.DELETE_FILE: { /** @type {number} */ const fileToDelete = await fileInput.handle(state); const selectedFile = files[fileToDelete]; try { await client.send( new DeleteObjectCommand({ Bucket: selectedFile.bucket, Key: selectedFile.key, VersionId: selectedFile.version, }), ); state.replOutput = `Deleted ${selectedFile.key} in ${selectedFile.bucket}.`; } catch (err) { state.replOutput = `Unable to delete object ${selectedFile.key} in bucket ${selectedFile.bucket}: ${err.message}`; } break; } case choices.DELETE_FILE_WITH_RETENTION: { /** @type {number} */ const fileToDelete = await fileInput.handle(state); const selectedFile = files[fileToDelete]; try { await client.send( new DeleteObjectCommand({ Bucket: selectedFile.bucket, Key: selectedFile.key, VersionId: selectedFile.version, BypassGovernanceRetention: true, }), ); state.replOutput = `Deleted ${selectedFile.key} in ${selectedFile.bucket}.`; } catch (err) { state.replOutput = `Unable to delete object ${selectedFile.key} in bucket ${selectedFile.bucket}: ${err.message}`; } break; } case choices.OVERWRITE_FILE: { /** @type {number} */ const fileToOverwrite = await fileInput.handle(state); const selectedFile = files[fileToOverwrite]; try { await client.send( new PutObjectCommand({ Bucket: selectedFile.bucket, Key: selectedFile.key, Body: "New content", ChecksumAlgorithm: ChecksumAlgorithm.SHA256, }), ); state.replOutput = `Overwrote ${selectedFile.key} in ${selectedFile.bucket}.`; } catch (err) { state.replOutput = `Unable to overwrite object ${selectedFile.key} in bucket ${selectedFile.bucket}: ${err.message}`; } break; } case choices.VIEW_RETENTION_SETTINGS: { /** @type {number} */ const fileToView = await fileInput.handle(state); const selectedFile = files[fileToView]; try { const retention = await client.send( new GetObjectRetentionCommand({ Bucket: selectedFile.bucket, Key: selectedFile.key, VersionId: selectedFile.version, }), ); const bucketConfig = await client.send( new GetObjectLockConfigurationCommand({ Bucket: selectedFile.bucket, }), ); state.replOutput = `Object retention for ${selectedFile.key} in ${selectedFile.bucket}: ${retention.Retention?.Mode} until ${retention.Retention?.RetainUntilDate?.toISOString()}. Bucket object lock config for ${selectedFile.bucket} in ${selectedFile.bucket}: Enabled: ${bucketConfig.ObjectLockConfiguration?.ObjectLockEnabled} Rule: ${JSON.stringify(bucketConfig.ObjectLockConfiguration?.Rule?.DefaultRetention)}`; } catch (err) { state.replOutput = `Unable to fetch object lock retention: '${err.message}'`; } break; } case choices.VIEW_LEGAL_HOLD_SETTINGS: { /** @type {number} */ const fileToView = await fileInput.handle(state); const selectedFile = files[fileToView]; try { const legalHold = await client.send( new GetObjectLegalHoldCommand({ Bucket: selectedFile.bucket, Key: selectedFile.key, VersionId: selectedFile.version, }), ); state.replOutput = `Object legal hold for ${selectedFile.key} in ${selectedFile.bucket}: Status: ${legalHold.LegalHold?.Status}`; } catch (err) { state.replOutput = `Unable to fetch legal hold: '${err.message}'`; } break; } default: throw new Error(`Invalid replChoice: ${replChoice}`); } }, { whileConfig: { whileFn: ({ replChoice }) => replChoice !== choices.EXIT, input: replInput(scenarios), output: new scenarios.ScenarioOutput( "REPL output", (state) => state.replOutput, { preformatted: true }, ), }, }, ); export { replInput, replAction, choices };

clean.steps.js - 銷毀所有建立的資源。

import { DeleteObjectCommand, DeleteBucketCommand, ListObjectVersionsCommand, GetObjectLegalHoldCommand, GetObjectRetentionCommand, PutObjectLegalHoldCommand, } from "@aws-sdk/client-s3"; /** * @typedef {import("@aws-doc-sdk-examples/lib/scenario/index.js")} Scenarios */ /** * @typedef {import("@aws-sdk/client-s3").S3Client} S3Client */ /** * @param {Scenarios} scenarios */ const confirmCleanup = (scenarios) => new scenarios.ScenarioInput("confirmCleanup", "Clean up resources?", { type: "confirm", }); /** * @param {Scenarios} scenarios * @param {S3Client} client */ const cleanupAction = (scenarios, client) => new scenarios.ScenarioAction("cleanupAction", async (state) => { const { noLockBucketName, lockEnabledBucketName, retentionBucketName } = state; const buckets = [ noLockBucketName, lockEnabledBucketName, retentionBucketName, ]; for (const bucket of buckets) { /** @type {import("@aws-sdk/client-s3").ListObjectVersionsCommandOutput} */ let objectsResponse; try { objectsResponse = await client.send( new ListObjectVersionsCommand({ Bucket: bucket, }), ); } catch (e) { if (e instanceof Error && e.name === "NoSuchBucket") { console.log("Object's bucket has already been deleted."); continue; } throw e; } for (const version of objectsResponse.Versions || []) { const { Key, VersionId } = version; try { const legalHold = await client.send( new GetObjectLegalHoldCommand({ Bucket: bucket, Key, VersionId, }), ); if (legalHold.LegalHold?.Status === "ON") { await client.send( new PutObjectLegalHoldCommand({ Bucket: bucket, Key, VersionId, LegalHold: { Status: "OFF", }, }), ); } } catch (err) { console.log( `Unable to fetch legal hold for ${Key} in ${bucket}: '${err.message}'`, ); } try { const retention = await client.send( new GetObjectRetentionCommand({ Bucket: bucket, Key, VersionId, }), ); if (retention.Retention?.Mode === "GOVERNANCE") { await client.send( new DeleteObjectCommand({ Bucket: bucket, Key, VersionId, BypassGovernanceRetention: true, }), ); } } catch (err) { console.log( `Unable to fetch object lock retention for ${Key} in ${bucket}: '${err.message}'`, ); } await client.send( new DeleteObjectCommand({ Bucket: bucket, Key, VersionId, }), ); } await client.send(new DeleteBucketCommand({ Bucket: bucket })); console.log(`Delete for ${bucket} complete.`); } }); export { confirmCleanup, cleanupAction };

下列程式碼範例示範如何在 Amazon S3 之間上傳或下載大型檔案。

如需詳細資訊,請參閱使用分段上傳以上傳物件

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在 AWS 程式碼範例儲存庫中設定和執行。

上傳大型檔案。

import { S3Client } from "@aws-sdk/client-s3"; import { Upload } from "@aws-sdk/lib-storage"; import { ProgressBar, logger, } from "@aws-doc-sdk-examples/lib/utils/util-log.js"; const twentyFiveMB = 25 * 1024 * 1024; export const createString = (size = twentyFiveMB) => { return "x".repeat(size); }; /** * Create a 25MB file and upload it in parts to the specified * Amazon S3 bucket. * @param {{ bucketName: string, key: string }} */ export const main = async ({ bucketName, key }) => { const str = createString(); const buffer = Buffer.from(str, "utf8"); const progressBar = new ProgressBar({ description: `Uploading "${key}" to "${bucketName}"`, barLength: 30, }); try { const upload = new Upload({ client: new S3Client({}), params: { Bucket: bucketName, Key: key, Body: buffer, }, }); upload.on("httpUploadProgress", ({ loaded, total }) => { progressBar.update({ current: loaded, total }); }); await upload.done(); } catch (caught) { if (caught instanceof Error && caught.name === "AbortError") { logger.error(`Multipart upload was aborted. ${caught.message}`); } else { throw caught; } } };

下載大型檔案。

import { GetObjectCommand, NoSuchKey, S3Client } from "@aws-sdk/client-s3"; import { createWriteStream, rmSync } from "node:fs"; const s3Client = new S3Client({}); const oneMB = 1024 * 1024; export const getObjectRange = ({ bucket, key, start, end }) => { const command = new GetObjectCommand({ Bucket: bucket, Key: key, Range: `bytes=${start}-${end}`, }); return s3Client.send(command); }; /** * @param {string | undefined} contentRange */ export const getRangeAndLength = (contentRange) => { const [range, length] = contentRange.split("/"); const [start, end] = range.split("-"); return { start: Number.parseInt(start), end: Number.parseInt(end), length: Number.parseInt(length), }; }; export const isComplete = ({ end, length }) => end === length - 1; const downloadInChunks = async ({ bucket, key }) => { const writeStream = createWriteStream( fileURLToPath(new URL(`./${key}`, import.meta.url)), ).on("error", (err) => console.error(err)); let rangeAndLength = { start: -1, end: -1, length: -1 }; while (!isComplete(rangeAndLength)) { const { end } = rangeAndLength; const nextRange = { start: end + 1, end: end + oneMB }; const { ContentRange, Body } = await getObjectRange({ bucket, key, ...nextRange, }); console.log(`Downloaded bytes ${nextRange.start} to ${nextRange.end}`); writeStream.write(await Body.transformToByteArray()); rangeAndLength = getRangeAndLength(ContentRange); } }; /** * Download a large object from and Amazon S3 bucket. * * When downloading a large file, you might want to break it down into * smaller pieces. Amazon S3 accepts a Range header to specify the start * and end of the byte range to be downloaded. * * @param {{ bucketName: string, key: string }} */ export const main = async ({ bucketName, key }) => { try { await downloadInChunks({ bucket: bucketName, key: key, }); } catch (caught) { if (caught instanceof NoSuchKey) { console.error(`Failed to download object. No such key "${key}".`); rmSync(key); } } };

無伺服器範例

下列程式碼範例示範如何實作 Lambda 函數,該函數接收透過將物件上傳至 S3 儲存貯體所觸發的事件。函數會從事件參數擷取 S3 儲存貯體名稱和物件金鑰,並呼叫 Amazon S3 API 來擷取和記錄物件的內容類型。

SDK for JavaScript (v3)
注意

還有更多 on GitHub。尋找完整範例,並了解如何在無伺服器範例儲存庫中設定和執行。

使用 Lambda using JavaScript 取用 S3 事件。

import { S3Client, HeadObjectCommand } from "@aws-sdk/client-s3"; const client = new S3Client(); export const handler = async (event, context) => { // Get the object from the event and show its content type const bucket = event.Records[0].s3.bucket.name; const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' ')); try { const { ContentType } = await client.send(new HeadObjectCommand({ Bucket: bucket, Key: key, })); console.log('CONTENT TYPE:', ContentType); return ContentType; } catch (err) { console.log(err); const message = `Error getting object ${key} from bucket ${bucket}. Make sure they exist and your bucket is in the same region as this function.`; console.log(message); throw new Error(message); } };

使用 Lambda using TypeScript 取用 S3 事件。

// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: Apache-2.0 import { S3Event } from 'aws-lambda'; import { S3Client, HeadObjectCommand } from '@aws-sdk/client-s3'; const s3 = new S3Client({ region: process.env.AWS_REGION }); export const handler = async (event: S3Event): Promise<string | undefined> => { // Get the object from the event and show its content type const bucket = event.Records[0].s3.bucket.name; const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' ')); const params = { Bucket: bucket, Key: key, }; try { const { ContentType } = await s3.send(new HeadObjectCommand(params)); console.log('CONTENT TYPE:', ContentType); return ContentType; } catch (err) { console.log(err); const message = `Error getting object ${key} from bucket ${bucket}. Make sure they exist and your bucket is in the same region as this function.`; console.log(message); throw new Error(message); } };