This page is only for existing customers of the S3 Glacier service using Vaults and the original REST API from 2012.
If you're looking for archival storage solutions we suggest using the S3 Glacier storage classes in Amazon S3, S3 Glacier Instant Retrieval, S3 Glacier Flexible Retrieval, and S3 Glacier Deep Archive. To learn more about these storage options, see S3 Glacier storage classes
Use UploadArchive
with an AWS SDK or CLI
The following code examples show how to use UploadArchive
.
Action examples are code excerpts from larger programs and must be run in context. You can see this action in context in the following code example:
- .NET
-
- AWS SDK for .NET
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. /// <summary> /// Upload an object to an Amazon S3 Glacier vault. /// </summary> /// <param name="vaultName">The name of the Amazon S3 Glacier vault to upload /// the archive to.</param> /// <param name="archiveFilePath">The file path of the archive to upload to the vault.</param> /// <returns>A Boolean value indicating the success of the action.</returns> public async Task<string> UploadArchiveWithArchiveManager(string vaultName, string archiveFilePath) { try { var manager = new ArchiveTransferManager(_glacierService); // Upload an archive. var response = await manager.UploadAsync(vaultName, "upload archive test", archiveFilePath); return response.ArchiveId; } catch (AmazonGlacierException ex) { Console.WriteLine(ex.Message); return string.Empty; } }
-
For API details, see UploadArchive in AWS SDK for .NET API Reference.
-
- CLI
-
- AWS CLI
-
The following command uploads an archive in the current folder named
archive.zip
to a vault namedmy-vault
:aws glacier upload
-
archive --account-id - --vault-namemy-vault
--bodyarchive.zip
Output:
{ "archiveId": "kKB7ymWJVpPSwhGP6ycSOAekp9ZYe_--zM_mw6k76ZFGEIWQX-ybtRDvc2VkPSDtfKmQrj0IRQLSGsNuDp-AJVlu2ccmDSyDUmZwKbwbpAdGATGDiB3hHO0bjbGehXTcApVud_wyDw", "checksum": "969fb39823836d81f0cc028195fcdbcbbe76cdde932d4646fa7de5f21e18aa67", "location": "/0123456789012/vaults/my-vault/archives/kKB7ymWJVpPSwhGP6ycSOAekp9ZYe_--zM_mw6k76ZFGEIWQX-ybtRDvc2VkPSDtfKmQrj0IRQLSGsNuDp-AJVlu2ccmDSyDUmZwKbwbpAdGATGDiB3hHO0bjbGehXTcApVud_wyDw" }
Amazon Glacier requires an account ID argument when performing operations, but you can use a hyphen to specify the in-use account.
To retrieve an uploaded archive, initiate a retrieval job with the aws glacier initiate-job command.
-
For API details, see UploadArchive
in AWS CLI Command Reference.
-
- Java
-
- SDK for Java 2.x
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. import software.amazon.awssdk.regions.Region; import software.amazon.awssdk.services.glacier.GlacierClient; import software.amazon.awssdk.services.glacier.model.UploadArchiveRequest; import software.amazon.awssdk.services.glacier.model.UploadArchiveResponse; import software.amazon.awssdk.services.glacier.model.GlacierException; import java.io.File; import java.nio.file.Path; import java.nio.file.Paths; import java.io.FileInputStream; import java.io.IOException; import java.security.MessageDigest; import java.security.NoSuchAlgorithmException; /** * Before running this Java V2 code example, set up your development * environment, including your credentials. * * For more information, see the following documentation topic: * * https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/get-started.html */ public class UploadArchive { static final int ONE_MB = 1024 * 1024; public static void main(String[] args) { final String usage = """ Usage: <strPath> <vaultName>\s Where: strPath - The path to the archive to upload (for example, C:\\AWS\\test.pdf). vaultName - The name of the vault. """; if (args.length != 2) { System.out.println(usage); System.exit(1); } String strPath = args[0]; String vaultName = args[1]; File myFile = new File(strPath); Path path = Paths.get(strPath); GlacierClient glacier = GlacierClient.builder() .region(Region.US_EAST_1) .build(); String archiveId = uploadContent(glacier, path, vaultName, myFile); System.out.println("The ID of the archived item is " + archiveId); glacier.close(); } public static String uploadContent(GlacierClient glacier, Path path, String vaultName, File myFile) { // Get an SHA-256 tree hash value. String checkVal = computeSHA256(myFile); try { UploadArchiveRequest uploadRequest = UploadArchiveRequest.builder() .vaultName(vaultName) .checksum(checkVal) .build(); UploadArchiveResponse res = glacier.uploadArchive(uploadRequest, path); return res.archiveId(); } catch (GlacierException e) { System.err.println(e.awsErrorDetails().errorMessage()); System.exit(1); } return ""; } private static String computeSHA256(File inputFile) { try { byte[] treeHash = computeSHA256TreeHash(inputFile); System.out.printf("SHA-256 tree hash = %s\n", toHex(treeHash)); return toHex(treeHash); } catch (IOException ioe) { System.err.format("Exception when reading from file %s: %s", inputFile, ioe.getMessage()); System.exit(-1); } catch (NoSuchAlgorithmException nsae) { System.err.format("Cannot locate MessageDigest algorithm for SHA-256: %s", nsae.getMessage()); System.exit(-1); } return ""; } public static byte[] computeSHA256TreeHash(File inputFile) throws IOException, NoSuchAlgorithmException { byte[][] chunkSHA256Hashes = getChunkSHA256Hashes(inputFile); return computeSHA256TreeHash(chunkSHA256Hashes); } /** * Computes an SHA256 checksum for each 1 MB chunk of the input file. This * includes the checksum for the last chunk, even if it's smaller than 1 MB. */ public static byte[][] getChunkSHA256Hashes(File file) throws IOException, NoSuchAlgorithmException { MessageDigest md = MessageDigest.getInstance("SHA-256"); long numChunks = file.length() / ONE_MB; if (file.length() % ONE_MB > 0) { numChunks++; } if (numChunks == 0) { return new byte[][] { md.digest() }; } byte[][] chunkSHA256Hashes = new byte[(int) numChunks][]; FileInputStream fileStream = null; try { fileStream = new FileInputStream(file); byte[] buff = new byte[ONE_MB]; int bytesRead; int idx = 0; while ((bytesRead = fileStream.read(buff, 0, ONE_MB)) > 0) { md.reset(); md.update(buff, 0, bytesRead); chunkSHA256Hashes[idx++] = md.digest(); } return chunkSHA256Hashes; } finally { if (fileStream != null) { try { fileStream.close(); } catch (IOException ioe) { System.err.printf("Exception while closing %s.\n %s", file.getName(), ioe.getMessage()); } } } } /** * Computes the SHA-256 tree hash for the passed array of 1 MB chunk * checksums. */ public static byte[] computeSHA256TreeHash(byte[][] chunkSHA256Hashes) throws NoSuchAlgorithmException { MessageDigest md = MessageDigest.getInstance("SHA-256"); byte[][] prevLvlHashes = chunkSHA256Hashes; while (prevLvlHashes.length > 1) { int len = prevLvlHashes.length / 2; if (prevLvlHashes.length % 2 != 0) { len++; } byte[][] currLvlHashes = new byte[len][]; int j = 0; for (int i = 0; i < prevLvlHashes.length; i = i + 2, j++) { // If there are at least two elements remaining. if (prevLvlHashes.length - i > 1) { // Calculate a digest of the concatenated nodes. md.reset(); md.update(prevLvlHashes[i]); md.update(prevLvlHashes[i + 1]); currLvlHashes[j] = md.digest(); } else { // Take care of the remaining odd chunk currLvlHashes[j] = prevLvlHashes[i]; } } prevLvlHashes = currLvlHashes; } return prevLvlHashes[0]; } /** * Returns the hexadecimal representation of the input byte array */ public static String toHex(byte[] data) { StringBuilder sb = new StringBuilder(data.length * 2); for (byte datum : data) { String hex = Integer.toHexString(datum & 0xFF); if (hex.length() == 1) { // Append leading zero. sb.append("0"); } sb.append(hex); } return sb.toString().toLowerCase(); } }
-
For API details, see UploadArchive in AWS SDK for Java 2.x API Reference.
-
- JavaScript
-
- SDK for JavaScript (v3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. Create the client.
const { GlacierClient } = require("@aws-sdk/client-glacier"); // Set the AWS Region. const REGION = "REGION"; //Set the Redshift Service Object const glacierClient = new GlacierClient({ region: REGION }); export { glacierClient };
Upload the archive.
// Load the SDK for JavaScript import { UploadArchiveCommand } from "@aws-sdk/client-glacier"; import { glacierClient } from "./libs/glacierClient.js"; // Set the parameters const vaultname = "VAULT_NAME"; // VAULT_NAME // Create a new service object and buffer const buffer = new Buffer.alloc(2.5 * 1024 * 1024); // 2.5MB buffer const params = { vaultName: vaultname, body: buffer }; const run = async () => { try { const data = await glacierClient.send(new UploadArchiveCommand(params)); console.log("Archive ID", data.archiveId); return data; // For unit tests. } catch (err) { console.log("Error uploading archive!", err); } }; run();
-
For more information, see AWS SDK for JavaScript Developer Guide.
-
For API details, see UploadArchive in AWS SDK for JavaScript API Reference.
-
- SDK for JavaScript (v2)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. // Load the SDK for JavaScript var AWS = require("aws-sdk"); // Set the region AWS.config.update({ region: "REGION" }); // Create a new service object and buffer var glacier = new AWS.Glacier({ apiVersion: "2012-06-01" }); buffer = Buffer.alloc(2.5 * 1024 * 1024); // 2.5MB buffer var params = { vaultName: "YOUR_VAULT_NAME", body: buffer }; // Call Glacier to upload the archive. glacier.uploadArchive(params, function (err, data) { if (err) { console.log("Error uploading archive!", err); } else { console.log("Archive ID", data.archiveId); } });
-
For more information, see AWS SDK for JavaScript Developer Guide.
-
For API details, see UploadArchive in AWS SDK for JavaScript API Reference.
-
- PowerShell
-
- Tools for PowerShell
-
Example 1: Uploads a single file to the specified vault, returning the archive ID and computed checksum.
Write-GLCArchive -VaultName myvault -FilePath c:\temp\blue.bin
Output:
FilePath ArchiveId Checksum -------- --------- -------- C:\temp\blue.bin o9O9jUUs...TTX-TpIhQJw 79f3e...f4395b
Example 2: Uploads the contents of a folder hierarchy to the specified vault in the user's account. For each file uploaded the cmdlet emits the filename, corresponding archive ID and the computed checksum of the archive.
Write-GLCArchive -VaultName myvault -FolderPath . -Recurse
Output:
FilePath ArchiveId Checksum -------- --------- -------- C:\temp\blue.bin o9O9jUUs...TTX-TpIhQJw 79f3e...f4395b C:\temp\green.bin qXAfOdSG...czo729UHXrw d50a1...9184b9 C:\temp\lum.bin 39aNifP3...q9nb8nZkFIg 28886...5c3e27 C:\temp\red.bin vp7E6rU_...Ejk_HhjAxKA e05f7...4e34f5 C:\temp\Folder1\file1.txt _eRINlip...5Sxy7dD2BaA d0d2a...c8a3ba C:\temp\Folder2\file2.iso -Ix3jlmu...iXiDh-XfOPA 7469e...3e86f1
-
For API details, see UploadArchive in AWS Tools for PowerShell Cmdlet Reference.
-
- Python
-
- SDK for Python (Boto3)
-
Note
There's more on GitHub. Find the complete example and learn how to set up and run in the AWS Code Examples Repository
. class GlacierWrapper: """Encapsulates Amazon S3 Glacier API operations.""" def __init__(self, glacier_resource): """ :param glacier_resource: A Boto3 Amazon S3 Glacier resource. """ self.glacier_resource = glacier_resource @staticmethod def upload_archive(vault, archive_description, archive_file): """ Uploads an archive to a vault. :param vault: The vault where the archive is put. :param archive_description: A description of the archive. :param archive_file: The archive file to put in the vault. :return: The uploaded archive. """ try: archive = vault.upload_archive( archiveDescription=archive_description, body=archive_file ) logger.info( "Uploaded %s with ID %s to vault %s.", archive_description, archive.id, vault.name, ) except ClientError: logger.exception( "Couldn't upload %s to %s.", archive_description, vault.name ) raise else: return archive
-
For API details, see UploadArchive in AWS SDK for Python (Boto3) API Reference.
-
For a complete list of AWS SDK developer guides and code examples, see Using S3 Glacier with an AWS SDK. This topic also includes information about getting started and details about previous SDK versions.