

# Using Amazon S3 in the AWS CLI
<a name="cli-services-s3"></a>


| An introduction to Amazon Simple Storage Service (Amazon S3) | 
| --- | 
|    | 

You can access the features of Amazon Simple Storage Service (Amazon S3) using the AWS Command Line Interface (AWS CLI). Amazon S3 is a highly scalable and durable object storage service. Amazon S3 is designed to provide virtually unlimited storage capacity, making it an ideal solution for a wide range of data storage and management needs.

Amazon S3 allows you to store and retrieve any amount of data, from small files to large datasets, in the form of objects. Each object is stored in a container called a bucket, which can be accessed and managed through the AWS Management Console or programatically through the AWS SDKs, tools, and AWS CLI.

Including basic storage, Amazon S3 also offers a range of features including lifecycle management, versioning, scalability, and security. These integrate with other AWS services enabling you to build cloud-based solutions that scale to your needs.

The AWS CLI provides two tiers of commands for accessing Amazon S3:
+ **s3** – Custom high-level commands made specifically for the AWS CLI that simplify performing common tasks, such as creating, manipulating, deleting, and syncing objects and buckets.
+ **s3api** – Exposes direct access to all Amazon S3 API operations which enables you to carry out advanced operations.

**Topics**
+ [

# Using high-level (s3) commands in the AWS CLI
](cli-services-s3-commands.md)
+ [

# Using API-Level (s3api) commands in the AWS CLI
](cli-services-s3-apicommands.md)
+ [

# Scripting example for the Amazon S3 bucket lifecycle in the AWS CLI
](cli-services-s3-lifecycle-example.md)

# Using high-level (s3) commands in the AWS CLI
<a name="cli-services-s3-commands"></a>

This topic describes some of the commands you can use to manage Amazon S3 buckets and objects using the [https://docs.aws.amazon.com/cli/latest/reference/s3/index.html](https://docs.aws.amazon.com/cli/latest/reference/s3/index.html) commands in the AWS CLI. For commands not covered in this topic and additional command examples, see the [https://docs.aws.amazon.com/cli/latest/reference/s3/index.html](https://docs.aws.amazon.com/cli/latest/reference/s3/index.html) commands in the *AWS CLI Reference*.

The high-level `aws s3` commands simplify managing Amazon S3 objects. These commands enable you to manage the contents of Amazon S3 within itself and with local directories.

**Topics**
+ [

## Prerequisites
](#using-s3-commands-prereqs)
+ [

## Before you start
](#using-s3-commands-before)
+ [

## Create a bucket
](#using-s3-commands-managing-buckets-creating)
+ [

## List buckets and objects
](#using-s3-commands-listing-buckets)
+ [

## Delete buckets
](#using-s3-commands-delete-buckets)
+ [

## Delete objects
](#using-s3-commands-delete-objects)
+ [

## Move objects
](#using-s3-commands-managing-objects-move)
+ [

## Copy objects
](#using-s3-commands-managing-objects-copy)
+ [

## Sync objects
](#using-s3-commands-managing-objects-sync)
+ [

## Frequently used options for s3 commands
](#using-s3-commands-managing-objects-param)
+ [

## Resources
](#using-s3-commands-managing-buckets-references)

## Prerequisites
<a name="using-s3-commands-prereqs"></a>

To run the `s3` commands, you need to:
+ Install and configure the AWS CLI. For more information, see [Installing or updating to the latest version of the AWS CLI](getting-started-install.md) and [Authentication and access credentials for the AWS CLI](cli-chap-authentication.md).
+ The profile that you use must have permissions that allow the AWS operations performed by the examples.
+ Understand these Amazon S3 terms:
  + **Bucket** – A top-level Amazon S3 folder.
  + **Prefix** – An Amazon S3 folder in a bucket.
  + **Object** – Any item that's hosted in an Amazon S3 bucket.

## Before you start
<a name="using-s3-commands-before"></a>

This section describes a few things to note before you use `aws s3` commands.

### Large object uploads
<a name="using-s3-commands-before-large"></a>

When you use `aws s3` commands to upload large objects to an Amazon S3 bucket, the AWS CLI automatically performs a multipart upload. You can't resume a failed upload when using these `aws s3` commands. 

If the multipart upload fails due to a timeout, or if you manually canceled in the AWS CLI, the AWS CLI stops the upload and cleans up any files that were created. This process can take several minutes. 

If the multipart upload or cleanup process is canceled by a kill command or system failure, the created files remain in the Amazon S3 bucket. To clean up the multipart upload, use the [s3api abort-multipart-upload](https://docs.aws.amazon.com/cli/latest/reference/s3api/abort-multipart-upload.html) command.

### File properties and tags in multipart copies
<a name="using-s3-commands-before-tags"></a>

When you use the AWS CLI version 1 version of commands in the `aws s3` namespace to copy a file from one Amazon S3 bucket location to another Amazon S3 bucket location, and that operation uses [multipart copy](https://docs.aws.amazon.com/AmazonS3/latest/userguide/CopyingObjctsMPUapi.html), no file properties from the source object are copied to the destination object.

By default, the AWS CLI version 2 commands in the `s3` namespace that perform multipart copies transfers all tags and the following set of properties from the source to the destination copy: `content-type`, `content-language`, `content-encoding`, `content-disposition`, `cache-control`, `expires`, and `metadata`.

This can result in additional AWS API calls to the Amazon S3 endpoint that would not have been made if you used AWS CLI version 1. These can include: `HeadObject`, `GetObjectTagging`, and `PutObjectTagging`.

If you need to change this default behavior in AWS CLI version 2 commands, use the `--copy-props` parameter to specify one of the following options:
+ **default** – The default value. Specifies that the copy includes all tags attached to the source object and the properties encompassed by the `--metadata-directive` parameter used for non-multipart copies: `content-type`, `content-language`, `content-encoding`, `content-disposition`, `cache-control`, `expires`, and `metadata`.
+ **metadata-directive** – Specifies that the copy includes only the properties that are encompassed by the `--metadata-directive` parameter used for non-multipart copies. It doesn't copy any tags.
+ **none** – Specifies that the copy includes none of the properties from the source object.

## Create a bucket
<a name="using-s3-commands-managing-buckets-creating"></a>

Use the [https://docs.aws.amazon.com/cli/latest/reference/s3/mb.html](https://docs.aws.amazon.com/cli/latest/reference/s3/mb.html) command to make a bucket. Bucket names must be ***globally*** unique (unique across all of Amazon S3) and should be DNS compliant. 

Bucket names can contain lowercase letters, numbers, hyphens, and periods. Bucket names can start and end only with a letter or number, and cannot contain a period next to a hyphen or another period. 

**Syntax**

```
$ aws s3 mb <target> [--options]
```

### s3 mb examples
<a name="using-s3-commands-managing-buckets-creating-examples"></a>

The following example creates the `s3://amzn-s3-demo-bucket` bucket.

```
$ aws s3 mb s3://amzn-s3-demo-bucket
```

## List buckets and objects
<a name="using-s3-commands-listing-buckets"></a>

To list your buckets, folders, or objects, use the [https://docs.aws.amazon.com/cli/latest/reference/s3/ls.html](https://docs.aws.amazon.com/cli/latest/reference/s3/ls.html) command. Using the command without a target or options lists all buckets. 

**Syntax**

```
$ aws s3 ls <target> [--options]
```

For a few common options to use with this command, and examples, see [Frequently used options for s3 commands](#using-s3-commands-managing-objects-param). For a complete list of available options, see [https://docs.aws.amazon.com/cli/latest/reference/s3/ls.html](https://docs.aws.amazon.com/cli/latest/reference/s3/ls.html) in the *AWS CLI Command Reference*.

### s3 ls examples
<a name="using-s3-commands-managing-objects-list-examples"></a>

The following example lists all of your Amazon S3 buckets.

```
$ aws s3 ls
2018-12-11 17:08:50 amzn-s3-demo-bucket1
2018-12-14 14:55:44 amzn-s3-demo-bucket2
```

The following command lists all objects and prefixes in a bucket. In this example output, the prefix `example/` has one file named `MyFile1.txt`.

```
$ aws s3 ls s3://amzn-s3-demo-bucket
                           PRE example/
2018-12-04 19:05:48          3 MyFile1.txt
```

You can filter the output to a specific prefix by including it in the command. The following command lists the objects in *bucket-name/example/* (that is, objects in *bucket-name* filtered by the prefix *example/*).

```
$ aws s3 ls s3://amzn-s3-demo-bucket/example/
2018-12-06 18:59:32          3 MyFile1.txt
```

To display only the buckets and objects in a specific region, use the `--region` options

```
$ aws s3 ls --region us-east-2
2018-12-06 18:59:32          3 MyFile1.txt
```

If you have a large list of buckets and objects, you can paginated the results using the `--max-items` or `--page-size` options. The `--max-items` option limits how many total buckets and objects are returned in a call and the `--page-size` option limits how many of those are listed on a page.

```
$ aws s3 ls --max-items 100 --page-size 10
```

For more information on pagination, see [How to use the --page-size parameter](cli-usage-pagination.md#cli-usage-pagination-pagesize) and [How to use the --max-items parameter](cli-usage-pagination.md#cli-usage-pagination-maxitems).

## Delete buckets
<a name="using-s3-commands-delete-buckets"></a>

To delete a bucket, use the [https://docs.aws.amazon.com/cli/latest/reference/s3/rb.html](https://docs.aws.amazon.com/cli/latest/reference/s3/rb.html) command. 

**Syntax**

```
$ aws s3 rb <target> [--options]
```

### s3 rb examples
<a name="using-s3-commands-removing-buckets-examples"></a>

The following example removes the `s3://amzn-s3-demo-bucket` bucket.

```
$ aws s3 rb s3://amzn-s3-demo-bucket
```

By default, the bucket must be empty for the operation to succeed. To remove a bucket that's not empty, you need to include the `--force` option. If you're using a versioned bucket that contains previously deleted—but retained—objects, this command does *not* allow you to remove the bucket. You must first remove all of the content.

The following example deletes all objects and prefixes in the bucket, and then deletes the bucket.

```
$ aws s3 rb s3://amzn-s3-demo-bucket --force
```

## Delete objects
<a name="using-s3-commands-delete-objects"></a>

To delete objects in a bucket or your local directory, use the [https://docs.aws.amazon.com/cli/latest/reference/s3/rm.html](https://docs.aws.amazon.com/cli/latest/reference/s3/rm.html) command. 

**Syntax**

```
$ aws s3 rm  <target> [--options]
```

For a few common options to use with this command, and examples, see [Frequently used options for s3 commands](#using-s3-commands-managing-objects-param). For a complete list of options, see [https://docs.aws.amazon.com/cli/latest/reference/s3/rm.html](https://docs.aws.amazon.com/cli/latest/reference/s3/rm.html) in the *AWS CLI Command Reference*.

### s3 rm examples
<a name="using-s3-commands-delete-objects-examples"></a>

The following example deletes `filename.txt` from `s3://amzn-s3-demo-bucket/example`.

```
$ aws s3 rm s3://amzn-s3-demo-bucket/example/filename.txt
```

The following example deletes all objects from `s3://amzn-s3-demo-bucket/example` using the `--recursive` option.

```
$ aws s3 rm s3://amzn-s3-demo-bucket/example --recursive
```

## Move objects
<a name="using-s3-commands-managing-objects-move"></a>

Use the [https://docs.aws.amazon.com/cli/latest/reference/s3/mv.html](https://docs.aws.amazon.com/cli/latest/reference/s3/mv.html) command to move objects from a bucket or a local directory. The `s3 mv` command copies the source object or file to the specified destination and then deletes the source object or file.

**Syntax**

```
$ aws s3 mv <source> <target> [--options]
```

For a few common options to use with this command, and examples, see [Frequently used options for s3 commands](#using-s3-commands-managing-objects-param). For a complete list of available options, see [https://docs.aws.amazon.com/cli/latest/reference/s3/mv.html](https://docs.aws.amazon.com/cli/latest/reference/s3/mv.html) in the *AWS CLI Command Reference*.

**Warning**  
If you are using any type of access point ARNs or access point aliases in your Amazon S3 source or destination URIs, you must take extra care that your source and destination Amazon S3 URIs resolve to different underlying buckets. If the source and destination buckets are the same, the source file or object can be moved onto itself, which can result in accidental deletion of your source file or object. To verify that the source and destination buckets are not the same, use the `--validate-same-s3-paths` parameter, or set the environment variable ``AWS_CLI_S3_MV_VALIDATE_SAME_S3_PATHS`` to `true`.

### s3 mv examples
<a name="using-s3-commands-managing-objects-move-examples"></a>

The following example moves all objects from `s3://amzn-s3-demo-bucket/example` to `s3://amzn-s3-demo-bucket/`.

```
$ aws s3 mv s3://amzn-s3-demo-bucket/example s3://amzn-s3-demo-bucket/
```

The following example moves a local file from your current working directory to the Amazon S3 bucket with the `s3 mv` command.

```
$ aws s3 mv filename.txt s3://amzn-s3-demo-bucket
```

The following example moves a file from your Amazon S3 bucket to your current working directory, where `./` specifies your current working directory.

```
$ aws s3 mv s3://amzn-s3-demo-bucket/filename.txt ./
```

## Copy objects
<a name="using-s3-commands-managing-objects-copy"></a>

Use the [https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html](https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html) command to copy objects from a bucket or a local directory. 

**Syntax**

```
$ aws s3 cp <source> <target> [--options]
```

You can use the dash parameter for file streaming to standard input (`stdin`) or standard output (`stdout`). 

**Warning**  
If you're using PowerShell, the shell might alter the encoding of a CRLF or add a CRLF to piped input or output, or redirected output.

The `s3 cp` command uses the following syntax to upload a file stream from `stdin` to a specified bucket.

**Syntax**

```
$ aws s3 cp - <target> [--options]
```

The `s3 cp` command uses the following syntax to download an Amazon S3 file stream for `stdout`.

**Syntax**

```
$ aws s3 cp <target> [--options] -
```

For a few common options to use with this command, and examples, see [Frequently used options for s3 commands](#using-s3-commands-managing-objects-param). For the complete list of options, see [https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html](https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html) in the *AWS CLI Command Reference*.

### `s3 cp` examples
<a name="using-s3-commands-managing-objects-copy-examples"></a>

The following example copies all objects from `s3://amzn-s3-demo-bucket/example` to `s3://amzn-s3-demo-bucket/`.

```
$ aws s3 cp s3://amzn-s3-demo-bucket/example s3://amzn-s3-demo-bucket/
```

The following example copies a local file from your current working directory to the Amazon S3 bucket with the `s3 cp` command.

```
$ aws s3 cp filename.txt s3://amzn-s3-demo-bucket
```

The following example copies a file from your Amazon S3 bucket to your current working directory, where `./` specifies your current working directory.

```
$ aws s3 cp s3://amzn-s3-demo-bucket/filename.txt ./
```

The following example uses echo to stream the text "hello world" to the `s3://bucket-name/filename.txt` file.

```
$ echo "hello world" | aws s3 cp - s3://amzn-s3-demo-bucket/filename.txt
```

The following example streams the `s3://amzn-s3-demo-bucket/filename.txt` file to `stdout` and prints the contents to the console.

```
$ aws s3 cp s3://amzn-s3-demo-bucket/filename.txt -
hello world
```

The following example streams the contents of `s3://bucket-name/pre` to `stdout`, uses the `bzip2` command to compress the files, and uploads the new compressed file named `key.bz2` to `s3://bucket-name`.

```
$ aws s3 cp s3://amzn-s3-demo-bucket/pre - | bzip2 --best | aws s3 cp - s3://amzn-s3-demo-bucket/key.bz2
```

## Sync objects
<a name="using-s3-commands-managing-objects-sync"></a>

The [https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html](https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html) command synchronizes the contents of a bucket and a directory, or the contents of two buckets. Typically, `s3 sync` copies missing or outdated files or objects between the source and target. However, you can also supply the `--delete` option to remove files or objects from the target that are not present in the source. 

**Syntax**

```
$ aws s3 sync <source> <target> [--options]
```

For a few common options to use with this command, and examples, see [Frequently used options for s3 commands](#using-s3-commands-managing-objects-param). For a complete list of options, see [https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html](https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html) in the *AWS CLI Command Reference*.

### s3 sync examples
<a name="using-s3-commands-managing-objects-sync-examples"></a>

The following example synchronizes the contents of an Amazon S3 prefix named *path* in the bucket named *amzn-s3-demo-bucket* with the current working directory. 

`s3 sync` updates any files that have a size or modified time that are different from files with the same name at the destination. The output displays specific operations performed during the sync. Notice that the operation recursively synchronizes the subdirectory `MySubdirectory` and its contents with `s3://amzn-s3-demo-bucket/path/MySubdirectory`.

```
$ aws s3 sync . s3://amzn-s3-demo-bucket/path
upload: MySubdirectory\MyFile3.txt to s3://amzn-s3-demo-bucket/path/MySubdirectory/MyFile3.txt
upload: MyFile2.txt to s3://amzn-s3-demo-bucket/path/MyFile2.txt
upload: MyFile1.txt to s3://amzn-s3-demo-bucket/path/MyFile1.txt
```

The following example, which extends the previous one, shows how to use the `--delete` option.

```
// Delete local file
$ rm ./MyFile1.txt

// Attempt sync without --delete option - nothing happens
$ aws s3 sync . s3://amzn-s3-demo-bucket/path

// Sync with deletion - object is deleted from bucket
$ aws s3 sync . s3://amzn-s3-demo-bucket/path --delete
delete: s3://amzn-s3-demo-bucket/path/MyFile1.txt

// Delete object from bucket
$ aws s3 rm s3://amzn-s3-demo-bucket/path/MySubdirectory/MyFile3.txt
delete: s3://amzn-s3-demo-bucket/path/MySubdirectory/MyFile3.txt

// Sync with deletion - local file is deleted
$ aws s3 sync s3://amzn-s3-demo-bucket/path . --delete
delete: MySubdirectory\MyFile3.txt

// Sync with Infrequent Access storage class
$ aws s3 sync . s3://amzn-s3-demo-bucket/path --storage-class STANDARD_IA
```

When using the `--delete` option, the `--exclude` and `--include` options can filter files or objects to delete during an `s3 sync` operation. In this case, the parameter string must specify files to exclude from, or include for, deletion in the context of the target directory or bucket. The following shows an example.

```
Assume local directory and s3://amzn-s3-demo-bucket/path currently in sync and each contains 3 files:
MyFile1.txt
MyFile2.rtf
MyFile88.txt
'''

// Sync with delete, excluding files that match a pattern. MyFile88.txt is deleted, while remote MyFile1.txt is not.
$ aws s3 sync . s3://amzn-s3-demo-bucket/path --delete --exclude "path/MyFile?.txt"
delete: s3://amzn-s3-demo-bucket/path/MyFile88.txt
'''

// Sync with delete, excluding MyFile2.rtf - local file is NOT deleted
$ aws s3 sync s3://amzn-s3-demo-bucket/path . --delete --exclude "./MyFile2.rtf"
download: s3://amzn-s3-demo-bucket/path/MyFile1.txt to MyFile1.txt
'''

// Sync with delete, local copy of MyFile2.rtf is deleted
$ aws s3 sync s3://amzn-s3-demo-bucket/path . --delete
delete: MyFile2.rtf
```

## Frequently used options for s3 commands
<a name="using-s3-commands-managing-objects-param"></a>

The following options are frequently used for the commands described in this topic. For a complete list of options you can use on a command, see the specific command in the [AWS CLI version 2 reference guide](https://docs.aws.amazon.com/cli/latest/reference/index.html).

**acl**  
`s3 sync` and `s3 cp` can use the `--acl` option. This enables you to set the access permissions for files copied to Amazon S3. The `--acl` option accepts `private`, `public-read`, and `public-read-write` values. For more information, see [Canned ACL](https://docs.aws.amazon.com/AmazonS3/latest/userguide/acl-overview.html#canned-acl) in the *Amazon S3 User Guide*.  

```
$ aws s3 sync . s3://amzn-s3-demo-bucket/path --acl public-read
```

**exclude**  
When you use the `s3 cp`, `s3 mv`, `s3 sync`, or `s3 rm` command, you can filter the results by using the `--exclude` or `--include` option. The `--exclude` option sets rules to only exclude objects from the command, and the options apply in the order specified. This is shown in the following example.  

```
Local directory contains 3 files:
MyFile1.txt
MyFile2.rtf
MyFile88.txt

// Exclude all .txt files, resulting in only MyFile2.rtf being copied
$ aws s3 cp . s3://amzn-s3-demo-bucket/path --exclude "*.txt"

// Exclude all .txt files but include all files with the "MyFile*.txt" format, resulting in, MyFile1.txt, MyFile2.rtf, MyFile88.txt being copied
$ aws s3 cp . s3://amzn-s3-demo-bucket/path --exclude "*.txt" --include "MyFile*.txt"

// Exclude all .txt files, but include all files with the "MyFile*.txt" format, but exclude all files with the "MyFile?.txt" format resulting in, MyFile2.rtf and MyFile88.txt being copied
$ aws s3 cp . s3://amzn-s3-demo-bucket/path --exclude "*.txt" --include "MyFile*.txt" --exclude "MyFile?.txt"
```

**include**  
When you use the `s3 cp`, `s3 mv`, `s3 sync`, or `s3 rm` command, you can filter the results using the `--exclude` or `--include` option. The `--include` option sets rules to only include objects specified for the command, and the options apply in the order specified. This is shown in the following example.  

```
Local directory contains 3 files:
MyFile1.txt
MyFile2.rtf
MyFile88.txt

// Include all .txt files, resulting in MyFile1.txt and MyFile88.txt being copied
$ aws s3 cp . s3://amzn-s3-demo-bucket/path --include "*.txt"

// Include all .txt files but exclude all files with the "MyFile*.txt" format, resulting in no files being copied
$ aws s3 cp . s3://amzn-s3-demo-bucket/path --include "*.txt" --exclude "MyFile*.txt"

// Include all .txt files, but exclude all files with the "MyFile*.txt" format, but include all files with the "MyFile?.txt" format resulting in MyFile1.txt being copied

$ aws s3 cp . s3://amzn-s3-demo-bucket/path --include "*.txt" --exclude "MyFile*.txt" --include "MyFile?.txt"
```

**grant**  
The `s3 cp`, `s3 mv`, and `s3 sync` commands include a `--grants` option that you can use to grant permissions on the object to specified users or groups. Set the `--grants` option to a list of permissions using the following syntax. Replace `Permission`, `Grantee_Type`, and `Grantee_ID` with your own values.  
**Syntax**  

```
--grants Permission=Grantee_Type=Grantee_ID
         [Permission=Grantee_Type=Grantee_ID ...]
```
Each value contains the following elements:  
+ *Permission* – Specifies the granted permissions. Can be set to `read`, `readacl`, `writeacl`, or `full`.
+ *Grantee\$1Type* – Specifies how to identify the grantee. Can be set to `uri`, `emailaddress`, or `id`.
+ *Grantee\$1ID* – Specifies the grantee based on *Grantee\$1Type*.
  + `uri` – The group's URI. For more information, see [Who is a grantee?](https://docs.aws.amazon.com/AmazonS3/latest/userguide/ACLOverview.html#SpecifyingGrantee)
  + `emailaddress` – The account's email address.
  + `id` – The account's canonical ID.
For more information about Amazon S3 access control, see [Access control](https://docs.aws.amazon.com/AmazonS3/latest/userguide/UsingAuthAccess.html).  
The following example copies an object into a bucket. It grants `read` permissions on the object to everyone, and `full` permissions (`read`, `readacl`, and `writeacl`) to the account associated with `user@example.com`.   

```
$ aws s3 cp file.txt s3://amzn-s3-demo-bucket/ --grants read=uri=http://acs.amazonaws.com/groups/global/AllUsers full=emailaddress=user@example.com
```
You can also specify a nondefault storage class (`REDUCED_REDUNDANCY` or `STANDARD_IA`) for objects that you upload to Amazon S3. To do this, use the `--storage-class` option.  

```
$ aws s3 cp file.txt s3://amzn-s3-demo-bucket/ --storage-class REDUCED_REDUNDANCY
```

**no-overwrite**  
The `s3 cp`, `s3 mv`, and `s3 sync` commands include a `--no-overwrite` option that you can use to prevent overwriting objects that already exist at the destination.  
The following example copies an object from a bucket to the local directory only if it does not already exist in the local directory.  

```
$ aws s3 cp --no-overwrite s3://amzn-s3-demo-bucket/file.txt file.txt
```
The following example recursively copies files from a local directory to a bucket. It will only copy files that do not already exist in the bucket.  

```
$ aws s3 cp --recursive --no-overwrite /path/to/demo-files/ s3://amzn-s3-demo-bucket/demo-files/
```
The following example moves an object from a local directory to a bucket only if it does not already exist in the bucket destination location.  

```
$ aws s3 mv --no-overwrite file.txt s3://amzn-s3-demo-bucket/file.txt
```
The following example syncs files from a local directory to a bucket. It will only sync files that do not already exist in the destination bucket.  

```
$ aws s3 sync --no-overwrite /path/to/demo-files/ s3://amzn-s3-demo-bucket/demo-files/
```

**recursive**  
When you use this option, the command is performed on all files or objects under the specified directory or prefix. The following example deletes `s3://amzn-s3-demo-bucket/path` and all of its contents.  

```
$ aws s3 rm s3://amzn-s3-demo-bucket/path --recursive
```

## Resources
<a name="using-s3-commands-managing-buckets-references"></a>

**AWS CLI reference:**
+ [https://docs.aws.amazon.com/cli/latest/reference/s3/index.html](https://docs.aws.amazon.com/cli/latest/reference/s3/index.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html](https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3/mb.html](https://docs.aws.amazon.com/cli/latest/reference/s3/mb.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3/mv.html](https://docs.aws.amazon.com/cli/latest/reference/s3/mv.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3/ls.html](https://docs.aws.amazon.com/cli/latest/reference/s3/ls.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3/rb.html](https://docs.aws.amazon.com/cli/latest/reference/s3/rb.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3/rm.html](https://docs.aws.amazon.com/cli/latest/reference/s3/rm.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html](https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html)

**Service reference:**
+ [Working with Amazon S3 buckets](https://docs.aws.amazon.com//AmazonS3/latest/userguide/UsingBucket.html) in the *Amazon S3 User Guide*
+ [Working with Amazon S3 objects](https://docs.aws.amazon.com//AmazonS3/latest/userguide/UsingObjects.html) in the *Amazon S3 User Guide*
+ [Listing keys hierarchically using a prefix and delimiter](https://docs.aws.amazon.com//AmazonS3/latest/userguide/ListingKeysHierarchy.html) in the *Amazon S3 User Guide*
+ [Abort multipart uploads to an S3 bucket using the AWS SDK for .NET (low-level)](https://docs.aws.amazon.com//AmazonS3/latest/userguide/LLAbortMPUnet.html) in the *Amazon S3 User Guide*

# Using API-Level (s3api) commands in the AWS CLI
<a name="cli-services-s3-apicommands"></a>

The API-level commands (contained in the `s3api` command set) provide direct access to the Amazon Simple Storage Service (Amazon S3) APIs, and enable some operations that are not exposed in the high-level `s3` commands. These commands are the equivalent of the other AWS services that provide API-level access to the services' functionality. For more information on the `s3` commands, see [Using high-level (s3) commands in the AWS CLI](cli-services-s3-commands.md)

This topic provides examples that demonstrate how to use the lower-level commands that map to the Amazon S3 APIs. In addition, you can find examples for each S3 API command in the `s3api` section of the [AWS CLI version 2 reference guide](https://docs.aws.amazon.com/cli/latest/reference/s3api/index.html).

**Topics**
+ [

## Prerequisites
](#cli-services-s3-apicommands-prereqs)
+ [

## Apply a custom ACL
](#cli-services-s3-apicommands-acls)
+ [

## Configure a logging policy
](#cli-services-s3-apicommands-logpol)
+ [

## Resources
](#cli-services-s3-apicommands-resources)

## Prerequisites
<a name="cli-services-s3-apicommands-prereqs"></a>

To run the `s3api` commands, you need to:
+ Install and configure the AWS CLI. For more information, see [Installing or updating to the latest version of the AWS CLI](getting-started-install.md) and [Authentication and access credentials for the AWS CLI](cli-chap-authentication.md).
+ The profile that you use must have permissions that allow the AWS operations performed by the examples.
+ Understand these Amazon S3 terms:
  + **Bucket** – A top-level Amazon S3 folder.
  + **Prefix** – An Amazon S3 folder in a bucket.
  + **Object** – Any item that's hosted in an Amazon S3 bucket.

## Apply a custom ACL
<a name="cli-services-s3-apicommands-acls"></a>

With high-level commands, you can use the `--acl` option to apply predefined access control lists (ACLs) to Amazon S3 objects. But you can't use that command to set bucket-wide ACLs. However, you can do this by using the ```[put-bucket-acl](https://docs.aws.amazon.com/cli/latest/reference/s3api/put-bucket-acl.html)` API-level command. 

The following example shows how to grant full control to two AWS users (*user1@example.com* and *user2@example.com*) and read permission to everyone. The identifier for "everyone" comes from a special URI that you pass as a parameter.

```
$ aws s3api put-bucket-acl --bucket amzn-s3-demo-bucket --grant-full-control 'emailaddress="user1@example.com",emailaddress="user2@example.com"' --grant-read 'uri="http://acs.amazonaws.com/groups/global/AllUsers"'
```

For details about how to construct the ACLs, see [PUT Bucket acl](https://docs.aws.amazon.com/AmazonS3/latest/API/RESTBucketPUTacl.html) in the *Amazon Simple Storage Service API Reference*. The `s3api` ACL commands in the CLI, such as `put-bucket-acl`, use the same [shorthand argument notation](https://docs.aws.amazon.com/cli/latest/userguide/cli-usage-shorthand.html).

## Configure a logging policy
<a name="cli-services-s3-apicommands-logpol"></a>

The API command `put-bucket-logging` configures a bucket logging policy. 

In the following example, the AWS user *user@example.com* is granted full control over the log files, and all users have read access to them. Notice that the `put-bucket-acl` command is also required to grant the Amazon S3 log delivery system (specified by a URI) the permissions needed to read and write the logs to the bucket.

```
$ aws s3api put-bucket-acl --bucket amzn-s3-demo-bucket --grant-read-acp 'URI="http://acs.amazonaws.com/groups/s3/LogDelivery"' --grant-write 'URI="http://acs.amazonaws.com/groups/s3/LogDelivery"'
$ aws s3api put-bucket-logging --bucket amzn-s3-demo-bucket --bucket-logging-status file://logging.json
```

The `logging.json` file in the previous command has the following content.

```
{
  "LoggingEnabled": {
    "TargetBucket": "amzn-s3-demo-bucket",
    "TargetPrefix": "amzn-s3-demo-bucketLogs/",
    "TargetGrants": [
      {
        "Grantee": {
          "Type": "AmazonCustomerByEmail",
          "EmailAddress": "user@example.com"
        },
        "Permission": "FULL_CONTROL"
      },
      {
        "Grantee": {
          "Type": "Group",
          "URI": "http://acs.amazonaws.com/groups/global/AllUsers"
        },
        "Permission": "READ"
      }
    ]
  }
}
```

## Resources
<a name="cli-services-s3-apicommands-resources"></a>

**AWS CLI reference:**
+ [https://docs.aws.amazon.com/cli/latest/reference/s3api/index.html](https://docs.aws.amazon.com/cli/latest/reference/s3api/index.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3api/put-bucket-acl.html](https://docs.aws.amazon.com/cli/latest/reference/s3api/put-bucket-acl.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3api/put-bucket-logging.html](https://docs.aws.amazon.com/cli/latest/reference/s3api/put-bucket-logging.html)

**Service reference:**
+ [Working with Amazon S3 buckets](https://docs.aws.amazon.com//AmazonS3/latest/userguide/UsingBucket.html) in the *Amazon S3 User Guide*
+ [Working with Amazon S3 objects](https://docs.aws.amazon.com//AmazonS3/latest/userguide/UsingObjects.html) in the *Amazon S3 User Guide*
+ [Listing keys hierarchically using a prefix and delimiter](https://docs.aws.amazon.com//AmazonS3/latest/userguide/ListingKeysHierarchy.html) in the *Amazon S3 User Guide*
+ [Abort multipart uploads to an S3 bucket using the AWS SDK for .NET (low-level)](https://docs.aws.amazon.com//AmazonS3/latest/userguide/LLAbortMPUnet.html) in the *Amazon S3 User Guide*

# Scripting example for the Amazon S3 bucket lifecycle in the AWS CLI
<a name="cli-services-s3-lifecycle-example"></a>

This topic uses a bash scripting example for Amazon S3 bucket lifecycle operations using the AWS Command Line Interface (AWS CLI). This scripting example uses the [https://docs.aws.amazon.com/cli/latest/reference/s3api/index.html](https://docs.aws.amazon.com/cli/latest/reference/s3api/index.html) set of commands. Shell scripts are programs designed to run in a command line interface.

**Topics**
+ [

## Before you start
](#cli-services-s3-lifecycle-example-before)
+ [

## About this example
](#cli-services-s3-lifecycle-example-about)
+ [

## Files
](#cli-services-s3-lifecycle-example-files)
+ [

## References
](#cli-services-s3-lifecycle-example-references)

## Before you start
<a name="cli-services-s3-lifecycle-example-before"></a>

Before you can run any of the below examples, the following things need to be completed.
+ Install and configure the AWS CLI. For more information, see [Installing or updating to the latest version of the AWS CLI](getting-started-install.md) and [Authentication and access credentials for the AWS CLI](cli-chap-authentication.md).
+ The profile that you use must have permissions that allow the AWS operations performed by the examples.
+ As an AWS best practice, grant this code least privilege, or only the permissions required to perform a task. For more information, see [Grant Least Privilege](https://docs.aws.amazon.com/IAM/latest/UserGuide/best-practices.html#grant-least-privilege) in the *IAM User Guide*.
+ This code has not been tested in all AWS Regions. Some AWS services are available only in specific Regions. For more information, see [ Service Endpoints and Quotas](https://docs.aws.amazon.com/general/latest/gr/aws-service-information.html) in the *AWS General Reference Guide*. 
+ Running this code can result in charges to your AWS account. It is your responsibility to ensure that any resources created by this script are removed when you are done with them. 

The Amazon S3 service uses the following terms:
+ Bucket — A top level Amazon S3 folder.
+ Prefix — An Amazon S3 folder in a bucket.
+ Object — Any item hosted in an Amazon S3 bucket.

## About this example
<a name="cli-services-s3-lifecycle-example-about"></a>

This example demonstrates how to interact with some of the basic Amazon S3 operations using a set of functions in shell script files. The functions are located in the shell script file named `bucket-operations.sh`. You can call these functions in another file. Each script file contains comments describing each of the functions.

To see the intermediate results of each step, run the script with a `-i` parameter. You can view the current status of the bucket or its contents using the Amazon S3 console. The script only proceeds to the next step when you press **enter** at the prompt. 

For the full example and downloadable script files, see [Amazon S3 Bucket Lifecycle Operations](https://github.com/awsdocs/aws-doc-sdk-examples/tree/main/aws-cli/bash-linux/s3/bucket-lifecycle-operations) in the *AWS Code Examples Repository* on *GitHub*.

## Files
<a name="cli-services-s3-lifecycle-example-files"></a>

The example contains the following files:

**bucket-operations.sh**  
This main script file can be sourced from another file. It includes functions that perform the following tasks:  
+ Creating a bucket and verifying that it exists
+ Copying a file from the local computer to a bucket
+ Copying a file from one bucket location to a different bucket location
+ Listing the contents of a bucket
+ Deleting a file from a bucket
+ Deleting a bucket
View the code for `[bucket-operations.sh](https://github.com/awsdocs/aws-doc-sdk-examples/blob/main/aws-cli/bash-linux/s3/bucket-lifecycle-operations/bucket_operations.sh)` on *GitHub*.

**test-bucket-operations.sh**  
The shell script file `test-bucket-operations.sh` demonstrates how to call the functions by sourcing the `bucket-operations.sh` file and calling each of the functions. After calling functions, the test script removes all resources that it created.   
View the code for `[test-bucket-operations.sh](https://github.com/awsdocs/aws-doc-sdk-examples/blob/main/aws-cli/bash-linux/s3/bucket-lifecycle-operations/test_bucket_operations.sh)` on *GitHub*.

**awsdocs-general.sh**  
The script file `awsdocs-general.sh` holds general purpose functions used across advanced code examples for the AWS CLI.  
View the code for `[awsdocs-general.sh](https://github.com/awsdocs/aws-doc-sdk-examples/blob/main/aws-cli/bash-linux/s3/bucket-lifecycle-operations/awsdocs_general.sh)` on *GitHub*.

## References
<a name="cli-services-s3-lifecycle-example-references"></a>

**AWS CLI reference:**
+ [https://docs.aws.amazon.com/cli/latest/reference/s3api/index.html](https://docs.aws.amazon.com/cli/latest/reference/s3api/index.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3api/create-bucket.html](https://docs.aws.amazon.com/cli/latest/reference/s3api/create-bucket.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3api/copy-object.html](https://docs.aws.amazon.com/cli/latest/reference/s3api/copy-object.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3api/delete-bucket.html](https://docs.aws.amazon.com/cli/latest/reference/s3api/delete-bucket.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3api/delete-object.html](https://docs.aws.amazon.com/cli/latest/reference/s3api/delete-object.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3api/head-bucket.html](https://docs.aws.amazon.com/cli/latest/reference/s3api/head-bucket.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3api/list-objects.html](https://docs.aws.amazon.com/cli/latest/reference/s3api/list-objects.html)
+ [https://docs.aws.amazon.com/cli/latest/reference/s3api/put-object.html](https://docs.aws.amazon.com/cli/latest/reference/s3api/put-object.html)

**Other reference:**
+ [Working with Amazon S3 buckets](https://docs.aws.amazon.com//AmazonS3/latest/userguide/UsingBucket.html) in the *Amazon S3 User Guide*
+ [Working with Amazon S3 objects](https://docs.aws.amazon.com//AmazonS3/latest/userguide/UsingObjects.html) in the *Amazon S3 User Guide*
+ To view and contribute to AWS SDK and AWS CLI code examples, see the [AWS Code Examples Repository](https://github.com/awsdocs/aws-doc-sdk-examples/) on *GitHub*.