AWS S3 CLI Commands Cheat Sheet: All You Need to Know
AWS S3 is one of the most popular AWS service as of now. It is a highly available, durable and cost effective object storage in AWS cloud.
As you know, you can create and manage your S3 buckets using various tools like AWS console, CLI, CloudFormation etc. . However, nothing beats the ease of AWS CLI when it comes to managing your bucket.
Firing a CLI command is easy and quick.
In this post, we’ll learn few most important S3 CLI commands that can help you while working with S3.
Suggested Read: 5 Ways to Create and Manage Resources on AWS
Before you can use AWS CLI to manage your bucket, you need to install CLI in your machine and configure it using your credentials(access key/secret key).
Here is a step by step tutorial on how to do it – How to Install and Configure AWS CLI in your System
Note: If you don’t use CLI on regular basis and just want to test few commands, there is a quicker option from AWS console itself. You can use AWS CloudShell which is shown in below screenshot. Click on that terminal icon on top menu of your AWS account and a ready to use terminal will open. This terminal already has CLI installed and is configured with your credentials.
Suggested Read: All You need to Know about AWS CloudShell – Your Browser Based CLI
Basic Overview of S3 and CLI with S3
Before getting started with CLI commands, we needs to know few basic terms of AWS S3.
- Bucket: It’s a top level container in which you store your objects
- Objects: Any item that is stored in an S3 bucket
- Prefix: Any folder that you have in your bucket
You use aws s3 CLI command to create and manage your s3 bucket and objects.
This is how the basic syntax looks like-
aws s3 <Command> [<Arg> ...]
You can have multiple arg like –region , –recursive , –profile etc.
commands or operations that you can use(copied from AWS documentation)
You can use
cp, mv and
rm on one object or all objects under a bucket or prefix by using –recursive option.
AWS S3 CLI Commands Cheat Sheet
- Create a Bucket
- List All The Bucket
- List the Content of a Bucket
- Copy Files to and from S3
- Find Out Number of Objects and Total Size of a Bucket
- Generate Pre-signed URL for an Object
- Move File To or From S3 Bucket
- Sync S3 Bucket with Another Bucket or Local Directory and Vice Versa
- Delete a File from a Bucket
- Delete All Files From a Bucket
- Delete a Bucket
1. Create a Bucket
You use mb command to create a bucket. mb stands for make bucket.
Create a Bucket in Default Region
aws s3 mb s3://bucket-name
Above command creates a bucket in the default region configured in your CLI. If you want to create a bucket in a specific region , specify –region as shown below.
Create a Bucket In Specific Region
aws s3 mb s3://mybucket --region us-west-1
aws s3 mb s3://cloudkatha-cli-bucket
Note: Please note that, S3 bucket names are unique globally. So if you create a bucket with name ‘abc‘, no body else can create the bucket with same name even in any other account. Therefore, always make sure to choose a unique name specific to your business like I added cloudkatha in my bucket name. And that way, it’s unique to me.
Suggested Reading: This is why S3 bucket name is unique globally
2. List All The Bucket
s3 ls command lists all the buckets in your AWS account provided you have permissions to do so.
aws s3 ls
As you can see in below screenshot, it shows all the buckets in my account.
3. List the Content of a Bucket
Lists the content of a bucket. Ideally, it lists all the objects and prefixes inside the bucket.
aws s3 ls s3://bucket-name
aws s3 ls s3://demo-talk-with-anu
Note: Please note that, using ls commands by default lists only object within folder and not subfolders so if you want to list them all, use below command-
aws s3 ls s3://demo-talk-with-anu --recursive
As you noticed, we have added –recursive option to previous command.
4. Copy Files to and from S3
We use s3 cp command to copy one object or multiple to and from s3 bucket.
Copy a Single File from Local to S3
aws s3 cp localPath s3://bucket-name
For example –
aws s3 cp sample.txt s3://demo-talk-with-anu
Copy a Single File from S3 to Local
aws s3 cp s3://bucket-name/filename.txt ./
aws s3 cp s3://demo-talk-with-anu/sample.txt ./
Here sample.txt is copied to the current working directory.
Copy a Single File from S3 to S3
aws s3 cp s3://bucket-name/example s3://my-bucket/
aws s3 cp s3://demo-talk-with-anu/index.html s3://techtalk-with-preeti/
Copy Multiple Objects From One S3 Bucket to Another
Copies all objects in s3://bucket-name/example into another bucket.
aws s3 cp s3://bucket-name/example s3://my-bucket/
aws s3 cp s3://demo-talk-with-anu/website s3://techtalk-with-preeti/
Ideally, you can make your own combination of source and destination and copy objects to or from s3 bucket.
5. Find Out Number of Objects and Total Size of a Bucket
Finding out the total size of bucket is quite a useful command and needed at times. You can use s3 ls command with –recursive, –summarize and –human-readable options like shown below.
aws s3 ls s3://bucket-name --recursive --summarize --human-readable
aws s3 ls s3://demo-talk-with-anu --recursive --summarize --human-readable
As you can see above, total number of objects and total size are returned as well in a easy to read format.
6. Generate Pre-signed URL for an Object
You can use a presigned URL to grant access to an S3 object. You can also use
--expires-in option to specify when presigned URL expires.
Default is 3600 seconds and Maximum is 604800 seconds.
aws s3 presign s3://bucket-name/filename.html
aws s3 presign s3://demo-talk-with-anu/index.html
Note: As you can notice in above screenshot, AMz-Expires = 3600 is shown as that’s the default value. You can provide your own value like-
aws s3 presign s3://demo-talk-with-anu/index.html --expires-in 604800
7. Move File To or From S3 Bucket
You use s3 mv to move an object or file. s3 mv command moves a local file or S3 object to another location locally or in S3.
aws s3 mv source destination
aws s3 mv sample.txt s3://mybucket/test2.txt
Note: As expected from move, this commands moves the object/file to destination and removes/deletes it from source.
8. Sync S3 Bucket with Another Bucket or Local Directory and Vice Versa
You can sync a local folder with s3 , an s3 prefix with local folder or s3 folder to another s3 folder.
In the below syntax snippet you can see all the source destination combo is given.
sync <LocalPath> <S3Uri> or <S3Uri> <LocalPath> or <S3Uri> <S3Uri>
aws s3 sync s3://mybucket .
using above command, all the content of mybucket is downloaded in current directory.
Syncs Two bucket mybucket1 and mybucket2
aws s3 sync s3://mybucket1 s3://mybucket2
9. Delete a File from a Bucket
You can delete a file from an s3 bucket using s3 rm command. This is how the syntax looks like-
aws s3 rm s3://bucket-name/filename
aws s3 rm s3://demo-talk-with-anu/index.html
10. Delete All Files From a Bucket
As we saw in previous example, we used s3 rm to delete a file. However, if you want to delete all the objects even present in subfolders, you can as usual use –recursive options.
aws s3 rm s3://mybucket --recursive
aws s3 rm s3://demo-talk-with-anu --recursive
11. Delete Bucket
You can use s3 rb command to delete a bucket. rb here stands for remove bucket.
Delete an Empty Bucket:
aws s3 rb s3://bucket-name
As you already know that if you try to delete an empty bucket, all goes well but if you try to delete a bucket which has some objects, above command is gonna fail.
If you want to delete a bucket with objects, use –force option. Using –force option in the command will first delete all the object and prefixes and then deletes the bucket.
Delete a Bucket with Objects
aws s3 rb s3://bucket-name --force
aws s3 rb s3://demo-talk-with-anu --force
As you can see in above screenshot, first three delete operation is fired and then remove_bucket operation.
In this post, we learnt some of the most used AWS S3 CLI high level commands to manage bucket and objects.
We also learnt that, few commands like
cp, mv and
rm can be used on one object or all objects under a bucket or prefix by using –recursive option.
Apart from that, there are quite a few options that you can use like –region, –profile, –dryrun etc.
Hope the post was useful to you.
Feel free to check the official documentation for further details.
AWS CLI reference:
Enjoyed the content?
Subscribe to our newsletter below to get awesome AWS learning materials delivered straight to your inbox.
Don’t forget to motivate me by-
- Adding a comment below on what you liked and what can be improved.
- Follow us on
- Share this post with your friends
This is list is very helpful, thank you very much for sharing it.