HowTo: Mirror S3 Buckets
We want to copy the objects from one S3 bucket to another one. We don’t want this to be a one time event though, as new objects are created in the first bucket we want them to be copied to the second bucket as well. In this article we show how to use AWS Lambda and the AWS Command Line Interface to mirror one S3 bucket to another.
There are two major steps to this process: we will set up an AWS Lambda function to copy new S3 objects as they are created; and we will use the AWS Command Line Interface to copy the existing objects from the source bucket to the target bucket.
Copy New Objects
We’ll be creating an AWS Lambda function to copy new S3 objects from the source bucket to a destination bucket. We’ll need to get the code for the function, setup the IAM permissions, create the Lambda function, then setup the S3 bucket to use the function.
Get the Code
We’ll be using aws-lambda-copy-s3-objects from Eleven41 Software as the code for the AWS Lambda function. While we could build the package, it is already built and available on the releases page. The latest version as of this writing is 0.2.0. So we’ll download aws-lambda-copy-s3-objects-0.2.0.zip
. The prebuilt zip file is also mirrored here.
Create IAM Policy and Role
We’ll make an IAM policy for this.
- Name:
S3Copy
- Description:
Allow S3Copy lambda to copy objects
- Policy Document:
Policy Document
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt1430872797000",
"Effect": "Allow",
"Action": [
"s3:GetBucketTagging",
"s3:GetObject",
"s3:PutObject"
],
"Resource": [
"*"
]
},
{
"Sid": "Stmt1430872844000",
"Effect": "Allow",
"Action": [
"cloudwatch:*"
],
"Resource": [
"*"
]
},
{
"Sid": "Stmt1430872852000",
"Effect": "Allow",
"Action": [
"logs:*"
],
"Resource": [
"*"
]
}
]
}
For more details on creating an IAM policy, see HowTo: Create an IAM Policy.
Once we have the policy created, we need to create an IAM role for our AWS Lambda function. We’ll name the role S3Copy
and have it use the S3Copy
IAM policy we just created. For more details on creating the IAM role, see HowTo: Create an IAM Role for AWS Lambda.
Create AWS Lambda Function
Now that we have the code we’re going to use for the function and we have our IAM permissions setup, we’ll actually set up the AWS Lambda function.
Open the AWS Console and choose Lambda
If we don’t have any AWS Lambda functions we’ll get this start screen. Click Get Started Now
to create a function.
If we do have other AWS Lambda functions, we’ll get this list of them. Click Create a Lambda function
to create a new function.
We’ll use a Blank Function
blueprint
We’ll configure the triggers later, just click Next
We’ll configure the function
- Name:
S3Copy
- Description:
Copy S3 objects from one bucket to another
- Runtime:
Node.js 4.3
It should look this this once that part is done:
Scroll down to the next section, Lambda function code
. Click on the Code entry type
dropdown:
Choose Upload a .ZIP file
Click Upload
and find the zip file we downloaded earlier
And this section should look something like this when done:
Scroll down to Lambda function handler and role
and we’ll set the values like so
- Handler:
index.handler
- Role:
Choose an existing role
- Existing role:
S3Copy
And the section should look like this when done:
Scroll down, if desired make changes in Advanced settings
, then click Next
Review the function details
If everything looks correct, scroll down and click Create function
And now our AWS Lambda function, S3Copy
, is now created.
Setup S3
Back to AWS Console Dashboard, choose S3 this time
We’ll create our bucket that will receive the copies
In this example we’ll be mirroring billing-example
, so we’ll name the new bucket billing-example-copy
.
If we desired, we could change the region as well, but for this example we’ll keep the default. Once we’re done with the bucket name and region, click Create
.
Right click the bucket we want to copy from, in this example we’re using billing-example
.
And click on Properties
in the context menu.
The way that aws-lambda-copy-s3-objects knows which buckets to copy to is by setting a tag on the source bucket. So to do this, we will expand the Tags
section.
And we’ll add a tag by clicking on Add more tags
.
The key for the tag will be TargetBucket
, and the value will be a space separated list of the buckets we want to copy to. If the bucket we want to copy to is in a different region from the one we’re copying from, add @
and then the region name (e.g. us-west-2-example-bucket@us-west-2
). If the source bucket is not in US Standard and the target bucket is in US Standard, use us-east-1
for the region (e.g. us-east-1-example-bucket@us-east-1
).
For this example billing-example
and billing-example-copy
are in the same region so we’ll set
- Key:
TargetBucket
- Value:
billing-example-copy
Once the Key and Value are set, click Save
Now that the Tags are done, we’ll go to the Events
section to set the trigger for the AWS Lambda function.
We’ll set the name for the event to be S3Copy
.
Next click on Select event(s)
to set the trigger events
We’ll choose ObjectCreated (All)
so the AWS Lambda function will fire whenever a new object is created.
We’ll leave prefix and suffix alone, we want all objects to fire the event.
For Send To
, click on Lambda function
.
Now we’ll select the Lambda function we just created.
Click the Save
button.
And our event is now set up.
Copy Existing Objects
While the Lambda will copy new objects as they are created, it will not copy the existing objects over. We can however easily handle this using the AWS command line interface. See HowTo: Install AWS CLI - AWS Command Line Interface on how to install it.
Console - user@hostname ~ $
1
aws s3 sync s3://billing-example s3://billing-example-copy
If the buckets are in different regions, use --source-region
to tell the region for the source bucket, and use --region
to tell the region for the destination bucket.
Console - user@hostname ~ $
1
aws s3 sync s3://us-west-2-example-bucket s3://us-east-1-example-bucket --source-region us-west-2 --region us-east-1