Cross account S3 access
An organization may have separate AWS accounts. For example, you might separate Dev and Prod environments. Or, you could have different departments.
Let's say there are two AWS accounts:
- Dev: you deploy SFTP Gateway to this environment
- Prod: developers don't have access, but they need to deploy files to an S3 bucket in Prod
First step is to provision SFTP Gateway (via CloudFormation) on the Dev AWS account.
In CloudFormation, go to the Resources tab, and click the link next to
S3WritableRole
Copy the Role ARN of the EC2 IAM role, which in my case was:
arn:aws:iam::<dev-account>:role/rob-dev-S3WritableRole-PFHJN29ABTXS
Note: be careful not to copy the Instance Profile ARN
Second step is to open up the Prod AWS console (it's prefer to open another Chrome profile).
Create an S3 bucket -- in my case, I called it:
rob-sftpgw-prod-bucket
Third step (still within the Prod AWS account) is to add the following bucket policy to the
rob-sftpgw-prod-bucket
S3 bucket.- Open the bucket details in the S3 console
- Click on the Permissions tab
- Click on Bucket Policy
- Paste in the following (replace the ARNs and Resource values with your own)
- Click Save
{ "Version": "2012-10-17", "Id": "CrossAccountAccess", "Statement": [ { "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::<Dev account>:role/rob-dev-S3WritableRole-PFHJN29ABTXS" }, "Action": [ "s3:ListBucket", "s3:GetBucketLocation" ], "Resource": "arn:aws:s3:::rob-sftpgw-prod-bucket" }, { "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::<Dev account>:role/rob-dev-S3WritableRole-PFHJN29ABTXS" }, "Action": [ "s3:GetObject", "s3:PutObject", "s3:DeleteObject" ], "Resource": "arn:aws:s3:::rob-sftpgw-prod-bucket/*" } ] }
This policy allows
ListBucket
on the bucket itself, and read/write access to the objects within the bucket. Also, thePrincipal
is set to the EC2 IAM role within the Dev AWS account.Using this approach, all the permissions are handled on the Prod AWS side -- specifically, within the S3 bucket policy.
Fourth step is performed on the SFTP Gateway EC2 instance
SSH into the box, and
sudo su
to rootProvision a user:
sudo addsftpuser robtest
It will ask you a series of questions. When it asks if you want to use a custom S3 bucket and path, you would normally type
Y
, and then specify a custom S3 bucket and path.Instead, hit
<enter>
(orN
) to skip. Later, you will set the custom S3 bucket and path manually.The reason is that the
addsftpuser
command performs a ListBuckets to see if the bucket exists in that account (and if not, it will try to create it). This command will fail because:- the bucket is in a different account
- the S3 bucket policy approach does not grant ListBuckets for all s3 buckets (nor should it)
- the command will error out when attempting to create the bucket since it already exists.
Instead, you will have to manually edit the config file for that user:
vi /home/robtest/.sftpgateway/user.properties
And manually add the following line:
s3.uploadpath=s3://rob-sftpgw-prod-bucket/robtest/uploads
At this point, you should be able to SFTP to the
robtest
account, drop files into the/uploads
directory, and these files will show up in the Prod account S3 bucket namedrob-sftpgw-prod-bucket
.However, the users in the Prod AWS account will not be able to see these files, because they are only accessible from the Dev AWS account. So you need to perform one last step:
Edit the config file for the user again:
vi /home/robtest/.sftpgateway/user.properties
And append the following line:
acl.option=7
Option
7
represents a canned ACL calledbucket-owner-full-control
. Any object uploaded can be accessed by the bucket owner (Prod) and the account creating the object (Dev). See this page for more details.