Tarek Cheikh
Founder & AWS Cloud Architect
Every security professional's worst nightmare: waking up to find your company's name trending on social media because someone discovered your customer database sitting in an open S3 bucket. It happened to Capital One ($190 million fine), Verizon (14 million records exposed), and countless others.
Here's what makes S3 exposure particularly dangerous:
The most frustrating part? Most S3 exposures aren't intentional. They happen through a perfect storm of misconfigurations, misunderstandings, and the complexity of S3's multiple security layers.
S3 has FOUR different ways to control access, and they all interact:
Imagine trying to figure out if a door is locked when there are four different locks, each with its own key, and some locks can override others. That's S3 security.
Scenario 1: The Developer Shortcut
Developer needs to share files with a contractor.
--> Googles "make S3 bucket public"
--> Copies first Stack Overflow answer
--> Applies policy making entire bucket public
--> Forgets to revert after project ends
--> 6 months later: data breach
Scenario 2: The Migration Mistake
Team migrates from on-premise to AWS.
--> Uses automated migration tools
--> Tool creates buckets with default settings
--> Default includes "public read" for compatibility
--> Nobody checks the security settings
--> Customer data exposed from day one
Scenario 3: The Website Hosting Trap
Marketing team wants to host website assets.
--> Enables static website hosting on bucket
--> Doesn't realize this requires public access
--> Puts other files in same bucket for "convenience"
--> Entire bucket contents become public
Attackers don't randomly guess bucket names. They use sophisticated methods:
company-name-backup or projectname-dataOnce found, downloading your data takes seconds:
aws s3 sync s3://your-exposed-bucket ./stolen-data --no-sign-request
The S3 Exposure Hunter solves these problems by analyzing ALL security mechanisms simultaneously to determine actual exposure risk.
A bucket can be public through multiple paths:
Path 1: ACL Settings
Path 2: Bucket Policy
"Principal": "*" (anyone on internet)s3:GetObject without restrictionsPath 3: Static Website Hosting
Path 4: Public Access Block Disabled
The tool doesn't just check if buckets are "public" — it determines actual risk levels:
CRITICAL Risk:
HIGH Risk:
MEDIUM Risk:
LOW Risk:
Before running the scanner:
# Get the S3 Exposure Hunter
git clone https://github.com/TocConsulting/aws-helper-scripts.git
cd aws-helper-scripts/check-public-s3
# Install dependencies
pip install -r requirements.txt
# Run your first S3 security scan
python check_public_s3_cli.py
The scanner provides real-time analysis feedback as it works:
Using AWS Account: 123456789012
Analyzing S3 buckets for security issues...
Found 15 S3 buckets. Analyzing security configurations...
Analyzing company-logs... Private
Analyzing public-website-assets... PUBLIC (Public ACL, Website hosting enabled)
Analyzing data-backup-bucket... PUBLIC (Public bucket policy)
Analyzing staging-files... Private
================================================================================
S3 SECURITY ASSESSMENT SUMMARY
================================================================================
Total S3 Buckets: 15
Public Buckets: 3
Encrypted Buckets: 12
Versioned Buckets: 8
Logged Buckets: 5
Critical Risk: 1
High Risk: 1
Medium Risk: 4
================================================================================
PUBLIC BUCKETS (IMMEDIATE ATTENTION REQUIRED)
================================================================================
Bucket Name Region Risk Size (MB) Reasons
-----------------------------------------------------------------------------------------------
public-website-assets us-east-1 Critical 2450.5 Public ACL, Website hosting enabled
data-backup-bucket us-west-2 High 125.2 Public bucket policy
company-logs eu-west-1 Medium 45.7 Website hosting enabled
The tool shows exactly why each bucket is public and uses indicators:
Find ALL public buckets right now:
python check_public_s3_cli.py --public-only
Generate compliance report for audit:
python check_public_s3_cli.py --export-csv s3_audit_report.csv
Check after fixing issues:
# Run scan again to verify fixes
python check_public_s3_cli.py
If you find a critical exposure, apply this immediately:
# EMERGENCY: Block all public access immediately
aws s3api put-public-access-block \
--bucket YOUR-EXPOSED-BUCKET \
--public-access-block-configuration \
"BlockPublicAcls=true,IgnorePublicAcls=true,BlockPublicPolicy=true,RestrictPublicBuckets=true"
This is the S3 equivalent of pulling the fire alarm — it blocks ALL public access immediately.
Step 1: Enable Public Access Block (Unless bucket needs public access)
Step 2: Review and Fix ACLs
Step 3: Audit Bucket Policies
"Principal": "*" with specific principalsStep 4: Enable Encryption
Step 5: Enable Logging
Manual scanning finds current issues, but buckets can become public at any time. The Lambda version provides automated, continuous monitoring.
Configuration Drift: Even with perfect initial setup:
# Navigate to Lambda version
cd ../check-public-s3-lambda
# Configure settings in template.yaml:
# - AlertEmail: security@yourcompany.com
# - ScanSchedule: rate(1 hour) # For critical environments
# - CriticalBucketPatterns: ["*customer*", "*backup*", "*prod*"]
# Deploy monitoring system
sam build && sam deploy --guided
What Gets Deployed:
Critical Alert (Immediate Action):
CRITICAL: Public S3 Bucket Detected!
Bucket: customer-database-backup
Region: us-east-1
Exposure: PUBLIC READ via ACL
Size: 2.3 GB
Contains: .sql, .csv files
IMMEDIATE ACTIONS REQUIRED:
1. Apply public access block NOW
2. Review CloudTrail for access logs
3. Check if data was accessed
4. Notify security team lead
One-click fix:
https://console.aws.amazon.com/s3/buckets/customer-database-backup?tab=permissions
Daily Summary Report:
S3 Security Report - Tuesday
Summary:
45 buckets secure
2 new buckets need review
1 bucket became public (fixed)
Trends:
- Public bucket rate: 2.1% (down from 4.3%)
- Encryption coverage: 89% (up from 85%)
- Buckets with logging: 67%
Action Items:
- Review 2 new buckets created by development team
- Enable encryption on 5 legacy buckets
- Schedule security training for new developers
Day 1:
Day 2–3:
Day 4–5:
Daily:
Weekly:
Monthly:
Solution: Use CloudFront + S3 Origin Access Identity
Solution: Use presigned URLs
Solution: Cross-account IAM roles
Track these metrics to demonstrate S3 security improvement:
In Episode 5, we'll explore load balancer security — another critical attack surface often overlooked. You'll learn how to build scanners that identify internet-facing load balancers, analyze their security rules, check SSL/TLS configurations, and prevent both data exposure and DDoS risks.
All tools from this series are production-ready and available at https://github.com/TocConsulting/aws-helper-scripts. Each tool is battle-tested in production environments and includes comprehensive documentation.
This article is just the start. Get the full picture with our free whitepaper - 8 chapters covering IAM, S3, VPC, monitoring, agentic AI security, compliance, and a prioritized action plan with 50+ CLI commands.
Stop sending your IAM policies, CloudTrail logs, and infrastructure code to third-party APIs. Run LLMs locally with Ollama on Apple Silicon — private, offline, fast. Complete setup guide with AWS security use cases.
We obtained the actual compromised litellm packages, set up a disposable EC2 instance with honeypot credentials and mitmproxy, and detonated the malware. Full evidence: fork bomb, credential theft in under 2 seconds, IMDS queries, AWS API calls, and C2 exfiltration.
A deep technical breakdown of how threat actor TeamPCP compromised Trivy, pivoted to LiteLLM, and turned a popular AI proxy into a credential-stealing weapon targeting AWS IMDS, Secrets Manager, and Kubernetes.