Setting up an S3 bucket for Drupal involves using Amazon S3 to store media files — images, videos, documents, and more — instead of keeping them on the server. This significantly improves scalability, performance, and reliability while reducing server storage costs.
In this guide, we’ll use the S3 File System (S3FS) module, the most robust and widely-adopted solution for Drupal–S3 integration.
Why Use S3 with Drupal?
| Benefit | Description |
|---|---|
| Scalability | Unlimited file storage without server constraints |
| Performance | Faster delivery via AWS’s global CDN network |
| Reliability | 99.999999999% (11 nines) data durability |
| Cost Efficiency | Pay only for what you use; reduce server storage costs |
| Backup | Built-in redundancy across multiple AWS data centers |
Step 1: Create an S3 Bucket
Navigate to the AWS Console
- Go to AWS Management Console → S3
- Click Create bucket
Bucket Configuration
- Bucket name:
your-bucket-name - Region: Choose the region closest to your users
Public Access Settings
- For public files (images, CSS, JS): Disable “Block all public access”
- For private files: Keep “Block all public access” enabled
Tip: If you plan to serve all files through signed URLs or CloudFront with Origin Access Identity (OAI), keep the bucket private.
Set Bucket Policy for Public Files
Add this bucket policy to allow public read access:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::your-bucket-name/*"
}
]
} Replace
your-bucket-namewith your actual bucket name throughout this guide.
Step 2: Create an IAM User and Set Permissions
Create an IAM User
- Go to AWS IAM → Users → Add user
- Select Programmatic access
- Generate an access key and secret key
- Save these credentials securely — you won’t be able to view the secret key again
Standard IAM Policy
Create a custom policy with the following permissions:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListBucketVersions",
"s3:ListBucket",
"s3:GetObject",
"s3:PutObject",
"s3:DeleteObject",
"s3:GetObjectAcl",
"s3:PutObjectAcl"
],
"Resource": [
"arn:aws:s3:::your-bucket-name",
"arn:aws:s3:::your-bucket-name/*"
]
}
]
} Minimal Permissions (Security Best Practice)
For production environments, apply the principle of least privilege:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": "arn:aws:s3:::your-bucket-name"
},
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject",
"s3:DeleteObject"
],
"Resource": "arn:aws:s3:::your-bucket-name/*"
}
]
} Step 3: Install the S3FS Drupal Module
The S3 File System module is the most widely used and reliable solution for Drupal–S3 integration.
# Install via Composer (recommended)
composer require drupal/s3fs
# Enable the module
drush en s3fs -y What S3FS Provides
- Stream Wrapper Integration: Use S3 as the default file system or for specific streams
- Public/Private File Support: Handle both public and private file streams
- Cache Integration: Local metadata caching for improved performance
- Image Style Support: Generate and store image derivatives directly on S3
Step 4: Configure S3FS
Add Configuration to settings.php
Add your AWS credentials and S3 settings to sites/default/settings.php:
// AWS S3 Configuration
$settings['s3fs.access_key'] = 'YOUR_AWS_ACCESS_KEY_ID';
$settings['s3fs.secret_key'] = 'YOUR_AWS_SECRET_ACCESS_KEY';
$config['s3fs.settings']['bucket'] = 'your-bucket-name';
$config['s3fs.settings']['region'] = 'us-east-1'; // Your bucket's region
$config['s3fs.settings']['endpoint'] = 'https://s3.amazonaws.com';
// Use S3 for public files
$settings['s3fs.use_s3_for_public'] = TRUE;
// Optional: Use S3 for private files too
$settings['s3fs.use_s3_for_private'] = TRUE; Security warning: Never commit AWS credentials to version control. Use environment variables instead:
$settings['s3fs.access_key'] = getenv('AWS_ACCESS_KEY_ID');
$settings['s3fs.secret_key'] = getenv('AWS_SECRET_ACCESS_KEY'); Step 5: Configure S3FS in the Drupal Admin UI
Navigate to S3FS Settings
- Go to Administration → Configuration → Media → S3 File System settings
- Or visit:
/admin/config/media/s3fs
Validate the Connection
- Click the “Validate” button
- You should see: ✅ Successfully connected to bucket
Configure File System Paths
- Public File Path:
s3:// - Private File Path:
s3://private(if using private files)
Advanced Settings
// Enable image style generation on S3
$config['s3fs.settings']['use_s3_for_public'] = true;
// Enable CORS for cross-origin requests
$config['s3fs.settings']['cors'] = TRUE;
// Set a custom domain (e.g., if using CloudFront)
$config['s3fs.settings']['domain'] = 'cdn.yoursite.com';
// Set long-lived cache headers for static assets
$config['s3fs.settings']['cache_control_header'] = 'public, max-age=31536000'; Step 6: Migrate Existing Files (Optional)
If you already have local files you’d like to move to S3, use Drush:
# Copy existing public files to S3
drush s3fs-copy-local
# Refresh the S3FS file metadata cache
drush s3fs-refresh-cache Manual Migration via AWS CLI
You can also sync files directly using the AWS CLI:
# Install AWS CLI
pip install awscli
# Sync local Drupal files to S3
aws s3 sync sites/default/files/ s3://your-bucket-name/public/ --delete Step 7: Advanced Configuration
CloudFront CDN Integration
For better global performance, front your S3 bucket with CloudFront:
// Use a CloudFront domain for all file URLs
$config['s3fs.settings']['domain'] = 'd1234567890123.cloudfront.net';
$config['s3fs.settings']['domain_root'] = 'public'; CORS Configuration
If you need cross-origin file access (e.g., fonts loaded from another domain), add a CORS rule to your S3 bucket:
[
{
"AllowedHeaders": ["*"],
"AllowedMethods": ["GET", "HEAD"],
"AllowedOrigins": ["https://yoursite.com"],
"ExposeHeaders": ["ETag"],
"MaxAgeSeconds": 3000
}
] Security Best Practices
- Use IAM roles instead of long-lived access keys where possible (e.g., on EC2/ECS)
- Store credentials in environment variables — never in code or version control
- Enable bucket versioning to protect against accidental deletions
- Enable access logging for security auditing and compliance
- Use private buckets and serve files through signed URLs or CloudFront with OAI
Performance Optimization Tips
- Use CloudFront — distribute content globally for faster delivery
- Enable metadata caching — reduces S3 API calls significantly
- Optimize image styles — generate and cache derivatives on S3 rather than locally
- Set long cache headers — use
max-age=31536000for static assets - Monitor costs — set up AWS billing alerts to track S3 usage
Conclusion
Integrating Amazon S3 with Drupal using the S3 File System module provides a robust, production-ready solution for file management at any scale. Once configured, you get:
- Unlimited storage without server constraints
- Global content delivery via CloudFront integration
- Cost optimization through AWS’s pay-as-you-go model
- Enterprise-grade reliability with 11 nines of durability
The initial setup requires careful attention to IAM permissions and bucket policies — but once in place, S3FS handles all file operations transparently, while maintaining Drupal’s familiar file management interface.
For high-traffic sites or applications with significant media requirements, this integration is essential for maintaining performance and keeping infrastructure costs predictable.
Have questions about S3FS setup or run into issues during implementation? Feel free to reach out. 🙂
