Setup & Configuration Guide
Complete installation and configuration guide for the StoryFlow email management system, including development setup, production deployment, and operational procedures.
Prerequisites
Before setting up the email system, ensure you have:
Required Services
- PostgreSQL 14+: Database with proper permissions
- Resend Account: Email delivery service
- Node.js 18+: Application runtime
- Redis (optional): For caching and session storage
Environment Access
- Admin access to database for running migrations
- Domain ownership for email authentication setup
- Monitoring and logging infrastructure
Installation
1. Database Setup
The email system shares the main application database. Run the email migration:
# Navigate to your web-app directory
cd apps/web-app
# Apply email system migration
npm run db:migrate -- --file=20250711_153000_email_templates_system.sql
``` -->
**Manual Migration (if needed):**
```bash
# Connect to your database
psql -h your-host -d storyflow -U your-user
# Run the migration file
\i lib/database/migrations/20250711_153000_email_templates_system.sql
``` -->
**Verify Installation:**
```sql
-- Check that all tables were created
SELECT table_name FROM information_schema.tables
WHERE table_schema = 'public'
AND table_name LIKE 'email_%';
-- Should return 7 tables:
-- email_templates, email_template_versions, email_automation_rules,
-- email_send_queue, user_email_preferences, email_analytics, email_ab_tests
``` -->
### 2. Environment Configuration
Add the following environment variables to your `.env` file:
```bash
# Resend API Configuration
RESEND_API_KEY=re_your_api_key_here
RESEND_FROM_EMAIL="StoryFlow <noreply@workbeehive.com>"
# Application URLs (for email links)
NEXT_PUBLIC_APP_URL=https://mystoryflow.com
# Email Processing Configuration (optional)
EMAIL_QUEUE_BATCH_SIZE=50
EMAIL_RETRY_MAX_ATTEMPTS=3
EMAIL_PROCESSING_INTERVAL=60000 # 1 minute in milliseconds
# Analytics Configuration (optional)
EMAIL_ANALYTICS_RETENTION_DAYS=365
EMAIL_CACHE_TTL=300 # 5 minutes
``` -->
### 3. Resend Service Setup
#### Create Resend Account
1. Sign up at [resend.com](https://resend.com)
2. Verify your email address
3. Add your domain for email sending
#### Domain Configuration
**Add Domain:**
```bash
# In Resend dashboard, add your domain
Domain: workbeehive.com
``` -->
**DNS Records:**
Add these DNS records to your domain:
```dns
# SPF Record
Type: TXT
Name: @
Value: "v=spf1 include:_spf.resend.com ~all"
# DKIM Record
Type: TXT
Name: resend._domainkey
Value: [Provided by Resend]
# DMARC Record (recommended)
Type: TXT
Name: _dmarc
Value: "v=DMARC1; p=quarantine; rua=mailto:dmarc@workbeehive.com"
``` -->
**Verify Domain:**
```bash
# Test domain verification
dig TXT workbeehive.com
# Should show your SPF record
# Check DKIM record
dig TXT resend._domainkey.workbeehive.com
``` -->
#### API Key Setup
```bash
# Generate API key in Resend dashboard
API Keys → Create API Key
Name: StoryFlow Production
Permissions: Full Access (or sending only for production)
# Test API key
curl -X POST 'https://api.resend.com/emails' \
-H 'Authorization: Bearer your-api-key' \
-H 'Content-Type: application/json' \
-d '{
"from": "test@workbeehive.com",
"to": ["your-email@example.com"],
"subject": "Test Email",
"html": "<p>Testing Resend integration</p>"
}'
``` -->
### 4. Application Integration
The email service is already integrated into the web-app. Verify the integration:
```typescript
// Test email automation service
import { emailAutomationService } from '@/lib/email/email-automation-service'
// Trigger a test email (in development)
await emailAutomationService.triggerUserSignup('test-user-id', {
user_name: 'Test User',
onboarding_url: '/onboarding'
})
``` -->
## Development Setup
### Local Development
**Install Dependencies:**
```bash
cd apps/web-app
npm install resend
``` -->
**Development Environment:**
```bash
# Use local database
DATABASE_URL=postgresql://postgres:password@localhost:5432/storyflow_dev
# Use Resend test API key
RESEND_API_KEY=re_test_key
RESEND_FROM_EMAIL="Dev StoryFlow <dev@workbeehive.com>"
# Local app URL
NEXT_PUBLIC_APP_URL=http://localhost:3000
``` -->
**Start Development Services:**
```bash
# Start the web application
npm run dev
# In another terminal, start email queue processor (optional for dev)
npm run email:process-queue
``` -->
### Testing Email Templates
**Create Test Template:**
```bash
# Access admin dashboard
http://localhost:3000/dashboard/admin/email
# Create a test template:
Name: test_template
Display Name: Test Template
Category: system
Subject: Test Email: {{user_name}}
HTML: <p>Hello {{user_name}}! This is a test.</p>
``` -->
**Test Template Rendering:**
```typescript
// In your development console or test file
const testVariables = {
user_name: 'John Doe',
test_url: 'http://localhost:3000/test'
}
// Test template rendering
const rendered = emailAutomationService.renderTemplate(
'Hello {{user_name}}! Visit: {{test_url}}',
testVariables
)
console.log(rendered) // "Hello John Doe! Visit: http://localhost:3000/test"
``` -->
## Production Deployment
### Pre-Deployment Checklist
**Database:**
- [ ] Migration applied successfully
- [ ] All indexes created
- [ ] RLS policies enabled
- [ ] Default templates exist
**Environment:**
- [ ] Production API keys configured
- [ ] Domain DNS records verified
- [ ] Application URLs are HTTPS
- [ ] Monitoring and alerting configured
**Testing:**
- [ ] Send test emails to different providers
- [ ] Verify delivery rates and inbox placement
- [ ] Test unsubscribe flow
- [ ] Validate template rendering
### Deployment Steps
**1. Deploy Database Changes:**
```bash
# Backup production database
pg_dump -h prod-host -U user -d storyflow > backup_before_email.sql
# Apply migration to production
psql -h prod-host -U user -d storyflow -f lib/database/migrations/20250711_153000_email_templates_system.sql
# Verify deployment
psql -h prod-host -U user -d storyflow -c "SELECT COUNT(*) FROM email_templates;"
``` -->
**2. Deploy Application Code:**
```bash
# Deploy with your normal process (Vercel, Docker, etc.)
# Ensure environment variables are configured
# Verify email service is available
curl https://mystoryflow.com/api/admin/email/templates \
-H "Authorization: Bearer admin-token"
``` -->
**3. Start Background Services:**
If using a separate email queue processor:
```bash
# Docker example
docker run -d \
--name email-processor \
--env-file .env.production \
your-app:latest \
npm run email:process-queue
# Or systemd service
sudo systemctl start storyflow-email-processor
sudo systemctl enable storyflow-email-processor
``` -->
**4. Monitor Deployment:**
```bash
# Check email queue processing
curl https://mystoryflow.com/api/admin/email/analytics | jq '.queue_status'
# Should return: {"pending": 0, "processing": 0}
``` -->
## Queue Processing
### Background Service
The email queue can be processed in several ways:
**Option 1: Cron Job**
```bash
# Add to crontab (runs every minute)
* * * * * cd /path/to/app && npm run email:process-queue >> /var/log/email-queue.log 2>&1
``` -->
**Option 2: Separate Node.js Process**
```javascript
// scripts/email-queue-processor.js
import { emailAutomationService } from '../lib/email/email-automation-service.js'
async function processQueue() {
try {
console.log('Processing email queue...')
await emailAutomationService.processEmailQueue(50)
} catch (error) {
console.error('Queue processing error:', error)
}
}
// Process every minute
setInterval(processQueue, 60000)
processQueue() // Initial run
``` -->
**Option 3: Serverless Function**
```javascript
// api/cron/process-emails.js (Vercel)
import { emailAutomationService } from '@/lib/email/email-automation-service'
export default async function handler(req, res) {
if (req.headers.authorization !== `Bearer ${process.env.CRON_SECRET}`) {
return res.status(401).json({ error: 'Unauthorized' })
}
try {
await emailAutomationService.processEmailQueue(50)
res.json({ success: true, message: 'Queue processed' })
} catch (error) {
console.error('Queue processing error:', error)
res.status(500).json({ error: 'Processing failed' })
}
}
``` -->
**Configure Vercel Cron:**
```json
// vercel.json
{
"crons": [
{
"path": "/api/cron/process-emails",
"schedule": "* * * * *"
}
]
}
``` -->
## Monitoring & Alerting
### Health Checks
**Email System Health:**
```bash
# Check queue status
curl https://mystoryflow.com/api/admin/email/analytics | jq '.queue_status'
# Check recent delivery rates
curl https://mystoryflow.com/api/admin/email/analytics?period=1 | jq '.overall_stats.delivery_rate'
# Should be > 95%
``` -->
**Database Health:**
```sql
-- Check for stuck emails (pending > 1 hour)
SELECT COUNT(*) FROM email_send_queue
WHERE status = 'pending'
AND scheduled_for < NOW() - INTERVAL '1 hour';
-- Check recent error rates
SELECT
DATE(created_at) as date,
COUNT(*) FILTER (WHERE status = 'failed') as failed,
COUNT(*) as total,
ROUND(COUNT(*) FILTER (WHERE status = 'failed')::numeric / COUNT(*) * 100, 2) as error_rate
FROM email_send_queue
WHERE created_at > NOW() - INTERVAL '7 days'
GROUP BY DATE(created_at)
ORDER BY date DESC;
``` -->
### Alerting Setup
**Prometheus Metrics** (if using Prometheus):
```javascript
// lib/email/metrics.js
import client from 'prom-client'
export const emailMetrics = {
queueSize: new client.Gauge({
name: 'email_queue_size',
help: 'Number of emails in queue by status',
labelNames: ['status']
}),
deliveryRate: new client.Gauge({
name: 'email_delivery_rate',
help: 'Email delivery rate percentage'
}),
processingTime: new client.Histogram({
name: 'email_processing_duration_seconds',
help: 'Time spent processing email queue',
buckets: [0.1, 0.5, 1, 2, 5, 10]
})
}
// Update metrics during queue processing
export async function updateEmailMetrics() {
const queueStats = await getQueueStats()
emailMetrics.queueSize.set({ status: 'pending' }, queueStats.pending)
emailMetrics.queueSize.set({ status: 'processing' }, queueStats.processing)
const analytics = await getEmailAnalytics(1) // Last 1 day
emailMetrics.deliveryRate.set(parseFloat(analytics.overall_stats.delivery_rate))
}
``` -->
**Alert Rules:**
```yaml
# alerts.yml (Prometheus)
groups:
- name: email_system
rules:
- alert: EmailQueueBacklog
expr: email_queue_size{status="pending"} > 100
for: 5m
annotations:
summary: "Email queue backlog detected"
description: "{{ $value }} emails pending in queue"
- alert: EmailDeliveryRateDropped
expr: email_delivery_rate < 95
for: 10m
annotations:
summary: "Email delivery rate dropped below 95%"
description: "Current delivery rate: {{ $value }}%"
``` -->
### Log Analysis
**Email Processing Logs:**
```bash
# Search for failed email sends
grep "Failed to send email" /var/log/app.log
# Monitor queue processing
grep "Processing.*emails from queue" /var/log/app.log | tail -10
# Check template rendering errors
grep "Template rendering failed" /var/log/app.log
``` -->
**Log Aggregation** (ELK Stack example):
```json
{
"mappings": {
"properties": {
"@timestamp": { "type": "date" },
"level": { "type": "keyword" },
"component": { "type": "keyword" },
"email_id": { "type": "keyword" },
"template_id": { "type": "keyword" },
"user_id": { "type": "keyword" },
"status": { "type": "keyword" },
"error_message": { "type": "text" }
}
}
}
``` -->
## Troubleshooting
### Common Issues
**1. Emails Not Sending**
*Symptoms:* Queue shows pending emails but nothing is sent
*Check:*
```bash
# Verify API key
curl -X POST 'https://api.resend.com/emails' \
-H "Authorization: Bearer $RESEND_API_KEY" \
-H 'Content-Type: application/json' \
-d '{"from":"test@workbeehive.com","to":["test@example.com"],"subject":"Test","html":"<p>Test</p>"}'
# Check queue processor is running
ps aux | grep email
# Check application logs
tail -f /var/log/app.log | grep -i email
``` -->
*Solutions:*
- Verify Resend API key is correct
- Check domain verification status
- Ensure queue processor is running
- Check for rate limiting
**2. Low Delivery Rate**
*Symptoms:* Emails sent but not reaching inboxes
*Check:*
```bash
# Check bounce and complaint rates
curl https://mystoryflow.com/api/admin/email/analytics | jq '.overall_stats'
# Review recent failed emails
SELECT error_message, COUNT(*)
FROM email_send_queue
WHERE status = 'failed'
AND created_at > NOW() - INTERVAL '24 hours'
GROUP BY error_message;
``` -->
*Solutions:*
- Review email content for spam triggers
- Check domain reputation
- Verify DNS records (SPF, DKIM, DMARC)
- Monitor blacklist status
**3. Template Rendering Errors**
*Symptoms:* Emails fail with template variable errors
*Check:*
```sql
-- Find emails with template errors
SELECT template_id, error_message, COUNT(*)
FROM email_send_queue
WHERE status = 'failed'
AND error_message LIKE '%variable%'
GROUP BY template_id, error_message;
``` -->
*Solutions:*
- Validate required variables are provided
- Check for missing template variables
- Review template syntax
- Test templates with sample data
### Performance Tuning
**Database Optimization:**
```sql
-- Analyze email table performance
ANALYZE email_send_queue;
ANALYZE email_templates;
-- Check index usage
SELECT schemaname, tablename, indexname, idx_scan
FROM pg_stat_user_indexes
WHERE schemaname = 'public'
AND tablename LIKE 'email_%'
ORDER BY idx_scan DESC;
-- Archive old email data (optional)
DELETE FROM email_send_queue
WHERE created_at < NOW() - INTERVAL '90 days'
AND status IN ('sent', 'failed');
``` -->
**Queue Processing Optimization:**
```javascript
// Adjust batch size based on performance
const OPTIMAL_BATCH_SIZE = process.env.NODE_ENV === 'production' ? 100 : 10
// Use connection pooling
const emailProcessingPool = new Pool({
connectionString: process.env.DATABASE_URL,
max: 5, // Limit concurrent connections
idleTimeoutMillis: 30000
})
// Implement graceful shutdown
process.on('SIGTERM', async () => {
console.log('Shutting down email processor...')
await emailProcessingPool.end()
process.exit(0)
})
``` -->
### Backup & Recovery
**Database Backup:**
```bash
# Backup email-related tables
pg_dump -h host -U user -d storyflow \
-t email_templates \
-t email_template_versions \
-t email_automation_rules \
-t email_send_queue \
-t user_email_preferences \
-t email_analytics \
-t email_ab_tests \
> email_system_backup.sql
# Restore from backup
psql -h host -U user -d storyflow < email_system_backup.sql
``` -->
**Configuration Backup:**
```bash
# Backup environment variables
env | grep -E "(RESEND|EMAIL)" > email_env_backup.txt
# Backup DNS records
dig TXT workbeehive.com > dns_records_backup.txt
dig TXT resend._domainkey.workbeehive.com >> dns_records_backup.txt
``` -->
## Security Considerations
### Email Security
**Content Security:**
- Sanitize all user-generated content in emails
- Use parameterized templates to prevent injection
- Validate email addresses before sending
- Implement rate limiting on email sending
**Authentication:**
- Use strong API keys for Resend
- Rotate API keys regularly
- Implement webhook signature verification
- Use HTTPS for all email-related endpoints
**Privacy Compliance:**
```sql
-- GDPR compliance: Delete user email data
CREATE OR REPLACE FUNCTION delete_user_email_data(user_uuid UUID)
RETURNS void AS $$
BEGIN
-- Delete user's email preferences
DELETE FROM user_email_preferences WHERE user_id = user_uuid;
-- Anonymize sent emails (keep for analytics but remove PII)
UPDATE email_send_queue
SET to_email = 'deleted@example.com',
template_variables = '{}',
html_content = '[Content Deleted]',
text_content = '[Content Deleted]'
WHERE user_id = user_uuid;
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;
``` -->
### Access Control
**API Security:**
```javascript
// Rate limiting for email APIs
import rateLimit from 'express-rate-limit'
const emailApiLimiter = rateLimit({
windowMs: 60 * 1000, // 1 minute
max: 100, // Limit each IP to 100 requests per windowMs
message: 'Too many email API requests'
})
app.use('/api/admin/email', emailApiLimiter)
``` -->
**Database Security:**
```sql
-- Create dedicated email service user
CREATE USER email_service WITH PASSWORD 'strong_password';
-- Grant minimal required permissions
GRANT SELECT, INSERT, UPDATE ON email_send_queue TO email_service;
GRANT SELECT ON email_templates TO email_service;
GRANT SELECT ON user_email_preferences TO email_service;
-- Revoke unnecessary permissions
REVOKE DELETE ON email_templates FROM email_service;
``` -->
For ongoing maintenance and operational procedures, monitor the system regularly and keep the documentation updated as the system evolves.