Database Migrations Guide
Overview
This document describes the database migration process for CODITECT Dev Context v2.0. Django migrations track changes to your database schema and allow you to version control your database structure.
Prerequisites
- Django 5.0.8 installed
- PostgreSQL 15+ database accessible
- Database credentials configured in
.env
Initial Setup
1. Install Dependencies
# Development environment
pip install -r requirements.txt
# Production environment
pip install -r requirements-production.txt
2. Configure Database
Create .env file with database credentials:
# Database Configuration
DATABASE_URL=postgresql://user:password@localhost:5432/coditect_dev_context
# For Google Cloud SQL
# DATABASE_URL=postgresql://user:password@/dbname?host=/cloudsql/PROJECT:REGION:INSTANCE
# Django Secret Key
SECRET_KEY=your-secret-key-here
# Environment
DEBUG=False
ALLOWED_HOSTS=localhost,127.0.0.1,.coditect.ai
Migration Commands
Create Initial Migrations
Generate migration files from models:
python manage.py makemigrations backend
This will create a migration file in backend/migrations/0001_initial.py containing all 26 models.
View Migration SQL
See what SQL will be executed:
python manage.py sqlmigrate backend 0001
Apply Migrations
Execute migrations to create database tables:
python manage.py migrate
Check Migration Status
See which migrations have been applied:
python manage.py showmigrations
Migration Workflow
For Development
- Modify models in
backend/models.py - Create migration:
python manage.py makemigrations backend - Review migration file in
backend/migrations/ - Test migration:
python manage.py migrate - Commit migration file to git
For Production (GKE)
- Review migrations in staging environment first
- Backup database before applying migrations
- Apply migrations during deployment:
kubectl exec -it deployment/coditect-backend -- python manage.py migrate - Verify database schema matches expectations
Row-Level Security Setup
After initial migrations, manually create PostgreSQL RLS policies:
Connect to PostgreSQL
# Local
psql -U postgres -d coditect_dev_context
# Google Cloud SQL (via proxy)
cloud_sql_proxy -instances=PROJECT:REGION:INSTANCE=tcp:5432 &
psql -h localhost -U postgres -d coditect_dev_context
Create RLS Policies
-- Enable RLS on all tables with tenant_id
ALTER TABLE backend_user ENABLE ROW LEVEL SECURITY;
ALTER TABLE backend_team ENABLE ROW LEVEL SECURITY;
ALTER TABLE backend_project ENABLE ROW LEVEL SECURITY;
ALTER TABLE backend_session ENABLE ROW LEVEL SECURITY;
ALTER TABLE backend_repository ENABLE ROW LEVEL SECURITY;
ALTER TABLE backend_role ENABLE ROW LEVEL SECURITY;
ALTER TABLE backend_taskstatus ENABLE ROW LEVEL SECURITY;
-- ... (repeat for all tenant-scoped tables)
-- Create policies for tenant isolation
CREATE POLICY tenant_isolation_user ON backend_user
FOR ALL TO authenticated_users
USING (tenant_id = current_setting('app.current_tenant_id')::uuid);
CREATE POLICY tenant_isolation_team ON backend_team
FOR ALL TO authenticated_users
USING (tenant_id = current_setting('app.current_tenant_id')::uuid);
CREATE POLICY tenant_isolation_project ON backend_project
FOR ALL TO authenticated_users
USING (tenant_id = current_setting('app.current_tenant_id')::uuid);
-- ... (repeat for all tenant-scoped tables)
-- Create role for Django application
CREATE ROLE django_app WITH LOGIN PASSWORD 'secure-password';
GRANT CONNECT ON DATABASE coditect_dev_context TO django_app;
GRANT USAGE ON SCHEMA public TO django_app;
GRANT SELECT, INSERT, UPDATE, DELETE ON ALL TABLES IN SCHEMA public TO django_app;
GRANT USAGE, SELECT ON ALL SEQUENCES IN SCHEMA public TO django_app;
Common Migration Scenarios
Adding a New Field
# backend/models.py
class Project(models.Model):
# ... existing fields
tags = models.JSONField(default=list, blank=True) # NEW FIELD
python manage.py makemigrations backend --name add_project_tags
python manage.py migrate
Modifying a Field
# Change max_length from 255 to 500
class Project(models.Model):
name = models.CharField(max_length=500) # Changed from 255
python manage.py makemigrations backend --name increase_project_name_length
python manage.py migrate
Adding an Index
class Task(models.Model):
class Meta:
indexes = [
models.Index(fields=['status', 'assigned_to']), # NEW INDEX
models.Index(fields=['due_date']),
]
python manage.py makemigrations backend --name add_task_indexes
python manage.py migrate
Data Migration
Create empty migration and add data operations:
python manage.py makemigrations backend --empty --name populate_default_task_statuses
Edit migration file:
from django.db import migrations
def create_default_statuses(apps, schema_editor):
TaskStatus = apps.get_model('backend', 'TaskStatus')
Tenant = apps.get_model('backend', 'Tenant')
for tenant in Tenant.objects.all():
TaskStatus.objects.bulk_create([
TaskStatus(tenant=tenant, name='To Do', slug='todo', color='#6B7280', sort_order=1),
TaskStatus(tenant=tenant, name='In Progress', slug='in-progress', color='#3B82F6', sort_order=2),
TaskStatus(tenant=tenant, name='Done', slug='done', color='#10B981', is_completed=True, sort_order=3),
])
class Migration(migrations.Migration):
dependencies = [
('backend', '0001_initial'),
]
operations = [
migrations.RunPython(create_default_statuses),
]
Rollback Migrations
Rollback Last Migration
python manage.py migrate backend 0001 # Rollback to migration 0001
Rollback All Migrations
python manage.py migrate backend zero # Remove all migrations (DANGER!)
Fake Migration (Mark as Applied Without Running)
python manage.py migrate backend 0001 --fake
Migration Best Practices
1. Always Review Generated Migrations
Django auto-generates migrations, but review them before committing:
cat backend/migrations/0002_add_field.py
Check for:
- Unintended field changes
- Missing dependencies
- Performance implications (large table alterations)
2. Test Migrations in Staging First
Never apply untested migrations to production:
# Staging
python manage.py migrate --database=staging
# Production (after verification)
python manage.py migrate --database=production
3. Backup Before Migrating
Always backup production database before migrations:
# Google Cloud SQL
gcloud sql backups create --instance=INSTANCE_NAME
# PostgreSQL (manual)
pg_dump -h localhost -U postgres coditect_dev_context > backup_$(date +%Y%m%d).sql
4. Zero-Downtime Migrations
For large tables, use multi-step migrations:
Step 1: Add new column (nullable)
new_field = models.CharField(max_length=255, null=True, blank=True)
Step 2: Data migration to populate new column
Step 3: Make column non-nullable
new_field = models.CharField(max_length=255)
5. Squash Migrations Periodically
Combine multiple migrations into one:
python manage.py squashmigrations backend 0001 0010
Troubleshooting
Migration Conflicts
If two developers create migrations simultaneously:
# Merge migrations
python manage.py makemigrations --merge
Fake Migrations
If migration already applied manually:
python manage.py migrate backend 0002 --fake
Reset Migrations (Development Only)
WARNING: Destroys all data!
# Drop all tables
python manage.py flush --no-input
# Delete migration files
rm backend/migrations/0*.py
# Recreate migrations
python manage.py makemigrations backend
python manage.py migrate
Production Deployment Checklist
- Backup database before migration
- Test migrations in staging environment
- Review migration SQL with
sqlmigrate - Check for breaking changes (removed fields, renamed tables)
- Plan rollback strategy if migration fails
- Schedule maintenance window for large migrations
- Monitor database during and after migration
- Verify application works after migration
- Update RLS policies if new tenant-scoped tables added
Google Cloud SQL Specific
Connect via Cloud SQL Proxy
# Download proxy
wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy
chmod +x cloud_sql_proxy
# Start proxy
./cloud_sql_proxy -instances=PROJECT:REGION:INSTANCE=tcp:5432
Run Migrations via kubectl
# Get pod name
kubectl get pods -l app=coditect-backend
# Run migration
kubectl exec -it POD_NAME -- python manage.py migrate
# View migration status
kubectl exec -it POD_NAME -- python manage.py showmigrations
Next Steps
- Create initial migrations:
python manage.py makemigrations backend - Apply migrations:
python manage.py migrate - Setup RLS policies: Run SQL scripts from above
- Create superuser:
python manage.py createsuperuser - Load initial data: Create fixtures or data migrations
- Test API: Access
/api/docs/for Swagger documentation
Version: 2.0.0 Last Updated: 2025-11-26 Author: CODITECT Development Team