Skip to main content

Automation Architecture Design - CODITECT Cloud Backend/Frontend

Executive Summary

Purpose: Transform deployment troubleshooting knowledge into automated prevention and validation systems using specialized CODITECT agents, skills, commands, scripts, hooks, and workflows.

Context: During deployment of end-to-end user registration (December 2, 2025), we discovered 8 distinct issues that required manual troubleshooting. This design document specifies automation components to prevent these issues and similar problems in future deployments.

Business Value:

  • 40-60% reduction in manual deployment troubleshooting
  • Prevent recurring issues through automated validation
  • Faster deployment cycles (from hours to minutes)
  • Higher deployment success rate (>95% first-time success)
  • Knowledge preservation encoded in automation

Implementation Timeline: 5 weeks (Week 1: Foundation, Week 2-3: Core Components, Week 4: Integration, Week 5: Production)


Problem Statement

Issues Discovered During Deployment (December 2, 2025)

Context: Complete end-to-end user registration deployment revealed 8 issues that delayed production deployment by multiple hours and required incremental fixes.

Issue Categories:

  1. Configuration Consistency Issues (3 issues)

    • Wrong secret key names in migration job
    • Missing DJANGO_SECRET_KEY environment variable
    • Wrong Django settings module path
  2. Infrastructure Assumptions (2 issues)

    • Wrong Kubernetes nodepool name
    • Wrong Docker registry (GCR vs Artifact Registry)
  3. Code Synchronization Issues (2 issues)

    • Frontend/backend field name mismatch (name vs full_name)
    • Django migration conflicts (parallel branch creation)
  4. Database Schema Issues (1 issue)

    • Missing stripe_customer_id column

Root Cause Analysis:

All issues share common patterns:

  • Manual configuration copying prone to errors
  • Lack of automated validation before deployment
  • No contract testing between frontend/backend
  • No migration conflict detection in CI/CD
  • Infrastructure assumptions not validated against actual state

User Requirement:

"we need to be more wholistic in solving these issues, rather than one at a time, when we identify an issue, it should be added to the project plan and the tasklist and when issues have a common grouping they should be solved wholistically rather than just one at a time, tests should be created to test the individual solutions"


Solution Architecture

Automation Components Overview

┌────────────────────────────────────────────────────────────────┐
│ CODITECT Cloud Platform │
│ Automation Architecture │
└────────────────────────────────────────────────────────────────┘

┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Specialized │ │ Skills │ │ Commands │
│ Agents │ │ (Capabilities) │ │ (User Interface)│
│ │ │ │ │ │
│ • Django │ │ • Migration │ │ /validate- │
│ Deployment │ │ Validation │ │ deployment │
│ • Contract │ │ • K8s Manifest │ │ /generate- │
│ Validator │ │ Generation │ │ migration-job │
│ • K8s Migration │ │ • TS from Django│ │ /check-contract │
│ Specialist │ │ • Env Checker │ │ /sync-types │
│ │ │ • Contract Tests│ │ │
└────────┬────────┘ └────────┬────────┘ └────────┬────────┘
│ │ │
└────────────────────┴────────────────────┘

┌────────────────────┴────────────────────┐
│ │
┌────────▼────────┐ ┌──────────▼─────────┐
│ Scripts │ │ Hooks │
│ (Automation) │ │ (Triggers) │
│ │ │ │
│ • Generate │ │ Pre-commit: │
│ Migration Job │ │ • Migration checks │
│ • Validate │ │ • Contract tests │
│ Secrets │ │ │
│ • Check │ │ Pre-deployment: │
│ Conflicts │ │ • Readiness check │
│ • Generate TS │ │ • Secret validation│
│ • Compare Envs │ │ │
└──────────────────┘ └────────────────────┘

┌────────────────────────────────────────────────────────────────┐
│ Workflows │
│ (End-to-End Orchestration) │
│ │
│ Backend Deployment → Contract Sync → Pre-Deployment Validation │
└────────────────────────────────────────────────────────────────┘

Component Interaction Flow

Developer Action


┌─────────────────────────────┐
│ Pre-commit Hook │
│ • Check migrations │
│ • Validate contracts │
└──────────────┬──────────────┘


┌─────────────────────────────┐
│ Contract Sync Workflow │
│ • Generate TS types │
│ • Run contract tests │
└──────────────┬──────────────┘


┌─────────────────────────────┐
│ Pre-Deployment Validation │
│ • Validate secrets │
│ • Check env consistency │
│ • Verify migrations │
└──────────────┬──────────────┘


┌─────────────────────────────┐
│ Backend Deployment Workflow │
│ • Generate migration job │
│ • Apply migrations │
│ • Deploy backend │
└──────────────┬──────────────┘


Production ✅

Component Specifications

1. Specialized Agents

1.1 django-deployment-specialist

Purpose: Expert in Django-specific deployment patterns, migration strategies, and environment configuration.

Capabilities:

  • Django migration planning and conflict resolution
  • Environment-specific settings validation
  • Database schema evolution strategies
  • Migration job manifest generation
  • Rollback planning and execution

Responsibilities:

  • Analyze Django migrations for conflicts
  • Generate Kubernetes migration jobs
  • Validate environment variables
  • Recommend migration strategies (online vs offline)
  • Create rollback plans

Integration Points:

  • Uses django-migration-validation skill
  • Uses k8s-manifest-generation skill
  • Invoked by /validate-deployment-readiness command
  • Invoked by backend-deployment-workflow

Input Interface:

{
"action": "validate_migrations|generate_job|plan_rollback",
"django_project_root": "/path/to/project",
"target_environment": "staging|production",
"kubernetes_namespace": "coditect-staging",
"deployment_name": "coditect-backend"
}

Output Interface:

{
"status": "success|warning|error",
"migrations_valid": true|false,
"conflicts_detected": [],
"generated_job": "path/to/migration-job.yaml",
"rollback_plan": {
"steps": [],
"estimated_downtime": "5 minutes"
},
"recommendations": []
}

Invocation Pattern:

# Via Task tool
Task(subagent_type="django-deployment-specialist",
description="Validate Django migrations before deployment",
prompt="Analyze Django migrations in coditect-cloud-backend for conflicts and generate migration job for staging environment")

# Via command
/validate-deployment-readiness --target staging

Implementation Details:

  • UAF v2.0 agent specification
  • Python integration via scripts/django-deployment.py
  • Kubernetes API integration for cluster state inspection
  • Django management command integration

Error Handling:

  • Migration conflict detection with merge suggestions
  • Environment variable validation with specific error messages
  • Secret key validation with actual vs expected comparison
  • Rollback capability if migrations fail

Testing Strategy:

  • Unit tests for migration analysis logic
  • Integration tests with actual Django projects
  • Kubernetes manifest generation validation
  • Mock Kubernetes API for testing

1.2 frontend-backend-contract-validator

Purpose: Ensure perfect alignment between frontend TypeScript interfaces and backend Django models.

Capabilities:

  • TypeScript type generation from Django models
  • API contract validation (request/response schemas)
  • Field name consistency checking (snake_case vs camelCase)
  • Breaking change detection
  • Contract test generation

Responsibilities:

  • Generate TypeScript types from Django models
  • Validate API request/response formats
  • Detect field name mismatches
  • Generate contract tests
  • Report breaking changes

Integration Points:

  • Uses typescript-from-django-models skill
  • Uses contract-test-generator skill
  • Invoked by /check-frontend-backend-contract command
  • Invoked by contract-sync-workflow

Input Interface:

{
"action": "generate_types|validate_contract|detect_breaks",
"django_models_path": "/path/to/models.py",
"typescript_types_path": "/path/to/types.ts",
"api_spec": "/path/to/openapi.yaml",
"frontend_root": "/path/to/frontend"
}

Output Interface:

{
"status": "success|warning|error",
"types_generated": true|false,
"mismatches_found": [
{
"field": "full_name",
"backend": "full_name (snake_case)",
"frontend": "name (camelCase)",
"severity": "error|warning"
}
],
"breaking_changes": [],
"contract_tests_generated": "/path/to/contract.test.ts",
"recommendations": []
}

Invocation Pattern:

# Via Task tool
Task(subagent_type="frontend-backend-contract-validator",
description="Validate frontend/backend contract alignment",
prompt="Check alignment between Django models in backend and TypeScript types in frontend, generate missing types")

# Via command
/check-frontend-backend-contract

Implementation Details:

  • Django model introspection via AST parsing
  • TypeScript type generation with proper snake_case → camelCase conversion
  • OpenAPI schema validation
  • Jest contract test generation

Error Handling:

  • Field mismatch detection with automatic fix suggestions
  • Breaking change detection with migration path
  • Validation failure with clear remediation steps

Testing Strategy:

  • Unit tests for type generation
  • Integration tests with actual Django models and TypeScript
  • Contract test validation
  • Breaking change detection tests

1.3 kubernetes-migration-specialist

Purpose: Expert in Kubernetes Job patterns for database migrations with zero-downtime strategies.

Capabilities:

  • Kubernetes Job manifest generation
  • Secret and ConfigMap validation
  • Node affinity and resource optimization
  • Job monitoring and failure recovery
  • Zero-downtime migration strategies

Responsibilities:

  • Generate migration Job manifests
  • Validate Kubernetes secrets
  • Check node availability and affinity
  • Monitor Job execution
  • Implement rollback if Job fails

Integration Points:

  • Uses k8s-manifest-generation skill
  • Uses environment-consistency-checker skill
  • Invoked by /generate-migration-job command
  • Invoked by backend-deployment-workflow

Input Interface:

{
"action": "generate_job|validate_secrets|monitor_job",
"deployment_name": "coditect-backend",
"namespace": "coditect-staging",
"command": ["python", "manage.py", "migrate"],
"image": "us-central1-docker.pkg.dev/.../image:tag",
"env_from_deployment": true
}

Output Interface:

{
"status": "success|warning|error",
"job_manifest": "path/to/migration-job.yaml",
"secrets_valid": true|false,
"secret_issues": [],
"node_affinity_valid": true|false,
"job_status": "pending|running|succeeded|failed",
"recommendations": []
}

Invocation Pattern:

# Via Task tool
Task(subagent_type="kubernetes-migration-specialist",
description="Generate and validate Kubernetes migration job",
prompt="Generate migration job for coditect-backend in staging, validate all secrets and environment variables")

# Via command
/generate-migration-job --deployment coditect-backend --namespace coditect-staging

Implementation Details:

  • Kubernetes API integration for cluster inspection
  • Deployment manifest analysis for env variable extraction
  • Secret validation against actual cluster secrets
  • Job status monitoring with progress reporting

Error Handling:

  • Secret key validation with specific missing keys
  • Node affinity validation against actual nodepools
  • Image pull validation
  • Job failure detection with log extraction

Testing Strategy:

  • Unit tests for manifest generation
  • Integration tests with actual Kubernetes cluster
  • Secret validation tests
  • Job monitoring tests

2. Skills (Reusable Capabilities)

2.1 django-migration-validation

Purpose: Comprehensive Django migration analysis and validation.

Capabilities:

def validate_migrations(project_root: str) -> MigrationAnalysis:
"""
Analyze Django migrations for conflicts, missing dependencies,
and backward compatibility issues.

Returns:
MigrationAnalysis with:
- conflicts: List of conflicting migrations
- missing_dependencies: Migrations with broken dependencies
- backward_incompatible: Migrations that cannot be rolled back
- merge_suggestions: Recommended merge migrations
"""

def check_for_conflicts() -> List[MigrationConflict]:
"""Detect multiple leaf nodes in migration graph."""

def suggest_merge_migration() -> MergeMigrationPlan:
"""Generate merge migration plan for conflicts."""

def validate_backward_compatibility() -> CompatibilityReport:
"""Check if migrations can be safely rolled back."""

Dependencies:

  • Django project structure
  • Python 3.10+
  • Git (for migration history)

Integration:

  • Used by django-deployment-specialist agent
  • Invoked by /validate-migrations command
  • Called in pre-commit-migration-check hook

Implementation:

# skill.py
class DjangoMigrationValidation:
def __init__(self, project_root: str):
self.project_root = project_root
self.apps = self._discover_apps()

def validate_all(self) -> Dict[str, Any]:
"""Run all validation checks."""
return {
"conflicts": self.check_conflicts(),
"dependencies": self.check_dependencies(),
"compatibility": self.check_compatibility()
}

def check_conflicts(self) -> List[Dict]:
"""
Check for migration conflicts by:
1. Building migration graph for each app
2. Finding multiple leaf nodes
3. Identifying conflicting branches
"""
conflicts = []
for app in self.apps:
graph = self._build_migration_graph(app)
leaves = self._find_leaf_nodes(graph)
if len(leaves) > 1:
conflicts.append({
"app": app,
"leaves": leaves,
"merge_needed": True
})
return conflicts

Testing:

  • Unit tests with sample migration files
  • Integration tests with actual Django projects
  • Conflict detection accuracy tests

2.2 k8s-manifest-generation

Purpose: Generate Kubernetes manifests (Job, Deployment, Service) with validation.

Capabilities:

def generate_migration_job(
deployment_name: str,
namespace: str,
command: List[str],
copy_env_from_deployment: bool = True
) -> str:
"""
Generate Kubernetes Job manifest for migrations by:
1. Reading existing Deployment
2. Copying environment variables
3. Validating secrets exist
4. Generating Job YAML

Returns: Path to generated job YAML file
"""

def validate_secrets_exist(namespace: str, secret_refs: List[str]) -> ValidationResult:
"""Verify all referenced secrets exist in namespace."""

def extract_env_from_deployment(deployment_name: str, namespace: str) -> List[EnvVar]:
"""Extract all environment variables from Deployment."""

Dependencies:

  • Kubernetes API access
  • kubectl configured
  • PyYAML

Integration:

  • Used by kubernetes-migration-specialist agent
  • Invoked by /generate-migration-job command
  • Called in backend-deployment-workflow

Implementation:

# skill.py
from kubernetes import client, config
import yaml

class K8sManifestGeneration:
def __init__(self):
config.load_kube_config()
self.apps_v1 = client.AppsV1Api()
self.core_v1 = client.CoreV1Api()
self.batch_v1 = client.BatchV1Api()

def generate_migration_job(
self,
deployment_name: str,
namespace: str,
command: List[str]
) -> str:
# 1. Get deployment
deployment = self.apps_v1.read_namespaced_deployment(
name=deployment_name,
namespace=namespace
)

# 2. Extract environment variables
container = deployment.spec.template.spec.containers[0]
env_vars = self._extract_env_vars(container)

# 3. Validate secrets
self._validate_secret_refs(namespace, env_vars)

# 4. Generate Job manifest
job_manifest = {
"apiVersion": "batch/v1",
"kind": "Job",
"metadata": {
"name": f"{deployment_name}-migration",
"namespace": namespace
},
"spec": {
"template": {
"spec": {
"restartPolicy": "OnFailure",
"containers": [{
"name": "migrations",
"image": container.image,
"command": command,
"env": env_vars
}]
}
}
}
}

# 5. Write to file
output_path = f"k8s/{deployment_name}-migration-job.yaml"
with open(output_path, 'w') as f:
yaml.dump(job_manifest, f)

return output_path

Testing:

  • Unit tests with mock Kubernetes API
  • Integration tests with actual cluster
  • Secret validation tests
  • Manifest syntax validation

2.3 typescript-from-django-models

Purpose: Generate TypeScript interfaces from Django models with proper field mapping.

Capabilities:

def generate_types(
models_file: str,
output_file: str,
naming_convention: str = "camelCase"
) -> TypeGenerationResult:
"""
Generate TypeScript interfaces from Django models.

Handles:
- Field type mapping (CharField → string, IntegerField → number)
- Naming convention conversion (snake_case → camelCase)
- Optional/required field detection
- Relationship mapping (ForeignKey → number)
"""

def map_django_field_to_typescript(field: DjangoField) -> str:
"""Map Django field type to TypeScript type."""

def convert_naming_convention(name: str, convention: str) -> str:
"""Convert between snake_case and camelCase."""

Dependencies:

  • Python AST module
  • Django ORM introspection

Integration:

  • Used by frontend-backend-contract-validator agent
  • Invoked by /sync-types command
  • Called in contract-sync-workflow

Implementation:

# skill.py
import ast
from typing import Dict, List

class TypeScriptFromDjangoModels:
# Django to TypeScript type mapping
TYPE_MAP = {
'CharField': 'string',
'TextField': 'string',
'EmailField': 'string',
'IntegerField': 'number',
'FloatField': 'number',
'BooleanField': 'boolean',
'DateTimeField': 'string', # ISO format
'ForeignKey': 'number', # ID reference
}

def generate_types(self, models_file: str, output_file: str) -> Dict:
# 1. Parse Django models file
with open(models_file, 'r') as f:
tree = ast.parse(f.read())

# 2. Extract model classes
models = self._extract_models(tree)

# 3. Generate TypeScript interfaces
ts_interfaces = []
for model in models:
interface = self._generate_interface(model)
ts_interfaces.append(interface)

# 4. Write to output file
with open(output_file, 'w') as f:
f.write("// Auto-generated from Django models\n\n")
f.write("\n\n".join(ts_interfaces))

return {
"interfaces_generated": len(ts_interfaces),
"output_file": output_file
}

def _generate_interface(self, model: Dict) -> str:
"""Generate TypeScript interface from Django model."""
interface_name = model['name']
fields = []

for field_name, field_type, required in model['fields']:
# Convert snake_case to camelCase
ts_field_name = self._to_camel_case(field_name)
# Map Django type to TypeScript type
ts_type = self.TYPE_MAP.get(field_type, 'any')
# Add optional marker if not required
optional = '' if required else '?'

fields.append(f" {ts_field_name}{optional}: {ts_type};")

return f"export interface {interface_name} {{\n" + \
"\n".join(fields) + "\n}"

def _to_camel_case(self, snake_str: str) -> str:
"""Convert snake_case to camelCase."""
components = snake_str.split('_')
return components[0] + ''.join(x.title() for x in components[1:])

Testing:

  • Unit tests with sample Django models
  • Type mapping accuracy tests
  • Naming convention conversion tests
  • Integration tests with actual projects

2.4 environment-consistency-checker

Purpose: Validate environment variable consistency across deployments, jobs, and local environments.

Capabilities:

def compare_environments(
deployment_name: str,
job_name: str,
namespace: str
) -> ConsistencyReport:
"""
Compare environment variables between:
- Deployment
- Migration Job
- Local .env file

Returns report of missing, mismatched, or extra variables.
"""

def validate_required_vars(env_vars: Dict[str, str], required: List[str]) -> ValidationResult:
"""Check if all required environment variables are present."""

def suggest_fixes(inconsistencies: List[Inconsistency]) -> List[FixSuggestion]:
"""Generate fix suggestions for inconsistencies."""

Dependencies:

  • Kubernetes API
  • python-dotenv

Integration:

  • Used by django-deployment-specialist and kubernetes-migration-specialist agents
  • Invoked by /validate-deployment-readiness command
  • Called in pre-deployment-validation workflow

Implementation:

# skill.py
from kubernetes import client, config
from dotenv import dotenv_values
from typing import Dict, List

class EnvironmentConsistencyChecker:
def __init__(self):
config.load_kube_config()
self.apps_v1 = client.AppsV1Api()
self.batch_v1 = client.BatchV1Api()

def compare_environments(
self,
deployment_name: str,
job_name: str,
namespace: str,
local_env_file: str = ".env"
) -> Dict:
# 1. Extract env vars from Deployment
deployment_env = self._get_deployment_env(deployment_name, namespace)

# 2. Extract env vars from Job
job_env = self._get_job_env(job_name, namespace)

# 3. Load local .env file
local_env = dotenv_values(local_env_file)

# 4. Compare
inconsistencies = []

# Check: Job has all Deployment vars
for key, value in deployment_env.items():
if key not in job_env:
inconsistencies.append({
"type": "missing_in_job",
"key": key,
"deployment_value": value,
"fix": f"Add {key} to Job manifest"
})

# Check: Job doesn't have extra vars not in Deployment
for key in job_env:
if key not in deployment_env:
inconsistencies.append({
"type": "extra_in_job",
"key": key,
"fix": f"Remove {key} from Job or add to Deployment"
})

return {
"deployment_env_count": len(deployment_env),
"job_env_count": len(job_env),
"local_env_count": len(local_env),
"inconsistencies": inconsistencies,
"status": "consistent" if not inconsistencies else "inconsistent"
}

Testing:

  • Unit tests with mock environments
  • Integration tests with actual Kubernetes
  • Inconsistency detection accuracy tests

2.5 contract-test-generator

Purpose: Generate Jest/Pytest contract tests to validate API request/response schemas.

Capabilities:

def generate_contract_tests(
api_spec: str, # OpenAPI YAML
output_dir: str,
framework: str = "jest" # or "pytest"
) -> ContractTestResult:
"""
Generate contract tests from OpenAPI specification.

Creates tests for:
- Request validation (required fields, types)
- Response validation (status codes, schemas)
- Field name consistency
"""

def generate_jest_tests(endpoints: List[Endpoint]) -> List[str]:
"""Generate Jest tests for frontend."""

def generate_pytest_tests(endpoints: List[Endpoint]) -> List[str]:
"""Generate Pytest tests for backend."""

Dependencies:

  • OpenAPI specification
  • Jest (for frontend tests)
  • Pytest (for backend tests)

Integration:

  • Used by frontend-backend-contract-validator agent
  • Invoked by /check-frontend-backend-contract command
  • Called in contract-sync-workflow

Implementation:

# skill.py
import yaml
from typing import List, Dict

class ContractTestGenerator:
def generate_contract_tests(
self,
api_spec_file: str,
output_dir: str,
framework: str = "jest"
) -> Dict:
# 1. Load OpenAPI spec
with open(api_spec_file, 'r') as f:
spec = yaml.safe_load(f)

# 2. Extract endpoints
endpoints = self._extract_endpoints(spec)

# 3. Generate tests based on framework
if framework == "jest":
tests = self._generate_jest_tests(endpoints)
elif framework == "pytest":
tests = self._generate_pytest_tests(endpoints)
else:
raise ValueError(f"Unsupported framework: {framework}")

# 4. Write test files
files_written = []
for test_name, test_content in tests.items():
output_path = f"{output_dir}/{test_name}"
with open(output_path, 'w') as f:
f.write(test_content)
files_written.append(output_path)

return {
"tests_generated": len(tests),
"files_written": files_written,
"framework": framework
}

def _generate_jest_tests(self, endpoints: List[Dict]) -> Dict[str, str]:
"""Generate Jest contract tests."""
tests = {}

for endpoint in endpoints:
method = endpoint['method']
path = endpoint['path']
test_name = f"{method}_{path.replace('/', '_')}.contract.test.ts"

test_content = f"""
import {{ {endpoint['operation_id']} }} from '../api';

describe('Contract: {method} {path}', () => {{
it('should match request schema', async () => {{
const request = {endpoint['example_request']};
// Validate request fields
expect(request).toHaveProperty('{endpoint['required_fields'][0]}');
}});

it('should match response schema', async () => {{
const response = await {endpoint['operation_id']}({{...}});
expect(response.status).toBe({endpoint['success_status']});
expect(response.data).toMatchObject({endpoint['response_schema']});
}});
}});
"""
tests[test_name] = test_content

return tests

Testing:

  • Unit tests for test generation logic
  • Validation that generated tests compile
  • Integration tests with actual API

3. Commands (User-Facing Interface)

3.1 /validate-deployment-readiness

Purpose: Comprehensive pre-deployment validation check.

Syntax:

/validate-deployment-readiness [--target staging|production] [--fix]

Options:

  • --target: Environment to validate (staging or production)
  • --fix: Automatically fix issues where possible

What It Does:

  1. Validate Django migrations (no conflicts)
  2. Check environment variable consistency
  3. Validate Kubernetes secrets
  4. Check frontend/backend contract alignment
  5. Verify Docker image exists
  6. Check database connectivity

Output:

========================================
Deployment Readiness Check
========================================
Target Environment: staging

✅ Django Migrations
• No conflicts detected
• 3 pending migrations

✅ Environment Variables
• Deployment and Job consistent
• All required variables present

✅ Kubernetes Secrets
• backend-secrets exists
• All keys present: db-user, db-password, db-name, django-secret-key

⚠️ Frontend/Backend Contract
• 1 field mismatch found: name vs full_name
• Fix available: /sync-types

✅ Docker Image
• us-central1-docker.pkg.dev/.../backend:v1.0.6-staging exists

✅ Database Connectivity
• Connection successful

========================================
Overall Status: READY WITH WARNINGS
Recommendation: Fix contract mismatch before deploying
========================================

Implementation:

  • Orchestrates multiple agents and skills
  • Aggregates validation results
  • Provides actionable fix commands

3.2 /generate-migration-job

Purpose: Generate Kubernetes Job manifest for Django migrations.

Syntax:

/generate-migration-job --deployment <name> --namespace <ns> [--output <file>]

Options:

  • --deployment: Name of deployment to copy env vars from
  • --namespace: Kubernetes namespace
  • --output: Output file path (default: k8s/migration-job.yaml)

What It Does:

  1. Read existing Deployment manifest
  2. Extract all environment variables
  3. Validate secrets exist
  4. Generate Job manifest
  5. Write to file

Output:

========================================
Migration Job Generator
========================================

✅ Deployment: coditect-backend (namespace: coditect-staging)
✅ Image: us-central1-docker.pkg.dev/.../backend:v1.0.6-staging
✅ Environment Variables: 8 copied
✅ Secrets Validated: backend-secrets (5 keys)

Generated: k8s/migration-job.yaml

Next Steps:
1. Review manifest: cat k8s/migration-job.yaml
2. Apply: kubectl apply -f k8s/migration-job.yaml
3. Monitor: kubectl logs -f job/django-migrations -n coditect-staging

========================================

Implementation:

  • Uses kubernetes-migration-specialist agent
  • Uses k8s-manifest-generation skill

3.3 /check-frontend-backend-contract

Purpose: Validate alignment between frontend and backend.

Syntax:

/check-frontend-backend-contract [--fix] [--generate-tests]

Options:

  • --fix: Auto-generate TypeScript types from Django models
  • --generate-tests: Create contract tests

What It Does:

  1. Parse Django models
  2. Parse TypeScript interfaces
  3. Compare field names and types
  4. Report mismatches
  5. Optionally generate types and tests

Output:

========================================
Frontend/Backend Contract Validation
========================================

Checking alignment...

❌ MISMATCH FOUND

Backend Model: User (users/models.py:15)
├─ full_name: CharField
├─ company_name: CharField
└─ email: EmailField

Frontend Interface: User (types/index.ts:8)
├─ name: string ← MISMATCH (expected: fullName)
├─ company: string ← MISMATCH (expected: companyName)
└─ email: string ✅

Fix Command: /sync-types --from backend --to frontend

Contract Tests: 0 found (run with --generate-tests)

========================================
Status: MISMATCHES DETECTED
========================================

Implementation:

  • Uses frontend-backend-contract-validator agent
  • Uses typescript-from-django-models skill
  • Uses contract-test-generator skill

3.4 /validate-migrations

Purpose: Check Django migrations for conflicts and issues.

Syntax:

/validate-migrations [--app <app_name>] [--suggest-merge]

Options:

  • --app: Validate specific Django app
  • --suggest-merge: Generate merge migration if conflicts found

What It Does:

  1. Build migration graph for each app
  2. Detect conflicts (multiple leaf nodes)
  3. Check for missing dependencies
  4. Validate backward compatibility
  5. Suggest merge migrations

Output:

========================================
Django Migration Validation
========================================

App: licenses
├─ ✅ No missing dependencies
├─ ❌ CONFLICT DETECTED
│ ├─ Branch 1: 0002_add_renewal_fields
│ └─ Branch 2: 0006_phase1_step6_license_sessions
└─ Merge Required: YES

Suggested Fix:
python manage.py makemigrations --merge licenses

App: tenants
└─ ✅ All migrations valid

App: users
└─ ✅ All migrations valid

========================================
Status: CONFLICTS FOUND (1 app)
Action Required: Create merge migration for 'licenses' app
========================================

Implementation:

  • Uses django-deployment-specialist agent
  • Uses django-migration-validation skill

3.5 /sync-types

Purpose: Generate TypeScript types from Django models.

Syntax:

/sync-types --from <source> --to <destination> [--convention camelCase|snake_case]

Options:

  • --from: Source (backend or frontend)
  • --to: Destination (backend or frontend)
  • --convention: Naming convention (default: camelCase for frontend)

What It Does:

  1. Parse source models/interfaces
  2. Generate destination types
  3. Apply naming convention conversion
  4. Write to output file
  5. Update imports if needed

Output:

========================================
Type Synchronization
========================================

Source: Django Models (backend/tenants/models.py)
Destination: TypeScript Interfaces (frontend/src/types/index.ts)

Generating types...

✅ Organization
├─ name → name: string
├─ stripe_customer_id → stripeCustomerId?: string
└─ created_at → createdAt: string

✅ User
├─ email → email: string
├─ full_name → fullName: string
└─ company_name → companyName: string

Generated: frontend/src/types/index.ts (2 interfaces)

Next Steps:
1. Review types: cat frontend/src/types/index.ts
2. Update components to use new field names
3. Run contract tests: npm run test:contract

========================================

Implementation:

  • Uses frontend-backend-contract-validator agent
  • Uses typescript-from-django-models skill

3.6 /troubleshoot-deployment

Purpose: Interactive deployment troubleshooting wizard.

Syntax:

/troubleshoot-deployment --issue <issue_type>

Options:

  • --issue: Issue type (migration-failed, pod-pending, image-pull-error, secret-not-found, env-var-missing)

What It Does:

  1. Guide user through troubleshooting steps
  2. Run diagnostic commands
  3. Suggest fixes
  4. Apply fixes if user approves

Output:

========================================
Deployment Troubleshooting Wizard
========================================

Issue: Pod Pending

Running diagnostics...

✅ Checking pod events...
❌ Error: 0/3 nodes available: 3 node(s) didn't match Pod's node affinity

✅ Checking node labels...
Available nodepools:
• primary-node-pool (3 nodes)

❌ Problem Found:
Your pod specifies nodepool "default-pool" but cluster only has "primary-node-pool"

Suggested Fix:
1. Remove node affinity from manifest
OR
2. Change nodepool to "primary-node-pool"

Apply fix? [y/n]: _

========================================

Implementation:

  • Uses all three specialized agents
  • Interactive prompt flow
  • Automated fix application

4. Scripts (CLI Automation)

4.1 generate-migration-job-from-deployment.py

Purpose: Generate Kubernetes migration Job from existing Deployment.

Usage:

python scripts/generate-migration-job-from-deployment.py \
--deployment coditect-backend \
--namespace coditect-staging \
--command "python,manage.py,migrate" \
--output k8s/migration-job.yaml

Algorithm:

def main():
# 1. Parse arguments
args = parse_arguments()

# 2. Get deployment from Kubernetes
deployment = get_deployment(args.deployment, args.namespace)

# 3. Extract container spec
container = deployment.spec.template.spec.containers[0]

# 4. Extract environment variables
env_vars = extract_env_vars(container)

# 5. Validate secrets
validate_secrets(args.namespace, env_vars)

# 6. Generate Job manifest
job = generate_job_manifest(
name=f"{args.deployment}-migration",
namespace=args.namespace,
image=container.image,
command=args.command.split(','),
env=env_vars
)

# 7. Write to file
write_yaml(args.output, job)

print(f"✅ Generated: {args.output}")

Implementation Details:

  • Uses Python Kubernetes client
  • Validates all secret references
  • Handles both ConfigMap and Secret env sources
  • Generates clean, production-ready YAML

4.2 validate-k8s-secrets.py

Purpose: Validate Kubernetes secrets against requirements.

Usage:

python scripts/validate-k8s-secrets.py \
--namespace coditect-staging \
--secret backend-secrets \
--required db-user,db-password,db-name,django-secret-key

Algorithm:

def main():
# 1. Parse arguments
args = parse_arguments()

# 2. Get secret from Kubernetes
secret = get_secret(args.secret, args.namespace)

# 3. Check required keys
required_keys = args.required.split(',')
missing_keys = []

for key in required_keys:
if key not in secret.data:
missing_keys.append(key)

# 4. Report results
if missing_keys:
print(f"❌ Missing keys: {', '.join(missing_keys)}")
sys.exit(1)
else:
print(f"✅ All {len(required_keys)} keys present")
sys.exit(0)

Implementation Details:

  • Returns proper exit codes for CI/CD
  • Lists available keys if validation fails
  • Handles secret not found gracefully

4.3 check-migration-conflicts.py

Purpose: Detect Django migration conflicts.

Usage:

python scripts/check-migration-conflicts.py \
--project-root /path/to/django/project \
--suggest-merge

Algorithm:

def main():
# 1. Parse arguments
args = parse_arguments()

# 2. Discover Django apps
apps = discover_apps(args.project_root)

# 3. Check each app for conflicts
conflicts = []
for app in apps:
migration_graph = build_migration_graph(app)
leaf_nodes = find_leaf_nodes(migration_graph)

if len(leaf_nodes) > 1:
conflicts.append({
'app': app,
'leaves': leaf_nodes
})

# 4. Report conflicts
if conflicts:
print(f"❌ Conflicts found in {len(conflicts)} apps:")
for conflict in conflicts:
print(f" • {conflict['app']}: {conflict['leaves']}")

if args.suggest_merge:
suggest_merge_commands(conflicts)

sys.exit(1)
else:
print("✅ No migration conflicts detected")
sys.exit(0)

Implementation Details:

  • Builds migration graph using Django internals
  • Detects parallel branches
  • Suggests merge commands
  • CI/CD friendly exit codes

4.4 generate-typescript-types.py

Purpose: Generate TypeScript types from Django models.

Usage:

python scripts/generate-typescript-types.py \
--models backend/tenants/models.py \
--output frontend/src/types/generated.ts \
--convention camelCase

Algorithm:

def main():
# 1. Parse arguments
args = parse_arguments()

# 2. Parse Django models
models = parse_django_models(args.models)

# 3. Generate TypeScript interfaces
interfaces = []
for model in models:
interface = generate_interface(
model,
naming_convention=args.convention
)
interfaces.append(interface)

# 4. Write to output file
with open(args.output, 'w') as f:
f.write("// Auto-generated from Django models\n\n")
f.write("\n\n".join(interfaces))

print(f"✅ Generated {len(interfaces)} interfaces")

Implementation Details:

  • Uses Python AST for parsing
  • Handles Django field types
  • Converts naming conventions
  • Preserves optional/required fields

4.5 compare-env-vars.py

Purpose: Compare environment variables across environments.

Usage:

python scripts/compare-env-vars.py \
--deployment coditect-backend \
--job django-migrations \
--namespace coditect-staging \
--local-env .env

Algorithm:

def main():
# 1. Parse arguments
args = parse_arguments()

# 2. Extract env vars from each source
deployment_env = get_deployment_env(args.deployment, args.namespace)
job_env = get_job_env(args.job, args.namespace)
local_env = load_dotenv(args.local_env)

# 3. Compare
comparison = compare_environments(deployment_env, job_env, local_env)

# 4. Report differences
if comparison['inconsistencies']:
print("❌ Inconsistencies found:")
for issue in comparison['inconsistencies']:
print(f" • {issue['type']}: {issue['key']}")
sys.exit(1)
else:
print("✅ All environments consistent")
sys.exit(0)

Implementation Details:

  • Extracts from Kubernetes API
  • Loads local .env files
  • Detailed diff reporting
  • Suggests fixes for each inconsistency

4.6 validate-deployment-readiness.py

Purpose: Comprehensive pre-deployment validation orchestrator.

Usage:

python scripts/validate-deployment-readiness.py \
--target staging \
--fix

Algorithm:

def main():
# 1. Parse arguments
args = parse_arguments()

# 2. Run all validation checks
results = {
'migrations': validate_migrations(),
'environment': validate_environment(args.target),
'secrets': validate_secrets(args.target),
'contract': validate_contract(),
'image': validate_image(args.target),
'database': validate_database(args.target)
}

# 3. Apply fixes if requested
if args.fix:
for check, result in results.items():
if result['status'] == 'error' and result['fixable']:
apply_fix(check, result['fix_command'])

# 4. Generate report
generate_report(results)

# 5. Exit with appropriate code
if any(r['status'] == 'error' for r in results.values()):
sys.exit(1)
else:
sys.exit(0)

Implementation Details:

  • Orchestrates all validation scripts
  • Aggregates results
  • Applies fixes when safe
  • Generates comprehensive report

5. Hooks (Event Triggers)

5.1 pre-commit-migration-check

Purpose: Validate migrations before commit.

Trigger: git commit

Location: .git/hooks/pre-commit

Implementation:

#!/bin/bash
# .git/hooks/pre-commit

echo "Checking Django migrations..."

# Run migration conflict detection
python scripts/check-migration-conflicts.py --project-root .

if [ $? -ne 0 ]; then
echo "❌ Migration conflicts detected. Resolve conflicts before committing."
echo "Run: python manage.py makemigrations --merge"
exit 1
fi

echo "✅ Migrations valid"
exit 0

What It Prevents:

  • Committing conflicting migrations
  • Pushing broken migration graphs

5.2 pre-commit-contract-test

Purpose: Run contract tests before commit.

Trigger: git commit

Location: .git/hooks/pre-commit

Implementation:

#!/bin/bash
# .git/hooks/pre-commit

echo "Running contract tests..."

# Check if models changed
MODELS_CHANGED=$(git diff --cached --name-only | grep "models.py")

if [ -n "$MODELS_CHANGED" ]; then
echo "Django models changed, validating frontend/backend contract..."

# Run contract validation
python scripts/validate-contract.py

if [ $? -ne 0 ]; then
echo "❌ Contract validation failed. Update TypeScript types."
echo "Run: /sync-types --from backend --to frontend"
exit 1
fi
fi

echo "✅ Contract tests passed"
exit 0

What It Prevents:

  • Committing backend changes without updating frontend types
  • Breaking API contracts

5.3 post-merge-type-sync

Purpose: Auto-sync TypeScript types after merging backend changes.

Trigger: git merge completion

Location: .git/hooks/post-merge

Implementation:

#!/bin/bash
# .git/hooks/post-merge

echo "Checking if type sync needed..."

# Check if models changed in merge
MODELS_CHANGED=$(git diff-tree -r --name-only --no-commit-id ORIG_HEAD HEAD | grep "models.py")

if [ -n "$MODELS_CHANGED" ]; then
echo "Django models changed in merge, syncing TypeScript types..."

# Auto-generate types
python scripts/generate-typescript-types.py \
--models backend/tenants/models.py \
--output frontend/src/types/generated.ts

echo "✅ Types synced. Review frontend/src/types/generated.ts"
echo "Run: git add frontend/src/types/generated.ts && git commit"
fi

What It Automates:

  • Type generation after merging backend changes
  • Reduces manual type synchronization

5.4 pre-deployment-validation

Purpose: Validate deployment readiness before pushing to cluster.

Trigger: Manual (./scripts/deploy.sh)

Location: scripts/deploy.sh

Implementation:

#!/bin/bash
# scripts/deploy.sh

set -e

TARGET_ENV=$1

echo "========================================="
echo "Deployment Pre-Flight Checks"
echo "========================================="

# Run comprehensive validation
python scripts/validate-deployment-readiness.py --target $TARGET_ENV

if [ $? -ne 0 ]; then
echo "❌ Deployment validation failed. Fix issues before deploying."
exit 1
fi

echo "✅ All checks passed. Proceeding with deployment..."

# Continue with actual deployment
kubectl apply -f k8s/$TARGET_ENV/

What It Prevents:

  • Deploying with migration conflicts
  • Deploying with missing secrets
  • Deploying with environment inconsistencies

5.5 post-deployment-validation

Purpose: Verify deployment succeeded.

Trigger: Deployment completion

Location: scripts/deploy.sh

Implementation:

#!/bin/bash
# scripts/deploy.sh (continuation)

echo "Waiting for deployment to stabilize..."
kubectl rollout status deployment/coditect-backend -n $TARGET_ENV --timeout=300s

echo "Running post-deployment checks..."

# Check pod health
POD_STATUS=$(kubectl get pods -n $TARGET_ENV -l app=coditect-backend -o jsonpath='{.items[0].status.phase}')

if [ "$POD_STATUS" != "Running" ]; then
echo "❌ Pod not running. Status: $POD_STATUS"
kubectl logs -n $TARGET_ENV -l app=coditect-backend --tail=50
exit 1
fi

# Check database migrations applied
echo "Verifying migrations..."
kubectl exec -n $TARGET_ENV deployment/coditect-backend -- python manage.py showmigrations --plan | grep "\[X\]" | wc -l

echo "✅ Deployment successful and verified"

What It Validates:

  • Deployment rollout succeeded
  • Pods are running
  • Migrations were applied

6. Workflows (End-to-End Orchestration)

6.1 backend-deployment-workflow

Purpose: Complete backend deployment with all validations.

Trigger: Manual or CI/CD

Steps:

1. Pre-Deployment Validation
├─ /validate-deployment-readiness --target staging
├─ Check migrations (no conflicts)
├─ Validate environment consistency
└─ Verify secrets exist

2. Migration Job Generation
├─ /generate-migration-job --deployment coditect-backend --namespace coditect-staging
└─ Review generated manifest

3. Apply Migrations
├─ kubectl apply -f k8s/migration-job.yaml
├─ Monitor: kubectl logs -f job/django-migrations -n coditect-staging
└─ Verify completion

4. Deploy Backend
├─ kubectl apply -f k8s/staging/backend-deployment.yaml
├─ Wait for rollout: kubectl rollout status deployment/coditect-backend -n coditect-staging
└─ Verify pods running

5. Post-Deployment Validation
├─ Check pod health
├─ Verify migrations applied
├─ Test API endpoints
└─ Monitor logs for errors

6. Smoke Tests
├─ curl -X POST /api/v1/auth/register/ (test registration)
├─ Check database (user created)
└─ Verify frontend can connect

Implementation:

#!/bin/bash
# workflows/backend-deployment.sh

set -e

TARGET_ENV=$1

echo "========================================="
echo "Backend Deployment Workflow"
echo "Target: $TARGET_ENV"
echo "========================================="

# Step 1: Pre-deployment validation
echo "Step 1/6: Pre-deployment validation..."
python scripts/validate-deployment-readiness.py --target $TARGET_ENV

# Step 2: Generate migration job
echo "Step 2/6: Generating migration job..."
python scripts/generate-migration-job-from-deployment.py \
--deployment coditect-backend \
--namespace coditect-$TARGET_ENV \
--command "python,manage.py,migrate"

# Step 3: Apply migrations
echo "Step 3/6: Applying migrations..."
kubectl apply -f k8s/migration-job.yaml
kubectl wait --for=condition=complete job/django-migrations -n coditect-$TARGET_ENV --timeout=300s

# Step 4: Deploy backend
echo "Step 4/6: Deploying backend..."
kubectl apply -f k8s/$TARGET_ENV/backend-deployment.yaml
kubectl rollout status deployment/coditect-backend -n coditect-$TARGET_ENV --timeout=300s

# Step 5: Post-deployment validation
echo "Step 5/6: Post-deployment validation..."
python scripts/validate-deployment-success.py --target $TARGET_ENV

# Step 6: Smoke tests
echo "Step 6/6: Running smoke tests..."
python scripts/run-smoke-tests.py --target $TARGET_ENV

echo "========================================="
echo "✅ Deployment Complete"
echo "========================================="

6.2 contract-sync-workflow

Purpose: Keep frontend and backend in perfect alignment.

Trigger: Backend model changes

Steps:

1. Detect Backend Changes
└─ Check if Django models modified

2. Generate TypeScript Types
├─ /sync-types --from backend --to frontend
└─ Review generated types

3. Update Frontend Components
├─ Update field names (snake_case → camelCase)
└─ Update API calls

4. Generate Contract Tests
├─ /check-frontend-backend-contract --generate-tests
└─ Review generated tests

5. Run Contract Tests
├─ npm run test:contract (frontend)
├─ pytest tests/contract/ (backend)
└─ Verify all pass

6. Commit Changes
├─ git add frontend/src/types/ tests/contract/
└─ git commit -m "sync: Update frontend types from backend models"

Implementation:

#!/bin/bash
# workflows/contract-sync.sh

set -e

echo "========================================="
echo "Contract Synchronization Workflow"
echo "========================================="

# Step 1: Detect backend changes
echo "Step 1/6: Detecting backend changes..."
MODELS_CHANGED=$(git diff --name-only | grep "models.py" || true)

if [ -z "$MODELS_CHANGED" ]; then
echo "No model changes detected. Exiting."
exit 0
fi

echo "Models changed: $MODELS_CHANGED"

# Step 2: Generate TypeScript types
echo "Step 2/6: Generating TypeScript types..."
python scripts/generate-typescript-types.py \
--models backend/tenants/models.py \
--output frontend/src/types/generated.ts

# Step 3: Update frontend components (manual step)
echo "Step 3/6: Review and update frontend components..."
echo "Files to review:"
git grep -l "full_name\|company_name" frontend/src/

read -p "Press enter after updating frontend components..."

# Step 4: Generate contract tests
echo "Step 4/6: Generating contract tests..."
python scripts/generate-contract-tests.py \
--spec openapi.yaml \
--output-frontend frontend/src/tests/contract/ \
--output-backend backend/tests/contract/

# Step 5: Run contract tests
echo "Step 5/6: Running contract tests..."
cd frontend && npm run test:contract
cd ../backend && pytest tests/contract/

# Step 6: Commit changes
echo "Step 6/6: Committing changes..."
git add frontend/src/types/ frontend/src/tests/contract/ backend/tests/contract/
git commit -m "sync: Update frontend types and contract tests from backend models"

echo "========================================="
echo "✅ Contract Synchronization Complete"
echo "========================================="

6.3 pre-deployment-validation-workflow

Purpose: Comprehensive validation before any deployment.

Trigger: CI/CD pipeline or manual deployment

Steps:

1. Code Quality Checks
├─ Linting (backend: flake8, frontend: eslint)
├─ Type checking (backend: mypy, frontend: tsc)
└─ Security scanning (bandit, npm audit)

2. Migration Validation
├─ /validate-migrations
├─ Check for conflicts
└─ Validate backward compatibility

3. Contract Validation
├─ /check-frontend-backend-contract
├─ Verify field alignment
└─ Run contract tests

4. Environment Validation
├─ Compare env vars (deployment vs job)
├─ Validate secrets exist
└─ Check required variables present

5. Infrastructure Validation
├─ Verify Docker image exists
├─ Check cluster resources available
└─ Validate node affinity (if specified)

6. Database Connectivity
├─ Test database connection
└─ Verify migrations can be applied

7. Generate Report
├─ Aggregate all results
├─ Highlight critical issues
└─ Provide fix commands

Implementation:

#!/bin/bash
# workflows/pre-deployment-validation.sh

set -e

TARGET_ENV=$1
REPORT_FILE="deployment-validation-report-$(date +%Y%m%d-%H%M%S).txt"

echo "=========================================" | tee $REPORT_FILE
echo "Pre-Deployment Validation Workflow" | tee -a $REPORT_FILE
echo "Target: $TARGET_ENV" | tee -a $REPORT_FILE
echo "=========================================" | tee -a $REPORT_FILE

FAILED_CHECKS=0

# Step 1: Code quality
echo "Step 1/7: Code quality checks..." | tee -a $REPORT_FILE
if ! flake8 backend/; then
echo "❌ Backend linting failed" | tee -a $REPORT_FILE
FAILED_CHECKS=$((FAILED_CHECKS+1))
fi

if ! cd frontend && npm run lint; then
echo "❌ Frontend linting failed" | tee -a $REPORT_FILE
FAILED_CHECKS=$((FAILED_CHECKS+1))
fi

# Step 2: Migration validation
echo "Step 2/7: Migration validation..." | tee -a $REPORT_FILE
if ! python scripts/check-migration-conflicts.py; then
echo "❌ Migration conflicts detected" | tee -a $REPORT_FILE
FAILED_CHECKS=$((FAILED_CHECKS+1))
fi

# Step 3: Contract validation
echo "Step 3/7: Contract validation..." | tee -a $REPORT_FILE
if ! python scripts/validate-contract.py; then
echo "❌ Contract validation failed" | tee -a $REPORT_FILE
FAILED_CHECKS=$((FAILED_CHECKS+1))
fi

# Step 4: Environment validation
echo "Step 4/7: Environment validation..." | tee -a $REPORT_FILE
if ! python scripts/compare-env-vars.py --target $TARGET_ENV; then
echo "❌ Environment inconsistencies found" | tee -a $REPORT_FILE
FAILED_CHECKS=$((FAILED_CHECKS+1))
fi

# Step 5: Infrastructure validation
echo "Step 5/7: Infrastructure validation..." | tee -a $REPORT_FILE
if ! python scripts/validate-infrastructure.py --target $TARGET_ENV; then
echo "❌ Infrastructure validation failed" | tee -a $REPORT_FILE
FAILED_CHECKS=$((FAILED_CHECKS+1))
fi

# Step 6: Database connectivity
echo "Step 6/7: Database connectivity..." | tee -a $REPORT_FILE
if ! python scripts/check-database-connection.py --target $TARGET_ENV; then
echo "❌ Database connection failed" | tee -a $REPORT_FILE
FAILED_CHECKS=$((FAILED_CHECKS+1))
fi

# Step 7: Generate report
echo "Step 7/7: Generating report..." | tee -a $REPORT_FILE
echo "=========================================" | tee -a $REPORT_FILE
if [ $FAILED_CHECKS -eq 0 ]; then
echo "✅ ALL CHECKS PASSED" | tee -a $REPORT_FILE
echo "Deployment APPROVED for $TARGET_ENV" | tee -a $REPORT_FILE
exit 0
else
echo "❌ $FAILED_CHECKS CHECKS FAILED" | tee -a $REPORT_FILE
echo "Deployment BLOCKED for $TARGET_ENV" | tee -a $REPORT_FILE
exit 1
fi

Implementation Plan

Phase 1: Foundation (Week 1)

Goal: Core infrastructure and scripts

Tasks:

  1. Create script structure: scripts/ directory organization
  2. Implement validate-k8s-secrets.py
  3. Implement check-migration-conflicts.py
  4. Implement compare-env-vars.py
  5. Create pre-commit hook template
  6. Document script usage

Deliverables:

  • 3 working scripts with tests
  • Pre-commit hook template
  • Script documentation

Success Criteria:

  • All scripts pass unit tests
  • Scripts integrated into local development

Phase 2: Agent Development (Week 2)

Goal: Specialized agents with UAF v2.0

Tasks:

  1. Create django-deployment-specialist agent specification
  2. Create frontend-backend-contract-validator agent specification
  3. Create kubernetes-migration-specialist agent specification
  4. Implement agent integration with existing scripts
  5. Test agent invocation patterns

Deliverables:

  • 3 agent specifications in .coditect/agents/
  • Agent integration tests
  • Agent usage documentation

Success Criteria:

  • Agents can be invoked via Task tool
  • Agents correctly orchestrate scripts
  • Agent outputs are actionable

Phase 3: Skills & Commands (Week 3)

Goal: Reusable capabilities and user interface

Tasks:

  1. Implement django-migration-validation skill
  2. Implement k8s-manifest-generation skill
  3. Implement typescript-from-django-models skill
  4. Create /validate-deployment-readiness command
  5. Create /generate-migration-job command
  6. Create /check-frontend-backend-contract command

Deliverables:

  • 3 skills with Python implementations
  • 3 commands with command files
  • Integration tests

Success Criteria:

  • Skills callable from agents and commands
  • Commands work end-to-end
  • All integration tests pass

Phase 4: Workflows & Hooks (Week 4)

Goal: End-to-end automation and prevention

Tasks:

  1. Implement backend-deployment-workflow
  2. Implement contract-sync-workflow
  3. Implement pre-deployment-validation-workflow
  4. Create all git hooks
  5. Integrate hooks into CI/CD

Deliverables:

  • 3 workflow scripts
  • 5 git hooks installed
  • CI/CD integration

Success Criteria:

  • Workflows run successfully end-to-end
  • Hooks prevent problematic commits
  • CI/CD uses workflows

Phase 5: Production Hardening (Week 5)

Goal: Production readiness and documentation

Tasks:

  1. Add comprehensive error handling
  2. Add logging and monitoring
  3. Create user documentation
  4. Create troubleshooting guide
  5. Conduct end-to-end testing
  6. Train team on new tools

Deliverables:

  • Production-ready components
  • Complete documentation
  • Team training materials
  • Test coverage report

Success Criteria:

  • 90% test coverage

  • All documentation complete
  • Team trained and using tools
  • Zero regressions in production

Expected Benefits

Time Savings

Before Automation:

  • Manual troubleshooting: 2-4 hours per deployment
  • Manual type synchronization: 30 minutes per model change
  • Manual migration validation: 15 minutes per migration
  • Manual environment validation: 20 minutes per deployment
  • Total: ~3-5 hours per deployment

After Automation:

  • Automated validation: 2 minutes
  • Automated type generation: 30 seconds
  • Automated migration checks: 10 seconds
  • Automated environment checks: 15 seconds
  • Total: ~3 minutes per deployment

Time Savings: 95%+ reduction in deployment troubleshooting

Quality Improvements

Before Automation:

  • First-time deployment success rate: ~50%
  • Average troubleshooting iterations: 3-5
  • Contract mismatches: 20% of deployments
  • Migration conflicts: 10% of deployments

After Automation:

  • First-time deployment success rate: >95%
  • Average troubleshooting iterations: 0-1
  • Contract mismatches: <1% (prevented by hooks)
  • Migration conflicts: 0% (prevented by pre-commit)

Knowledge Preservation

Current State:

  • Troubleshooting knowledge in developer heads
  • Manual documentation (often outdated)
  • Repeated mistakes

Future State:

  • Knowledge encoded in automation
  • Self-documenting code
  • Mistakes prevented by automation

Testing Strategy

Unit Tests

Coverage:

  • Each script has dedicated unit tests
  • Each skill has unit tests with mocks
  • Each agent specification validated

Tools:

  • pytest for Python
  • Jest for TypeScript generation
  • Mock Kubernetes API

Example:

# tests/test_validate_secrets.py
def test_validate_secrets_all_present():
# Mock Kubernetes API
with mock_kubernetes_api():
result = validate_secrets(
namespace="test",
secret="test-secrets",
required=["key1", "key2"]
)
assert result.valid == True

def test_validate_secrets_missing_key():
with mock_kubernetes_api():
result = validate_secrets(
namespace="test",
secret="test-secrets",
required=["key1", "missing-key"]
)
assert result.valid == False
assert "missing-key" in result.missing_keys

Integration Tests

Coverage:

  • End-to-end workflows
  • Agent orchestration
  • Command execution

Tools:

  • pytest with real Kubernetes cluster (staging)
  • Test fixtures for deployments
  • Cleanup after tests

Example:

# tests/integration/test_deployment_workflow.py
def test_backend_deployment_workflow():
# 1. Setup test deployment
create_test_deployment()

# 2. Run workflow
result = run_workflow("backend-deployment", target="test")

# 3. Verify deployment successful
assert result.success == True
assert deployment_exists("coditect-backend", "test")

# 4. Cleanup
delete_test_deployment()

Contract Tests

Coverage:

  • Frontend/backend alignment
  • API request/response schemas
  • Field name consistency

Tools:

  • Jest for frontend
  • Pytest for backend
  • OpenAPI validation

Example:

// tests/contract/user.contract.test.ts
describe('User Contract', () => {
it('should match backend User model', async () => {
const user: User = {
email: 'test@example.com',
fullName: 'Test User', // camelCase
companyName: 'Test Co' // camelCase
};

const response = await createUser(user);
expect(response.status).toBe(201);
expect(response.data).toHaveProperty('email');
expect(response.data).toHaveProperty('full_name'); // backend returns snake_case
});
});

Monitoring & Observability

Metrics

Key Metrics:

  • Deployment success rate
  • Time to deploy
  • Validation failures by type
  • Hook trigger frequency
  • Contract test pass rate

Implementation:

# metrics.py
from prometheus_client import Counter, Histogram

deployment_total = Counter('coditect_deployments_total', 'Total deployments')
deployment_success = Counter('coditect_deployments_success', 'Successful deployments')
deployment_duration = Histogram('coditect_deployment_duration_seconds', 'Deployment duration')
validation_failures = Counter('coditect_validation_failures', 'Validation failures', ['type'])

Logging

Log Levels:

  • INFO: Normal operations
  • WARNING: Non-critical issues (e.g., optional checks failed)
  • ERROR: Critical issues (e.g., deployment blocked)

Log Format:

{
"timestamp": "2025-12-02T08:00:00Z",
"level": "ERROR",
"component": "validate-deployment-readiness",
"target": "staging",
"check": "migration-validation",
"status": "failed",
"details": "Migration conflict detected in licenses app",
"fix_command": "/validate-migrations --suggest-merge"
}

Alerting

Alert Conditions:

  • Deployment failure rate >10%
  • Validation failures >5 per hour
  • Contract tests failing
  • Migration conflicts detected

Alert Channels:

  • Slack #deployments channel
  • Email to on-call engineer
  • PagerDuty (critical only)

Security Considerations

Secret Handling

Requirements:

  • Never log secret values
  • Validate secret references only (not contents)
  • Use Kubernetes RBAC for secret access
  • Rotate secrets regularly

Implementation:

def validate_secret_exists(namespace: str, secret_name: str, key: str) -> bool:
"""
Validate secret key exists WITHOUT logging its value.
"""
try:
secret = core_v1.read_namespaced_secret(secret_name, namespace)
return key in secret.data
except ApiException:
return False
# NEVER: print(secret.data[key]) # DON'T LOG SECRET VALUES

Access Control

Requirements:

  • Limit who can run deployment workflows
  • Audit all deployment actions
  • Require approval for production deployments

Implementation:

  • GitHub branch protection rules
  • Required reviews for production merges
  • Audit log in CI/CD pipeline

Validation

Requirements:

  • Validate all user inputs
  • Sanitize file paths
  • Prevent command injection

Implementation:

def validate_namespace(namespace: str) -> bool:
"""
Validate namespace name to prevent injection.
"""
import re
# Kubernetes namespace regex
pattern = r'^[a-z0-9]([-a-z0-9]*[a-z0-9])?$'
return re.match(pattern, namespace) is not None

Documentation Requirements

User Documentation

Required:

  1. Quick Start Guide
  2. Command Reference
  3. Workflow Guides
  4. Troubleshooting Guide
  5. FAQ

Location: docs/automation/

Developer Documentation

Required:

  1. Agent Development Guide
  2. Skill Development Guide
  3. Script Development Guide
  4. Testing Guide
  5. Architecture Diagrams

Location: docs/development/

API Documentation

Required:

  1. Script CLI reference
  2. Agent invocation patterns
  3. Skill interfaces
  4. Hook installation guide

Location: docs/api/


Rollout Strategy

Phase 1: Staging Deployment (Week 1-2)

Scope:

  • Install all components in staging environment
  • Train team on new tools
  • Run parallel with manual processes

Success Criteria:

  • All scripts work in staging
  • Team comfortable with commands
  • Zero regressions

Phase 2: Production Soft Launch (Week 3-4)

Scope:

  • Enable pre-commit hooks
  • Use workflows for non-critical deployments
  • Monitor closely

Success Criteria:

  • Hooks prevent bad commits
  • Workflows run successfully
  • Team reports time savings

Phase 3: Production Hard Launch (Week 5)

Scope:

  • Make workflows mandatory for all deployments
  • Enforce all validation gates
  • Deprecate manual deployment processes

Success Criteria:

  • 100% deployments use workflows
  • 95% first-time success rate

  • Team fully adopted tools

Maintenance Plan

Regular Tasks

Daily:

  • Monitor deployment success rate
  • Check validation failure logs
  • Review generated manifests

Weekly:

  • Update agent prompts based on feedback
  • Review and improve error messages
  • Update documentation

Monthly:

  • Review and update component inventory
  • Audit security practices
  • Performance optimization

Continuous Improvement

Feedback Loop:

  1. Collect feedback from developers
  2. Identify pain points
  3. Prioritize improvements
  4. Implement enhancements
  5. Measure impact

Version Control:

  • Semantic versioning for components
  • Changelog for each release
  • Deprecation policy for old components

Appendix A: Component Inventory

Agents (3)

  1. django-deployment-specialist - Django deployment expertise
  2. frontend-backend-contract-validator - Contract alignment
  3. kubernetes-migration-specialist - K8s Job patterns

Skills (5)

  1. django-migration-validation - Migration analysis
  2. k8s-manifest-generation - Manifest generation
  3. typescript-from-django-models - Type generation
  4. environment-consistency-checker - Env var validation
  5. contract-test-generator - Contract test creation

Commands (6)

  1. /validate-deployment-readiness - Pre-deployment validation
  2. /generate-migration-job - Job manifest generation
  3. /check-frontend-backend-contract - Contract validation
  4. /validate-migrations - Migration conflict detection
  5. /sync-types - TypeScript type generation
  6. /troubleshoot-deployment - Interactive troubleshooting

Scripts (6)

  1. generate-migration-job-from-deployment.py - Job generation
  2. validate-k8s-secrets.py - Secret validation
  3. check-migration-conflicts.py - Conflict detection
  4. generate-typescript-types.py - Type generation
  5. compare-env-vars.py - Environment comparison
  6. validate-deployment-readiness.py - Comprehensive validation

Hooks (5)

  1. pre-commit-migration-check - Prevent conflicting migrations
  2. pre-commit-contract-test - Validate contracts
  3. post-merge-type-sync - Auto-sync types
  4. pre-deployment-validation - Pre-deployment checks
  5. post-deployment-validation - Post-deployment verification

Workflows (3)

  1. backend-deployment-workflow - Complete backend deployment
  2. contract-sync-workflow - Frontend/backend alignment
  3. pre-deployment-validation-workflow - Comprehensive validation

Appendix B: Success Metrics

Deployment Efficiency

MetricBeforeTargetMeasurement
Time to deploy3-5 hours15 minutesCI/CD logs
First-time success50%>95%Deployment tracking
Troubleshooting iterations3-50-1Developer surveys
Manual validation time45 minutes3 minutesTime tracking

Quality Metrics

MetricBeforeTargetMeasurement
Contract mismatches20%<1%Contract test results
Migration conflicts10%0%Git hook triggers
Secret validation failures15%<2%Validation logs
Environment inconsistencies25%<3%Comparison results

Team Metrics

MetricBeforeTargetMeasurement
Developer satisfaction6/109/10Surveys
Deployment confidence5/109/10Surveys
Knowledge sharing4/109/10Team feedback
Onboarding time2 weeks3 daysHR tracking

Appendix C: Risk Mitigation

Risk: Automation Complexity

Likelihood: Medium Impact: Medium Mitigation:

  • Start with simple scripts
  • Gradual rollout
  • Comprehensive documentation
  • Team training

Risk: False Positives

Likelihood: Medium Impact: Low Mitigation:

  • Tunable validation thresholds
  • Override mechanism for edge cases
  • Continuous improvement of detection logic

Risk: Dependency on Kubernetes API

Likelihood: Low Impact: High Mitigation:

  • Graceful degradation if API unavailable
  • Caching of cluster state
  • Manual override option

Risk: Learning Curve

Likelihood: High Impact: Low Mitigation:

  • Interactive tutorials
  • Command suggestions
  • Clear error messages
  • 1-on-1 training

Document Version: 1.0 Created: December 2, 2025 Author: AI Assistant (Claude Code) Status: Complete - Ready for Review Next Action: Add to tasklist.md and project-plan.md