Skip to main content

CI/CD Pipeline for BIO-QMS Platform

Document Version: 1.0.0 Last Updated: 2026-02-16 Owner: DevOps Team Compliance: FDA 21 CFR Part 11, HIPAA, SOC 2 Type II


Table of Contents

  1. Overview
  2. E.1.1: GitHub Actions CI Pipeline
  3. E.1.2: Automated Deployment Pipeline
  4. E.1.3: Container Image Build and Registry
  5. E.1.4: Database Migration CI Step
  6. Appendices

Overview

Architecture

The BIO-QMS CI/CD pipeline is designed for a regulated SaaS platform serving pharmaceutical and biotech companies. The pipeline enforces quality gates, security scanning, and compliance audit trails at every stage.

Technology Stack:

  • Frontend: Vite + React 19, TypeScript
  • Backend: Django REST Framework, Python 3.11
  • Database: PostgreSQL 15, Redis
  • Infrastructure: GCP (Cloud Run, Cloud SQL, Memorystore)
  • CI/CD: GitHub Actions
  • Container Registry: Google Artifact Registry

Compliance Requirements:

  • FDA 21 CFR Part 11: Electronic records and signatures
  • HIPAA: Protected health information security
  • SOC 2 Type II: Security, availability, confidentiality

Pipeline Stages:

  1. CI: Continuous Integration (lint, test, build, scan)
  2. CD: Continuous Deployment (staging auto, production manual)
  3. Registry: Container build, sign, scan, store
  4. Migrations: Database schema validation and deployment

E.1.1: GitHub Actions CI Pipeline

Complete CI Workflow

File: .github/workflows/ci.yml

name: CI Pipeline

on:
push:
branches:
- main
- develop
- 'release/**'
pull_request:
branches:
- main
- develop
workflow_dispatch:
inputs:
skip_tests:
description: 'Skip test execution (use cautiously)'
required: false
type: boolean
default: false

env:
NODE_VERSION: '20'
PYTHON_VERSION: '3.11'
POSTGRES_VERSION: '15'
MINIMUM_COVERAGE: 80

# Cancel in-progress runs for same branch
concurrency:
group: ci-${{ github.ref }}
cancel-in-progress: true

jobs:
# ============================================
# FRONTEND CI JOBS
# ============================================

frontend-lint:
name: Frontend Lint
runs-on: ubuntu-latest
timeout-minutes: 10

steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0 # Full history for better analysis

- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
cache-dependency-path: frontend/package-lock.json

- name: Install dependencies
working-directory: frontend
run: |
npm ci --prefer-offline --no-audit
echo "✓ Installed $(npm list --depth=0 | wc -l) packages"

- name: Run ESLint
working-directory: frontend
run: |
npm run lint -- --format=json --output-file=eslint-report.json || true
npm run lint

- name: Upload ESLint report
if: always()
uses: actions/upload-artifact@v4
with:
name: eslint-report
path: frontend/eslint-report.json
retention-days: 30

- name: Run Prettier check
working-directory: frontend
run: npm run format:check

frontend-typecheck:
name: Frontend Type Check
runs-on: ubuntu-latest
timeout-minutes: 10

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
cache-dependency-path: frontend/package-lock.json

- name: Install dependencies
working-directory: frontend
run: npm ci --prefer-offline --no-audit

- name: Run TypeScript compiler
working-directory: frontend
run: |
npx tsc --noEmit --incremental false --pretty
echo "✓ TypeScript type checking passed"

frontend-test:
name: Frontend Tests
runs-on: ubuntu-latest
timeout-minutes: 15

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
cache-dependency-path: frontend/package-lock.json

- name: Install dependencies
working-directory: frontend
run: npm ci --prefer-offline --no-audit

- name: Run Vitest with coverage
if: ${{ !inputs.skip_tests }}
working-directory: frontend
run: |
npm run test:coverage -- --run --reporter=verbose --reporter=json --outputFile=test-results.json
env:
CI: true

- name: Check coverage threshold
if: ${{ !inputs.skip_tests }}
working-directory: frontend
run: |
COVERAGE=$(cat coverage/coverage-summary.json | jq '.total.lines.pct')
echo "Coverage: ${COVERAGE}%"
if (( $(echo "$COVERAGE < ${{ env.MINIMUM_COVERAGE }}" | bc -l) )); then
echo "❌ Coverage ${COVERAGE}% is below minimum ${{ env.MINIMUM_COVERAGE }}%"
exit 1
fi
echo "✓ Coverage threshold met"

- name: Upload coverage to Codecov
if: ${{ !inputs.skip_tests }}
uses: codecov/codecov-action@v4
with:
files: frontend/coverage/coverage-final.json
flags: frontend
name: frontend-coverage
fail_ci_if_error: true
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}

- name: Upload test results
if: always()
uses: actions/upload-artifact@v4
with:
name: frontend-test-results
path: |
frontend/test-results.json
frontend/coverage/
retention-days: 30

frontend-build:
name: Frontend Build
runs-on: ubuntu-latest
timeout-minutes: 15
needs: [frontend-lint, frontend-typecheck]

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
cache-dependency-path: frontend/package-lock.json

- name: Install dependencies
working-directory: frontend
run: npm ci --prefer-offline --no-audit

- name: Build production bundle
working-directory: frontend
run: |
npm run build
echo "✓ Build completed"
env:
NODE_ENV: production

- name: Analyze bundle size
working-directory: frontend
run: |
npm run build -- --mode=production --analyze
du -sh dist/
echo "Bundle contents:"
ls -lh dist/assets/ | tail -20

- name: Upload build artifacts
uses: actions/upload-artifact@v4
with:
name: frontend-dist
path: frontend/dist/
retention-days: 7

# ============================================
# BACKEND CI JOBS
# ============================================

backend-lint:
name: Backend Lint
runs-on: ubuntu-latest
timeout-minutes: 10

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: ${{ env.PYTHON_VERSION }}
cache: 'pip'
cache-dependency-path: backend/requirements.txt

- name: Install dependencies
working-directory: backend
run: |
pip install --upgrade pip
pip install ruff black isort mypy
pip install -r requirements.txt

- name: Run Ruff
working-directory: backend
run: |
ruff check . --output-format=json --output-file=ruff-report.json || true
ruff check . --statistics
echo "✓ Ruff linting completed"

- name: Run Black
working-directory: backend
run: |
black --check --diff .
echo "✓ Black formatting check passed"

- name: Run isort
working-directory: backend
run: |
isort --check-only --diff .
echo "✓ isort import sorting check passed"

- name: Upload lint reports
if: always()
uses: actions/upload-artifact@v4
with:
name: backend-lint-reports
path: backend/ruff-report.json
retention-days: 30

backend-typecheck:
name: Backend Type Check
runs-on: ubuntu-latest
timeout-minutes: 10

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: ${{ env.PYTHON_VERSION }}
cache: 'pip'
cache-dependency-path: backend/requirements.txt

- name: Install dependencies
working-directory: backend
run: |
pip install --upgrade pip
pip install mypy types-requests django-stubs djangorestframework-stubs
pip install -r requirements.txt

- name: Run mypy
working-directory: backend
run: |
mypy . --config-file=mypy.ini --junit-xml=mypy-report.xml
echo "✓ mypy type checking passed"

- name: Upload mypy report
if: always()
uses: actions/upload-artifact@v4
with:
name: mypy-report
path: backend/mypy-report.xml
retention-days: 30

backend-test:
name: Backend Tests
runs-on: ubuntu-latest
timeout-minutes: 20

services:
postgres:
image: postgres:15-alpine
env:
POSTGRES_DB: bio_qms_test
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432

redis:
image: redis:7-alpine
options: >-
--health-cmd "redis-cli ping"
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 6379:6379

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: ${{ env.PYTHON_VERSION }}
cache: 'pip'
cache-dependency-path: backend/requirements.txt

- name: Install dependencies
working-directory: backend
run: |
pip install --upgrade pip
pip install -r requirements.txt
pip install pytest pytest-cov pytest-django pytest-xdist

- name: Run Django checks
working-directory: backend
run: |
python manage.py check --deploy
python manage.py check --database default
echo "✓ Django checks passed"
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
REDIS_URL: redis://localhost:6379/0
SECRET_KEY: test-secret-key-for-ci
DEBUG: 'False'

- name: Run pytest with coverage
if: ${{ !inputs.skip_tests }}
working-directory: backend
run: |
pytest \
--cov=. \
--cov-report=xml \
--cov-report=html \
--cov-report=term-missing \
--junit-xml=pytest-report.xml \
--verbose \
--numprocesses=auto \
--maxfail=5
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
REDIS_URL: redis://localhost:6379/0
SECRET_KEY: test-secret-key-for-ci
DEBUG: 'True'
TESTING: 'True'

- name: Check coverage threshold
if: ${{ !inputs.skip_tests }}
working-directory: backend
run: |
COVERAGE=$(python -c "import xml.etree.ElementTree as ET; tree = ET.parse('coverage.xml'); root = tree.getroot(); print(root.attrib['line-rate'])")
COVERAGE_PCT=$(echo "$COVERAGE * 100" | bc)
echo "Coverage: ${COVERAGE_PCT}%"
if (( $(echo "$COVERAGE_PCT < ${{ env.MINIMUM_COVERAGE }}" | bc -l) )); then
echo "❌ Coverage ${COVERAGE_PCT}% is below minimum ${{ env.MINIMUM_COVERAGE }}%"
exit 1
fi
echo "✓ Coverage threshold met"

- name: Upload coverage to Codecov
if: ${{ !inputs.skip_tests }}
uses: codecov/codecov-action@v4
with:
files: backend/coverage.xml
flags: backend
name: backend-coverage
fail_ci_if_error: true
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}

- name: Upload test results
if: always()
uses: actions/upload-artifact@v4
with:
name: backend-test-results
path: |
backend/pytest-report.xml
backend/coverage.xml
backend/htmlcov/
retention-days: 30

# ============================================
# SECURITY SCANNING JOBS
# ============================================

security-scan-frontend:
name: Security Scan - Frontend
runs-on: ubuntu-latest
timeout-minutes: 15

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}

- name: Run npm audit
working-directory: frontend
run: |
npm audit --audit-level=moderate --json > npm-audit.json || true
npm audit --audit-level=moderate
continue-on-error: true

- name: Upload npm audit report
if: always()
uses: actions/upload-artifact@v4
with:
name: npm-audit-report
path: frontend/npm-audit.json
retention-days: 30

security-scan-backend:
name: Security Scan - Backend
runs-on: ubuntu-latest
timeout-minutes: 15

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: ${{ env.PYTHON_VERSION }}

- name: Install pip-audit
run: pip install pip-audit

- name: Run pip-audit
working-directory: backend
run: |
pip-audit --format=json --output=pip-audit.json || true
pip-audit --format=text
continue-on-error: true

- name: Upload pip-audit report
if: always()
uses: actions/upload-artifact@v4
with:
name: pip-audit-report
path: backend/pip-audit.json
retention-days: 30

security-scan-trivy:
name: Security Scan - Trivy
runs-on: ubuntu-latest
timeout-minutes: 15
needs: [frontend-build]

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Run Trivy vulnerability scanner - Filesystem
uses: aquasecurity/trivy-action@master
with:
scan-type: 'fs'
scan-ref: '.'
format: 'sarif'
output: 'trivy-fs-results.sarif'
severity: 'CRITICAL,HIGH'

- name: Upload Trivy results to GitHub Security
uses: github/codeql-action/upload-sarif@v3
with:
sarif_file: 'trivy-fs-results.sarif'

- name: Run Trivy vulnerability scanner - Config files
uses: aquasecurity/trivy-action@master
with:
scan-type: 'config'
scan-ref: '.'
format: 'table'
exit-code: '0'

# ============================================
# COMPLIANCE & AUDIT TRAIL
# ============================================

compliance-audit:
name: Compliance Audit Trail
runs-on: ubuntu-latest
if: github.event_name == 'push' && (github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/heads/release/'))
needs: [frontend-test, backend-test, security-scan-trivy]

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Generate audit record
run: |
cat > audit-record.json <<EOF
{
"event_type": "ci_pipeline_execution",
"timestamp": "$(date -u +"%Y-%m-%dT%H:%M:%SZ")",
"commit_sha": "${{ github.sha }}",
"commit_message": $(jq -n --arg msg "$(git log -1 --pretty=%B)" '$msg'),
"branch": "${{ github.ref_name }}",
"actor": "${{ github.actor }}",
"workflow": "${{ github.workflow }}",
"run_id": "${{ github.run_id }}",
"run_number": "${{ github.run_number }}",
"repository": "${{ github.repository }}",
"event_name": "${{ github.event_name }}",
"jobs_status": {
"frontend_tests": "${{ needs.frontend-test.result }}",
"backend_tests": "${{ needs.backend-test.result }}",
"security_scan": "${{ needs.security-scan-trivy.result }}"
},
"compliance_frameworks": ["FDA_21_CFR_Part_11", "HIPAA", "SOC2_Type_II"],
"artifacts_signed": false,
"code_review_required": true,
"approval_status": "pending_deployment"
}
EOF

- name: Upload audit record
uses: actions/upload-artifact@v4
with:
name: compliance-audit-record
path: audit-record.json
retention-days: 2555 # 7 years for FDA compliance

- name: Store audit in GCS
uses: google-github-actions/auth@v2
with:
credentials_json: ${{ secrets.GCP_SA_KEY }}

- name: Upload to audit bucket
run: |
gsutil cp audit-record.json gs://bio-qms-audit-logs/ci-pipeline/${{ github.run_id }}/

# ============================================
# CI SUMMARY JOB
# ============================================

ci-complete:
name: CI Pipeline Complete
runs-on: ubuntu-latest
if: always()
needs:
- frontend-lint
- frontend-typecheck
- frontend-test
- frontend-build
- backend-lint
- backend-typecheck
- backend-test
- security-scan-frontend
- security-scan-backend
- security-scan-trivy

steps:
- name: Check job results
run: |
echo "Frontend Lint: ${{ needs.frontend-lint.result }}"
echo "Frontend Type Check: ${{ needs.frontend-typecheck.result }}"
echo "Frontend Tests: ${{ needs.frontend-test.result }}"
echo "Frontend Build: ${{ needs.frontend-build.result }}"
echo "Backend Lint: ${{ needs.backend-lint.result }}"
echo "Backend Type Check: ${{ needs.backend-typecheck.result }}"
echo "Backend Tests: ${{ needs.backend-test.result }}"
echo "Security Scan Frontend: ${{ needs.security-scan-frontend.result }}"
echo "Security Scan Backend: ${{ needs.security-scan-backend.result }}"
echo "Security Scan Trivy: ${{ needs.security-scan-trivy.result }}"

if [[ "${{ needs.frontend-lint.result }}" != "success" ]] || \
[[ "${{ needs.frontend-typecheck.result }}" != "success" ]] || \
[[ "${{ needs.frontend-test.result }}" != "success" ]] || \
[[ "${{ needs.frontend-build.result }}" != "success" ]] || \
[[ "${{ needs.backend-lint.result }}" != "success" ]] || \
[[ "${{ needs.backend-typecheck.result }}" != "success" ]] || \
[[ "${{ needs.backend-test.result }}" != "success" ]] || \
[[ "${{ needs.security-scan-trivy.result }}" != "success" ]]; then
echo "❌ CI Pipeline Failed"
exit 1
fi

echo "✅ CI Pipeline Passed - All quality gates met"

- name: Post summary
run: |
cat >> $GITHUB_STEP_SUMMARY <<EOF
## CI Pipeline Results

| Job | Status |
|-----|--------|
| Frontend Lint | ${{ needs.frontend-lint.result }} |
| Frontend Type Check | ${{ needs.frontend-typecheck.result }} |
| Frontend Tests | ${{ needs.frontend-test.result }} |
| Frontend Build | ${{ needs.frontend-build.result }} |
| Backend Lint | ${{ needs.backend-lint.result }} |
| Backend Type Check | ${{ needs.backend-typecheck.result }} |
| Backend Tests | ${{ needs.backend-test.result }} |
| Security Scan (Frontend) | ${{ needs.security-scan-frontend.result }} |
| Security Scan (Backend) | ${{ needs.security-scan-backend.result }} |
| Security Scan (Trivy) | ${{ needs.security-scan-trivy.result }} |

**Compliance:** Audit record generated and stored for 7 years

**Next Steps:** If all jobs passed, this build is eligible for deployment to staging.
EOF

PR Status Checks Configuration

File: .github/workflows/pr-checks.yml

name: PR Status Checks

on:
pull_request:
types: [opened, synchronize, reopened, ready_for_review]

permissions:
contents: read
pull-requests: write
statuses: write

jobs:
pr-validation:
name: PR Validation
runs-on: ubuntu-latest
if: github.event.pull_request.draft == false

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Check PR title format
run: |
PR_TITLE="${{ github.event.pull_request.title }}"
if ! echo "$PR_TITLE" | grep -qE '^(feat|fix|docs|style|refactor|perf|test|build|ci|chore|revert)(\(.+\))?: .+'; then
echo "❌ PR title must follow Conventional Commits format"
echo "Examples:"
echo " - feat(auth): add SSO login"
echo " - fix(api): resolve CORS issue"
echo " - docs: update deployment guide"
exit 1
fi
echo "✓ PR title format is valid"

- name: Check PR description
run: |
PR_BODY="${{ github.event.pull_request.body }}"
if [ -z "$PR_BODY" ] || [ ${#PR_BODY} -lt 50 ]; then
echo "❌ PR description must be at least 50 characters"
exit 1
fi
echo "✓ PR description is adequate"

- name: Check for linked issues
run: |
PR_BODY="${{ github.event.pull_request.body }}"
if ! echo "$PR_BODY" | grep -qE '(Closes|Fixes|Resolves) #[0-9]+'; then
echo "⚠️ Warning: No linked issue found (recommended)"
else
echo "✓ PR links to issue"
fi

- name: Check for breaking changes
run: |
PR_BODY="${{ github.event.pull_request.body }}"
if echo "$PR_BODY" | grep -qi 'BREAKING CHANGE'; then
echo "⚠️ BREAKING CHANGE detected - requires additional review"
echo "breaking_change=true" >> $GITHUB_OUTPUT
fi
id: breaking_check

- name: Label PR with breaking change
if: steps.breaking_check.outputs.breaking_change == 'true'
uses: actions/github-script@v7
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
github.rest.issues.addLabels({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
labels: ['breaking-change']
})

code-review-assignment:
name: Auto-assign Reviewers
runs-on: ubuntu-latest
if: github.event.pull_request.draft == false

steps:
- name: Assign reviewers based on CODEOWNERS
uses: actions/github-script@v7
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
const { data: files } = await github.rest.pulls.listFiles({
owner: context.repo.owner,
repo: context.repo.repo,
pull_number: context.issue.number
});

const reviewers = new Set();

// Add specific reviewers based on file paths
for (const file of files) {
if (file.filename.startsWith('backend/')) {
reviewers.add('backend-team');
}
if (file.filename.startsWith('frontend/')) {
reviewers.add('frontend-team');
}
if (file.filename.includes('migration')) {
reviewers.add('database-team');
}
if (file.filename.includes('Dockerfile') || file.filename.includes('.github/workflows')) {
reviewers.add('devops-team');
}
}

if (reviewers.size > 0) {
await github.rest.pulls.requestReviewers({
owner: context.repo.owner,
repo: context.repo.repo,
pull_number: context.issue.number,
team_reviewers: Array.from(reviewers)
});
}

size-label:
name: Label PR by Size
runs-on: ubuntu-latest

steps:
- name: Calculate PR size
uses: actions/github-script@v7
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
const { data: files } = await github.rest.pulls.listFiles({
owner: context.repo.owner,
repo: context.repo.repo,
pull_number: context.issue.number
});

const additions = files.reduce((sum, file) => sum + file.additions, 0);
const deletions = files.reduce((sum, file) => sum + file.deletions, 0);
const total = additions + deletions;

let size = 'XS';
if (total > 1000) size = 'XL';
else if (total > 500) size = 'L';
else if (total > 100) size = 'M';
else if (total > 10) size = 'S';

await github.rest.issues.addLabels({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
labels: [`size/${size}`]
});

console.log(`PR size: ${size} (${additions} additions, ${deletions} deletions)`);

Branch Protection Rules

Configure in GitHub repository settings:

# Repository Settings > Branches > Branch protection rules
# Rule for: main

required_status_checks:
strict: true
contexts:
- "Frontend Lint"
- "Frontend Type Check"
- "Frontend Tests"
- "Frontend Build"
- "Backend Lint"
- "Backend Type Check"
- "Backend Tests"
- "Security Scan - Trivy"
- "CI Pipeline Complete"

required_pull_request_reviews:
required_approving_review_count: 2
dismiss_stale_reviews: true
require_code_owner_reviews: true
require_last_push_approval: true

enforce_admins: true
allow_force_pushes: false
allow_deletions: false
required_linear_history: true
required_signatures: false # Enable if using commit signing

Code Coverage Enforcement

File: .github/workflows/coverage-report.yml

name: Coverage Report

on:
pull_request:
types: [opened, synchronize]
push:
branches: [main]

jobs:
coverage-comment:
name: Post Coverage Comment
runs-on: ubuntu-latest
if: github.event_name == 'pull_request'

steps:
- name: Download coverage artifacts
uses: actions/download-artifact@v4
with:
pattern: '*-coverage'
merge-multiple: true

- name: Generate coverage comment
uses: actions/github-script@v7
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
const fs = require('fs');

let frontendCoverage = 'N/A';
let backendCoverage = 'N/A';

try {
const frontendData = JSON.parse(fs.readFileSync('frontend/coverage/coverage-summary.json'));
frontendCoverage = frontendData.total.lines.pct.toFixed(2);
} catch (e) {
console.log('Frontend coverage not available');
}

try {
const xml = fs.readFileSync('backend/coverage.xml', 'utf8');
const match = xml.match(/line-rate="([0-9.]+)"/);
if (match) {
backendCoverage = (parseFloat(match[1]) * 100).toFixed(2);
}
} catch (e) {
console.log('Backend coverage not available');
}

const body = `## Code Coverage Report

| Component | Coverage | Status |
|-----------|----------|--------|
| Frontend | ${frontendCoverage}% | ${frontendCoverage >= 80 ? '✅' : '❌'} |
| Backend | ${backendCoverage}% | ${backendCoverage >= 80 ? '✅' : '❌'} |

**Minimum Required:** 80%

${frontendCoverage < 80 || backendCoverage < 80 ? '⚠️ Coverage below threshold - please add more tests' : '✅ Coverage requirements met'}
`;

const { data: comments } = await github.rest.issues.listComments({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number
});

const botComment = comments.find(comment =>
comment.user.type === 'Bot' && comment.body.includes('Code Coverage Report')
);

if (botComment) {
await github.rest.issues.updateComment({
owner: context.repo.owner,
repo: context.repo.repo,
comment_id: botComment.id,
body: body
});
} else {
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
body: body
});
}

E.1.2: Automated Deployment Pipeline

Complete CD Workflow

File: .github/workflows/cd.yml

name: CD Pipeline

on:
push:
branches:
- main
workflow_dispatch:
inputs:
environment:
description: 'Target environment'
required: true
type: choice
options:
- staging
- production
skip_tests:
description: 'Skip smoke tests (not recommended)'
required: false
type: boolean
default: false

env:
GCP_PROJECT_ID: bio-qms-prod
GCP_REGION: us-central1
ARTIFACT_REGISTRY: us-central1-docker.pkg.dev
SERVICE_NAME: bio-qms

concurrency:
group: deploy-${{ github.event.inputs.environment || 'staging' }}
cancel-in-progress: false # Never cancel deployments

jobs:
# ============================================
# BUILD & PUSH IMAGES
# ============================================

build-backend:
name: Build Backend Image
runs-on: ubuntu-latest
timeout-minutes: 30
outputs:
image: ${{ steps.image.outputs.image }}
version: ${{ steps.version.outputs.version }}

steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0

- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

- name: Authenticate to Google Cloud
uses: google-github-actions/auth@v2
with:
credentials_json: ${{ secrets.GCP_SA_KEY }}

- name: Configure Docker for Artifact Registry
run: |
gcloud auth configure-docker ${{ env.ARTIFACT_REGISTRY }}

- name: Generate version
id: version
run: |
VERSION=$(git describe --tags --always --dirty)
TIMESTAMP=$(date -u +"%Y%m%d%H%M%S")
FULL_VERSION="${VERSION}-${TIMESTAMP}"
echo "version=${FULL_VERSION}" >> $GITHUB_OUTPUT
echo "Version: ${FULL_VERSION}"

- name: Build and push backend image
uses: docker/build-push-action@v5
with:
context: ./backend
file: ./backend/Dockerfile
push: true
tags: |
${{ env.ARTIFACT_REGISTRY }}/${{ env.GCP_PROJECT_ID }}/bio-qms/backend:${{ steps.version.outputs.version }}
${{ env.ARTIFACT_REGISTRY }}/${{ env.GCP_PROJECT_ID }}/bio-qms/backend:${{ github.sha }}
${{ env.ARTIFACT_REGISTRY }}/${{ env.GCP_PROJECT_ID }}/bio-qms/backend:latest
cache-from: type=gha
cache-to: type=gha,mode=max
build-args: |
BUILD_DATE=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
VCS_REF=${{ github.sha }}
VERSION=${{ steps.version.outputs.version }}

- name: Set image output
id: image
run: |
IMAGE="${{ env.ARTIFACT_REGISTRY }}/${{ env.GCP_PROJECT_ID }}/bio-qms/backend:${{ steps.version.outputs.version }}"
echo "image=${IMAGE}" >> $GITHUB_OUTPUT

- name: Scan image with Trivy
uses: aquasecurity/trivy-action@master
with:
image-ref: ${{ steps.image.outputs.image }}
format: 'sarif'
output: 'trivy-backend-results.sarif'
severity: 'CRITICAL,HIGH'

- name: Upload Trivy results
uses: github/codeql-action/upload-sarif@v3
with:
sarif_file: 'trivy-backend-results.sarif'

build-frontend:
name: Build Frontend Image
runs-on: ubuntu-latest
timeout-minutes: 30
outputs:
image: ${{ steps.image.outputs.image }}
version: ${{ steps.version.outputs.version }}

steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0

- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

- name: Authenticate to Google Cloud
uses: google-github-actions/auth@v2
with:
credentials_json: ${{ secrets.GCP_SA_KEY }}

- name: Configure Docker for Artifact Registry
run: |
gcloud auth configure-docker ${{ env.ARTIFACT_REGISTRY }}

- name: Generate version
id: version
run: |
VERSION=$(git describe --tags --always --dirty)
TIMESTAMP=$(date -u +"%Y%m%d%H%M%S")
FULL_VERSION="${VERSION}-${TIMESTAMP}"
echo "version=${FULL_VERSION}" >> $GITHUB_OUTPUT
echo "Version: ${FULL_VERSION}"

- name: Build and push frontend image
uses: docker/build-push-action@v5
with:
context: ./frontend
file: ./frontend/Dockerfile
push: true
tags: |
${{ env.ARTIFACT_REGISTRY }}/${{ env.GCP_PROJECT_ID }}/bio-qms/frontend:${{ steps.version.outputs.version }}
${{ env.ARTIFACT_REGISTRY }}/${{ env.GCP_PROJECT_ID }}/bio-qms/frontend:${{ github.sha }}
${{ env.ARTIFACT_REGISTRY }}/${{ env.GCP_PROJECT_ID }}/bio-qms/frontend:latest
cache-from: type=gha
cache-to: type=gha,mode=max
build-args: |
BUILD_DATE=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
VCS_REF=${{ github.sha }}
VERSION=${{ steps.version.outputs.version }}

- name: Set image output
id: image
run: |
IMAGE="${{ env.ARTIFACT_REGISTRY }}/${{ env.GCP_PROJECT_ID }}/bio-qms/frontend:${{ steps.version.outputs.version }}"
echo "image=${IMAGE}" >> $GITHUB_OUTPUT

# ============================================
# STAGING DEPLOYMENT
# ============================================

deploy-staging:
name: Deploy to Staging
runs-on: ubuntu-latest
timeout-minutes: 20
needs: [build-backend, build-frontend]
if: github.ref == 'refs/heads/main' && github.event.inputs.environment != 'production'
environment:
name: staging
url: https://staging.bio-qms.com

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Authenticate to Google Cloud
uses: google-github-actions/auth@v2
with:
credentials_json: ${{ secrets.GCP_SA_KEY }}

- name: Setup Cloud SDK
uses: google-github-actions/setup-gcloud@v2

- name: Deploy backend to Cloud Run (blue)
id: deploy_backend_blue
run: |
gcloud run deploy ${{ env.SERVICE_NAME }}-backend-staging-blue \
--image=${{ needs.build-backend.outputs.image }} \
--region=${{ env.GCP_REGION }} \
--platform=managed \
--allow-unauthenticated \
--set-env-vars="ENVIRONMENT=staging,VERSION=${{ needs.build-backend.outputs.version }}" \
--set-secrets="DATABASE_URL=staging-database-url:latest,SECRET_KEY=staging-secret-key:latest" \
--memory=2Gi \
--cpu=2 \
--min-instances=1 \
--max-instances=10 \
--timeout=300 \
--no-traffic \
--tag=blue

BLUE_URL=$(gcloud run services describe ${{ env.SERVICE_NAME }}-backend-staging-blue \
--region=${{ env.GCP_REGION }} \
--format='value(status.url)')
echo "blue_url=${BLUE_URL}" >> $GITHUB_OUTPUT

- name: Deploy frontend to Cloud Run (blue)
id: deploy_frontend_blue
run: |
gcloud run deploy ${{ env.SERVICE_NAME }}-frontend-staging-blue \
--image=${{ needs.build-frontend.outputs.image }} \
--region=${{ env.GCP_REGION }} \
--platform=managed \
--allow-unauthenticated \
--set-env-vars="API_URL=${{ steps.deploy_backend_blue.outputs.blue_url }},VERSION=${{ needs.build-frontend.outputs.version }}" \
--memory=512Mi \
--cpu=1 \
--min-instances=1 \
--max-instances=10 \
--timeout=60 \
--no-traffic \
--tag=blue

BLUE_URL=$(gcloud run services describe ${{ env.SERVICE_NAME }}-frontend-staging-blue \
--region=${{ env.GCP_REGION }} \
--format='value(status.url)')
echo "blue_url=${BLUE_URL}" >> $GITHUB_OUTPUT

- name: Run smoke tests on blue deployment
run: |
# Backend health check
BACKEND_HEALTH="${{ steps.deploy_backend_blue.outputs.blue_url }}/health"
for i in {1..30}; do
if curl -f "$BACKEND_HEALTH" | jq -e '.status == "healthy"'; then
echo "✓ Backend health check passed"
break
fi
if [ $i -eq 30 ]; then
echo "❌ Backend health check failed after 30 attempts"
exit 1
fi
sleep 10
done

# Frontend health check
FRONTEND_URL="${{ steps.deploy_frontend_blue.outputs.blue_url }}"
if curl -f "$FRONTEND_URL" | grep -q "<!DOCTYPE html>"; then
echo "✓ Frontend is serving content"
else
echo "❌ Frontend health check failed"
exit 1
fi

# API integration test
API_TEST="${{ steps.deploy_backend_blue.outputs.blue_url }}/api/v1/healthz"
if curl -f "$API_TEST" -H "Accept: application/json" | jq -e '.database == "connected"'; then
echo "✓ Database connection verified"
else
echo "❌ Database connection failed"
exit 1
fi

- name: Switch traffic to blue (green -> blue)
run: |
# Update backend traffic
gcloud run services update-traffic ${{ env.SERVICE_NAME }}-backend-staging \
--region=${{ env.GCP_REGION }} \
--to-tags=blue=100

# Update frontend traffic
gcloud run services update-traffic ${{ env.SERVICE_NAME }}-frontend-staging \
--region=${{ env.GCP_REGION }} \
--to-tags=blue=100

echo "✓ Traffic switched to blue deployment"

- name: Tag current green as rollback
run: |
gcloud run services update-traffic ${{ env.SERVICE_NAME }}-backend-staging \
--region=${{ env.GCP_REGION }} \
--update-tags=rollback=green

gcloud run services update-traffic ${{ env.SERVICE_NAME }}-frontend-staging \
--region=${{ env.GCP_REGION }} \
--update-tags=rollback=green

- name: Generate deployment record
run: |
cat > deployment-record.json <<EOF
{
"environment": "staging",
"timestamp": "$(date -u +"%Y-%m-%dT%H:%M:%SZ")",
"version": "${{ needs.build-backend.outputs.version }}",
"commit_sha": "${{ github.sha }}",
"backend_image": "${{ needs.build-backend.outputs.image }}",
"frontend_image": "${{ needs.build-frontend.outputs.image }}",
"backend_url": "${{ steps.deploy_backend_blue.outputs.blue_url }}",
"frontend_url": "${{ steps.deploy_frontend_blue.outputs.blue_url }}",
"deployed_by": "${{ github.actor }}",
"deployment_strategy": "blue-green",
"smoke_tests_passed": true
}
EOF
gsutil cp deployment-record.json gs://bio-qms-audit-logs/deployments/staging/${{ github.run_id }}.json

- name: Post deployment notification
uses: slackapi/slack-github-action@v1
with:
payload: |
{
"text": "Staging Deployment Completed",
"blocks": [
{
"type": "header",
"text": {
"type": "plain_text",
"text": "✅ Staging Deployment Successful"
}
},
{
"type": "section",
"fields": [
{
"type": "mrkdwn",
"text": "*Environment:*\nStaging"
},
{
"type": "mrkdwn",
"text": "*Version:*\n${{ needs.build-backend.outputs.version }}"
},
{
"type": "mrkdwn",
"text": "*Deployed by:*\n${{ github.actor }}"
},
{
"type": "mrkdwn",
"text": "*Commit:*\n<https://github.com/${{ github.repository }}/commit/${{ github.sha }}|${{ github.sha }}>"
}
]
},
{
"type": "section",
"text": {
"type": "mrkdwn",
"text": "*URLs:*\n• Frontend: ${{ steps.deploy_frontend_blue.outputs.blue_url }}\n• Backend: ${{ steps.deploy_backend_blue.outputs.blue_url }}"
}
}
]
}
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}

# ============================================
# PRODUCTION DEPLOYMENT
# ============================================

deploy-production:
name: Deploy to Production
runs-on: ubuntu-latest
timeout-minutes: 30
needs: [build-backend, build-frontend]
if: github.event.inputs.environment == 'production'
environment:
name: production
url: https://app.bio-qms.com

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Authenticate to Google Cloud
uses: google-github-actions/auth@v2
with:
credentials_json: ${{ secrets.GCP_SA_KEY }}

- name: Setup Cloud SDK
uses: google-github-actions/setup-gcloud@v2

- name: Verify approvals
run: |
echo "This deployment requires manual approval"
echo "Version: ${{ needs.build-backend.outputs.version }}"
echo "Commit: ${{ github.sha }}"

- name: Deploy backend to Cloud Run (blue)
id: deploy_backend_blue
run: |
gcloud run deploy ${{ env.SERVICE_NAME }}-backend-prod-blue \
--image=${{ needs.build-backend.outputs.image }} \
--region=${{ env.GCP_REGION }} \
--platform=managed \
--allow-unauthenticated \
--set-env-vars="ENVIRONMENT=production,VERSION=${{ needs.build-backend.outputs.version }}" \
--set-secrets="DATABASE_URL=prod-database-url:latest,SECRET_KEY=prod-secret-key:latest" \
--memory=4Gi \
--cpu=4 \
--min-instances=3 \
--max-instances=50 \
--timeout=300 \
--no-traffic \
--tag=blue

BLUE_URL=$(gcloud run services describe ${{ env.SERVICE_NAME }}-backend-prod-blue \
--region=${{ env.GCP_REGION }} \
--format='value(status.url)')
echo "blue_url=${BLUE_URL}" >> $GITHUB_OUTPUT

- name: Deploy frontend to Cloud Run (blue)
id: deploy_frontend_blue
run: |
gcloud run deploy ${{ env.SERVICE_NAME }}-frontend-prod-blue \
--image=${{ needs.build-frontend.outputs.image }} \
--region=${{ env.GCP_REGION }} \
--platform=managed \
--allow-unauthenticated \
--set-env-vars="API_URL=${{ steps.deploy_backend_blue.outputs.blue_url }},VERSION=${{ needs.build-frontend.outputs.version }}" \
--memory=1Gi \
--cpu=2 \
--min-instances=3 \
--max-instances=50 \
--timeout=60 \
--no-traffic \
--tag=blue

BLUE_URL=$(gcloud run services describe ${{ env.SERVICE_NAME }}-frontend-prod-blue \
--region=${{ env.GCP_REGION }} \
--format='value(status.url)')
echo "blue_url=${BLUE_URL}" >> $GITHUB_OUTPUT

- name: Run comprehensive smoke tests
if: ${{ !inputs.skip_tests }}
run: |
echo "Running production smoke tests..."

# Backend health check
BACKEND_HEALTH="${{ steps.deploy_backend_blue.outputs.blue_url }}/health"
for i in {1..60}; do
if curl -f "$BACKEND_HEALTH" | jq -e '.status == "healthy"'; then
echo "✓ Backend health check passed"
break
fi
if [ $i -eq 60 ]; then
echo "❌ Backend health check failed"
exit 1
fi
sleep 10
done

# Database connectivity
API_DB="${{ steps.deploy_backend_blue.outputs.blue_url }}/api/v1/healthz"
if curl -f "$API_DB" | jq -e '.database == "connected" and .redis == "connected"'; then
echo "✓ Database and Redis connected"
else
echo "❌ Infrastructure health check failed"
exit 1
fi

# Critical API endpoints
API_BASE="${{ steps.deploy_backend_blue.outputs.blue_url }}/api/v1"
curl -f "$API_BASE/auth/status" || exit 1
curl -f "$API_BASE/documents/health" || exit 1
curl -f "$API_BASE/tenants/health" || exit 1

# Frontend serving
FRONTEND="${{ steps.deploy_frontend_blue.outputs.blue_url }}"
if curl -f "$FRONTEND" | grep -q "<!DOCTYPE html>"; then
echo "✓ Frontend serving HTML"
else
echo "❌ Frontend health check failed"
exit 1
fi

# Load test (light)
echo "Running light load test..."
for i in {1..10}; do
curl -f "$BACKEND_HEALTH" &
done
wait
echo "✓ Load test passed"

- name: Gradual traffic shift (canary)
run: |
echo "Starting gradual traffic shift..."

# 10% traffic to blue
gcloud run services update-traffic ${{ env.SERVICE_NAME }}-backend-prod \
--region=${{ env.GCP_REGION }} \
--to-tags=blue=10,green=90
gcloud run services update-traffic ${{ env.SERVICE_NAME }}-frontend-prod \
--region=${{ env.GCP_REGION }} \
--to-tags=blue=10,green=90
echo "✓ 10% traffic shifted to blue"
sleep 120

# Monitor for 2 minutes, check error rates
# (In production, integrate with monitoring system)

# 50% traffic to blue
gcloud run services update-traffic ${{ env.SERVICE_NAME }}-backend-prod \
--region=${{ env.GCP_REGION }} \
--to-tags=blue=50,green=50
gcloud run services update-traffic ${{ env.SERVICE_NAME }}-frontend-prod \
--region=${{ env.GCP_REGION }} \
--to-tags=blue=50,green=50
echo "✓ 50% traffic shifted to blue"
sleep 180

# 100% traffic to blue
gcloud run services update-traffic ${{ env.SERVICE_NAME }}-backend-prod \
--region=${{ env.GCP_REGION }} \
--to-tags=blue=100
gcloud run services update-traffic ${{ env.SERVICE_NAME }}-frontend-prod \
--region=${{ env.GCP_REGION }} \
--to-tags=blue=100
echo "✓ 100% traffic shifted to blue"

- name: Tag rollback revision
run: |
gcloud run services update-traffic ${{ env.SERVICE_NAME }}-backend-prod \
--region=${{ env.GCP_REGION }} \
--update-tags=rollback=green

gcloud run services update-traffic ${{ env.SERVICE_NAME }}-frontend-prod \
--region=${{ env.GCP_REGION }} \
--update-tags=rollback=green

echo "✓ Rollback tag created"

- name: Generate deployment record
run: |
cat > deployment-record.json <<EOF
{
"environment": "production",
"timestamp": "$(date -u +"%Y-%m-%dT%H:%M:%SZ")",
"version": "${{ needs.build-backend.outputs.version }}",
"commit_sha": "${{ github.sha }}",
"backend_image": "${{ needs.build-backend.outputs.image }}",
"frontend_image": "${{ needs.build-frontend.outputs.image }}",
"backend_url": "${{ steps.deploy_backend_blue.outputs.blue_url }}",
"frontend_url": "${{ steps.deploy_frontend_blue.outputs.blue_url }}",
"deployed_by": "${{ github.actor }}",
"deployment_strategy": "blue-green-canary",
"canary_stages": ["10%", "50%", "100%"],
"smoke_tests_passed": true,
"compliance_check": "passed"
}
EOF
gsutil cp deployment-record.json gs://bio-qms-audit-logs/deployments/production/${{ github.run_id }}.json

- name: Post deployment notification
uses: slackapi/slack-github-action@v1
with:
payload: |
{
"text": "Production Deployment Completed",
"blocks": [
{
"type": "header",
"text": {
"type": "plain_text",
"text": "✅ Production Deployment Successful"
}
},
{
"type": "section",
"fields": [
{
"type": "mrkdwn",
"text": "*Environment:*\nProduction"
},
{
"type": "mrkdwn",
"text": "*Version:*\n${{ needs.build-backend.outputs.version }}"
},
{
"type": "mrkdwn",
"text": "*Deployed by:*\n${{ github.actor }}"
},
{
"type": "mrkdwn",
"text": "*Commit:*\n<https://github.com/${{ github.repository }}/commit/${{ github.sha }}|${{ github.sha }}>"
}
]
},
{
"type": "section",
"text": {
"type": "mrkdwn",
"text": "⚠️ *PRODUCTION DEPLOYMENT* - Monitor dashboards closely for 30 minutes"
}
}
]
}
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_PRODUCTION }}

# ============================================
# ROLLBACK WORKFLOW
# ============================================

rollback:
name: Rollback Deployment
runs-on: ubuntu-latest
if: failure()
needs: [deploy-staging, deploy-production]

steps:
- name: Authenticate to Google Cloud
uses: google-github-actions/auth@v2
with:
credentials_json: ${{ secrets.GCP_SA_KEY }}

- name: Setup Cloud SDK
uses: google-github-actions/setup-gcloud@v2

- name: Rollback to previous revision
run: |
ENV=${{ github.event.inputs.environment || 'staging' }}
echo "Rolling back $ENV environment..."

# Rollback backend
gcloud run services update-traffic ${{ env.SERVICE_NAME }}-backend-${ENV} \
--region=${{ env.GCP_REGION }} \
--to-tags=rollback=100

# Rollback frontend
gcloud run services update-traffic ${{ env.SERVICE_NAME }}-frontend-${ENV} \
--region=${{ env.GCP_REGION }} \
--to-tags=rollback=100

echo "✓ Rollback completed"

- name: Notify rollback
uses: slackapi/slack-github-action@v1
with:
payload: |
{
"text": "⚠️ ROLLBACK EXECUTED",
"blocks": [
{
"type": "header",
"text": {
"type": "plain_text",
"text": "⚠️ Deployment Rollback"
}
},
{
"type": "section",
"text": {
"type": "mrkdwn",
"text": "Deployment failed. Rolled back to previous revision.\n\n*Environment:* ${{ github.event.inputs.environment || 'staging' }}\n*Failed Version:* ${{ needs.build-backend.outputs.version }}\n*Commit:* ${{ github.sha }}"
}
}
]
}
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_ALERTS }}

Manual Rollback Workflow

File: .github/workflows/rollback.yml

name: Manual Rollback

on:
workflow_dispatch:
inputs:
environment:
description: 'Environment to rollback'
required: true
type: choice
options:
- staging
- production
reason:
description: 'Reason for rollback'
required: true
type: string

jobs:
rollback:
name: Execute Rollback
runs-on: ubuntu-latest
environment: ${{ github.event.inputs.environment }}

steps:
- name: Verify rollback reason
run: |
echo "Rollback Reason: ${{ github.event.inputs.reason }}"
if [ -z "${{ github.event.inputs.reason }}" ]; then
echo "❌ Rollback reason is required"
exit 1
fi

- name: Authenticate to Google Cloud
uses: google-github-actions/auth@v2
with:
credentials_json: ${{ secrets.GCP_SA_KEY }}

- name: Setup Cloud SDK
uses: google-github-actions/setup-gcloud@v2

- name: Get current revision info
run: |
ENV=${{ github.event.inputs.environment }}

# Get current backend revision
BACKEND_CURRENT=$(gcloud run services describe bio-qms-backend-${ENV} \
--region=us-central1 \
--format='value(status.traffic[0].revisionName)')

# Get rollback backend revision
BACKEND_ROLLBACK=$(gcloud run services describe bio-qms-backend-${ENV} \
--region=us-central1 \
--format='value(status.traffic.filter(tag:rollback).revisionName)')

echo "Backend Current: $BACKEND_CURRENT"
echo "Backend Rollback: $BACKEND_ROLLBACK"

# Get current frontend revision
FRONTEND_CURRENT=$(gcloud run services describe bio-qms-frontend-${ENV} \
--region=us-central1 \
--format='value(status.traffic[0].revisionName)')

# Get rollback frontend revision
FRONTEND_ROLLBACK=$(gcloud run services describe bio-qms-frontend-${ENV} \
--region=us-central1 \
--format='value(status.traffic.filter(tag:rollback).revisionName)')

echo "Frontend Current: $FRONTEND_CURRENT"
echo "Frontend Rollback: $FRONTEND_ROLLBACK"

- name: Execute rollback
run: |
ENV=${{ github.event.inputs.environment }}

# Rollback backend
gcloud run services update-traffic bio-qms-backend-${ENV} \
--region=us-central1 \
--to-tags=rollback=100

# Rollback frontend
gcloud run services update-traffic bio-qms-frontend-${ENV} \
--region=us-central1 \
--to-tags=rollback=100

echo "✓ Rollback completed"

- name: Create rollback audit record
run: |
cat > rollback-record.json <<EOF
{
"event_type": "manual_rollback",
"timestamp": "$(date -u +"%Y-%m-%dT%H:%M:%SZ")",
"environment": "${{ github.event.inputs.environment }}",
"reason": "${{ github.event.inputs.reason }}",
"executed_by": "${{ github.actor }}",
"workflow_run_id": "${{ github.run_id }}"
}
EOF
gsutil cp rollback-record.json gs://bio-qms-audit-logs/rollbacks/${{ github.run_id }}.json

- name: Notify rollback
uses: slackapi/slack-github-action@v1
with:
payload: |
{
"text": "⚠️ MANUAL ROLLBACK EXECUTED",
"blocks": [
{
"type": "header",
"text": {
"type": "plain_text",
"text": "⚠️ Manual Rollback Executed"
}
},
{
"type": "section",
"fields": [
{
"type": "mrkdwn",
"text": "*Environment:*\n${{ github.event.inputs.environment }}"
},
{
"type": "mrkdwn",
"text": "*Executed by:*\n${{ github.actor }}"
},
{
"type": "mrkdwn",
"text": "*Reason:*\n${{ github.event.inputs.reason }}"
}
]
}
]
}
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_ALERTS }}

Environment Configuration

File: .github/environments/staging.yml

# Staging environment protection rules
environment:
name: staging
url: https://staging.bio-qms.com

# Auto-deploy on merge to main
deployment_branch_policy:
protected_branches: false
custom_branch_policies: true

# No approval required for staging
reviewers: []

# Environment-specific secrets
secrets:
- GCP_SA_KEY
- DATABASE_URL
- SECRET_KEY
- SLACK_WEBHOOK_URL

File: .github/environments/production.yml

# Production environment protection rules
environment:
name: production
url: https://app.bio-qms.com

# Require manual approval
deployment_branch_policy:
protected_branches: true
custom_branch_policies: false

# Require 2 approvals from specific teams
reviewers:
- type: team
id: engineering-leads
- type: team
id: compliance-team

wait_timer: 0 # No wait timer, but manual approval required

# Environment-specific secrets
secrets:
- GCP_SA_KEY
- DATABASE_URL
- SECRET_KEY
- SLACK_WEBHOOK_PRODUCTION
- SLACK_WEBHOOK_ALERTS

E.1.3: Container Image Build and Registry

Backend Dockerfile

File: backend/Dockerfile

# =================================
# Stage 1: Build Dependencies
# =================================
FROM python:3.11-slim as builder

LABEL maintainer="devops@bio-qms.com"

# Build arguments
ARG BUILD_DATE
ARG VCS_REF
ARG VERSION

# Metadata labels
LABEL org.opencontainers.image.created="${BUILD_DATE}"
LABEL org.opencontainers.image.version="${VERSION}"
LABEL org.opencontainers.image.revision="${VCS_REF}"
LABEL org.opencontainers.image.title="BIO-QMS Backend"
LABEL org.opencontainers.image.description="Django REST API for BIO-QMS Platform"
LABEL org.opencontainers.image.vendor="BIO-QMS"
LABEL org.opencontainers.image.licenses="Proprietary"

# Set environment variables
ENV PYTHONUNBUFFERED=1 \
PYTHONDONTWRITEBYTECODE=1 \
PIP_NO_CACHE_DIR=1 \
PIP_DISABLE_PIP_VERSION_CHECK=1

# Install system dependencies
RUN apt-get update && apt-get install -y --no-install-recommends \
build-essential \
libpq-dev \
libssl-dev \
libffi-dev \
&& rm -rf /var/lib/apt/lists/*

# Create virtual environment
RUN python -m venv /opt/venv
ENV PATH="/opt/venv/bin:$PATH"

# Copy requirements first (better caching)
COPY requirements.txt /tmp/requirements.txt

# Install Python dependencies
RUN pip install --upgrade pip setuptools wheel && \
pip install --no-cache-dir -r /tmp/requirements.txt

# =================================
# Stage 2: Runtime
# =================================
FROM python:3.11-slim

# Copy build arguments for runtime labels
ARG BUILD_DATE
ARG VCS_REF
ARG VERSION

# Set environment variables
ENV PYTHONUNBUFFERED=1 \
PYTHONDONTWRITEBYTECODE=1 \
PATH="/opt/venv/bin:$PATH" \
DJANGO_SETTINGS_MODULE=bio_qms.settings.production

# Install runtime system dependencies
RUN apt-get update && apt-get install -y --no-install-recommends \
libpq5 \
curl \
&& rm -rf /var/lib/apt/lists/*

# Create non-root user
RUN groupadd -r django && \
useradd -r -g django -u 1000 -d /app -s /bin/bash django

# Create app directory
WORKDIR /app

# Copy virtual environment from builder
COPY --from=builder /opt/venv /opt/venv

# Copy application code
COPY --chown=django:django . /app/

# Create required directories
RUN mkdir -p /app/logs /app/static /app/media && \
chown -R django:django /app

# Switch to non-root user
USER django

# Collect static files
RUN python manage.py collectstatic --noinput --clear

# Health check
HEALTHCHECK --interval=30s --timeout=10s --start-period=40s --retries=3 \
CMD curl -f http://localhost:8000/health || exit 1

# Expose port
EXPOSE 8000

# Run gunicorn
CMD ["gunicorn", \
"--bind", "0.0.0.0:8000", \
"--workers", "4", \
"--worker-class", "gevent", \
"--worker-connections", "1000", \
"--timeout", "120", \
"--keep-alive", "5", \
"--max-requests", "1000", \
"--max-requests-jitter", "100", \
"--access-logfile", "-", \
"--error-logfile", "-", \
"--log-level", "info", \
"bio_qms.wsgi:application"]

Frontend Dockerfile

File: frontend/Dockerfile

# =================================
# Stage 1: Build
# =================================
FROM node:20-alpine as builder

LABEL maintainer="devops@bio-qms.com"

# Build arguments
ARG BUILD_DATE
ARG VCS_REF
ARG VERSION
ARG API_URL

# Metadata labels
LABEL org.opencontainers.image.created="${BUILD_DATE}"
LABEL org.opencontainers.image.version="${VERSION}"
LABEL org.opencontainers.image.revision="${VCS_REF}"
LABEL org.opencontainers.image.title="BIO-QMS Frontend"
LABEL org.opencontainers.image.description="React SPA for BIO-QMS Platform"
LABEL org.opencontainers.image.vendor="BIO-QMS"
LABEL org.opencontainers.image.licenses="Proprietary"

# Set working directory
WORKDIR /app

# Copy package files
COPY package*.json ./

# Install dependencies
RUN npm ci --prefer-offline --no-audit --legacy-peer-deps

# Copy source code
COPY . .

# Build production bundle
ENV NODE_ENV=production
ENV VITE_API_URL=${API_URL}
RUN npm run build

# List build output
RUN ls -lah /app/dist

# =================================
# Stage 2: Runtime (Nginx)
# =================================
FROM nginx:1.25-alpine

# Copy build arguments
ARG BUILD_DATE
ARG VCS_REF
ARG VERSION

# Install curl for healthcheck
RUN apk add --no-cache curl

# Copy built files from builder
COPY --from=builder /app/dist /usr/share/nginx/html

# Copy nginx configuration
COPY nginx.conf /etc/nginx/nginx.conf
COPY default.conf /etc/nginx/conf.d/default.conf

# Create nginx cache directory
RUN mkdir -p /var/cache/nginx/client_temp && \
chown -R nginx:nginx /var/cache/nginx && \
chown -R nginx:nginx /usr/share/nginx/html

# Health check
HEALTHCHECK --interval=30s --timeout=3s --start-period=10s --retries=3 \
CMD curl -f http://localhost:80/health || exit 1

# Expose port
EXPOSE 80

# Switch to non-root user
USER nginx

# Start nginx
CMD ["nginx", "-g", "daemon off;"]

Nginx Configuration for Frontend

File: frontend/nginx.conf

user  nginx;
worker_processes auto;
error_log /var/log/nginx/error.log warn;
pid /var/run/nginx.pid;

events {
worker_connections 1024;
use epoll;
}

http {
include /etc/nginx/mime.types;
default_type application/octet-stream;

log_format main '$remote_addr - $remote_user [$time_local] "$request" '
'$status $body_bytes_sent "$http_referer" '
'"$http_user_agent" "$http_x_forwarded_for"';

access_log /var/log/nginx/access.log main;

sendfile on;
tcp_nopush on;
tcp_nodelay on;
keepalive_timeout 65;
types_hash_max_size 2048;
client_max_body_size 20M;

# Gzip compression
gzip on;
gzip_vary on;
gzip_proxied any;
gzip_comp_level 6;
gzip_types text/plain text/css text/xml text/javascript
application/json application/javascript application/xml+rss
application/rss+xml font/truetype font/opentype
application/vnd.ms-fontobject image/svg+xml;

# Security headers
add_header X-Frame-Options "SAMEORIGIN" always;
add_header X-Content-Type-Options "nosniff" always;
add_header X-XSS-Protection "1; mode=block" always;
add_header Referrer-Policy "strict-origin-when-cross-origin" always;

include /etc/nginx/conf.d/*.conf;
}

File: frontend/default.conf

server {
listen 80;
server_name _;
root /usr/share/nginx/html;
index index.html;

# Health check endpoint
location /health {
access_log off;
return 200 "healthy\n";
add_header Content-Type text/plain;
}

# Static assets with long cache
location ~* \.(jpg|jpeg|png|gif|ico|css|js|svg|woff|woff2|ttf|eot)$ {
expires 1y;
add_header Cache-Control "public, immutable";
}

# index.html with no-cache
location = /index.html {
expires -1;
add_header Cache-Control "no-store, no-cache, must-revalidate, proxy-revalidate, max-age=0";
}

# SPA fallback
location / {
try_files $uri $uri/ /index.html;
}

# Security: deny access to hidden files
location ~ /\. {
deny all;
access_log off;
log_not_found off;
}
}

Docker Build Script

File: scripts/docker-build.sh

#!/bin/bash
set -euo pipefail

# Color output
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
NC='\033[0m' # No Color

# Configuration
GCP_PROJECT="bio-qms-prod"
REGISTRY="us-central1-docker.pkg.dev"
REPO="${REGISTRY}/${GCP_PROJECT}/bio-qms"

# Generate version
VERSION=$(git describe --tags --always --dirty)
TIMESTAMP=$(date -u +"%Y%m%d%H%M%S")
FULL_VERSION="${VERSION}-${TIMESTAMP}"
COMMIT_SHA=$(git rev-parse --short HEAD)
BUILD_DATE=$(date -u +"%Y-%m-%dT%H:%M:%SZ")

echo -e "${GREEN}Building BIO-QMS Docker Images${NC}"
echo "Version: ${FULL_VERSION}"
echo "Commit: ${COMMIT_SHA}"
echo "Build Date: ${BUILD_DATE}"
echo ""

# Build backend
echo -e "${YELLOW}Building backend image...${NC}"
docker build \
--file backend/Dockerfile \
--tag "${REPO}/backend:${FULL_VERSION}" \
--tag "${REPO}/backend:${COMMIT_SHA}" \
--tag "${REPO}/backend:latest" \
--build-arg BUILD_DATE="${BUILD_DATE}" \
--build-arg VCS_REF="${COMMIT_SHA}" \
--build-arg VERSION="${FULL_VERSION}" \
--cache-from "${REPO}/backend:latest" \
--progress=plain \
backend/

echo -e "${GREEN}✓ Backend image built${NC}"
echo ""

# Build frontend
echo -e "${YELLOW}Building frontend image...${NC}"
docker build \
--file frontend/Dockerfile \
--tag "${REPO}/frontend:${FULL_VERSION}" \
--tag "${REPO}/frontend:${COMMIT_SHA}" \
--tag "${REPO}/frontend:latest" \
--build-arg BUILD_DATE="${BUILD_DATE}" \
--build-arg VCS_REF="${COMMIT_SHA}" \
--build-arg VERSION="${FULL_VERSION}" \
--build-arg API_URL="https://api.bio-qms.com" \
--cache-from "${REPO}/frontend:latest" \
--progress=plain \
frontend/

echo -e "${GREEN}✓ Frontend image built${NC}"
echo ""

# Scan images
echo -e "${YELLOW}Scanning images for vulnerabilities...${NC}"
trivy image --severity HIGH,CRITICAL "${REPO}/backend:${FULL_VERSION}"
trivy image --severity HIGH,CRITICAL "${REPO}/frontend:${FULL_VERSION}"

echo -e "${GREEN}✓ Security scan completed${NC}"
echo ""

# Push images
echo -e "${YELLOW}Pushing images to Artifact Registry...${NC}"
docker push "${REPO}/backend:${FULL_VERSION}"
docker push "${REPO}/backend:${COMMIT_SHA}"
docker push "${REPO}/backend:latest"

docker push "${REPO}/frontend:${FULL_VERSION}"
docker push "${REPO}/frontend:${COMMIT_SHA}"
docker push "${REPO}/frontend:latest"

echo -e "${GREEN}✓ Images pushed successfully${NC}"
echo ""

# Output image references
echo -e "${GREEN}Image References:${NC}"
echo "Backend: ${REPO}/backend:${FULL_VERSION}"
echo "Frontend: ${REPO}/frontend:${FULL_VERSION}"

Artifact Registry Setup

File: terraform/artifact-registry.tf

# Google Artifact Registry for BIO-QMS

resource "google_artifact_registry_repository" "bio_qms" {
project = var.gcp_project_id
location = var.gcp_region
repository_id = "bio-qms"
description = "Docker images for BIO-QMS platform"
format = "DOCKER"

labels = {
environment = "production"
project = "bio-qms"
compliance = "hipaa-soc2"
}

# Cleanup policy
cleanup_policy_dry_run = false

cleanup_policies {
id = "delete-old-images"
action = "DELETE"

condition {
tag_state = "UNTAGGED"
older_than = "604800s" # 7 days
}
}

cleanup_policies {
id = "keep-recent-tagged"
action = "KEEP"

most_recent_versions {
keep_count = 10
}
}
}

# IAM bindings for CI/CD
resource "google_artifact_registry_repository_iam_member" "ci_writer" {
project = google_artifact_registry_repository.bio_qms.project
location = google_artifact_registry_repository.bio_qms.location
repository = google_artifact_registry_repository.bio_qms.name
role = "roles/artifactregistry.writer"
member = "serviceAccount:${var.ci_service_account}"
}

# Enable vulnerability scanning
resource "google_artifact_registry_repository_iam_member" "scanner" {
project = google_artifact_registry_repository.bio_qms.project
location = google_artifact_registry_repository.bio_qms.location
repository = google_artifact_registry_repository.bio_qms.name
role = "roles/artifactregistry.reader"
member = "serviceAccount:service-${data.google_project.project.number}@compute-system.iam.gserviceaccount.com"
}

# Output
output "artifact_registry_url" {
value = "${google_artifact_registry_repository.bio_qms.location}-docker.pkg.dev/${google_artifact_registry_repository.bio_qms.project}/${google_artifact_registry_repository.bio_qms.repository_id}"
}

Image Signing with Cosign

File: .github/workflows/image-signing.yml

name: Sign Container Images

on:
workflow_call:
inputs:
backend_image:
required: true
type: string
frontend_image:
required: true
type: string

jobs:
sign-images:
name: Sign Images with Cosign
runs-on: ubuntu-latest

steps:
- name: Install Cosign
uses: sigstore/cosign-installer@v3

- name: Authenticate to Google Cloud
uses: google-github-actions/auth@v2
with:
credentials_json: ${{ secrets.GCP_SA_KEY }}

- name: Configure Docker
run: |
gcloud auth configure-docker us-central1-docker.pkg.dev

- name: Sign backend image
env:
COSIGN_EXPERIMENTAL: 1
run: |
cosign sign --yes ${{ inputs.backend_image }}
echo "✓ Backend image signed"

- name: Sign frontend image
env:
COSIGN_EXPERIMENTAL: 1
run: |
cosign sign --yes ${{ inputs.frontend_image }}
echo "✓ Frontend image signed"

- name: Verify signatures
env:
COSIGN_EXPERIMENTAL: 1
run: |
cosign verify ${{ inputs.backend_image }}
cosign verify ${{ inputs.frontend_image }}
echo "✓ Signatures verified"

- name: Generate SBOM
run: |
cosign attach sbom ${{ inputs.backend_image }}
cosign attach sbom ${{ inputs.frontend_image }}
echo "✓ SBOMs attached"

E.1.4: Database Migration CI Step

Django Migration CI Workflow

File: .github/workflows/migrations.yml

name: Database Migrations CI

on:
pull_request:
paths:
- 'backend/**/migrations/**'
- 'backend/models.py'
push:
branches: [main]
paths:
- 'backend/**/migrations/**'

env:
POSTGRES_VERSION: '15'

jobs:
# ============================================
# MIGRATION VALIDATION
# ============================================

validate-migrations:
name: Validate Migrations
runs-on: ubuntu-latest
timeout-minutes: 15

services:
postgres:
image: postgres:15-alpine
env:
POSTGRES_DB: bio_qms_test
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432

steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0

- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: '3.11'
cache: 'pip'

- name: Install dependencies
working-directory: backend
run: |
pip install --upgrade pip
pip install -r requirements.txt

- name: Check for missing migrations
working-directory: backend
run: |
python manage.py makemigrations --check --dry-run --no-input
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
SECRET_KEY: test-secret-key

- name: Validate migration files
working-directory: backend
run: |
# Check for proper migration naming
for migration in $(find . -path "*/migrations/*.py" -not -name "__init__.py"); do
if [[ ! $migration =~ [0-9]{4}_.+\.py$ ]]; then
echo "❌ Invalid migration filename: $migration"
exit 1
fi
done
echo "✓ All migration files have valid names"

- name: Check for conflicts
working-directory: backend
run: |
# Detect migration conflicts
python manage.py makemigrations --check --dry-run
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
SECRET_KEY: test-secret-key

- name: Run migrations (forward)
working-directory: backend
run: |
python manage.py migrate --no-input
echo "✓ Forward migrations applied"
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
SECRET_KEY: test-secret-key

- name: Generate SQL for new migrations
if: github.event_name == 'pull_request'
working-directory: backend
run: |
# Get list of new migrations in this PR
git diff --name-only origin/${{ github.base_ref }}...HEAD | \
grep "migrations/.*\.py$" | \
grep -v "__init__.py" > new_migrations.txt || true

if [ -s new_migrations.txt ]; then
echo "New migrations found:"
cat new_migrations.txt

# Generate SQL for each migration
mkdir -p migration_sql
while IFS= read -r migration; do
app=$(echo "$migration" | cut -d'/' -f1)
migration_name=$(basename "$migration" .py)
echo "Generating SQL for $app.$migration_name"
python manage.py sqlmigrate "$app" "$migration_name" > "migration_sql/${app}_${migration_name}.sql" || true
done < new_migrations.txt
fi
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
SECRET_KEY: test-secret-key

- name: Upload SQL files
if: github.event_name == 'pull_request'
uses: actions/upload-artifact@v4
with:
name: migration-sql
path: backend/migration_sql/
retention-days: 30

# ============================================
# BACKWARD COMPATIBILITY TEST
# ============================================

test-backward-compatibility:
name: Test Backward Compatibility
runs-on: ubuntu-latest
timeout-minutes: 20

services:
postgres:
image: postgres:15-alpine
env:
POSTGRES_DB: bio_qms_test
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432

steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0

- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: '3.11'
cache: 'pip'

- name: Install dependencies
working-directory: backend
run: |
pip install --upgrade pip
pip install -r requirements.txt

- name: Apply migrations up to previous version
working-directory: backend
run: |
# Checkout previous commit
git checkout origin/${{ github.base_ref }}
python manage.py migrate --no-input
echo "✓ Previous migrations applied"
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
SECRET_KEY: test-secret-key

- name: Run tests on previous version
working-directory: backend
run: |
pytest tests/ --maxfail=1
echo "✓ Previous version tests passed"
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
SECRET_KEY: test-secret-key
TESTING: 'True'

- name: Apply new migrations
working-directory: backend
run: |
# Return to current branch
git checkout ${{ github.sha }}
python manage.py migrate --no-input
echo "✓ New migrations applied"
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
SECRET_KEY: test-secret-key

- name: Run tests on new version
working-directory: backend
run: |
pytest tests/ --maxfail=1
echo "✓ New version tests passed"
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
SECRET_KEY: test-secret-key
TESTING: 'True'

- name: Test rollback
working-directory: backend
run: |
# Get last migration from previous version
LAST_MIGRATION=$(python -c "
import os
import django
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'bio_qms.settings')
django.setup()
from django.db.migrations.recorder import MigrationRecorder
recorder = MigrationRecorder.Migration.objects.order_by('-id').first()
print(f'{recorder.app}.{recorder.name}')
" || echo "")

if [ -n "$LAST_MIGRATION" ]; then
python manage.py migrate "${LAST_MIGRATION%.*}" "${LAST_MIGRATION##*.}" --no-input
echo "✓ Rollback successful"
fi
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
SECRET_KEY: test-secret-key

# ============================================
# SCHEMA DIFF CHECK
# ============================================

schema-diff:
name: Schema Diff vs Production
runs-on: ubuntu-latest
if: github.event_name == 'pull_request'
timeout-minutes: 15

services:
postgres_old:
image: postgres:15-alpine
env:
POSTGRES_DB: bio_qms_old
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5433:5432

postgres_new:
image: postgres:15-alpine
env:
POSTGRES_DB: bio_qms_new
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5434:5432

steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0

- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: '3.11'
cache: 'pip'

- name: Install dependencies
run: |
pip install --upgrade pip
pip install -r backend/requirements.txt
pip install migra

- name: Apply old schema
working-directory: backend
run: |
git checkout origin/${{ github.base_ref }}
python manage.py migrate --no-input --database=old
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5433/bio_qms_old
SECRET_KEY: test-secret-key

- name: Apply new schema
working-directory: backend
run: |
git checkout ${{ github.sha }}
python manage.py migrate --no-input --database=new
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5434/bio_qms_new
SECRET_KEY: test-secret-key

- name: Generate schema diff
run: |
migra \
postgresql://postgres:postgres@localhost:5433/bio_qms_old \
postgresql://postgres:postgres@localhost:5434/bio_qms_new \
--unsafe > schema_diff.sql || true

if [ -s schema_diff.sql ]; then
echo "Schema changes detected:"
cat schema_diff.sql
else
echo "No schema changes detected"
fi

- name: Upload schema diff
uses: actions/upload-artifact@v4
with:
name: schema-diff
path: schema_diff.sql
retention-days: 30

- name: Comment on PR with schema diff
uses: actions/github-script@v7
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
const fs = require('fs');
let diff = '';
try {
diff = fs.readFileSync('schema_diff.sql', 'utf8');
} catch (e) {
diff = 'No schema changes detected';
}

const body = `## Database Schema Changes

\`\`\`sql
${diff}
\`\`\`

**Review Notes:**
- Verify all changes are intentional
- Check for potential breaking changes
- Ensure backward compatibility
- Validate indexes and constraints
`;

github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
body: body
});

# ============================================
# DATA MIGRATION TEST
# ============================================

test-data-migration:
name: Test Data Migration
runs-on: ubuntu-latest
timeout-minutes: 30

services:
postgres:
image: postgres:15-alpine
env:
POSTGRES_DB: bio_qms_test
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: '3.11'
cache: 'pip'

- name: Install dependencies
working-directory: backend
run: |
pip install --upgrade pip
pip install -r requirements.txt
pip install faker

- name: Apply previous migrations
working-directory: backend
run: |
git checkout origin/${{ github.base_ref }}
python manage.py migrate --no-input
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
SECRET_KEY: test-secret-key

- name: Generate test data
working-directory: backend
run: |
python manage.py shell <<EOF
from faker import Faker
from django.contrib.auth import get_user_model
from apps.documents.models import Document
from apps.tenants.models import Tenant

fake = Faker()
User = get_user_model()

# Create test tenant
tenant = Tenant.objects.create(
name=fake.company(),
slug=fake.slug(),
is_active=True
)

# Create test users
for i in range(10):
User.objects.create_user(
username=fake.user_name(),
email=fake.email(),
first_name=fake.first_name(),
last_name=fake.last_name(),
tenant=tenant
)

# Create test documents
for i in range(100):
Document.objects.create(
title=fake.sentence(),
content=fake.text(),
tenant=tenant
)

print(f'Created {User.objects.count()} users')
print(f'Created {Document.objects.count()} documents')
EOF
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
SECRET_KEY: test-secret-key

- name: Apply new migrations
working-directory: backend
run: |
git checkout ${{ github.sha }}
python manage.py migrate --no-input
echo "✓ Migrations applied"
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
SECRET_KEY: test-secret-key

- name: Verify data integrity
working-directory: backend
run: |
python manage.py shell <<EOF
from django.contrib.auth import get_user_model
from apps.documents.models import Document
from apps.tenants.models import Tenant

User = get_user_model()

user_count = User.objects.count()
doc_count = Document.objects.count()
tenant_count = Tenant.objects.count()

print(f'Users: {user_count}')
print(f'Documents: {doc_count}')
print(f'Tenants: {tenant_count}')

if user_count < 10:
raise Exception(f'Expected at least 10 users, got {user_count}')

if doc_count < 100:
raise Exception(f'Expected at least 100 documents, got {doc_count}')

print('✓ Data integrity verified')
EOF
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
SECRET_KEY: test-secret-key

# ============================================
# MIGRATION PERFORMANCE TEST
# ============================================

test-migration-performance:
name: Test Migration Performance
runs-on: ubuntu-latest
timeout-minutes: 45

services:
postgres:
image: postgres:15-alpine
env:
POSTGRES_DB: bio_qms_test
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432

steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0

- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: '3.11'
cache: 'pip'

- name: Install dependencies
working-directory: backend
run: |
pip install --upgrade pip
pip install -r requirements.txt
pip install faker

- name: Generate large dataset
working-directory: backend
run: |
python manage.py shell <<EOF
import time
from faker import Faker
from django.contrib.auth import get_user_model
from apps.documents.models import Document
from apps.tenants.models import Tenant

fake = Faker()
User = get_user_model()

# Create tenant
tenant = Tenant.objects.create(
name='Test Corp',
slug='test-corp',
is_active=True
)

# Bulk create users
users = []
for i in range(1000):
users.append(User(
username=f'user{i}',
email=f'user{i}@example.com',
tenant=tenant
))
User.objects.bulk_create(users, batch_size=100)

# Bulk create documents
docs = []
for i in range(10000):
docs.append(Document(
title=f'Document {i}',
content=fake.text(),
tenant=tenant
))
Document.objects.bulk_create(docs, batch_size=1000)

print(f'Created {User.objects.count()} users')
print(f'Created {Document.objects.count()} documents')
EOF
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
SECRET_KEY: test-secret-key

- name: Benchmark migrations
working-directory: backend
run: |
START_TIME=$(date +%s)
python manage.py migrate --no-input
END_TIME=$(date +%s)
DURATION=$((END_TIME - START_TIME))

echo "Migration duration: ${DURATION} seconds"

if [ $DURATION -gt 300 ]; then
echo "⚠️ Warning: Migrations took longer than 5 minutes"
echo "Consider optimizing migrations or scheduling downtime"
else
echo "✓ Migration performance acceptable"
fi

echo "duration=${DURATION}" >> $GITHUB_OUTPUT
id: benchmark
env:
DATABASE_URL: postgresql://postgres:postgres@localhost:5432/bio_qms_test
SECRET_KEY: test-secret-key

- name: Comment performance results
if: github.event_name == 'pull_request'
uses: actions/github-script@v7
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
const duration = ${{ steps.benchmark.outputs.duration }};
const emoji = duration > 300 ? '⚠️' : '✅';

const body = `## Migration Performance

${emoji} **Duration:** ${duration} seconds

**Dataset:**
- 1,000 users
- 10,000 documents

${duration > 300 ? '⚠️ **Warning:** Migration took longer than 5 minutes on test dataset. Consider scheduling maintenance window for production deployment.' : '✅ Performance is acceptable for production deployment.'}
`;

github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
body: body
});

Pre-Deployment Migration Script

File: backend/scripts/pre_deploy_migration.py

#!/usr/bin/env python
"""
Pre-deployment migration validation script.

Validates migrations before deployment to production.
"""

import os
import sys
import django
import subprocess
from django.conf import settings
from django.db import connection
from django.db.migrations.executor import MigrationExecutor


def check_migrations():
"""Check for unapplied migrations."""
executor = MigrationExecutor(connection)
targets = executor.loader.graph.leaf_nodes()
plan = executor.migration_plan(targets)

if plan:
print("⚠️ Unapplied migrations detected:")
for migration, _ in plan:
print(f" - {migration}")
return False
else:
print("✓ All migrations applied")
return True


def check_conflicts():
"""Check for migration conflicts."""
try:
result = subprocess.run(
['python', 'manage.py', 'makemigrations', '--check', '--dry-run'],
capture_output=True,
text=True,
check=True
)
print("✓ No migration conflicts")
return True
except subprocess.CalledProcessError as e:
print("❌ Migration conflicts detected:")
print(e.stderr)
return False


def validate_schema():
"""Validate database schema."""
try:
subprocess.run(
['python', 'manage.py', 'check', '--database', 'default'],
check=True
)
print("✓ Schema validation passed")
return True
except subprocess.CalledProcessError:
print("❌ Schema validation failed")
return False


def generate_sql():
"""Generate SQL for pending migrations."""
executor = MigrationExecutor(connection)
targets = executor.loader.graph.leaf_nodes()
plan = executor.migration_plan(targets)

if not plan:
return True

print("\nGenerated SQL for pending migrations:")
for migration, _ in plan:
app_label, migration_name = str(migration).split('.')
try:
result = subprocess.run(
['python', 'manage.py', 'sqlmigrate', app_label, migration_name],
capture_output=True,
text=True,
check=True
)
print(f"\n-- {migration} --")
print(result.stdout)
except subprocess.CalledProcessError as e:
print(f"❌ Failed to generate SQL for {migration}")
print(e.stderr)
return False

return True


def main():
"""Main validation flow."""
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'bio_qms.settings.production')
django.setup()

print("Starting pre-deployment migration validation...\n")

checks = [
("Migration conflicts", check_conflicts),
("Schema validation", validate_schema),
("Migration status", check_migrations),
]

failed = []
for check_name, check_func in checks:
print(f"\nRunning: {check_name}")
if not check_func():
failed.append(check_name)

if not failed:
print("\n" + "="*50)
print("✅ All validation checks passed")
print("="*50)
generate_sql()
sys.exit(0)
else:
print("\n" + "="*50)
print("❌ Validation failed:")
for check in failed:
print(f" - {check}")
print("="*50)
sys.exit(1)


if __name__ == '__main__':
main()

Migration Rollback Script

File: backend/scripts/rollback_migration.py

#!/usr/bin/env python
"""
Automated migration rollback script.

Usage:
python rollback_migration.py <app_label> <migration_name>
python rollback_migration.py --to-previous
"""

import os
import sys
import django
import argparse
from django.db import connection
from django.db.migrations.executor import MigrationExecutor
from django.db.migrations.recorder import MigrationRecorder


def get_previous_migration(app_label, migration_name):
"""Get the migration before the specified one."""
recorder = MigrationRecorder(connection)
applied = list(recorder.applied_migrations())

# Find the migration to rollback to
target_index = None
for i, (app, name) in enumerate(applied):
if app == app_label and name == migration_name:
target_index = i
break

if target_index is None:
print(f"❌ Migration {app_label}.{migration_name} not found in applied migrations")
return None

if target_index == 0:
print(f"❌ No previous migration to rollback to")
return None

previous_app, previous_name = applied[target_index - 1]
return previous_app, previous_name


def rollback_migration(app_label, migration_name):
"""Rollback to specified migration."""
print(f"Rolling back to {app_label}.{migration_name}...")

executor = MigrationExecutor(connection)
target = [(app_label, migration_name)]

try:
executor.migrate(target)
print(f"✓ Successfully rolled back to {app_label}.{migration_name}")
return True
except Exception as e:
print(f"❌ Rollback failed: {str(e)}")
return False


def rollback_to_previous():
"""Rollback to the previous migration."""
recorder = MigrationRecorder(connection)
applied = list(recorder.applied_migrations())

if len(applied) < 2:
print("❌ Not enough migrations to rollback")
return False

# Get the last two migrations
current_app, current_name = applied[-1]
previous_app, previous_name = applied[-2]

print(f"Current migration: {current_app}.{current_name}")
print(f"Rolling back to: {previous_app}.{previous_name}")

return rollback_migration(previous_app, previous_name)


def main():
parser = argparse.ArgumentParser(description='Rollback database migrations')
parser.add_argument('app_label', nargs='?', help='App label')
parser.add_argument('migration_name', nargs='?', help='Migration name')
parser.add_argument('--to-previous', action='store_true',
help='Rollback to the previous migration')

args = parser.parse_args()

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'bio_qms.settings.production')
django.setup()

if args.to_previous:
success = rollback_to_previous()
elif args.app_label and args.migration_name:
success = rollback_migration(args.app_label, args.migration_name)
else:
parser.print_help()
sys.exit(1)

sys.exit(0 if success else 1)


if __name__ == '__main__':
main()

Appendices

A. Environment Variables

File: .env.example

# Django Settings
DJANGO_SETTINGS_MODULE=bio_qms.settings.production
SECRET_KEY=change-me-in-production
DEBUG=False
ALLOWED_HOSTS=app.bio-qms.com,staging.bio-qms.com

# Database
DATABASE_URL=postgresql://user:password@host:5432/bio_qms_prod
DATABASE_CONN_MAX_AGE=600

# Redis
REDIS_URL=redis://host:6379/0
CELERY_BROKER_URL=redis://host:6379/1

# GCP
GCP_PROJECT_ID=bio-qms-prod
GCP_REGION=us-central1
GCS_BUCKET_NAME=bio-qms-media

# Cloud Run
CLOUD_RUN_SERVICE_NAME=bio-qms-backend-prod
CLOUD_RUN_REGION=us-central1

# Artifact Registry
ARTIFACT_REGISTRY_URL=us-central1-docker.pkg.dev/bio-qms-prod/bio-qms

# Monitoring
SENTRY_DSN=https://...@sentry.io/...
SENTRY_ENVIRONMENT=production

# Compliance
AUDIT_LOG_BUCKET=bio-qms-audit-logs
RETENTION_PERIOD_YEARS=7

# Email
EMAIL_BACKEND=django.core.mail.backends.smtp.EmailBackend
EMAIL_HOST=smtp.sendgrid.net
EMAIL_PORT=587
EMAIL_USE_TLS=True
EMAIL_HOST_USER=apikey
EMAIL_HOST_PASSWORD=SG.xxx

# Slack Notifications
SLACK_WEBHOOK_URL=https://hooks.slack.com/services/...
SLACK_WEBHOOK_PRODUCTION=https://hooks.slack.com/services/...
SLACK_WEBHOOK_ALERTS=https://hooks.slack.com/services/...

B. Required GitHub Secrets

# GCP Authentication
GCP_SA_KEY: JSON service account key with roles:
- roles/run.admin
- roles/artifactregistry.writer
- roles/storage.objectAdmin

# Codecov
CODECOV_TOKEN: Coverage reporting token

# Notifications
SLACK_WEBHOOK_URL: Staging notifications
SLACK_WEBHOOK_PRODUCTION: Production notifications
SLACK_WEBHOOK_ALERTS: Alert notifications

C. Cloud Run Service Configuration

File: infrastructure/cloud-run/backend-service.yaml

apiVersion: serving.knative.dev/v1
kind: Service
metadata:
name: bio-qms-backend-prod
namespace: default
labels:
cloud.googleapis.com/location: us-central1
annotations:
run.googleapis.com/ingress: all
run.googleapis.com/ingress-status: all
spec:
template:
metadata:
annotations:
autoscaling.knative.dev/minScale: '3'
autoscaling.knative.dev/maxScale: '50'
run.googleapis.com/cpu-throttling: 'false'
run.googleapis.com/startup-cpu-boost: 'true'
spec:
containerConcurrency: 80
timeoutSeconds: 300
serviceAccountName: bio-qms-backend@bio-qms-prod.iam.gserviceaccount.com
containers:
- image: us-central1-docker.pkg.dev/bio-qms-prod/bio-qms/backend:latest
ports:
- name: http1
containerPort: 8000
env:
- name: ENVIRONMENT
value: production
- name: PORT
value: '8000'
resources:
limits:
cpu: '4'
memory: 4Gi
livenessProbe:
httpGet:
path: /health
port: 8000
initialDelaySeconds: 30
periodSeconds: 10
timeoutSeconds: 5
failureThreshold: 3
startupProbe:
httpGet:
path: /health
port: 8000
initialDelaySeconds: 0
periodSeconds: 10
timeoutSeconds: 5
failureThreshold: 30
traffic:
- percent: 100
latestRevision: true

D. Monitoring Dashboard

File: infrastructure/monitoring/dashboard.json

{
"displayName": "BIO-QMS CI/CD Pipeline",
"mosaicLayout": {
"columns": 12,
"tiles": [
{
"width": 6,
"height": 4,
"widget": {
"title": "CI Pipeline Success Rate",
"xyChart": {
"dataSets": [
{
"timeSeriesQuery": {
"timeSeriesFilter": {
"filter": "resource.type=\"github_workflow\"",
"aggregation": {
"alignmentPeriod": "3600s",
"perSeriesAligner": "ALIGN_RATE"
}
}
}
}
]
}
}
},
{
"width": 6,
"height": 4,
"widget": {
"title": "Deployment Frequency",
"scorecard": {
"timeSeriesQuery": {
"timeSeriesFilter": {
"filter": "resource.type=\"cloud_run_revision\"",
"aggregation": {
"alignmentPeriod": "86400s",
"perSeriesAligner": "ALIGN_COUNT"
}
}
}
}
}
},
{
"width": 6,
"height": 4,
"widget": {
"title": "Build Duration",
"xyChart": {
"dataSets": [
{
"timeSeriesQuery": {
"timeSeriesFilter": {
"filter": "metric.type=\"github.com/workflow/duration\"",
"aggregation": {
"alignmentPeriod": "3600s",
"perSeriesAligner": "ALIGN_MEAN"
}
}
}
}
]
}
}
},
{
"width": 6,
"height": 4,
"widget": {
"title": "Failed Deployments",
"xyChart": {
"dataSets": [
{
"timeSeriesQuery": {
"timeSeriesFilter": {
"filter": "metric.type=\"github.com/workflow/failure\"",
"aggregation": {
"alignmentPeriod": "3600s",
"perSeriesAligner": "ALIGN_COUNT"
}
}
}
}
]
}
}
}
]
}
}

E. Compliance Checklist

Pre-Deployment Compliance Verification:

  • All CI jobs passed (lint, typecheck, test, build)
  • Code coverage meets 80% threshold
  • Security scans show no critical/high vulnerabilities
  • PR approved by 2 reviewers (including 1 compliance team member)
  • Migration validation passed
  • Schema diff reviewed and approved
  • Backward compatibility verified
  • Audit records generated and stored
  • Container images signed with cosign
  • SBOM generated and attached
  • Environment secrets rotated (if needed)
  • Rollback procedure tested
  • Monitoring dashboards updated
  • Runbook updated with new version info
  • Stakeholders notified of deployment window

Post-Deployment Compliance Verification:

  • Smoke tests passed
  • Health checks green for 30 minutes
  • Error rates within acceptable thresholds
  • Performance metrics baseline maintained
  • Audit log entries created
  • Deployment notification sent
  • Documentation updated
  • Incident response team notified (for production)
  • Change management record updated
  • Compliance dashboard updated

F. Rollback Decision Tree

Deployment Issue Detected
├── Error Rate > 5%?
│ ├── Yes → ROLLBACK IMMEDIATELY
│ └── No → Continue monitoring
├── Response Time > 2x baseline?
│ ├── Yes → ROLLBACK IMMEDIATELY
│ └── No → Continue monitoring
├── Health Check Failing?
│ ├── Yes → ROLLBACK IMMEDIATELY
│ └── No → Continue monitoring
├── Database Connection Issues?
│ ├── Yes → ROLLBACK IMMEDIATELY + Check migrations
│ └── No → Continue monitoring
└── Security Vulnerability Detected?
├── Critical → ROLLBACK IMMEDIATELY
├── High → Assess impact, consider rollback
└── Medium/Low → Schedule fix for next release

G. Performance Benchmarks

MetricTargetAcceptableAction Required
CI Pipeline Duration< 10 min< 15 min> 15 min: Optimize
Build Time (Backend)< 3 min< 5 min> 5 min: Review Dockerfile
Build Time (Frontend)< 2 min< 3 min> 3 min: Review Vite config
Migration Duration (100k records)< 5 min< 10 min> 10 min: Schedule downtime
Deployment Time (Staging)< 5 min< 10 min> 10 min: Investigate
Deployment Time (Production)< 15 min< 30 min> 30 min: Optimize canary
Image Size (Backend)< 500 MB< 1 GB> 1 GB: Multi-stage optimize
Image Size (Frontend)< 50 MB< 100 MB> 100 MB: Review assets

H. Contact Information

RoleContactEscalation
DevOps Leaddevops-lead@bio-qms.comCTO
Compliance Officercompliance@bio-qms.comVP Engineering
Database Admindba@bio-qms.comDevOps Lead
Security Teamsecurity@bio-qms.comCISO
On-Call Engineeroncall@bio-qms.comEngineering Manager

Document History

VersionDateAuthorChanges
1.0.02026-02-16Claude (Sonnet 4.5)Initial comprehensive CI/CD pipeline documentation

End of Document