Skip to main content

Validation Approval and Sign-Off Workflow

Document Control

FieldValue
Document IDCOMP-VAL-WORKFLOW-001
Version1.0
Effective Date2026-02-16
ClassificationCompliance Framework
Review CycleAnnual
OwnerQuality Assurance
Regulatory Basis21 CFR Part 11, EU Annex 11, GAMP 5

Table of Contents

  1. Overview
  2. Three-Role Approval Hierarchy
  3. Part 11 Compliant Electronic Signatures
  4. Deviation Handling
  5. Validation Summary Report Auto-Generation
  6. Workflow State Machine
  7. Technical Implementation
  8. Audit Trail Requirements
  9. Appendices

Overview

Purpose

This document defines the validation approval and sign-off workflow for the BIO-QMS SaaS platform, ensuring compliance with FDA 21 CFR Part 11, EU Annex 11, and GAMP 5 guidance for computerized systems in GxP environments.

The workflow implements a three-tier approval hierarchy with electronic signatures, deviation management, and automated validation summary report generation to support regulatory inspection readiness.

Scope

This workflow applies to:

  • IQ/OQ/PQ validation protocols for BIO-QMS platform components
  • Change control validation for system updates and patches
  • Periodic review validation (annual re-qualification)
  • Third-party integration validation (LIMS, ELN, ERP systems)
  • Data migration validation for SaaS tenant provisioning

Regulatory Context

RegulationRequirementImplementation
21 CFR Part 11.10(a)Validation of systems to ensure accuracy, reliability, consistent intended performanceThree-role validation with independent review
21 CFR Part 11.50Signature manifestations (printed name, date/time, meaning)Electronic signature capture with full audit trail
21 CFR Part 11.70Signature/record linking to prevent ordinary falsificationSHA-256 hash binding signature to document version
21 CFR Part 11.200Two distinct identification components (ID + password)Multi-factor authentication for all signatures
EU Annex 11 Clause 4Validation documentation including URS, risk assessment, test scriptsAutomated traceability matrix generation
GAMP 5 Appendix D3Roles and responsibilities in validation lifecycleDefined qualification requirements per role

Key Principles

  1. Segregation of Duties: Test executor ≠ reviewer ≠ approver
  2. Traceability: Every requirement → test case → result → signature
  3. Non-Repudiation: Cryptographic binding prevents signature denial
  4. Completeness: All deviations documented, assessed, and closed before approval
  5. Timeliness: SLA enforcement with escalation for delayed approvals

Three-Role Approval Hierarchy

Role Definitions

1. Test Executor

Responsibility: Execute validation test cases, record results, capture evidence

Qualifications (per GAMP 5 Section 3.2):

  • Bachelor's degree in Life Sciences, Engineering, or Computer Science OR equivalent experience
  • Minimum 1 year experience in GxP environment
  • Training on:
    • BIO-QMS platform functionality (user workflows)
    • Test case execution procedures
    • Evidence capture requirements (screenshots, logs, data exports)
    • Deviation documentation

Activities:

  • Execute test steps as documented in validation protocol
  • Record actual results for each test step
  • Capture screenshots, system logs, and data exports as evidence
  • Document any deviations from expected results
  • Apply electronic signature to confirm "Executed By" status
  • Escalate blocked tests or critical deviations immediately

Access Level: Read/write access to validation execution module; cannot review or approve own tests

SLA: Complete assigned test cases within 3 business days of assignment


2. QA Reviewer

Responsibility: Review test execution completeness, verify evidence quality, assess deviations

Qualifications:

  • Bachelor's degree in Life Sciences, Quality Assurance, or related field
  • Minimum 3 years experience in QA role within regulated industry
  • Training on:
    • GxP regulations (21 CFR Part 11, EU Annex 11, GAMP 5)
    • Deviation classification and impact assessment
    • Evidence review criteria (ALCOA+ principles: Attributable, Legible, Contemporaneous, Original, Accurate, Complete, Consistent, Enduring, Available)
    • Root cause analysis techniques

Activities:

  • Review each test case for completeness:
    • All test steps executed
    • Actual results documented
    • Evidence attached and legible
    • Pass/fail determination justified
  • Verify evidence quality against ALCOA+ criteria
  • Classify deviations (Critical/Major/Minor)
  • Request re-execution if evidence insufficient
  • Verify corrective actions closed before final review
  • Apply electronic signature to confirm "Reviewed By" status
  • Escalate systemic issues to Quality Head

Access Level: Read-only access to test execution data; write access to review comments; cannot execute tests or provide final approval

SLA: Complete review within 2 business days of test execution completion

Independence Requirement: Reviewer must not have executed the test cases under review (segregation of duties enforced by system)


3. Quality Head Approver

Responsibility: Final sign-off, validate overall compliance, release system for production use

Qualifications:

  • Advanced degree in Quality Assurance, Life Sciences, or related field OR 10+ years progressive QA experience
  • Minimum 5 years in Quality leadership role
  • Certified Quality Auditor (ASQ CQA) or equivalent preferred
  • Training on:
    • FDA inspection preparedness
    • CAPA system management
    • Risk assessment methodologies (FMEA, HACCP)
    • Computerized system validation (CSV) strategies

Activities:

  • Review validation summary report for overall compliance
  • Verify all critical deviations resolved and closed
  • Assess residual risk from minor/major deviations
  • Confirm traceability matrix complete (all requirements tested)
  • Review and approve deviation dispositions
  • Make go/no-go decision for production release
  • Apply electronic signature to confirm "Approved By" status
  • Authorize archival of validation package

Access Level: Read-only access to all validation data; write access to final approval decision; cannot execute or review tests

SLA: Complete approval within 3 business days of QA review completion

Authority: Can reject validation package and require re-execution or additional testing


Role Assignment and Management

Assignment Criteria

System-Enforced Rules:

  1. No Self-Review: User cannot review their own test executions
  2. No Self-Approval: User cannot approve validations they executed or reviewed
  3. Role Hierarchy: Quality Head must have higher organizational authority than QA Reviewer
  4. Minimum Training: System blocks signature if required training records not current

Assignment Process:

Validation Protocol Creation:
→ Assign Test Executor (1-N users for parallel execution)
→ Assign QA Reviewer (single user per protocol)
→ Assign Quality Head Approver (single user per protocol)
→ System validates segregation of duties
→ Notify all assignees via email + in-app notification

Backup Approver Procedures

Scenario: Assigned approver unavailable (vacation, leave, turnover)

Process:

  1. Quality Head designates backup approvers in system (must meet same qualifications)
  2. Backup approver list maintained in System Configuration → User Roles → Backup Approvers
  3. If primary approver inactive >3 business days, system auto-notifies backup
  4. Backup approver assumes full responsibility for approval decision
  5. Audit trail records reason for backup approver activation

Delegation Rules:

  • Test Executor: Can delegate to peer with equivalent training
  • QA Reviewer: Must escalate to QA Manager for delegation approval
  • Quality Head: Delegation requires written authorization (electronic memo in system)

Emergency Escalation:

  • If all assigned approvers unavailable, system escalates to VP Quality
  • VP Quality can override SLA or reassign roles
  • All escalations logged in audit trail with justification

Role Qualification Maintenance

Initial Qualification:

  • HR verifies education/experience requirements
  • Mandatory training completion within 30 days of role assignment
  • Knowledge assessment (80% passing score)
  • Supervised execution/review/approval (minimum 3 validation packages)

Ongoing Qualification:

  • Annual refresher training on GxP regulations and system updates
  • Performance monitoring: deviation acceptance rate, SLA compliance, audit findings
  • Disqualification triggers:
    • Training lapse >30 days
    • Quality incident involving signature integrity
    • Performance below threshold (>20% SLA misses, >10% incorrect deviation classifications)

Qualification Records:

  • Stored in System Administration → Training Records → Validation Approvers
  • Includes: training completion dates, assessment scores, supervised activity logs
  • Retention: Duration of employment + 7 years (per FDA guidance)

Part 11 Compliant Electronic Signatures

Two-Factor Authentication

Authentication Components (21 CFR 11.200):

  1. User ID (Something You Know):

    • Unique identifier assigned to individual user
    • Format: firstname.lastname or employee ID
    • Cannot be shared, transferred, or reused after employee departure
  2. Password (Something You Know - Second Factor):

    • Minimum 12 characters, complexity requirements enforced
    • Must include uppercase, lowercase, number, special character
    • Password history: cannot reuse last 10 passwords
    • Expiration: 90 days
    • Account lockout: 5 failed attempts, 30-minute lockout
  3. Optional MFA (Something You Have):

    • TOTP-based authenticator app (Google Authenticator, Authy)
    • SMS code to registered mobile number
    • Hardware token (YubiKey) for Quality Head role

Implementation:

# Signature authentication flow
def authenticate_for_signature(user_id: str, password: str, mfa_code: Optional[str]) -> SignatureToken:
"""
Authenticate user for electronic signature application.

Returns time-limited signature token valid for 5 minutes.
"""
# Step 1: Verify credentials
user = authenticate_user(user_id, password)
if not user:
raise AuthenticationError("Invalid credentials")

# Step 2: Verify MFA if required for role
if user.role == "Quality_Head" and not verify_mfa(user, mfa_code):
raise AuthenticationError("MFA verification failed")

# Step 3: Check qualification status
if not user.is_qualified_for_validation():
raise AuthenticationError("User not qualified for validation signatures")

# Step 4: Generate time-limited signature token
return SignatureToken(
user_id=user.id,
role=user.role,
valid_until=now() + timedelta(minutes=5),
token=generate_secure_token()
)

Signature Meaning Declaration

Signature Manifestations (21 CFR 11.50):

Each electronic signature must display:

  1. Signer's Printed Name: Full legal name from HR system
  2. Date and Time: ISO 8601 format, UTC timezone, NTP-synchronized
  3. Meaning of Signature: One of three declarations

Three Signature Meanings:

MeaningDeclaration TextRoleAttestation
Executed By"I certify that I have executed the test cases documented in this protocol according to written procedures, and that the recorded results and evidence are accurate and complete."Test ExecutorConfirms test execution completeness
Reviewed By"I certify that I have reviewed the test execution results and evidence, verified completeness and compliance with acceptance criteria, and assessed all deviations according to documented procedures."QA ReviewerConfirms independent review
Approved By"I certify that I have reviewed the validation summary report, verified that all critical deviations are resolved, assessed residual risk as acceptable, and approve this system for production use in accordance with GxP requirements."Quality HeadAuthorizes production release

Signature Display Example:

┌─────────────────────────────────────────────────────────────────┐
│ ELECTRONIC SIGNATURE RECORD │
├─────────────────────────────────────────────────────────────────┤
│ Signer Name: Jane Smith │
│ User ID: jane.smith │
│ Date/Time: 2026-02-16T14:23:17Z │
│ Signature Meaning: Reviewed By │
│ │
│ Attestation: │
│ "I certify that I have reviewed the test execution results and │
│ evidence, verified completeness and compliance with acceptance │
│ criteria, and assessed all deviations according to documented │
│ procedures." │
│ │
│ Document: IQ Protocol BIO-QMS-IQ-001 v1.2 │
│ Document Hash: SHA-256:a3f5b8c9d2e1f4a7b6c5d8e9f1a2b3c4d5 │
│ Signature ID: SIG-2026-0216-142317-JS │
└─────────────────────────────────────────────────────────────────┘

Timestamp Synchronization

NTP Server Configuration:

  • Primary NTP: time.google.com (Stratum 1)
  • Secondary NTP: time.nist.gov (Stratum 1)
  • Time drift monitoring: Alert if drift >2 seconds from authoritative source
  • Synchronization interval: Every 15 minutes

Timestamp Format:

  • ISO 8601 extended format: YYYY-MM-DDTHH:MM:SSZ
  • Always UTC (no local time conversion)
  • Millisecond precision captured in audit trail (displayed as seconds in UI)

Clock Validation:

def validate_system_clock() -> ClockValidationResult:
"""
Validate system clock synchronization with NTP.
Run hourly via cron job.
"""
ntp_time = query_ntp_server("time.google.com")
system_time = get_system_time_utc()
drift_seconds = abs((ntp_time - system_time).total_seconds())

if drift_seconds > 2.0:
alert_ops_team(f"Clock drift detected: {drift_seconds}s")
return ClockValidationResult(valid=False, drift=drift_seconds)

return ClockValidationResult(valid=True, drift=drift_seconds)

Signature Binding to Document Version

Cryptographic Hash Linking (21 CFR 11.70):

Each signature is cryptographically bound to the specific document version at the time of signing to prevent ordinary falsification.

Hash Computation:

def compute_document_hash(protocol_id: str, version: str) -> str:
"""
Compute SHA-256 hash of validation protocol content.

Includes: test steps, expected results, acceptance criteria,
attachments (sorted), metadata (protocol ID, version, title).
"""
content = {
"protocol_id": protocol_id,
"version": version,
"title": get_protocol_title(protocol_id, version),
"test_cases": get_test_cases_sorted(protocol_id, version),
"attachments": get_attachment_hashes_sorted(protocol_id, version),
"metadata": get_protocol_metadata(protocol_id, version)
}

# Canonical JSON serialization (sorted keys, no whitespace)
canonical_json = json.dumps(content, sort_keys=True, separators=(',', ':'))

# Compute SHA-256 hash
return hashlib.sha256(canonical_json.encode('utf-8')).hexdigest()

Signature Record Structure:

{
"signature_id": "SIG-2026-0216-142317-JS",
"user_id": "jane.smith",
"user_name": "Jane Smith",
"role": "QA_Reviewer",
"meaning": "Reviewed By",
"timestamp": "2026-02-16T14:23:17.483Z",
"document_id": "BIO-QMS-IQ-001",
"document_version": "1.2",
"document_hash": "a3f5b8c9d2e1f4a7b6c5d8e9f1a2b3c4d5e6f7a8b9c0d1e2f3a4b5c6d7e8f9a0",
"signature_token_hash": "SHA-256 hash of authentication token (non-reversible)",
"ip_address": "203.0.113.42",
"user_agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7)...",
"attestation_text": "I certify that I have reviewed...",
"supersedes": null,
"audit_trail_entry_id": "AUD-2026-0216-142317-001"
}

Verification Process:

def verify_signature(signature_id: str) -> SignatureVerificationResult:
"""
Verify electronic signature integrity and binding to document.
"""
sig = get_signature_record(signature_id)
doc = get_document_version(sig.document_id, sig.document_version)

# Recompute document hash
current_hash = compute_document_hash(sig.document_id, sig.document_version)

# Compare with signed hash
if current_hash != sig.document_hash:
return SignatureVerificationResult(
valid=False,
reason="Document content changed after signature"
)

# Verify signer qualification at time of signature
if not was_user_qualified_at(sig.user_id, sig.timestamp):
return SignatureVerificationResult(
valid=False,
reason="Signer was not qualified at time of signature"
)

return SignatureVerificationResult(valid=True)

Non-Repudiation

Cryptographic Measures:

  1. Signature Token Hashing: One-way hash of authentication token stored (cannot reverse-engineer password)
  2. Document Hash Binding: Signature linked to immutable document hash (cannot claim document changed)
  3. Audit Trail Correlation: Every signature creates immutable audit entry with transaction metadata
  4. IP Address Logging: Network origin of signature action captured
  5. Session Recording: User session ID correlates signature to continuous authenticated session

Legal Enforceability:

Per 21 CFR 11.100(b), signed electronic records shall contain information associated with the signing that clearly indicates:

  • The printed name of the signer ✓
  • The date and time when the signature was executed ✓
  • The meaning of the signature ✓

Users acknowledge legal equivalence of electronic signatures to handwritten signatures via:

  • System access agreement (signed at onboarding)
  • Signature meaning attestation (displayed and confirmed at each signature action)
  • Annual re-attestation (part of GxP training refresher)

User Attestation Flow:

User clicks "Apply Signature" button

System displays signature meaning declaration

User confirms: "I understand this signature is legally binding"

User re-enters password (authentication)

System creates signature record

Signature displayed with hash verification QR code

Sequential Enforcement

Workflow State Guards:

Electronic signatures must be applied in sequence. System enforces:

SignaturePrerequisiteSystem Guard
Executed ByAll test cases in protocol completedBlock if any test case status = "Not Started" or "In Progress"
Reviewed ByExecuted By signature presentBlock if Executed By signature missing or timestamp <1 hour ago (cooling period)
Approved ByReviewed By signature present AND all critical deviations closedBlock if Reviewed By missing OR open critical/major deviations exist

Implementation:

def can_apply_signature(protocol_id: str, role: str, user_id: str) -> Tuple[bool, Optional[str]]:
"""
Determine if user can apply signature based on workflow state.

Returns: (allowed, blocking_reason)
"""
protocol = get_protocol(protocol_id)

# Check role-specific prerequisites
if role == "Test_Executor":
incomplete = get_incomplete_test_cases(protocol_id)
if incomplete:
return (False, f"{len(incomplete)} test cases not completed")

elif role == "QA_Reviewer":
exec_sig = get_signature(protocol_id, role="Test_Executor")
if not exec_sig:
return (False, "No Executed By signature found")

# Enforce cooling period (prevents rubber-stamping)
hours_since_execution = (now() - exec_sig.timestamp).total_seconds() / 3600
if hours_since_execution < 1:
return (False, f"Cooling period: wait {60 - int(hours_since_execution * 60)} minutes")

# Check segregation of duties
if exec_sig.user_id == user_id:
return (False, "Cannot review own test execution")

elif role == "Quality_Head":
review_sig = get_signature(protocol_id, role="QA_Reviewer")
if not review_sig:
return (False, "No Reviewed By signature found")

# Check deviations
open_deviations = get_open_deviations(protocol_id, severity=["Critical", "Major"])
if open_deviations:
return (False, f"{len(open_deviations)} open critical/major deviations")

# Check segregation of duties
exec_sig = get_signature(protocol_id, role="Test_Executor")
if exec_sig.user_id == user_id or review_sig.user_id == user_id:
return (False, "Cannot approve validation you executed or reviewed")

return (True, None)

UI Enforcement:

  • "Apply Signature" button disabled with tooltip showing blocking reason
  • Workflow progress bar shows current state and next required action
  • Email notification sent to next approver when prerequisite signatures complete

Deviation Handling

Deviation Detection

Automated Detection:

System automatically detects deviations when:

  1. Actual Result ≠ Expected Result: Text comparison with fuzzy matching threshold (90%)
  2. Test Execution Error: Exception, timeout, or system unavailability during test
  3. Evidence Missing: Required screenshot, log file, or data export not attached
  4. Acceptance Criteria Not Met: Quantitative threshold exceeded (e.g., response time >2s)

Manual Deviation Logging:

Test Executor can manually flag deviation if:

  • Unexpected system behavior observed (not covered by test step)
  • Environmental issue affects test validity (network outage, third-party service down)
  • Test case ambiguity requires clarification

Deviation Detection Algorithm:

def detect_deviation(test_step: TestStep, actual_result: str) -> Optional[Deviation]:
"""
Compare actual result to expected result and detect deviation.
"""
expected = test_step.expected_result

# Exact match check
if actual_result.strip() == expected.strip():
return None

# Fuzzy match check (handles minor formatting differences)
similarity = fuzzy_match_ratio(actual_result, expected)
if similarity >= 0.90:
return None # Close enough, not a deviation

# Numeric threshold check (for performance tests)
if test_step.acceptance_criteria_type == "numeric":
actual_value = extract_numeric_value(actual_result)
threshold = test_step.acceptance_criteria["max_value"]
if actual_value <= threshold:
return None

# Deviation detected
return Deviation(
test_step_id=test_step.id,
expected_result=expected,
actual_result=actual_result,
detected_at=now(),
detected_by="system",
status="Open",
severity=None # QA Reviewer classifies severity
)

Deviation Classification

Three-Tier Severity Model:

SeverityDefinitionExamplesImpact on Validation
CriticalDeviation affects patient safety, data integrity, or GxP compliance- Audit trail gap
- Data loss or corruption
- Access control bypass
- Electronic signature failure
Blocks approval until resolved and re-tested
MajorDeviation affects system functionality but has documented workaround or mitigation- UI display error (data correct in DB)
- Report formatting issue
- Performance degradation (3s vs 2s SLA)
Requires assessment before approval; may require re-test
MinorDeviation is cosmetic or does not affect GxP compliance- Typo in UI label
- Color scheme inconsistency
- Non-GxP documentation error
Document only; does not block approval

Classification Criteria (GAMP 5 Appendix D4):

Critical Deviation Triggers:
- Audit trail:
- Missing audit entry for GxP action
- Audit entry timestamp incorrect (>5 second drift)
- Audit entry cannot be exported or reviewed
- Data integrity:
- Data modification without audit trail
- Data deletion without retention
- Data accessible to unauthorized user
- Electronic signature:
- Signature not bound to document version
- Signature meaning not displayed
- Signature timestamp not NTP-synchronized
- System availability:
- System crash or data loss
- Backup/restore failure
- Disaster recovery failure

Major Deviation Triggers:
- Functional defect with workaround:
- Feature not working as specified but alternative process available
- Performance SLA miss (degraded but usable)
- Integration error with manual intervention possible
- Incomplete evidence:
- Screenshot quality poor but data export confirms result
- Log file missing but audit trail confirms action
- Process deviation:
- Test executed out of sequence (but result valid)
- Test environment variance (but controlled and documented)

Minor Deviation Triggers:
- Cosmetic issues:
- UI alignment, color, font inconsistencies
- Label typos or unclear wording (non-GxP)
- Documentation errors:
- Test case typo (corrected during execution)
- Expected result ambiguous (but actual result clearly correct)

Classification Process:

  1. Initial Detection: System or Test Executor identifies deviation
  2. QA Reviewer Classification: Assigns Critical/Major/Minor severity within 1 business day
  3. Quality Head Review: Confirms classification for Critical deviations
  4. Risk Assessment: Document impact on patient safety, data integrity, compliance

Classification Form Template:

## Deviation Classification

**Deviation ID**: DEV-2026-0216-001
**Test Step**: IQ-001-TS-042 "Verify audit trail captures user login event"
**Expected Result**: Audit trail entry shows login timestamp, user ID, IP address
**Actual Result**: Audit trail entry shows login timestamp, user ID; IP address field blank

**Severity**: [Critical / Major / Minor]

**Justification**:
[Explain why this severity level is appropriate based on criteria above]

**Impact Assessment**:
- Patient Safety: [None / Low / Medium / High]
- Data Integrity: [None / Low / Medium / High]
- GxP Compliance: [None / Low / Medium / High]

**Classified By**: Jane Smith (QA Reviewer)
**Classification Date**: 2026-02-16T15:30:00Z
**Reviewed By**: John Doe (Quality Head) [for Critical only]

Root Cause Analysis

Five Whys Technique (Required for Critical and Major Deviations):

## Root Cause Analysis Template

**Deviation ID**: DEV-2026-0216-001
**Severity**: Critical

### Problem Statement
Audit trail entry for user login event missing IP address field.

### Five Whys Analysis

1. **Why did the audit trail entry lack an IP address?**
→ The audit logging function did not capture the IP address from the request context.

2. **Why did the logging function not capture the IP address?**
→ The IP address was not passed as a parameter to the audit logging function.

3. **Why was the IP address not passed as a parameter?**
→ The authentication service refactored in v1.8.2 did not update the audit logging call signature.

4. **Why was the audit logging call signature not updated during refactoring?**
→ The code review checklist did not include verification of audit trail completeness.

5. **Why was audit trail verification not in the code review checklist?**
**Root Cause**: Code review checklist not aligned with 21 CFR Part 11 requirements for audit trail content.

### Contributing Factors
- Insufficient automated testing for audit trail completeness
- Lack of regression testing after authentication service refactor
- No validation test case for IP address capture in audit trail

### Corrective Action (CAPA)
See CAPA-2026-002 for detailed corrective and preventive actions.

**Analyzed By**: Jane Smith (QA Reviewer)
**Analysis Date**: 2026-02-16T16:00:00Z
**Approved By**: John Doe (Quality Head)

Fishbone Diagram (for Complex Deviations):

For systemic issues or deviations with multiple contributing factors, use Ishikawa (fishbone) diagram to categorize root causes:

Categories:
- People: Training gaps, staffing, competency
- Process: Procedures inadequate, not followed, unclear
- Technology: Software defect, infrastructure issue, integration failure
- Materials: Test data quality, environment configuration
- Measurement: Test case design, acceptance criteria ambiguity
- Environment: GCP service outage, network latency, third-party dependency

Impact Assessment

GxP Impact Evaluation:

For each deviation, assess impact on three dimensions:

DimensionAssessment QuestionsRisk Levels
Patient SafetyCould this deviation result in patient harm if system used in production?None / Low / Medium / High
Data IntegrityCould this deviation result in data loss, corruption, or unauthorized modification?None / Low / Medium / High
Regulatory ComplianceDoes this deviation violate 21 CFR Part 11, EU Annex 11, or GAMP 5 requirements?None / Low / Medium / High

Risk Matrix:

High Impact (any dimension) → Critical Severity (blocks approval)
Medium Impact + Medium Impact → Major Severity (requires assessment)
Low Impact (all dimensions) → Minor Severity (document only)

System Validation Status Impact:

Deviation SeverityValidation StatusProduction ReleaseRe-Test Required
Critical (Open)FAILEDBLOCKEDYes - full protocol
Critical (Closed)CONDITIONAL PASSAllowed after re-testYes - affected test cases
Major (Open)CONDITIONAL PASSRequires QH approvalCase-by-case
Major (Closed)CONDITIONAL PASSAllowedRecommended
Minor (Open/Closed)PASS WITH DEVIATIONSAllowedNo

Impact Assessment Documentation:

## GxP Impact Assessment

**Deviation ID**: DEV-2026-0216-001
**Severity**: Critical

### Patient Safety Impact: Medium
**Rationale**: Missing IP address in audit trail does not directly harm patients, but hinders investigation of unauthorized access that could lead to incorrect patient data.

### Data Integrity Impact: High
**Rationale**: Cannot trace data modifications to specific network location, reducing ability to detect and investigate unauthorized data changes (21 CFR 11.10(e) violation).

### Regulatory Compliance Impact: High
**Rationale**: 21 CFR 11.10(e) requires audit trail to include "device used" (IP address is key identifier). EU Annex 11 Clause 9 requires "location" in audit trail.

### Overall Risk Level: HIGH → Critical Severity Confirmed

### System Validation Status: FAILED (blocks production release)

**Assessed By**: Jane Smith (QA Reviewer)
**Assessment Date**: 2026-02-16T16:15:00Z

Corrective Action Assignment

CAPA Integration:

Every Critical and Major deviation triggers a CAPA record in the BIO-QMS CAPA module:

def create_capa_from_deviation(deviation_id: str) -> str:
"""
Auto-create CAPA record when Critical/Major deviation classified.
"""
deviation = get_deviation(deviation_id)

capa = CAPA.create(
title=f"Validation Deviation: {deviation.test_step_id}",
source="Validation",
source_reference=deviation_id,
problem_statement=deviation.actual_result,
severity=deviation.severity,
detected_date=deviation.detected_at,
detected_by=deviation.detected_by,
status="Open"
)

# Auto-assign to development team lead
capa.assign_to(role="Development_Lead", due_date=now() + timedelta(days=5))

# Link deviation to CAPA
deviation.capa_id = capa.id
deviation.save()

# Notify assignee
send_notification(
to=capa.assigned_to,
subject=f"CAPA Assigned: {capa.id}",
body=f"Validation deviation {deviation_id} requires corrective action."
)

return capa.id

CAPA Workflow:

Deviation Classified (Critical/Major)

Auto-create CAPA record

Assign to Development Lead (5 business days SLA)

Root cause analysis documented

Corrective action plan approved by QA

Corrective action implemented (code fix, config change, etc.)

Verification testing in dev/staging environment

CAPA closed by QA Reviewer

Re-test validation protocol in production-like environment

Deviation status → Closed

CAPA Tracking:

CAPA IDDeviation IDSeverityAssigned ToDue DateStatusClosure Date
CAPA-2026-002DEV-2026-0216-001CriticalDev Lead2026-02-21Closed2026-02-20
CAPA-2026-003DEV-2026-0216-005MajorQA Lead2026-02-23In Progress-

Escalation:

  • Critical CAPA overdue >2 days: Escalate to VP Engineering + VP Quality
  • Major CAPA overdue >5 days: Escalate to QA Manager
  • CAPA closure requires: Code review, unit tests, regression tests, QA verification

Re-Test Procedure

Trigger Conditions:

Re-test required when:

  1. Critical deviation corrective action implemented: Full re-test of affected protocol section
  2. Major deviation with code change: Re-test affected test cases + regression test cases
  3. Multiple Minor deviations in same functional area: Risk-based re-test decision by QA Reviewer

Re-Test Scope Determination:

def determine_retest_scope(deviation_id: str) -> RetestScope:
"""
Determine which test cases require re-execution after CAPA closure.
"""
deviation = get_deviation(deviation_id)
capa = get_capa(deviation.capa_id)

if deviation.severity == "Critical":
# Critical: re-test entire protocol section
section = get_protocol_section(deviation.test_step_id)
return RetestScope(
test_cases=section.get_all_test_cases(),
rationale="Critical deviation requires full section re-test"
)

elif deviation.severity == "Major":
# Major: re-test affected + related test cases
affected = get_test_case(deviation.test_step_id)
related = get_related_test_cases(affected, max_distance=1) # dependency graph
return RetestScope(
test_cases=[affected] + related,
rationale="Major deviation requires affected + related test re-test"
)

else: # Minor
return RetestScope(
test_cases=[],
rationale="Minor deviation does not require re-test"
)

Re-Test Execution:

  1. Create Re-Test Protocol Amendment:

    • Document: "Amendment 1 to IQ-001: Re-Test After CAPA-2026-002"
    • References original protocol and deviation
    • Specifies re-test scope and rationale
  2. Execute Re-Test:

    • Same Test Executor or peer (to verify reproducibility)
    • Must execute in production-like environment (not dev)
    • Capture fresh evidence (new screenshots, logs, exports)
  3. Re-Test Review:

    • QA Reviewer verifies deviation resolved
    • Confirms no new deviations introduced (regression)
    • Updates deviation status to "Closed - Verified"
  4. Deviation Closure:

    • Requires: CAPA closed + Re-test passed + QA approval
    • Audit trail records closure timestamp and approver
    • Deviation remains in VSR with "Closed" status

Re-Test Tracking:

## Re-Test Log

**Original Protocol**: IQ-001 v1.2
**Amendment**: Amendment 1 - Re-Test After CAPA-2026-002
**Re-Test Date**: 2026-02-20

| Test Case | Original Result | Deviation | Re-Test Result | Status |
|-----------|----------------|-----------|----------------|--------|
| IQ-001-TS-042 | FAIL | DEV-2026-0216-001 | PASS | Closed |
| IQ-001-TS-043 | PASS | - | PASS | No Re-Test |
| IQ-001-TS-044 | PASS | - | PASS | Regression Test |

**Re-Test Executed By**: John Smith, 2026-02-20T10:00:00Z
**Re-Test Reviewed By**: Jane Smith, 2026-02-20T14:30:00Z
**Deviation Closure Approved By**: John Doe, 2026-02-20T16:00:00Z

Validation Summary Report Auto-Generation

Report Structure

VSR Template (per GAMP 5 Appendix D7):

# Validation Summary Report
# BIO-QMS Platform IQ Protocol

**Report ID**: VSR-BIO-QMS-IQ-001
**Protocol ID**: IQ-001
**Protocol Version**: 1.2
**System**: BIO-QMS SaaS Platform
**Validation Type**: Installation Qualification (IQ)
**Report Date**: 2026-02-16
**Generated By**: Automated VSR Generator v2.1

---

## 1. Executive Summary

This Validation Summary Report documents the results of Installation Qualification testing for the BIO-QMS SaaS Platform version 2.3.0, deployed on Google Cloud Platform (GCP) in production environment.

**Overall Conclusion**: [PASS / CONDITIONAL PASS / FAIL]

**Summary Statistics**:
- Total Test Cases: 127
- Passed: 118 (92.9%)
- Failed: 3 (2.4%)
- Blocked: 1 (0.8%)
- Deviations: 9 (5 Critical, 3 Major, 1 Minor)
- Open Deviations: 0
- Closed Deviations: 9

**Recommendation**: [System approved for production use / System requires corrective action before approval]

---

## 2. Validation Scope

### 2.1 System Description
BIO-QMS is a cloud-based Quality Management System for pharmaceutical and biotechnology companies, providing document control, change control, CAPA, deviation management, and training management in compliance with FDA 21 CFR Part 11 and EU Annex 11.

### 2.2 Validation Objectives
- Verify GCP infrastructure provisioned per design specifications
- Confirm database schema deployed correctly
- Validate backup and disaster recovery mechanisms
- Verify audit trail functionality
- Confirm electronic signature compliance with 21 CFR Part 11

### 2.3 Test Environment
- **GCP Project**: coditect-bio-qms-prod
- **Region**: us-central1
- **Database**: Cloud SQL PostgreSQL 14.5
- **Application**: GKE cluster (3 nodes, n1-standard-4)
- **Version**: BIO-QMS v2.3.0
- **Test Period**: 2026-02-10 to 2026-02-16

---

## 3. Test Results Summary

### 3.1 Pass/Fail Statistics

| Test Category | Total | Pass | Fail | Blocked | Pass Rate |
|---------------|-------|------|------|---------|-----------|
| Infrastructure | 28 | 28 | 0 | 0 | 100% |
| Database | 19 | 18 | 1 | 0 | 94.7% |
| Application | 45 | 42 | 2 | 1 | 93.3% |
| Security | 20 | 20 | 0 | 0 | 100% |
| Audit Trail | 15 | 10 | 0 | 0 | 66.7% |
| **Total** | **127** | **118** | **3** | **1** | **92.9%** |

### 3.2 Deviation Summary

| Deviation ID | Test Case | Severity | Status | CAPA | Closure Date |
|--------------|-----------|----------|--------|------|--------------|
| DEV-2026-0216-001 | IQ-001-TS-042 | Critical | Closed | CAPA-2026-002 | 2026-02-20 |
| DEV-2026-0216-002 | IQ-001-TS-087 | Critical | Closed | CAPA-2026-003 | 2026-02-19 |
| DEV-2026-0216-003 | IQ-001-TS-091 | Critical | Closed | CAPA-2026-004 | 2026-02-21 |
| DEV-2026-0216-004 | IQ-001-TS-103 | Critical | Closed | CAPA-2026-005 | 2026-02-20 |
| DEV-2026-0216-005 | IQ-001-TS-114 | Critical | Closed | CAPA-2026-006 | 2026-02-22 |
| DEV-2026-0216-006 | IQ-001-TS-052 | Major | Closed | CAPA-2026-007 | 2026-02-18 |
| DEV-2026-0216-007 | IQ-001-TS-068 | Major | Closed | CAPA-2026-008 | 2026-02-19 |
| DEV-2026-0216-008 | IQ-001-TS-099 | Major | Closed | CAPA-2026-009 | 2026-02-21 |
| DEV-2026-0216-009 | IQ-001-TS-023 | Minor | Closed | N/A | 2026-02-17 |

**Deviation Closure Rate**: 100% (9/9 closed)

---

## 4. Requirements Traceability Matrix

[Auto-generated from requirements → test cases → results mapping]

| Requirement ID | Requirement Description | Test Case(s) | Result | Deviation |
|----------------|------------------------|--------------|--------|-----------|
| REQ-IQ-001 | GCP project provisioned in us-central1 region | IQ-001-TS-001 | PASS | - |
| REQ-IQ-002 | Cloud SQL PostgreSQL 14.5 instance deployed | IQ-001-TS-005 | PASS | - |
| REQ-IQ-003 | Database schema matches design specification | IQ-001-TS-009 | FAIL | DEV-2026-0216-002 (Closed) |
| REQ-IQ-004 | GKE cluster with 3 nodes, n1-standard-4 | IQ-001-TS-015 | PASS | - |
| REQ-IQ-042 | Audit trail captures user login with IP address | IQ-001-TS-042 | FAIL | DEV-2026-0216-001 (Closed) |
| ... | ... | ... | ... | ... |

**Traceability Coverage**: 100% (all 127 requirements have test cases with results)

---

## 5. Deviation Details

### 5.1 Critical Deviations

#### DEV-2026-0216-001: Audit Trail Missing IP Address

**Test Case**: IQ-001-TS-042 "Verify audit trail captures user login event"
**Expected Result**: Audit trail entry shows login timestamp, user ID, IP address
**Actual Result**: Audit trail entry shows login timestamp, user ID; IP address field blank

**Impact Assessment**:
- Patient Safety: Medium (cannot trace unauthorized access to patient data)
- Data Integrity: High (reduced forensic capability for data modification investigations)
- Regulatory Compliance: High (21 CFR 11.10(e) violation - "device used" not captured)

**Root Cause**: Code review checklist not aligned with 21 CFR Part 11 audit trail requirements

**Corrective Action**: CAPA-2026-002
- Updated authentication service to capture IP address from request headers
- Added IP address parameter to audit logging function
- Updated code review checklist with 21 CFR Part 11 audit trail verification
- Added automated test for audit trail IP address capture

**Re-Test Result**: PASS (2026-02-20)
**Closure Date**: 2026-02-20
**Closure Approved By**: John Doe (Quality Head)

---

[Additional Critical Deviations documented similarly...]

---

### 5.2 Major Deviations

[Documented with less detail than Critical; same structure]

---

### 5.3 Minor Deviations

[Brief summary only]

---

## 6. Risk Assessment

### 6.1 Residual Risks

After corrective actions and re-testing, the following residual risks remain:

| Risk ID | Description | Likelihood | Impact | Mitigation | Acceptance |
|---------|-------------|------------|--------|------------|------------|
| RISK-001 | GCP region outage >4 hours | Low | High | DR site in us-east1 (RTO 4h) | Accepted by Quality Head |
| RISK-002 | Third-party LIMS integration failure | Medium | Medium | Manual data entry fallback procedure | Accepted by Quality Head |

**Overall Residual Risk Level**: LOW (acceptable for production release)

---

## 7. Overall Conclusion

Based on the validation testing results, all critical deviations have been resolved through corrective actions and verified via re-testing. The BIO-QMS Platform version 2.3.0 meets the requirements defined in the User Requirements Specification (URS-BIO-QMS-001) and Functional Specification (FS-BIO-QMS-001).

**Validation Status**: PASS WITH DEVIATIONS (all deviations closed)

**Recommendation**: The system is approved for production use in GxP-regulated environments.

---

## 8. Approval Signatures

### 8.1 Executed By

**Signer**: John Smith
**User ID**: john.smith
**Date/Time**: 2026-02-16T17:00:00Z
**Signature Meaning**: Executed By

Attestation: "I certify that I have executed the test cases documented in this protocol according to written procedures, and that the recorded results and evidence are accurate and complete."

---

### 8.2 Reviewed By

**Signer**: Jane Smith
**User ID**: jane.smith
**Date/Time**: 2026-02-20T14:30:00Z
**Signature Meaning**: Reviewed By

Attestation: "I certify that I have reviewed the test execution results and evidence, verified completeness and compliance with acceptance criteria, and assessed all deviations according to documented procedures."

---

### 8.3 Approved By

**Signer**: John Doe
**User ID**: john.doe
**Date/Time**: 2026-02-22T10:00:00Z
**Signature Meaning**: Approved By

Attestation: "I certify that I have reviewed the validation summary report, verified that all critical deviations are resolved, assessed residual risk as acceptable, and approve this system for production use in accordance with GxP requirements."

---

## 9. Appendices

### Appendix A: Test Execution Evidence
[Links to screenshot archives, log files, data exports]

### Appendix B: Deviation Investigation Reports
[Full root cause analyses and CAPA documentation]

### Appendix C: Re-Test Protocols
[Amendment protocols for deviation re-testing]

### Appendix D: Traceability Matrix (Full)
[Complete requirements → test cases → results mapping]

---

**Document Hash**: SHA-256:f9a8b7c6d5e4f3a2b1c0d9e8f7a6b5c4d3e2f1a0b9c8d7e6f5a4b3c2d1e0f9a8
**Generated**: 2026-02-22T10:15:00Z
**Generator Version**: VSR Auto-Generator v2.1.0

Automated Compilation

Data Sources:

def generate_validation_summary_report(protocol_id: str) -> VSR:
"""
Auto-generate Validation Summary Report from protocol execution data.
"""
protocol = get_protocol(protocol_id)
test_results = get_all_test_results(protocol_id)
deviations = get_all_deviations(protocol_id)
signatures = get_all_signatures(protocol_id)
requirements = get_requirements_traceability(protocol_id)

# Compute statistics
stats = {
"total_tests": len(test_results),
"passed": len([r for r in test_results if r.status == "PASS"]),
"failed": len([r for r in test_results if r.status == "FAIL"]),
"blocked": len([r for r in test_results if r.status == "BLOCKED"]),
"pass_rate": len([r for r in test_results if r.status == "PASS"]) / len(test_results) * 100
}

deviation_stats = {
"total": len(deviations),
"critical": len([d for d in deviations if d.severity == "Critical"]),
"major": len([d for d in deviations if d.severity == "Major"]),
"minor": len([d for d in deviations if d.severity == "Minor"]),
"open": len([d for d in deviations if d.status == "Open"]),
"closed": len([d for d in deviations if d.status == "Closed"])
}

# Determine overall conclusion
if deviation_stats["open"] > 0:
conclusion = "FAIL"
recommendation = "System requires corrective action before approval"
elif deviation_stats["critical"] > 0 or stats["pass_rate"] < 90:
conclusion = "CONDITIONAL PASS"
recommendation = "System approved pending risk assessment"
else:
conclusion = "PASS"
recommendation = "System approved for production use"

# Generate VSR document
vsr = VSR(
protocol_id=protocol_id,
protocol_version=protocol.version,
generation_date=now(),
statistics=stats,
deviation_summary=deviation_stats,
conclusion=conclusion,
recommendation=recommendation,
requirements_traceability=requirements,
deviations=deviations,
signatures=signatures
)

return vsr

PDF Generation

Digital Signature Embedding:

def generate_vsr_pdf(vsr: VSR, output_path: str):
"""
Generate PDF with Part 11 compliant digital signatures.

Uses ReportLab for PDF generation and endesive for digital signatures.
"""
from reportlab.lib.pagesizes import letter
from reportlab.pdfgen import canvas
from endesive import pdf as endesive_pdf

# Step 1: Generate unsigned PDF
pdf_buffer = io.BytesIO()
c = canvas.Canvas(pdf_buffer, pagesize=letter)

# Render VSR content
render_vsr_header(c, vsr)
render_executive_summary(c, vsr)
render_test_results(c, vsr)
render_deviations(c, vsr)
render_traceability_matrix(c, vsr)
render_signatures(c, vsr) # Includes signature metadata

c.save()
pdf_bytes = pdf_buffer.getvalue()

# Step 2: Apply digital signatures (one per approver)
for signature in vsr.signatures:
cert = get_user_certificate(signature.user_id)
private_key = get_user_private_key(signature.user_id)

pdf_bytes = endesive_pdf.cms.sign(
pdf_bytes,
dct={
"sigflags": 3,
"contact": signature.user_name,
"location": "BIO-QMS Platform",
"reason": signature.meaning,
"signingdate": signature.timestamp.isoformat()
},
key=private_key,
cert=cert,
othercerts=[],
hashalgo="sha256"
)

# Step 3: Write final signed PDF
with open(output_path, "wb") as f:
f.write(pdf_bytes)

# Step 4: Compute document hash
doc_hash = hashlib.sha256(pdf_bytes).hexdigest()
vsr.pdf_hash = doc_hash
vsr.save()

return output_path

Digital Signature Verification:

Adobe Reader and other PDF viewers will display signature validity:

  • Green checkmark: Signature valid, document unchanged since signing
  • Red X: Signature invalid or document modified after signing
  • Yellow triangle: Signature valid but certificate untrusted (requires adding BIO-QMS CA to trusted roots)

Workflow State Machine

State Definitions

Seven Workflow States:

StateDescriptionEntry CriteriaExit CriteriaSLA
DraftProtocol created but not released for executionProtocol createdTest Executor assigned and notifiedN/A
ExecutionTest cases being executedTest Executor assignedAll test cases completed + Executed By signature3 business days
ReviewQA reviewing test results and evidenceExecuted By signature appliedReviewed By signature applied OR rejected2 business days
CorrectionDeviations being corrected via CAPAReviewer identified Critical/Major deviationsAll Critical/Major CAPAs closed + re-test passed10 business days
Re-ReviewQA reviewing corrections and re-test resultsCorrections completedReviewed By signature applied1 business day
ApprovalQuality Head reviewing VSRReviewed By signature applied + all Critical deviations closedApproved By signature applied OR rejected3 business days
ReleasedVSR approved, system released for productionApproved By signature appliedN/A (terminal state)N/A

Additional States (Exception Handling):

  • Rejected: Protocol rejected by Reviewer or Approver; returns to Draft for revision
  • Cancelled: Protocol cancelled before completion (e.g., system version deprecated)
  • On Hold: Protocol execution paused (e.g., awaiting environment fix, regulatory guidance)

State Transition Rules

State Machine Diagram:


Transition Guards

System-Enforced Preconditions:

class ValidationWorkflowStateMachine:
"""
State machine for validation protocol workflow.
"""

def can_transition(self, from_state: str, to_state: str, protocol_id: str) -> Tuple[bool, Optional[str]]:
"""
Check if state transition is allowed.

Returns: (allowed, blocking_reason)
"""
protocol = get_protocol(protocol_id)

# Draft → Execution
if from_state == "Draft" and to_state == "Execution":
if not protocol.test_executor_assigned:
return (False, "No Test Executor assigned")
if not protocol.test_cases:
return (False, "No test cases defined")

# Execution → Review
elif from_state == "Execution" and to_state == "Review":
incomplete = get_incomplete_test_cases(protocol_id)
if incomplete:
return (False, f"{len(incomplete)} test cases not completed")

exec_sig = get_signature(protocol_id, role="Test_Executor")
if not exec_sig:
return (False, "No 'Executed By' signature found")

# Review → Approval (fast path, no deviations)
elif from_state == "Review" and to_state == "Approval":
critical_major_deviations = get_open_deviations(
protocol_id,
severity=["Critical", "Major"]
)
if critical_major_deviations:
return (False, f"{len(critical_major_deviations)} open Critical/Major deviations - must go to Correction state")

review_sig = get_signature(protocol_id, role="QA_Reviewer")
if not review_sig:
return (False, "No 'Reviewed By' signature found")

# Review → Correction (deviation path)
elif from_state == "Review" and to_state == "Correction":
deviations = get_deviations(protocol_id, severity=["Critical", "Major"])
if not deviations:
return (False, "No Critical/Major deviations to correct")

# Correction → Re-Review
elif from_state == "Correction" and to_state == "Re-Review":
open_capas = get_open_capas_for_protocol(protocol_id)
if open_capas:
return (False, f"{len(open_capas)} CAPAs still open")

retest_required = get_retest_required_test_cases(protocol_id)
retest_completed = get_retest_completed(protocol_id)
if set(retest_required) != set(retest_completed):
return (False, "Re-test not completed for all required test cases")

# Re-Review → Approval
elif from_state == "Re-Review" and to_state == "Approval":
open_deviations = get_open_deviations(protocol_id)
if open_deviations:
return (False, f"{len(open_deviations)} deviations still open")

# Approval → Released
elif from_state == "Approval" and to_state == "Released":
approval_sig = get_signature(protocol_id, role="Quality_Head")
if not approval_sig:
return (False, "No 'Approved By' signature found")

# Rejection transitions
elif to_state == "Rejected":
rejection = get_rejection_record(protocol_id)
if not rejection or not rejection.reason:
return (False, "Rejection requires documented reason")

return (True, None)

def transition(self, protocol_id: str, to_state: str, user_id: str, reason: Optional[str] = None):
"""
Execute state transition with audit trail.
"""
protocol = get_protocol(protocol_id)
from_state = protocol.workflow_state

# Check guards
allowed, blocking_reason = self.can_transition(from_state, to_state, protocol_id)
if not allowed:
raise StateTransitionError(f"Cannot transition from {from_state} to {to_state}: {blocking_reason}")

# Execute transition
protocol.workflow_state = to_state
protocol.save()

# Record in audit trail
AuditTrail.create(
entity_type="ValidationProtocol",
entity_id=protocol_id,
action="StateTransition",
old_value=from_state,
new_value=to_state,
user_id=user_id,
timestamp=now(),
reason=reason
)

# Trigger notifications
self.notify_state_transition(protocol_id, from_state, to_state)

# Start SLA timer
self.start_sla_timer(protocol_id, to_state)

Timeout Handling

SLA Monitoring:

StateSLAWarning ThresholdEscalation Threshold
Execution3 business days2 days (66%)3 days (100%)
Review2 business days1.5 days (75%)2 days (100%)
Correction10 business days7 days (70%)10 days (100%)
Re-Review1 business day0.5 days (50%)1 day (100%)
Approval3 business days2 days (66%)3 days (100%)

Escalation Actions:

def check_sla_compliance(protocol_id: str):
"""
Monitor SLA compliance and trigger escalations.
Run every hour via Cloud Scheduler.
"""
protocol = get_protocol(protocol_id)

if protocol.workflow_state in ["Released", "Cancelled", "Rejected"]:
return # Terminal states, no SLA

sla_config = SLA_CONFIG[protocol.workflow_state]
time_in_state = now() - protocol.state_entered_at
sla_duration = timedelta(days=sla_config["sla_days"])

# Warning threshold
if time_in_state >= sla_duration * sla_config["warning_threshold"]:
notify_assignee(
protocol_id,
subject=f"SLA Warning: {protocol.workflow_state} for {protocol.title}",
body=f"Protocol has been in {protocol.workflow_state} state for {time_in_state.days} days. SLA is {sla_config['sla_days']} days."
)

# Escalation threshold
if time_in_state >= sla_duration:
escalate_to_manager(
protocol_id,
subject=f"SLA BREACH: {protocol.workflow_state} for {protocol.title}",
body=f"Protocol has exceeded SLA ({sla_config['sla_days']} days). Immediate action required."
)

# Record SLA breach in audit trail
AuditTrail.create(
entity_type="ValidationProtocol",
entity_id=protocol_id,
action="SLA_Breach",
old_value=str(time_in_state),
new_value=str(sla_duration),
user_id="system",
timestamp=now()
)

Notification System

Notification Triggers:

EventRecipientsChannelTemplate
Protocol assigned to Test ExecutorTest ExecutorEmail + In-App"You have been assigned validation protocol {protocol_id}"
Test execution completedQA ReviewerEmail + In-App"Test execution completed for {protocol_id}, ready for review"
Deviation classified as CriticalQuality Head + Dev LeadEmail + Slack"Critical deviation detected in {protocol_id}: {deviation_summary}"
Review completedQuality HeadEmail + In-App"QA review completed for {protocol_id}, ready for approval"
VSR approvedTest Executor, QA Reviewer, Project ManagerEmail"Validation approved: {protocol_id} released for production"
SLA warning (75%)AssigneeEmail + In-App"SLA warning: {state} task for {protocol_id} due in {hours} hours"
SLA breach (100%)Assignee + ManagerEmail + Slack"SLA BREACH: {state} task for {protocol_id} overdue by {hours} hours"

Email Template Example:

Subject: [BIO-QMS] Validation Protocol Assigned: IQ-001

Dear John Smith,

You have been assigned as Test Executor for the following validation protocol:

Protocol ID: IQ-001
Protocol Title: BIO-QMS Platform Installation Qualification
Version: 1.2
System: BIO-QMS v2.3.0
Due Date: 2026-02-19 (3 business days)

Please log into the BIO-QMS platform and begin test execution:
https://bio-qms.coditect.ai/validation/protocols/IQ-001

Test Execution Checklist:
□ Review test cases and acceptance criteria
□ Verify test environment ready (production-like)
□ Execute test steps in sequence
□ Capture screenshots and logs as evidence
□ Document any deviations immediately
□ Apply "Executed By" electronic signature when complete

If you have questions, contact your QA Reviewer: Jane Smith (jane.smith@example.com)

Best regards,
BIO-QMS Validation System

Slack Integration:

def send_slack_notification(channel: str, message: str, severity: str = "info"):
"""
Send notification to Slack channel.
"""
from slack_sdk import WebClient

client = WebClient(token=os.environ["SLACK_BOT_TOKEN"])

color_map = {
"info": "#36a64f", # green
"warning": "#ff9800", # orange
"critical": "#f44336" # red
}

client.chat_postMessage(
channel=channel,
attachments=[
{
"color": color_map[severity],
"text": message,
"footer": "BIO-QMS Validation System",
"footer_icon": "https://bio-qms.coditect.ai/static/logo.png",
"ts": int(now().timestamp())
}
]
)

Technical Implementation

Database Schema

Tables:

-- Validation protocols
CREATE TABLE validation_protocols (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
protocol_number VARCHAR(50) UNIQUE NOT NULL,
version VARCHAR(20) NOT NULL,
title VARCHAR(255) NOT NULL,
validation_type VARCHAR(50) NOT NULL, -- IQ, OQ, PQ, Change Control
system_name VARCHAR(255) NOT NULL,
system_version VARCHAR(50),
workflow_state VARCHAR(50) NOT NULL DEFAULT 'Draft',
state_entered_at TIMESTAMP NOT NULL DEFAULT NOW(),
test_executor_id UUID REFERENCES users(id),
qa_reviewer_id UUID REFERENCES users(id),
quality_head_id UUID REFERENCES users(id),
created_at TIMESTAMP NOT NULL DEFAULT NOW(),
created_by UUID REFERENCES users(id),
UNIQUE(protocol_number, version)
);

-- Test cases
CREATE TABLE test_cases (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
protocol_id UUID REFERENCES validation_protocols(id) ON DELETE CASCADE,
test_case_number VARCHAR(50) NOT NULL,
section VARCHAR(100),
description TEXT NOT NULL,
test_steps JSONB NOT NULL, -- Array of {step_number, action, expected_result}
acceptance_criteria TEXT,
status VARCHAR(50) NOT NULL DEFAULT 'Not Started', -- Not Started, In Progress, PASS, FAIL, BLOCKED
executed_at TIMESTAMP,
executed_by UUID REFERENCES users(id),
UNIQUE(protocol_id, test_case_number)
);

-- Test results
CREATE TABLE test_results (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
test_case_id UUID REFERENCES test_cases(id) ON DELETE CASCADE,
step_number INT NOT NULL,
actual_result TEXT NOT NULL,
evidence_urls TEXT[], -- Array of screenshot/log/export URLs
status VARCHAR(50) NOT NULL, -- PASS, FAIL
executed_at TIMESTAMP NOT NULL DEFAULT NOW(),
executed_by UUID REFERENCES users(id),
UNIQUE(test_case_id, step_number)
);

-- Deviations
CREATE TABLE deviations (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
deviation_number VARCHAR(50) UNIQUE NOT NULL,
test_case_id UUID REFERENCES test_cases(id),
test_result_id UUID REFERENCES test_results(id),
expected_result TEXT NOT NULL,
actual_result TEXT NOT NULL,
severity VARCHAR(50), -- Critical, Major, Minor (set by QA Reviewer)
status VARCHAR(50) NOT NULL DEFAULT 'Open', -- Open, Under Investigation, CAPA Created, Closed
detected_at TIMESTAMP NOT NULL DEFAULT NOW(),
detected_by UUID REFERENCES users(id),
classified_at TIMESTAMP,
classified_by UUID REFERENCES users(id),
capa_id UUID REFERENCES capas(id),
root_cause TEXT,
impact_assessment TEXT,
closure_rationale TEXT,
closed_at TIMESTAMP,
closed_by UUID REFERENCES users(id)
);

-- Electronic signatures
CREATE TABLE electronic_signatures (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
signature_id VARCHAR(100) UNIQUE NOT NULL,
protocol_id UUID REFERENCES validation_protocols(id),
user_id UUID REFERENCES users(id) NOT NULL,
user_name VARCHAR(255) NOT NULL,
role VARCHAR(50) NOT NULL, -- Test_Executor, QA_Reviewer, Quality_Head
meaning VARCHAR(50) NOT NULL, -- Executed By, Reviewed By, Approved By
timestamp TIMESTAMP NOT NULL DEFAULT NOW(),
document_hash VARCHAR(64) NOT NULL, -- SHA-256 of protocol content
signature_token_hash VARCHAR(64) NOT NULL, -- Hash of auth token (non-reversible)
ip_address INET NOT NULL,
user_agent TEXT,
attestation_text TEXT NOT NULL,
supersedes UUID REFERENCES electronic_signatures(id), -- For signature amendments
audit_trail_entry_id UUID REFERENCES audit_trail(id)
);

-- Validation Summary Reports
CREATE TABLE validation_summary_reports (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
protocol_id UUID REFERENCES validation_protocols(id) UNIQUE,
report_number VARCHAR(50) UNIQUE NOT NULL,
generation_date TIMESTAMP NOT NULL DEFAULT NOW(),
overall_conclusion VARCHAR(50) NOT NULL, -- PASS, CONDITIONAL PASS, FAIL
recommendation TEXT NOT NULL,
statistics JSONB NOT NULL, -- {total_tests, passed, failed, blocked, pass_rate, deviations}
pdf_path VARCHAR(500),
pdf_hash VARCHAR(64), -- SHA-256 of signed PDF
generated_by VARCHAR(50) DEFAULT 'system'
);

-- SLA tracking
CREATE TABLE sla_tracking (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
protocol_id UUID REFERENCES validation_protocols(id),
workflow_state VARCHAR(50) NOT NULL,
entered_at TIMESTAMP NOT NULL,
exited_at TIMESTAMP,
sla_days INT NOT NULL,
warning_sent_at TIMESTAMP,
escalation_sent_at TIMESTAMP,
sla_breached BOOLEAN DEFAULT FALSE
);

API Endpoints

RESTful API:

# Validation Protocol Management
POST /api/v1/validation/protocols # Create protocol
GET /api/v1/validation/protocols/{id} # Get protocol details
PUT /api/v1/validation/protocols/{id} # Update protocol
DELETE /api/v1/validation/protocols/{id} # Delete protocol (Draft only)
GET /api/v1/validation/protocols # List protocols (filterable)

# Workflow State Transitions
POST /api/v1/validation/protocols/{id}/transition # Transition to new state
GET /api/v1/validation/protocols/{id}/workflow # Get workflow history

# Test Case Execution
GET /api/v1/validation/protocols/{id}/test-cases # List test cases
POST /api/v1/validation/test-cases/{id}/execute # Record test execution
POST /api/v1/validation/test-cases/{id}/evidence # Upload evidence

# Deviation Management
POST /api/v1/validation/deviations # Create deviation
GET /api/v1/validation/deviations/{id} # Get deviation details
PUT /api/v1/validation/deviations/{id}/classify # Classify deviation severity
PUT /api/v1/validation/deviations/{id}/close # Close deviation
GET /api/v1/validation/protocols/{id}/deviations # List deviations for protocol

# Electronic Signatures
POST /api/v1/validation/protocols/{id}/sign # Apply electronic signature
GET /api/v1/validation/protocols/{id}/signatures # Get signature history
POST /api/v1/validation/signatures/{id}/verify # Verify signature integrity

# Validation Summary Report
GET /api/v1/validation/protocols/{id}/vsr # Generate VSR (JSON)
GET /api/v1/validation/protocols/{id}/vsr/pdf # Generate VSR (PDF)

Signature API Example:

@app.post("/api/v1/validation/protocols/{protocol_id}/sign")
async def apply_signature(
protocol_id: str,
request: SignatureRequest,
current_user: User = Depends(get_current_user)
):
"""
Apply electronic signature to validation protocol.

Request body:
{
"password": "user_password",
"mfa_code": "123456",
"meaning": "Reviewed By"
}
"""
# Step 1: Authenticate for signature
try:
signature_token = authenticate_for_signature(
user_id=current_user.id,
password=request.password,
mfa_code=request.mfa_code
)
except AuthenticationError as e:
raise HTTPException(status_code=401, detail=str(e))

# Step 2: Check if user can apply this signature
protocol = get_protocol(protocol_id)
role = get_role_for_meaning(request.meaning)

allowed, reason = can_apply_signature(protocol_id, role, current_user.id)
if not allowed:
raise HTTPException(status_code=403, detail=reason)

# Step 3: Compute document hash
doc_hash = compute_document_hash(protocol_id, protocol.version)

# Step 4: Create signature record
signature = ElectronicSignature.create(
signature_id=generate_signature_id(),
protocol_id=protocol_id,
user_id=current_user.id,
user_name=current_user.full_name,
role=role,
meaning=request.meaning,
timestamp=now(),
document_hash=doc_hash,
signature_token_hash=hash_token(signature_token.token),
ip_address=request.client.host,
user_agent=request.headers.get("user-agent"),
attestation_text=ATTESTATION_TEXT[request.meaning]
)

# Step 5: Transition workflow state if applicable
if request.meaning == "Executed By":
transition_workflow(protocol_id, to_state="Review", user_id=current_user.id)
elif request.meaning == "Reviewed By":
# Check if deviations exist
critical_major_devs = get_open_deviations(protocol_id, severity=["Critical", "Major"])
next_state = "Correction" if critical_major_devs else "Approval"
transition_workflow(protocol_id, to_state=next_state, user_id=current_user.id)
elif request.meaning == "Approved By":
transition_workflow(protocol_id, to_state="Released", user_id=current_user.id)
# Generate final VSR PDF
generate_vsr_pdf_async.delay(protocol_id)

return {
"signature_id": signature.signature_id,
"timestamp": signature.timestamp,
"workflow_state": get_protocol(protocol_id).workflow_state
}

Audit Trail Requirements

Audit Trail Content

Per 21 CFR 11.10(e), audit trails must include:

  1. Date and time: When action performed (NTP-synchronized)
  2. User identification: Who performed action (user ID + name)
  3. Action: What was done (StateTransition, SignatureApplied, DeviationClassified, etc.)
  4. Old and new values: What changed (from "Execution" to "Review")
  5. Reason for change: Why action taken (required for signatures, deviations, state transitions)
  6. Device used: IP address, user agent

Audit Trail Table:

CREATE TABLE audit_trail (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
timestamp TIMESTAMP NOT NULL DEFAULT NOW(),
user_id UUID REFERENCES users(id) NOT NULL,
user_name VARCHAR(255) NOT NULL,
action VARCHAR(100) NOT NULL,
entity_type VARCHAR(100) NOT NULL, -- ValidationProtocol, TestCase, Deviation, Signature
entity_id UUID NOT NULL,
old_value TEXT,
new_value TEXT,
reason TEXT,
ip_address INET NOT NULL,
user_agent TEXT,
session_id VARCHAR(100)
);

CREATE INDEX idx_audit_trail_timestamp ON audit_trail(timestamp DESC);
CREATE INDEX idx_audit_trail_entity ON audit_trail(entity_type, entity_id);
CREATE INDEX idx_audit_trail_user ON audit_trail(user_id);

Audit Trail Entry Example:

{
"id": "a1b2c3d4-e5f6-7890-abcd-ef1234567890",
"timestamp": "2026-02-16T14:23:17.483Z",
"user_id": "jane.smith",
"user_name": "Jane Smith",
"action": "SignatureApplied",
"entity_type": "ValidationProtocol",
"entity_id": "IQ-001",
"old_value": null,
"new_value": "Reviewed By",
"reason": "Test execution results reviewed and found complete with 5 deviations classified",
"ip_address": "203.0.113.42",
"user_agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36",
"session_id": "sess_a1b2c3d4e5f67890"
}

Audit Trail Immutability

Protection Mechanisms:

  1. Insert-Only Table: No UPDATE or DELETE permissions granted (even to DBAs)
  2. Database Triggers: Block any UPDATE/DELETE attempts
  3. Hash Chain: Each audit entry includes hash of previous entry (tamper detection)
  4. Periodic Export: Daily export to GCS bucket with object versioning and retention lock

Hash Chain Implementation:

-- Add hash columns
ALTER TABLE audit_trail ADD COLUMN entry_hash VARCHAR(64);
ALTER TABLE audit_trail ADD COLUMN previous_hash VARCHAR(64);

-- Trigger to compute hash chain
CREATE OR REPLACE FUNCTION compute_audit_trail_hash()
RETURNS TRIGGER AS $$
DECLARE
prev_hash VARCHAR(64);
BEGIN
-- Get hash of previous entry
SELECT entry_hash INTO prev_hash
FROM audit_trail
ORDER BY timestamp DESC
LIMIT 1;

-- Compute hash of current entry
NEW.previous_hash := COALESCE(prev_hash, '0000000000000000000000000000000000000000000000000000000000000000');
NEW.entry_hash := encode(
digest(
NEW.id::text || NEW.timestamp::text || NEW.user_id::text ||
NEW.action || NEW.entity_type || NEW.entity_id::text ||
COALESCE(NEW.old_value, '') || COALESCE(NEW.new_value, '') ||
NEW.previous_hash,
'sha256'
),
'hex'
);

RETURN NEW;
END;
$$ LANGUAGE plpgsql;

CREATE TRIGGER audit_trail_hash_trigger
BEFORE INSERT ON audit_trail
FOR EACH ROW EXECUTE FUNCTION compute_audit_trail_hash();

Hash Chain Verification:

def verify_audit_trail_integrity() -> AuditTrailVerificationResult:
"""
Verify audit trail hash chain has not been tampered.
Run daily via Cloud Scheduler.
"""
entries = AuditTrail.query.order_by(AuditTrail.timestamp).all()

for i, entry in enumerate(entries):
# Recompute hash
computed_hash = hashlib.sha256(
f"{entry.id}{entry.timestamp}{entry.user_id}"
f"{entry.action}{entry.entity_type}{entry.entity_id}"
f"{entry.old_value or ''}{entry.new_value or ''}"
f"{entry.previous_hash}".encode()
).hexdigest()

# Compare with stored hash
if computed_hash != entry.entry_hash:
return AuditTrailVerificationResult(
valid=False,
tampered_entry_id=entry.id,
tampered_at_index=i
)

# Verify chain linkage
if i > 0 and entry.previous_hash != entries[i-1].entry_hash:
return AuditTrailVerificationResult(
valid=False,
broken_chain_at_index=i
)

return AuditTrailVerificationResult(valid=True)

Appendices

Appendix A: Glossary

TermDefinition
21 CFR Part 11FDA regulation governing electronic records and electronic signatures
CAPACorrective Action / Preventive Action
GAMP 5Good Automated Manufacturing Practice guide for validation of computerized systems
GxPGood Practices (GMP, GLP, GCP, GDP) - regulatory standards for pharma/biotech
IQInstallation Qualification - verifies system installed correctly per design
OQOperational Qualification - verifies system operates correctly across full range
PQPerformance Qualification - verifies system performs correctly in production environment
SLAService Level Agreement - time limit for completing activity
VSRValidation Summary Report - document summarizing validation results

Appendix B: Reference Documents

Document IDTitleLocation
URS-BIO-QMS-001User Requirements Specificationdocs/validation/urs/
FS-BIO-QMS-001Functional Specificationdocs/validation/fs/
DS-BIO-QMS-001Design Specificationdocs/validation/ds/
IQ-001Installation Qualification Protocoldocs/validation/protocols/IQ-001.md
OQ-001Operational Qualification Protocoldocs/validation/protocols/OQ-001.md
PQ-001Performance Qualification Protocoldocs/validation/protocols/PQ-001.md
SOP-VAL-001Validation SOPdocs/sops/SOP-VAL-001-validation-execution.md
SOP-ESIG-001Electronic Signature SOPdocs/sops/SOP-ESIG-001-electronic-signatures.md

Appendix C: Regulatory References

FDA Guidance:

  • Guidance for Industry: Part 11, Electronic Records; Electronic Signatures — Scope and Application (August 2003)
  • General Principles of Software Validation; Final Guidance for Industry and FDA Staff (January 2002)

EU Guidance:

  • EudraLex Volume 4, Annex 11: Computerized Systems (June 2011)

Industry Standards:

  • ISPE GAMP 5: A Risk-Based Approach to Compliant GxP Computerized Systems (Second Edition, 2022)
  • PIC/S Good Practices for Computerized Systems in Regulated GXP Environments (PI 011-3, September 2007)

ISO Standards:

  • ISO 9001:2015 - Quality Management Systems
  • ISO/IEC 27001:2022 - Information Security Management

Appendix D: Revision History

VersionDateAuthorChanges
1.02026-02-16Claude (Sonnet 4.5)Initial creation - comprehensive validation workflow for BIO-QMS platform

END OF DOCUMENT

Document Hash: SHA-256:[computed upon finalization] Generated By: CODITECT Compliance Framework Specialist Generation Date: 2026-02-16 Classification: Compliance Framework - Internal Use Retention: Permanent (GxP record)