Skip to main content

Tenant Data Export and Migration

Document ID: CODITECT-BIO-TDEM-001 Version: 1.0.0 Effective Date: 2026-02-16 Classification: Internal - Restricted Owner: Chief Data Officer (CDO)


Document Control

Approval History

RoleNameSignatureDate
Chief Data Officer[Pending][Digital Signature]YYYY-MM-DD
Chief Information Security Officer[Pending][Digital Signature]YYYY-MM-DD
VP Quality Assurance[Pending][Digital Signature]YYYY-MM-DD
VP Engineering[Pending][Digital Signature]YYYY-MM-DD
Regulatory Affairs Director[Pending][Digital Signature]YYYY-MM-DD

Revision History

VersionDateAuthorChangesApproval Status
1.0.02026-02-16CDO OfficeInitial releaseDraft

Distribution List

  • Executive Leadership Team
  • Information Security Team
  • Quality Assurance Team
  • Engineering Leadership
  • Compliance and Regulatory Affairs
  • Customer Success Team
  • Legal Department
  • Internal Audit

Review Schedule

Review TypeFrequencyNext Review DateResponsible Party
Annual Review12 months2027-02-16CDO
Regulatory Update ReviewAs neededN/ARegulatory Affairs
Post-Migration ReviewAfter each M&AN/AData Engineering Lead
Export Format ReviewQuarterly2026-05-16Technical Architect

1. Purpose and Scope

1.1 Purpose

This Tenant Data Export and Migration specification establishes comprehensive requirements and procedures for:

  1. Data Portability - GDPR Article 20 compliant machine-readable exports
  2. Regulatory Compliance - FDA Part 11 human-readable retrieval requirements
  3. Tenant Migration - Complete data transfer for M&A scenarios
  4. Business Continuity - Pre-deletion export and disaster recovery
  5. Audit Trail Integrity - Tamper-evident export packages with cryptographic verification

1.2 Scope

This specification covers:

In Scope:

  • Complete tenant data extraction across all system domains
  • Export format specifications (JSON, CSV, XML, PDF)
  • Export integrity verification and cryptographic signing
  • Tenant-to-tenant migration workflows
  • User mapping and conflict resolution
  • Migration rollback procedures
  • Export API specifications and TypeScript interfaces
  • Compliance validation and audit logging

Out of Scope:

  • Individual user data subject access requests (covered in GDPR-DSAR-001)
  • System-wide backup and disaster recovery (covered in DR-PLAN-001)
  • Real-time data replication between environments
  • Third-party system integrations (covered in INT-SPEC-001)

1.3 Regulatory Requirements

RegulationRequirementImplementation
GDPR Article 20Right to data portability in structured, commonly used, machine-readable formatJSON export with full schema documentation
FDA 21 CFR Part 11 §11.10(b)Records retrievable in both human-readable and electronic formPDF + JSON/CSV dual format
HIPAA §164.308(a)(7)(ii)(A)Data backup plan with retrievable exact copiesSHA-256 integrity verification
SOC 2 CC6.5Data retention and disposal policiesExport before deletion with 90-day retention

2. Export Architecture

2.1 Export System Components

/**
* Core export system architecture
* Reference: D.6.4 Tenant Data Export
*/

// Export job orchestration
interface ExportJob {
id: string; // cuid
tenantId: string; // Source tenant
status: ExportJobStatus;
type: ExportType;
format: ExportFormat[]; // Multiple formats in one job
scope: ExportScope;

// Scheduling
triggeredBy: string; // User ID or 'system'
triggeredAt: Date;
startedAt?: Date;
completedAt?: Date;

// Progress tracking
totalEntities: number;
processedEntities: number;
progressPercent: number;

// Results
exportPackageUrl?: string; // GCS signed URL
manifestHash: string; // SHA-256 of manifest.json
packageSize: number; // Bytes

// Integrity
integrityVerified: boolean;
verificationReport?: IntegrityReport;

// Retention
expiresAt: Date; // Auto-delete after 90 days
downloadCount: number;
lastDownloadedAt?: Date;
}

enum ExportJobStatus {
QUEUED = 'QUEUED',
VALIDATING = 'VALIDATING',
EXTRACTING = 'EXTRACTING',
PACKAGING = 'PACKAGING',
VERIFYING = 'VERIFYING',
COMPLETED = 'COMPLETED',
FAILED = 'FAILED',
EXPIRED = 'EXPIRED'
}

enum ExportType {
FULL = 'FULL', // All tenant data
INCREMENTAL = 'INCREMENTAL', // Since last export
SELECTIVE = 'SELECTIVE', // Specific domains
PRE_DELETION = 'PRE_DELETION' // Automatic before tenant deletion
}

enum ExportFormat {
JSON = 'JSON', // Structured with relationships
CSV = 'CSV', // Flat per-table
XML = 'XML', // HL7 FHIR or standard
PDF = 'PDF' // Human-readable archive
}

interface ExportScope {
domains: ExportDomain[]; // Which data domains to include
dateRange?: {
start: Date;
end: Date;
};
includeDeleted: boolean; // Soft-deleted records
includeAttachments: boolean; // Binary files
includeAuditTrail: boolean; // Full audit history
}

enum ExportDomain {
CORE = 'CORE', // Organizations, users, roles
QMS = 'QMS', // Documents, CAPAs, deviations
WORKFLOW = 'WORKFLOW', // State machines, transitions
COMPLIANCE = 'COMPLIANCE', // Signatures, audit trails
INTEGRATION = 'INTEGRATION', // Webhooks, API configs
ATTACHMENTS = 'ATTACHMENTS', // Uploaded files
CONFIGURATION = 'CONFIGURATION', // Templates, custom fields
ANALYTICS = 'ANALYTICS' // Reports, dashboards
}

2.2 Export Package Structure

export-{tenantId}-{timestamp}.zip
├── manifest.json # Export metadata + file hashes
├── schema/
│ ├── json-schema.json # JSON export schema definition
│ ├── csv-schema.json # CSV column definitions
│ └── relationships.json # Foreign key mappings
├── data/
│ ├── json/
│ │ ├── core.json # Nested JSON with relationships
│ │ ├── qms.json
│ │ ├── workflow.json
│ │ ├── compliance.json
│ │ └── integration.json
│ ├── csv/
│ │ ├── organizations.csv
│ │ ├── users.csv
│ │ ├── roles.csv
│ │ ├── work_orders.csv
│ │ ├── approvals.csv
│ │ ├── signatures.csv
│ │ ├── audit_trail.csv
│ │ └── ... (one per table)
│ ├── xml/
│ │ ├── fhir-bundle.xml # HL7 FHIR for clinical data
│ │ └── standard.xml # Generic XML
│ └── pdf/
│ ├── tenant-overview.pdf # Human-readable summary
│ ├── data-dictionary.pdf # Schema documentation
│ └── documents/ # Individual QMS documents as PDF
├── attachments/
│ ├── {entityType}/
│ │ └── {entityId}/
│ │ └── {filename} # Original uploaded files
├── integrity/
│ ├── checksums.sha256 # SHA-256 per file
│ ├── signature.p7s # PKCS#7 detached signature
│ └── verification-report.json # Integrity check results
└── metadata/
├── export-job.json # Export job details
├── tenant-info.json # Tenant configuration
└── statistics.json # Row counts, sizes

2.3 Manifest File Schema

/**
* Export package manifest
* Provides inventory and integrity verification
*/
interface ExportManifest {
version: string; // Manifest schema version (1.0.0)
generatedAt: Date;
tenantId: string;
tenantName: string;

exportJob: {
id: string;
type: ExportType;
format: ExportFormat[];
scope: ExportScope;
triggeredBy: string;
};

// Data inventory
entities: EntityInventory[];

// File inventory with hashes
files: FileInventory[];

// Integrity verification
integrity: {
manifestHash: string; // SHA-256 of this manifest (excluding this field)
packageHash: string; // SHA-256 of entire zip
signatureAlgorithm: string; // 'ECDSA-SHA256'
publicKeyId: string; // For signature verification
timestampToken?: string; // RFC 3161
};

// Compliance metadata
compliance: {
gdprArticle20Compliant: boolean;
fdaPart11Compliant: boolean;
hipaaCompliant: boolean;
retentionPolicy: string;
expiresAt: Date;
};
}

interface EntityInventory {
entityType: string; // e.g., 'work_orders'
rowCount: number; // Total records exported
deletedCount: number; // Soft-deleted records included
dateRange: {
earliest: Date;
latest: Date;
};
sizeBytes: number;
}

interface FileInventory {
path: string; // Relative path in zip
sizeBytes: number;
sha256: string;
contentType: string;
encoding?: string;
}

3. Export Formats

3.1 JSON Export Format

Purpose: Machine-readable, structured export preserving relationships

Schema: JSON Schema Draft 2020-12 compliant

/**
* JSON export preserves nested relationships
* UDOM-compatible for document content
*/
interface JSONExport {
metadata: {
version: string; // Export schema version
tenantId: string;
exportedAt: Date;
scope: ExportScope;
};

data: {
core: CoreData;
qms: QMSData;
workflow: WorkflowData;
compliance: ComplianceData;
integration: IntegrationData;
configuration: ConfigurationData;
analytics: AnalyticsData;
};
}

// Core domain export
interface CoreData {
organizations: Organization[];
users: User[];
roles: Role[];
permissions: Permission[];
teams: Team[];

// Nested relationships preserved
userRoles: Array<{
userId: string;
roleId: string;
organizationId: string;
assignedAt: Date;
assignedBy: string;
}>;
}

interface User {
id: string;
email: string;
name: string;
active: boolean;
department?: string;

// Embedded relationships
roles: Array<{
roleId: string;
roleName: string;
organizationId: string;
}>;

experiences: Array<{
experienceId: string;
experienceName: string;
rating: number;
certifiedAt?: Date;
expiresAt?: Date;
}>;

// Audit metadata
createdAt: Date;
createdBy: string;
updatedAt: Date;
updatedBy: string;
version: number;
}

// QMS domain export
interface QMSData {
workOrders: WorkOrder[];
jobPlans: JobPlan[];
changeItems: ChangeItem[];
approvals: Approval[];

// Registry entities
parties: Party[];
persons: Person[];
vendors: Vendor[];
assets: Asset[];
tools: Tool[];
materials: Material[];
experiences: Experience[];

// Scheduling
schedules: Schedule[];
timeEntries: TimeEntry[];
}

interface WorkOrder {
id: string;
tenantId: string;

// Hierarchy with resolved references
masterId?: string;
masterWorkOrder?: {
id: string;
summary: string;
};
children: Array<{
id: string;
summary: string;
sequenceOrder: number;
}>;

// Full nested data
sourceType: string;
originator: Party; // Embedded party data
item: ChangeItem; // Embedded change item
assigner: Party;
assignee: Party;

summary: string;
detail: string;

jobPlan?: JobPlan; // Embedded if present
schedule?: Schedule;

priority: number;
status: string;
regulatoryFlag: boolean;
complianceClass?: string;
riskLevel?: string;

// Related entities
approvals: Approval[];
timeEntries: TimeEntry[];
minRequirements: WorkOrderMinimumRequirement[];
dependencies: WorkOrderDependency[];
auditEvents: AuditTrailEntry[];

// Versioning
createdAt: Date;
updatedAt: Date;
version: number;
}

// Workflow domain export
interface WorkflowData {
stateMachines: StateMachine[];
states: State[];
transitions: Transition[];
transitionHistory: TransitionHistory[];
workflowTemplates: WorkflowTemplate[];
}

// Compliance domain export
interface ComplianceData {
electronicSignatures: ElectronicSignature[];
auditTrail: AuditTrailEntry[];
trainingRecords: TrainingRecord[];
validationProtocols: ValidationProtocol[];
changeControls: ChangeControl[];
}

interface ElectronicSignature {
id: string;
tenantId: string;
signer: {
id: string;
name: string;
email: string;
};
meaning: string;
signedAt: Date;
reason?: string;
authMethod: string;
sessionId?: string;
ipAddress?: string;

// Cryptographic binding
payloadHash?: string;
signatureBytes?: string; // Base64 ECDSA
keyId?: string;

// Single-use enforcement
consumed: boolean;
consumedAt?: Date;
consumedBy?: string;

// Timestamping
timestampToken?: string;
timestampSource?: string;

// Associated approvals
approvals: Array<{
id: string;
workOrderId: string;
role: string;
decision: string;
}>;
}

interface AuditTrailEntry {
id: string;
tenantId: string;
entityType: string;
entityId: string;
action: string;

performer: {
id: string;
name: string;
email: string;
};
performerType: 'HUMAN' | 'AGENT' | 'SYSTEM';
agentSessionId?: string;
correlationId?: string;

performedAt: Date;
previousVal?: any; // JSON diff
newVal?: any;

// Hash chain integrity
entryHash?: string;
previousHash?: string;
sequence?: number;
}

// Integration domain export
interface IntegrationData {
webhooks: Webhook[];
apiKeys: APIKey[]; // Redacted
externalSystems: ExternalSystem[];
integrationMappings: IntegrationMapping[];
}

interface APIKey {
id: string;
name: string;
keyPrefix: string; // First 8 chars only
keyHash: string; // SHA-256 of full key (not exportable)
redacted: true; // Always true in exports
scopes: string[];
createdAt: Date;
expiresAt?: Date;
lastUsedAt?: Date;

// NEVER export the actual key
note: 'API key value not included in export for security. Regenerate after import.'
}

// Configuration domain export
interface ConfigurationData {
workflowTemplates: WorkflowTemplate[];
documentCategories: DocumentCategory[];
customFields: CustomField[];
organizationSettings: OrganizationSettings;
notificationTemplates: NotificationTemplate[];
}

// Analytics domain export
interface AnalyticsData {
savedReports: SavedReport[];
dashboards: Dashboard[];
savedQueries: SavedQuery[];
kpiDefinitions: KPIDefinition[];
}

3.2 CSV Export Format

Purpose: Excel-compatible flat export with foreign key references

Encoding: UTF-8 with BOM for Excel compatibility

Schema Documentation: schema/csv-schema.json describes all columns

/**
* CSV export format specifications
* One file per table, foreign keys as string IDs
*/

// organizations.csv
interface OrganizationCSV {
id: string;
tenant_id: string;
name: string;
type: string;
active: boolean;
created_at: string; // ISO 8601
updated_at: string;
version: number;
}

// users.csv
interface UserCSV {
id: string;
tenant_id: string;
email: string;
name: string;
active: boolean;
department: string;
created_at: string;
created_by: string; // User ID
updated_at: string;
updated_by: string;
}

// work_orders.csv
interface WorkOrderCSV {
id: string;
tenant_id: string;
master_id: string; // Foreign key to work_orders.id
sequence_order: number;
source_type: string;
originator_id: string; // Foreign key to parties.id
item_id: string; // Foreign key to change_items.id
summary: string;
detail: string;
assigner_id: string; // Foreign key to parties.id
assignee_id: string; // Foreign key to parties.id
job_plan_id: string; // Foreign key to job_plans.id
schedule_id: string; // Foreign key to schedules.id
priority: number;
status: string;
regulatory_flag: boolean;
compliance_class: string;
risk_level: string;
created_at: string;
updated_at: string;
version: number;
}

// audit_trail.csv
interface AuditTrailCSV {
id: string;
tenant_id: string;
entity_type: string;
entity_id: string;
action: string;
performed_by: string; // Foreign key to persons.id
performer_type: string;
agent_session_id: string;
correlation_id: string;
performed_at: string;
previous_val: string; // JSON.stringify
new_val: string; // JSON.stringify
entry_hash: string;
previous_hash: string;
sequence: number;
work_order_id: string; // Foreign key to work_orders.id
}

// CSV export rules
const CSV_EXPORT_RULES = {
encoding: 'UTF-8',
bom: true, // Byte Order Mark for Excel
delimiter: ',',
quoteChar: '"',
escapeChar: '"',
lineTerminator: '\r\n', // Windows CRLF for Excel
header: true,

// Null handling
nullValue: '', // Empty string for NULL

// Boolean handling
booleanTrue: 'true',
booleanFalse: 'false',

// Date handling
dateFormat: 'ISO 8601', // YYYY-MM-DDTHH:mm:ss.sssZ

// JSON handling
jsonFields: ['previous_val', 'new_val'], // Stringify these fields

// Array handling
arrayDelimiter: ';' // For array fields, join with semicolon
};

3.3 XML Export Format

Purpose: HL7 FHIR for clinical data, standard XML for business data

<?xml version="1.0" encoding="UTF-8"?>
<TenantExport xmlns="https://coditect.ai/schemas/bio-qms/export/v1"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="https://coditect.ai/schemas/bio-qms/export/v1 export-schema.xsd">

<Metadata>
<Version>1.0.0</Version>
<TenantId>cuid_12345</TenantId>
<TenantName>Acme Pharmaceuticals</TenantName>
<ExportedAt>2026-02-16T14:30:00Z</ExportedAt>
<ExportJobId>cuid_export_001</ExportJobId>
<ExportType>FULL</ExportType>
</Metadata>

<Core>
<Organizations>
<Organization>
<Id>org_001</Id>
<Name>Acme Pharmaceuticals</Name>
<Type>ENTERPRISE</Type>
<Active>true</Active>
<CreatedAt>2025-01-01T00:00:00Z</CreatedAt>
</Organization>
</Organizations>

<Users>
<User>
<Id>user_001</Id>
<Email>john.doe@acme.com</Email>
<Name>John Doe</Name>
<Active>true</Active>
<Department>Quality Assurance</Department>
<Roles>
<Role>
<RoleId>role_qa_manager</RoleId>
<RoleName>QA Manager</RoleName>
<OrganizationId>org_001</OrganizationId>
</Role>
</Roles>
</User>
</Users>
</Core>

<QMS>
<WorkOrders>
<WorkOrder>
<Id>wo_001</Id>
<Summary>Calibration of HPLC System</Summary>
<Detail>Annual calibration per SOP-CAL-001</Detail>
<Status>COMPLETED</Status>
<RegulatoryFlag>true</RegulatoryFlag>
<ComplianceClass>GMP</ComplianceClass>
<Approvals>
<Approval>
<Id>appr_001</Id>
<Role>QA</Role>
<Decision>APPROVED</Decision>
<DecisionTimestamp>2026-02-15T16:00:00Z</DecisionTimestamp>
<Signature>
<SignatureId>sig_001</SignatureId>
<Meaning>Approval of calibration completion</Meaning>
<SignedAt>2026-02-15T16:00:00Z</SignedAt>
<PayloadHash>sha256:abc123...</PayloadHash>
</Signature>
</Approval>
</Approvals>
</WorkOrder>
</WorkOrders>
</QMS>

<Compliance>
<AuditTrail>
<Entry>
<Id>audit_001</Id>
<EntityType>work_orders</EntityType>
<EntityId>wo_001</EntityId>
<Action>status_change</Action>
<PerformedBy>
<PersonId>user_001</PersonId>
<PersonName>John Doe</PersonName>
</PerformedBy>
<PerformerType>HUMAN</PerformerType>
<PerformedAt>2026-02-15T16:00:00Z</PerformedAt>
<PreviousValue>
<Status>IN_PROGRESS</Status>
</PreviousValue>
<NewValue>
<Status>COMPLETED</Status>
</NewValue>
<EntryHash>sha256:def456...</EntryHash>
<Sequence>42</Sequence>
</Entry>
</AuditTrail>
</Compliance>

</TenantExport>

3.4 PDF Export Format

Purpose: Human-readable archive with digital signatures per document

Components:

  1. Tenant Overview PDF - Executive summary with statistics
  2. Data Dictionary PDF - Schema documentation
  3. Individual Document PDFs - QMS documents as signed PDFs
/**
* PDF export generation
* Uses pdfkit or similar for generation
*/
interface PDFExportConfig {
// Document metadata
title: string;
author: string; // Tenant name
subject: string; // 'Tenant Data Export'
keywords: string[];
creator: 'CODITECT BIO-QMS Platform';
producer: 'CODITECT PDF Export v1.0';

// Digital signature
signPDF: boolean; // Apply PDF signature
signingCertificate?: string; // PEM certificate
signingKey?: string; // PEM private key
timestampServer?: string; // RFC 3161 TSA URL

// Layout
pageSize: 'LETTER' | 'A4';
margins: {
top: number;
bottom: number;
left: number;
right: number;
};

// Header/footer
includeHeader: boolean;
includeFooter: boolean;
includePageNumbers: boolean;
includeWatermark: boolean;
watermarkText?: string; // 'EXPORT COPY'
}

// Tenant overview PDF sections
const TENANT_OVERVIEW_SECTIONS = [
'Cover Page',
'Table of Contents',
'Export Summary',
'Tenant Information',
'User Summary',
'QMS Statistics',
'Compliance Status',
'Data Inventory',
'Export Integrity Verification',
'Appendix: Schema Summary'
];

4. Export Integrity Verification

4.1 Integrity Verification Process

/**
* Multi-layer integrity verification
* Ensures export package is complete and tamper-free
*/
interface IntegrityVerification {
// Row count verification
rowCountVerification: RowCountCheck[];

// Hash verification
fileHashVerification: FileHashCheck[];

// Cross-reference verification
foreignKeyVerification: ForeignKeyCheck[];

// Size verification
sizeVerification: SizeCheck;

// Cryptographic signature
signatureVerification: SignatureCheck;

// Overall result
passed: boolean;
failureReasons: string[];
verifiedAt: Date;
verifiedBy: string;
}

interface RowCountCheck {
entityType: string;
sourceRowCount: number; // From database
exportRowCount: number; // From export file
deletedRowCount: number; // Soft-deleted included
match: boolean;
discrepancy?: number;
}

interface FileHashCheck {
filePath: string;
expectedHash: string; // From manifest
actualHash: string; // Computed from file
algorithm: 'SHA-256';
match: boolean;
}

interface ForeignKeyCheck {
tableName: string;
foreignKeyColumn: string;
referencedTable: string;
referencedColumn: string;
totalReferences: number;
resolvedReferences: number;
unresolvedReferences: string[]; // IDs that don't resolve
resolutionRate: number; // Percentage
passed: boolean; // 100% required
}

interface SizeCheck {
estimatedSizeBytes: number;
actualSizeBytes: number;
deviationPercent: number;
threshold: 20; // Flag if >20% deviation
passed: boolean;
}

interface SignatureCheck {
signaturePresent: boolean;
algorithm: string; // 'ECDSA-SHA256'
publicKeyId: string;
signatureValid: boolean;
timestampValid: boolean;
signedAt: Date;
verifiedAt: Date;
}

4.2 Integrity Verification Implementation

/**
* Export integrity verifier
* Runs after export package creation
*/
class ExportIntegrityVerifier {

async verifyExport(
exportJobId: string,
packagePath: string
): Promise<IntegrityVerification> {

const manifest = await this.readManifest(packagePath);
const checks: IntegrityVerification = {
rowCountVerification: [],
fileHashVerification: [],
foreignKeyVerification: [],
sizeVerification: {} as SizeCheck,
signatureVerification: {} as SignatureCheck,
passed: true,
failureReasons: [],
verifiedAt: new Date(),
verifiedBy: 'system'
};

// 1. Verify row counts
checks.rowCountVerification = await this.verifyRowCounts(
manifest.tenantId,
manifest.entities
);

// 2. Verify file hashes
checks.fileHashVerification = await this.verifyFileHashes(
packagePath,
manifest.files
);

// 3. Verify foreign key integrity
checks.foreignKeyVerification = await this.verifyForeignKeys(
packagePath,
manifest.data
);

// 4. Verify package size
checks.sizeVerification = await this.verifySize(
packagePath,
manifest.entities
);

// 5. Verify cryptographic signature
checks.signatureVerification = await this.verifySignature(
packagePath,
manifest.integrity
);

// Aggregate results
checks.passed = this.aggregateResults(checks);

// Store verification report
await this.storeVerificationReport(exportJobId, checks);

return checks;
}

private async verifyRowCounts(
tenantId: string,
entities: EntityInventory[]
): Promise<RowCountCheck[]> {
const checks: RowCountCheck[] = [];

for (const entity of entities) {
// Query source database for actual count
const sourceCount = await prisma[entity.entityType].count({
where: { tenantId }
});

const check: RowCountCheck = {
entityType: entity.entityType,
sourceRowCount: sourceCount,
exportRowCount: entity.rowCount,
deletedRowCount: entity.deletedCount,
match: sourceCount === entity.rowCount,
discrepancy: sourceCount - entity.rowCount
};

checks.push(check);
}

return checks;
}

private async verifyFileHashes(
packagePath: string,
files: FileInventory[]
): Promise<FileHashCheck[]> {
const checks: FileHashCheck[] = [];

for (const file of files) {
const filePath = path.join(packagePath, file.path);
const actualHash = await this.computeFileSHA256(filePath);

checks.push({
filePath: file.path,
expectedHash: file.sha256,
actualHash,
algorithm: 'SHA-256',
match: actualHash === file.sha256
});
}

return checks;
}

private async verifyForeignKeys(
packagePath: string,
data: any
): Promise<ForeignKeyCheck[]> {
const checks: ForeignKeyCheck[] = [];

// Load JSON data
const jsonData = await this.loadJSONExport(packagePath);

// Define foreign key relationships
const relationships = [
{
table: 'work_orders',
fk: 'masterId',
refTable: 'work_orders',
refColumn: 'id'
},
{
table: 'work_orders',
fk: 'originatorId',
refTable: 'parties',
refColumn: 'id'
},
{
table: 'approvals',
fk: 'workOrderId',
refTable: 'work_orders',
refColumn: 'id'
}
// ... all relationships
];

for (const rel of relationships) {
const check = await this.verifyForeignKeyRelationship(
jsonData,
rel
);
checks.push(check);
}

return checks;
}

private async verifySignature(
packagePath: string,
integrityInfo: any
): Promise<SignatureCheck> {
// Load signature file
const signaturePath = path.join(
packagePath,
'integrity/signature.p7s'
);

if (!fs.existsSync(signaturePath)) {
return {
signaturePresent: false,
algorithm: '',
publicKeyId: '',
signatureValid: false,
timestampValid: false,
signedAt: new Date(),
verifiedAt: new Date()
};
}

// Verify PKCS#7 signature
const manifestPath = path.join(packagePath, 'manifest.json');
const manifestData = fs.readFileSync(manifestPath);
const signature = fs.readFileSync(signaturePath);

const publicKey = await this.loadPublicKey(
integrityInfo.publicKeyId
);

const signatureValid = crypto.verify(
'sha256',
manifestData,
{
key: publicKey,
padding: crypto.constants.RSA_PKCS1_PSS_PADDING
},
signature
);

// Verify timestamp if present
let timestampValid = false;
if (integrityInfo.timestampToken) {
timestampValid = await this.verifyRFC3161Timestamp(
integrityInfo.timestampToken,
manifestData
);
}

return {
signaturePresent: true,
algorithm: integrityInfo.signatureAlgorithm,
publicKeyId: integrityInfo.publicKeyId,
signatureValid,
timestampValid,
signedAt: new Date(integrityInfo.signedAt),
verifiedAt: new Date()
};
}

private computeFileSHA256(filePath: string): Promise<string> {
return new Promise((resolve, reject) => {
const hash = crypto.createHash('sha256');
const stream = fs.createReadStream(filePath);

stream.on('data', (data) => hash.update(data));
stream.on('end', () => resolve(hash.digest('hex')));
stream.on('error', reject);
});
}
}

5. Tenant-to-Tenant Migration

5.1 Migration Architecture

/**
* Tenant migration for M&A scenarios
* Merges source tenant data into target tenant
*/
interface MigrationJob {
id: string;

// Source and target
sourceTenantId: string;
targetTenantId: string;
exportPackageUrl: string; // Source tenant export

// Migration configuration
strategy: MigrationStrategy;
conflictResolution: ConflictResolution;
userMappingStrategy: UserMappingStrategy;

// Status tracking
status: MigrationJobStatus;
phase: MigrationPhase;

// Progress
totalSteps: number;
completedSteps: number;
currentStep: string;
progressPercent: number;

// Results
migrationReport?: MigrationReport;
rollbackAvailable: boolean;
rollbackExpiresAt: Date; // 72 hours

// Scheduling
triggeredBy: string;
triggeredAt: Date;
startedAt?: Date;
completedAt?: Date;
}

enum MigrationStrategy {
MERGE = 'MERGE', // Merge data into existing tenant
REPLACE = 'REPLACE', // Replace target tenant data
APPEND = 'APPEND' // Append without deduplication
}

enum ConflictResolution {
SOURCE_WINS = 'SOURCE_WINS', // Source data takes precedence
TARGET_WINS = 'TARGET_WINS', // Keep target data
MANUAL = 'MANUAL', // Require manual resolution
FAIL = 'FAIL' // Abort on conflict
}

enum UserMappingStrategy {
EMAIL_MATCH = 'EMAIL_MATCH', // Match users by email
MANUAL_MAP = 'MANUAL_MAP', // Provide explicit mapping
CREATE_NEW = 'CREATE_NEW', // Create all as new users
HYBRID = 'HYBRID' // Email match + create new
}

enum MigrationJobStatus {
PENDING = 'PENDING',
VALIDATING = 'VALIDATING',
DRY_RUN = 'DRY_RUN',
APPROVED = 'APPROVED',
MIGRATING = 'MIGRATING',
COMPLETED = 'COMPLETED',
FAILED = 'FAILED',
ROLLED_BACK = 'ROLLED_BACK'
}

enum MigrationPhase {
VALIDATION = 'VALIDATION', // Validate export package
USER_MAPPING = 'USER_MAPPING', // Map source users to target
CONFLICT_DETECTION = 'CONFLICT_DETECTION',
DRY_RUN = 'DRY_RUN', // Simulate migration
BACKUP = 'BACKUP', // Backup target before migration
DATA_IMPORT = 'DATA_IMPORT', // Import source data
RELATIONSHIP_REWIRING = 'RELATIONSHIP_REWIRING',
AUDIT_LOGGING = 'AUDIT_LOGGING',
VERIFICATION = 'VERIFICATION',
COMPLETION = 'COMPLETION'
}

5.2 Migration Report

/**
* Comprehensive migration results
* Documents what was migrated and conflicts encountered
*/
interface MigrationReport {
migrationJobId: string;
sourceTenantId: string;
targetTenantId: string;

// Summary statistics
summary: {
totalRecords: number;
migratedRecords: number;
skippedRecords: number;
failedRecords: number;
conflictsDetected: number;
conflictsResolved: number;
conflictsPending: number;
};

// User mapping results
userMapping: UserMappingResult[];

// Entity migration results
entityResults: EntityMigrationResult[];

// Conflicts
conflicts: MigrationConflict[];

// Audit trail
migrationAuditEvents: AuditTrailEntry[];

// Rollback information
rollbackSnapshot: {
snapshotId: string;
createdAt: Date;
expiresAt: Date;
sizeBytes: number;
};
}

interface UserMappingResult {
sourceUserId: string;
sourceEmail: string;
sourceName: string;

targetUserId?: string; // Null if new user created
targetEmail?: string;
mappingStrategy: 'EMAIL_MATCH' | 'MANUAL' | 'NEW_USER';

rolesTransferred: Array<{
sourceRoleId: string;
targetRoleId: string;
roleName: string;
}>;

conflictsDetected: string[]; // e.g., 'Role already assigned'
}

interface EntityMigrationResult {
entityType: string;
totalRecords: number;
migratedRecords: number;
skippedRecords: number;
failedRecords: number;

// ID remapping
idRemapping: Array<{
sourceId: string;
targetId: string;
}>;

// Errors
errors: Array<{
sourceId: string;
errorType: string;
errorMessage: string;
}>;
}

interface MigrationConflict {
id: string;
conflictType: ConflictType;
entityType: string;
sourceEntityId: string;
targetEntityId?: string;

description: string;

// Conflict data
sourceData: any;
targetData?: any;

// Resolution
resolutionStrategy: ConflictResolution;
resolved: boolean;
resolvedAt?: Date;
resolvedBy?: string;
resolution?: string;
}

enum ConflictType {
DUPLICATE_WORK_ORDER_NUMBER = 'DUPLICATE_WORK_ORDER_NUMBER',
USER_EMAIL_CONFLICT = 'USER_EMAIL_CONFLICT',
ROLE_NAME_CONFLICT = 'ROLE_NAME_CONFLICT',
WORKFLOW_STATE_MISMATCH = 'WORKFLOW_STATE_MISMATCH',
DOCUMENT_CATEGORY_CONFLICT = 'DOCUMENT_CATEGORY_CONFLICT',
CUSTOM_FIELD_TYPE_MISMATCH = 'CUSTOM_FIELD_TYPE_MISMATCH'
}

5.3 Migration Implementation

/**
* Tenant migration orchestrator
* Handles complete migration workflow
*/
class TenantMigrationOrchestrator {

async migrateTenant(
migrationJob: MigrationJob
): Promise<MigrationReport> {

const report: MigrationReport = this.initializeReport(migrationJob);

try {
// Phase 1: Validation
await this.validateExportPackage(migrationJob);

// Phase 2: User mapping
report.userMapping = await this.mapUsers(
migrationJob,
migrationJob.userMappingStrategy
);

// Phase 3: Conflict detection
report.conflicts = await this.detectConflicts(migrationJob);

// Phase 4: Dry run
if (migrationJob.phase === MigrationPhase.DRY_RUN) {
return await this.performDryRun(migrationJob, report);
}

// Phase 5: Backup target tenant
const snapshot = await this.createRollbackSnapshot(
migrationJob.targetTenantId
);
report.rollbackSnapshot = snapshot;

// Phase 6: Data import
report.entityResults = await this.importData(
migrationJob,
report.userMapping
);

// Phase 7: Relationship rewiring
await this.rewireRelationships(
migrationJob,
report.entityResults
);

// Phase 8: Audit logging
report.migrationAuditEvents = await this.logMigration(
migrationJob,
report
);

// Phase 9: Verification
const verified = await this.verifyMigration(
migrationJob,
report
);

if (!verified) {
throw new Error('Migration verification failed');
}

// Update summary
report.summary = this.computeSummary(report);

return report;

} catch (error) {
// Automatic rollback on failure
if (report.rollbackSnapshot) {
await this.rollbackMigration(
migrationJob.id,
report.rollbackSnapshot.snapshotId
);
}

throw error;
}
}

private async mapUsers(
migrationJob: MigrationJob,
strategy: UserMappingStrategy
): Promise<UserMappingResult[]> {

const exportData = await this.loadExportData(
migrationJob.exportPackageUrl
);

const sourceUsers = exportData.data.core.users;
const mappingResults: UserMappingResult[] = [];

for (const sourceUser of sourceUsers) {

let targetUserId: string | undefined;
let mappingStrategyUsed: string;

if (strategy === UserMappingStrategy.EMAIL_MATCH ||
strategy === UserMappingStrategy.HYBRID) {

// Try to find existing user by email
const existingUser = await prisma.person.findUnique({
where: {
tenantId_email: {
tenantId: migrationJob.targetTenantId,
email: sourceUser.email
}
}
});

if (existingUser) {
targetUserId = existingUser.id;
mappingStrategyUsed = 'EMAIL_MATCH';
} else if (strategy === UserMappingStrategy.HYBRID) {
// Create new user
const newUser = await prisma.person.create({
data: {
tenantId: migrationJob.targetTenantId,
email: sourceUser.email,
name: sourceUser.name,
department: sourceUser.department,
active: sourceUser.active
}
});
targetUserId = newUser.id;
mappingStrategyUsed = 'NEW_USER';
}
} else if (strategy === UserMappingStrategy.CREATE_NEW) {
// Always create new users
const newUser = await prisma.person.create({
data: {
tenantId: migrationJob.targetTenantId,
email: `migrated.${sourceUser.email}`, // Avoid email conflicts
name: `[Migrated] ${sourceUser.name}`,
department: sourceUser.department,
active: sourceUser.active
}
});
targetUserId = newUser.id;
mappingStrategyUsed = 'NEW_USER';
}

// Map roles
const rolesTransferred = await this.transferUserRoles(
sourceUser,
targetUserId,
migrationJob.targetTenantId
);

mappingResults.push({
sourceUserId: sourceUser.id,
sourceEmail: sourceUser.email,
sourceName: sourceUser.name,
targetUserId,
targetEmail: targetUserId ? sourceUser.email : undefined,
mappingStrategy: mappingStrategyUsed as any,
rolesTransferred,
conflictsDetected: []
});
}

return mappingResults;
}

private async detectConflicts(
migrationJob: MigrationJob
): Promise<MigrationConflict[]> {

const conflicts: MigrationConflict[] = [];
const exportData = await this.loadExportData(
migrationJob.exportPackageUrl
);

// Check for duplicate work order numbers
const sourceWorkOrders = exportData.data.qms.workOrders;
for (const sourceWO of sourceWorkOrders) {
// Assuming work orders have unique identifiers
const existing = await prisma.workOrder.findFirst({
where: {
tenantId: migrationJob.targetTenantId,
summary: sourceWO.summary,
createdAt: sourceWO.createdAt
}
});

if (existing) {
conflicts.push({
id: `conflict_${sourceWO.id}`,
conflictType: ConflictType.DUPLICATE_WORK_ORDER_NUMBER,
entityType: 'work_orders',
sourceEntityId: sourceWO.id,
targetEntityId: existing.id,
description: `Work order with summary "${sourceWO.summary}" already exists`,
sourceData: sourceWO,
targetData: existing,
resolutionStrategy: migrationJob.conflictResolution,
resolved: false
});
}
}

// Check workflow state machine compatibility
const sourceWorkflows = exportData.data.workflow.stateMachines;
for (const sourceWF of sourceWorkflows) {
const targetWF = await prisma.stateMachine.findFirst({
where: {
tenantId: migrationJob.targetTenantId,
name: sourceWF.name
}
});

if (targetWF && !this.areWorkflowsCompatible(sourceWF, targetWF)) {
conflicts.push({
id: `conflict_wf_${sourceWF.id}`,
conflictType: ConflictType.WORKFLOW_STATE_MISMATCH,
entityType: 'state_machines',
sourceEntityId: sourceWF.id,
targetEntityId: targetWF.id,
description: `Workflow "${sourceWF.name}" has incompatible states`,
sourceData: sourceWF,
targetData: targetWF,
resolutionStrategy: migrationJob.conflictResolution,
resolved: false
});
}
}

return conflicts;
}

private async createRollbackSnapshot(
tenantId: string
): Promise<any> {

// Create full export of target tenant before migration
const snapshotExportJob = await prisma.exportJob.create({
data: {
tenantId,
status: ExportJobStatus.QUEUED,
type: ExportType.FULL,
format: [ExportFormat.JSON],
scope: {
domains: Object.values(ExportDomain),
includeDeleted: true,
includeAttachments: true,
includeAuditTrail: true
},
triggeredBy: 'migration_system',
triggeredAt: new Date(),
expiresAt: new Date(Date.now() + 72 * 60 * 60 * 1000) // 72 hours
}
});

// Execute export synchronously
const exporter = new TenantExporter();
await exporter.executeExport(snapshotExportJob.id);

// Return snapshot info
const completedJob = await prisma.exportJob.findUnique({
where: { id: snapshotExportJob.id }
});

return {
snapshotId: completedJob!.id,
createdAt: completedJob!.completedAt!,
expiresAt: completedJob!.expiresAt,
sizeBytes: completedJob!.packageSize
};
}

async rollbackMigration(
migrationJobId: string,
snapshotId: string
): Promise<void> {

const migrationJob = await prisma.migrationJob.findUnique({
where: { id: migrationJobId }
});

if (!migrationJob) {
throw new Error('Migration job not found');
}

// Load snapshot export
const snapshot = await prisma.exportJob.findUnique({
where: { id: snapshotId }
});

if (!snapshot || snapshot.status !== ExportJobStatus.COMPLETED) {
throw new Error('Snapshot not available for rollback');
}

// Delete all data imported during migration
await this.deleteImportedData(migrationJob);

// Restore from snapshot
const importer = new TenantImporter();
await importer.importFromExport(
migrationJob.targetTenantId,
snapshot.exportPackageUrl!,
{
strategy: MigrationStrategy.REPLACE,
conflictResolution: ConflictResolution.SOURCE_WINS
}
);

// Log rollback event
await prisma.auditTrail.create({
data: {
tenantId: migrationJob.targetTenantId,
entityType: 'migration_jobs',
entityId: migrationJobId,
action: 'ROLLBACK',
performedBy: 'migration_system',
performerType: PerformerType.SYSTEM,
performedAt: new Date(),
newVal: {
snapshotId,
reason: 'Migration failed or requested rollback'
}
}
});

// Update migration job status
await prisma.migrationJob.update({
where: { id: migrationJobId },
data: { status: MigrationJobStatus.ROLLED_BACK }
});
}
}

6. Export API Specifications

6.1 Export API Endpoints

/**
* RESTful API for tenant data export
* Base path: /api/v1/tenants/:tenantId/export
*/

// POST /api/v1/tenants/:tenantId/export
// Create new export job (async)
interface CreateExportRequest {
type: ExportType;
format: ExportFormat[];
scope: ExportScope;
notifyOnComplete?: boolean; // Send email when done
notificationEmail?: string;
}

interface CreateExportResponse {
exportJobId: string;
status: ExportJobStatus;
estimatedCompletionTime: Date; // Based on data volume
pollUrl: string; // GET endpoint for status
webhookUrl?: string; // POST notification when complete
}

// GET /api/v1/tenants/:tenantId/export/:jobId/status
// Poll export job status
interface ExportStatusResponse {
exportJobId: string;
status: ExportJobStatus;
progressPercent: number;
currentStep: string;
totalEntities: number;
processedEntities: number;
estimatedTimeRemaining: number; // Seconds
errors: Array<{
entityType: string;
errorMessage: string;
timestamp: Date;
}>;
}

// GET /api/v1/tenants/:tenantId/export/:jobId/download
// Download completed export package
// Returns: 302 redirect to signed GCS URL with 1-hour expiration
interface ExportDownloadResponse {
downloadUrl: string; // Signed URL
expiresAt: Date; // URL expiration
packageSize: number; // Bytes
contentType: 'application/zip';
filename: string; // export-{tenantId}-{timestamp}.zip
}

// GET /api/v1/tenants/:tenantId/export
// List all export jobs for tenant
interface ListExportsResponse {
exports: Array<{
exportJobId: string;
type: ExportType;
format: ExportFormat[];
status: ExportJobStatus;
triggeredBy: string;
triggeredAt: Date;
completedAt?: Date;
packageSize?: number;
expiresAt: Date;
}>;
pagination: {
page: number;
pageSize: number;
totalPages: number;
totalCount: number;
};
}

// DELETE /api/v1/tenants/:tenantId/export/:jobId
// Cancel or delete export job
interface DeleteExportResponse {
deleted: boolean;
exportJobId: string;
message: string;
}

6.2 Migration API Endpoints

/**
* RESTful API for tenant migration
* Base path: /api/v1/tenants/:targetTenantId/import
*/

// POST /api/v1/tenants/:targetTenantId/import/validate
// Dry run - validate migration without importing
interface ValidateMigrationRequest {
exportPackageUrl: string; // Source tenant export
strategy: MigrationStrategy;
conflictResolution: ConflictResolution;
userMappingStrategy: UserMappingStrategy;
userMapping?: Array<{ // Optional manual mapping
sourceUserId: string;
targetUserId: string;
}>;
}

interface ValidateMigrationResponse {
valid: boolean;
conflicts: MigrationConflict[];
userMappingPreview: UserMappingResult[];
estimatedRecords: number;
estimatedDuration: number; // Seconds
warnings: string[];
blockers: string[]; // Issues that prevent migration
}

// POST /api/v1/tenants/:targetTenantId/import
// Execute tenant migration (async)
interface ExecuteMigrationRequest {
exportPackageUrl: string;
strategy: MigrationStrategy;
conflictResolution: ConflictResolution;
userMappingStrategy: UserMappingStrategy;
userMapping?: Array<{
sourceUserId: string;
targetUserId: string;
}>;
notifyOnComplete?: boolean;
}

interface ExecuteMigrationResponse {
migrationJobId: string;
status: MigrationJobStatus;
estimatedCompletionTime: Date;
pollUrl: string;
rollbackAvailableUntil: Date; // 72 hours
}

// GET /api/v1/tenants/:targetTenantId/import/:jobId/status
// Poll migration job status
interface MigrationStatusResponse {
migrationJobId: string;
status: MigrationJobStatus;
phase: MigrationPhase;
progressPercent: number;
currentStep: string;
completedSteps: number;
totalSteps: number;
estimatedTimeRemaining: number;
conflicts: MigrationConflict[];
errors: Array<{
phase: MigrationPhase;
errorMessage: string;
timestamp: Date;
}>;
}

// GET /api/v1/tenants/:targetTenantId/import/:jobId/report
// Download migration report
interface MigrationReportResponse {
migrationReport: MigrationReport;
downloadUrl: string; // PDF report download
}

// POST /api/v1/tenants/:targetTenantId/import/:jobId/rollback
// Rollback migration within 72 hours
interface RollbackMigrationRequest {
reason: string;
confirmRollback: boolean; // Must be true
}

interface RollbackMigrationResponse {
rolled_back: boolean;
migrationJobId: string;
snapshotRestored: string; // Snapshot ID
message: string;
}

7. Prisma Implementation

7.1 Export Data Extraction

/**
* Prisma queries for complete tenant data extraction
* Ensures 100% data coverage with efficient queries
*/
class TenantDataExtractor {

async extractCompleteData(
tenantId: string,
scope: ExportScope
): Promise<JSONExport> {

const exportData: JSONExport = {
metadata: {
version: '1.0.0',
tenantId,
exportedAt: new Date(),
scope
},
data: {
core: await this.extractCoreData(tenantId, scope),
qms: await this.extractQMSData(tenantId, scope),
workflow: await this.extractWorkflowData(tenantId, scope),
compliance: await this.extractComplianceData(tenantId, scope),
integration: await this.extractIntegrationData(tenantId, scope),
configuration: await this.extractConfigurationData(tenantId, scope),
analytics: await this.extractAnalyticsData(tenantId, scope)
}
};

return exportData;
}

private async extractCoreData(
tenantId: string,
scope: ExportScope
): Promise<CoreData> {

// Extract all core entities with relationships
const [organizations, users, roles, permissions, teams] = await Promise.all([
// Organizations
prisma.organization.findMany({
where: { tenantId },
include: {
settings: true,
customFields: true
}
}),

// Users with roles and experiences
prisma.person.findMany({
where: {
tenantId,
...(scope.includeDeleted ? {} : { deletedAt: null })
},
include: {
experiences: {
include: {
experience: true
}
},
parties: true,
teamMembers: {
include: {
team: true
}
}
}
}),

// Roles
prisma.role.findMany({
where: { tenantId },
include: {
permissions: {
include: {
permission: true
}
}
}
}),

// Permissions
prisma.permission.findMany({
where: { tenantId }
}),

// Teams
prisma.team.findMany({
where: { tenantId },
include: {
members: {
include: {
person: true
}
}
}
})
]);

// Transform to export format with embedded relationships
const transformedUsers: User[] = users.map(user => ({
id: user.id,
email: user.email,
name: user.name,
active: user.active,
department: user.department || undefined,

roles: user.roles?.map(ur => ({
roleId: ur.roleId,
roleName: ur.role.name,
organizationId: ur.organizationId
})) || [],

experiences: user.experiences.map(ue => ({
experienceId: ue.experienceId,
experienceName: ue.experience.name,
rating: ue.rating,
certifiedAt: ue.certifiedAt || undefined,
expiresAt: ue.expiresAt || undefined
})),

createdAt: user.createdAt,
createdBy: user.createdBy,
updatedAt: user.updatedAt,
updatedBy: user.updatedBy,
version: user.version
}));

return {
organizations,
users: transformedUsers,
roles,
permissions,
teams,
userRoles: [] // Extracted from users
};
}

private async extractQMSData(
tenantId: string,
scope: ExportScope
): Promise<QMSData> {

// Extract all QMS entities
const workOrders = await prisma.workOrder.findMany({
where: {
tenantId,
...(scope.dateRange ? {
createdAt: {
gte: scope.dateRange.start,
lte: scope.dateRange.end
}
} : {})
},
include: {
master: true,
children: true,
originator: {
include: {
person: true,
vendor: true,
team: true
}
},
item: {
include: {
asset: true
}
},
assigner: {
include: {
person: true
}
},
assignee: {
include: {
person: true
}
},
jobPlan: {
include: {
tools: {
include: {
tool: true
}
},
experiences: {
include: {
experience: true
}
},
persons: {
include: {
person: true
}
},
materials: {
include: {
material: true
}
}
}
},
schedule: true,
approvals: {
include: {
approver: true,
signature: true
}
},
timeEntries: {
include: {
person: true,
vendor: true
}
},
minRequirements: true,
dependencies: {
include: {
dependsOn: true
}
},
dependents: true,
auditEvents: {
include: {
performer: true
}
}
}
});

// Additional QMS entities
const [jobPlans, parties, persons, vendors, assets, tools, materials, experiences] =
await Promise.all([
prisma.jobPlan.findMany({ where: { tenantId }, include: {...} }),
prisma.party.findMany({ where: { tenantId }, include: {...} }),
prisma.person.findMany({ where: { tenantId }, include: {...} }),
prisma.vendor.findMany({ where: { tenantId } }),
prisma.asset.findMany({ where: { tenantId } }),
prisma.tool.findMany({ where: { tenantId } }),
prisma.material.findMany({ where: { tenantId } }),
prisma.experience.findMany({ where: { tenantId } })
]);

return {
workOrders,
jobPlans,
changeItems: [], // Embedded in work orders
approvals: [], // Embedded in work orders
parties,
persons,
vendors,
assets,
tools,
materials,
experiences,
schedules: [],
timeEntries: []
};
}

private async extractComplianceData(
tenantId: string,
scope: ExportScope
): Promise<ComplianceData> {

if (!scope.includeAuditTrail) {
return {
electronicSignatures: [],
auditTrail: [],
trainingRecords: [],
validationProtocols: [],
changeControls: []
};
}

// Extract compliance entities
const [signatures, auditTrail] = await Promise.all([
prisma.electronicSignature.findMany({
where: {
tenantId,
...(scope.dateRange ? {
signedAt: {
gte: scope.dateRange.start,
lte: scope.dateRange.end
}
} : {})
},
include: {
signer: true,
approvals: {
include: {
workOrder: true
}
}
}
}),

prisma.auditTrail.findMany({
where: {
tenantId,
...(scope.dateRange ? {
performedAt: {
gte: scope.dateRange.start,
lte: scope.dateRange.end
}
} : {})
},
include: {
performer: true,
workOrder: true
},
orderBy: {
performedAt: 'asc'
}
})
]);

return {
electronicSignatures: signatures,
auditTrail,
trainingRecords: [], // TODO: Add when training module exists
validationProtocols: [],
changeControls: []
};
}
}

8. Compliance Validation

8.1 GDPR Article 20 Compliance

Requirement: Right to data portability in structured, commonly used, machine-readable format

Implementation:

/**
* GDPR Article 20 compliance validator
* Ensures export meets data portability requirements
*/
class GDPRArticle20Validator {

async validateExport(exportPackage: ExportManifest): Promise<boolean> {

const checks = [
this.checkMachineReadableFormat(exportPackage),
this.checkStructuredFormat(exportPackage),
this.checkCommonlyUsedFormat(exportPackage),
this.checkDataCompleteness(exportPackage),
this.checkWithoutUndueDel ay(exportPackage)
];

const results = await Promise.all(checks);
return results.every(r => r === true);
}

private checkMachineReadableFormat(manifest: ExportManifest): boolean {
// Must include JSON or CSV format
return manifest.exportJob.format.includes(ExportFormat.JSON) ||
manifest.exportJob.format.includes(ExportFormat.CSV);
}

private checkStructuredFormat(manifest: ExportManifest): boolean {
// Must include schema documentation
const schemaFiles = manifest.files.filter(f =>
f.path.startsWith('schema/')
);
return schemaFiles.length >= 2; // json-schema.json + relationships.json
}

private checkCommonlyUsedFormat(manifest: ExportManifest): boolean {
// JSON, CSV, XML are all commonly used
return true;
}

private checkDataCompleteness(manifest: ExportManifest): boolean {
// All tenant data must be included
const requiredEntities = [
'organizations', 'users', 'roles', 'work_orders',
'approvals', 'signatures', 'audit_trail'
];

const exportedEntities = manifest.entities.map(e => e.entityType);
return requiredEntities.every(req => exportedEntities.includes(req));
}

private checkWithoutUndueDelay(manifest: ExportManifest): boolean {
// Export must complete within 1 month of request
const requestDate = new Date(manifest.exportJob.triggeredAt);
const completionDate = manifest.generatedAt;
const daysDiff = (completionDate.getTime() - requestDate.getTime()) / (1000 * 60 * 60 * 24);

return daysDiff <= 30;
}
}

8.2 FDA Part 11 Compliance

Requirement: Records retrievable in both human-readable and electronic form

Implementation:

/**
* FDA Part 11 §11.10(b) compliance validator
*/
class FDAPart11Validator {

async validateExport(exportPackage: ExportManifest): Promise<boolean> {

return (
this.checkHumanReadableForm(exportPackage) &&
this.checkElectronicForm(exportPackage) &&
this.checkRetrievability(exportPackage) &&
this.checkRetentionPeriod(exportPackage)
);
}

private checkHumanReadableForm(manifest: ExportManifest): boolean {
// Must include PDF format
return manifest.exportJob.format.includes(ExportFormat.PDF);
}

private checkElectronicForm(manifest: ExportManifest): boolean {
// Must include JSON or CSV
return manifest.exportJob.format.includes(ExportFormat.JSON);
}

private checkRetrievability(manifest: ExportManifest): boolean {
// Export package must be downloadable
return manifest.files.length > 0;
}

private checkRetentionPeriod(manifest: ExportManifest): boolean {
// Export must be retained per policy (90 days minimum)
const retentionDays = (manifest.compliance.expiresAt.getTime() -
manifest.generatedAt.getTime()) / (1000 * 60 * 60 * 24);

return retentionDays >= 90;
}
}

9. Operational Procedures

9.1 Scheduled Incremental Exports

/**
* Automated weekly incremental exports to GCS
* Cron: Every Sunday at 2:00 AM UTC
*/
class ScheduledExportService {

@Cron('0 2 * * 0') // Every Sunday at 2:00 AM
async executeWeeklyIncrementalExport() {

// Get all active tenants
const tenants = await prisma.tenant.findMany({
where: { active: true }
});

for (const tenant of tenants) {

// Find last export
const lastExport = await prisma.exportJob.findFirst({
where: {
tenantId: tenant.id,
status: ExportJobStatus.COMPLETED,
type: ExportType.INCREMENTAL
},
orderBy: {
completedAt: 'desc'
}
});

const sinceDate = lastExport?.completedAt || new Date(0);

// Create incremental export job
const exportJob = await prisma.exportJob.create({
data: {
tenantId: tenant.id,
status: ExportJobStatus.QUEUED,
type: ExportType.INCREMENTAL,
format: [ExportFormat.JSON],
scope: {
domains: Object.values(ExportDomain),
dateRange: {
start: sinceDate,
end: new Date()
},
includeDeleted: false,
includeAttachments: true,
includeAuditTrail: true
},
triggeredBy: 'scheduled_export_system',
triggeredAt: new Date(),
expiresAt: new Date(Date.now() + 90 * 24 * 60 * 60 * 1000)
}
});

// Execute export
const exporter = new TenantExporter();
await exporter.executeExport(exportJob.id);

// Upload to GCS backup bucket
await this.uploadToBackupBucket(exportJob.id, tenant.id);
}
}

private async uploadToBackupBucket(
exportJobId: string,
tenantId: string
): Promise<void> {

const exportJob = await prisma.exportJob.findUnique({
where: { id: exportJobId }
});

if (!exportJob || !exportJob.exportPackageUrl) {
return;
}

// GCS bucket: coditect-bio-qms-tenant-backups
const bucket = 'coditect-bio-qms-tenant-backups';
const destinationPath = `${tenantId}/incremental/${exportJobId}.zip`;

// Copy from temp export location to backup bucket
await storage.bucket(bucket).upload(exportJob.exportPackageUrl, {
destination: destinationPath,
metadata: {
contentType: 'application/zip',
metadata: {
tenantId,
exportJobId,
exportType: ExportType.INCREMENTAL,
createdAt: exportJob.completedAt?.toISOString()
}
}
});
}
}

9.2 Pre-Deletion Export

/**
* Automatic full export before tenant deletion
* Called from D.6.3 tenant deletion workflow
*/
class PreDeletionExportService {

async executePreDeletionExport(tenantId: string): Promise<string> {

// Create full export job
const exportJob = await prisma.exportJob.create({
data: {
tenantId,
status: ExportJobStatus.QUEUED,
type: ExportType.PRE_DELETION,
format: [ExportFormat.JSON, ExportFormat.CSV, ExportFormat.PDF],
scope: {
domains: Object.values(ExportDomain),
includeDeleted: true,
includeAttachments: true,
includeAuditTrail: true
},
triggeredBy: 'pre_deletion_system',
triggeredAt: new Date(),
expiresAt: new Date(Date.now() + 365 * 24 * 60 * 60 * 1000) // 1 year
}
});

// Execute export synchronously
const exporter = new TenantExporter();
await exporter.executeExport(exportJob.id);

// Upload to long-term retention bucket
await this.uploadToRetentionBucket(exportJob.id, tenantId);

// Notify admin
await this.notifyPreDeletionExportComplete(tenantId, exportJob.id);

return exportJob.id;
}

private async uploadToRetentionBucket(
exportJobId: string,
tenantId: string
): Promise<void> {

const bucket = 'coditect-bio-qms-deleted-tenant-archives';
const destinationPath = `${tenantId}/pre-deletion-${exportJobId}.zip`;

const exportJob = await prisma.exportJob.findUnique({
where: { id: exportJobId }
});

await storage.bucket(bucket).upload(exportJob!.exportPackageUrl!, {
destination: destinationPath,
metadata: {
contentType: 'application/zip',
metadata: {
tenantId,
exportJobId,
exportType: ExportType.PRE_DELETION,
deletionDate: new Date().toISOString(),
retentionPeriod: '7 years' // Regulatory requirement
}
},
storageClass: 'COLDLINE' // Cost-optimized long-term storage
});
}
}

10. Security Controls

10.1 Export Access Control

/**
* RBAC for export operations
* Only tenant admins can trigger exports
*/
const EXPORT_PERMISSIONS = {
CREATE_EXPORT: 'tenant:export:create',
VIEW_EXPORT: 'tenant:export:view',
DOWNLOAD_EXPORT: 'tenant:export:download',
DELETE_EXPORT: 'tenant:export:delete',
TRIGGER_MIGRATION: 'tenant:migration:execute',
ROLLBACK_MIGRATION: 'tenant:migration:rollback'
};

// Middleware: Check export permissions
async function checkExportPermission(
req: Request,
res: Response,
next: NextFunction
) {

const { tenantId } = req.params;
const userId = req.user!.id;

// Check if user has export permission for this tenant
const hasPermission = await rbac.checkPermission(
userId,
tenantId,
EXPORT_PERMISSIONS.CREATE_EXPORT
);

if (!hasPermission) {
return res.status(403).json({
error: 'Insufficient permissions to create export',
required: EXPORT_PERMISSIONS.CREATE_EXPORT
});
}

next();
}

// Audit log all export operations
async function auditExportOperation(
tenantId: string,
userId: string,
operation: string,
details: any
) {

await prisma.auditTrail.create({
data: {
tenantId,
entityType: 'export_jobs',
entityId: details.exportJobId || 'N/A',
action: operation,
performedBy: userId,
performerType: PerformerType.HUMAN,
performedAt: new Date(),
newVal: details
}
});
}

10.2 Export Data Redaction

/**
* Sensitive data redaction in exports
* API keys, passwords, tokens are never exported
*/
class ExportDataRedactor {

redactSensitiveData(data: any): any {

// Redact API keys
if (data.integration?.apiKeys) {
data.integration.apiKeys = data.integration.apiKeys.map((key: APIKey) => ({
...key,
keyValue: undefined, // Never export actual key
keyHash: key.keyHash, // Hash only
keyPrefix: key.keyPrefix, // First 8 chars for identification
redacted: true,
note: 'API key value not included in export for security. Regenerate after import.'
}));
}

// Redact OAuth tokens
if (data.integration?.oauthTokens) {
data.integration.oauthTokens = data.integration.oauthTokens.map((token: any) => ({
...token,
accessToken: '[REDACTED]',
refreshToken: '[REDACTED]',
redacted: true
}));
}

// Redact passwords (should never be in export, but defensive)
if (data.core?.users) {
data.core.users = data.core.users.map((user: any) => ({
...user,
password: undefined,
passwordHash: undefined
}));
}

// Redact PII per request
// (Caller can request PII redaction for GDPR anonymization)

return data;
}
}

11. Monitoring and Alerting

11.1 Export Metrics

/**
* Prometheus metrics for export operations
*/
const exportMetrics = {
exportJobsCreated: new Counter({
name: 'export_jobs_created_total',
help: 'Total number of export jobs created',
labelNames: ['tenant_id', 'export_type', 'format']
}),

exportJobsCompleted: new Counter({
name: 'export_jobs_completed_total',
help: 'Total number of export jobs completed successfully',
labelNames: ['tenant_id', 'export_type']
}),

exportJobsFailed: new Counter({
name: 'export_jobs_failed_total',
help: 'Total number of export jobs failed',
labelNames: ['tenant_id', 'export_type', 'error_type']
}),

exportDuration: new Histogram({
name: 'export_duration_seconds',
help: 'Export job duration in seconds',
labelNames: ['tenant_id', 'export_type'],
buckets: [60, 300, 600, 1800, 3600, 7200] // 1min to 2hrs
}),

exportPackageSize: new Histogram({
name: 'export_package_size_bytes',
help: 'Export package size in bytes',
labelNames: ['tenant_id', 'export_type'],
buckets: [1e6, 10e6, 100e6, 1e9, 10e9] // 1MB to 10GB
}),

migrationJobsCompleted: new Counter({
name: 'migration_jobs_completed_total',
help: 'Total number of migration jobs completed',
labelNames: ['source_tenant_id', 'target_tenant_id', 'strategy']
}),

migrationConflicts: new Counter({
name: 'migration_conflicts_total',
help: 'Total number of migration conflicts detected',
labelNames: ['conflict_type', 'resolution_strategy']
}),

migrationRollbacks: new Counter({
name: 'migration_rollbacks_total',
help: 'Total number of migration rollbacks',
labelNames: ['target_tenant_id', 'reason']
})
};

11.2 Alerting Rules

# Prometheus alerting rules for export/migration
groups:
- name: export_alerts
interval: 60s
rules:
- alert: ExportJobFailureRate
expr: |
rate(export_jobs_failed_total[5m]) > 0.1
for: 10m
labels:
severity: warning
annotations:
summary: "High export job failure rate"
description: "Export jobs are failing at {{ $value }} per second"

- alert: ExportJobStuck
expr: |
export_job_status{status="EXTRACTING"} > 7200
for: 1h
labels:
severity: critical
annotations:
summary: "Export job stuck in EXTRACTING state"
description: "Export job {{ $labels.export_job_id }} has been extracting for over 2 hours"

- alert: MigrationConflictsSurge
expr: |
rate(migration_conflicts_total[30m]) > 10
for: 15m
labels:
severity: warning
annotations:
summary: "High migration conflict rate"
description: "Migration conflicts are occurring at {{ $value }} per second"

- alert: MigrationRollback
expr: |
increase(migration_rollbacks_total[1h]) > 0
labels:
severity: critical
annotations:
summary: "Migration rollback occurred"
description: "Migration for tenant {{ $labels.target_tenant_id }} was rolled back"

12. Testing and Validation

12.1 Export Completeness Tests

/**
* Integration tests for export completeness
* Ensures 100% data coverage
*/
describe('Tenant Data Export', () => {

it('should export all work orders with relationships', async () => {
// Seed test tenant
const tenant = await createTestTenant();
const workOrders = await seedWorkOrders(tenant.id, 100);

// Execute export
const exportJob = await exportService.createExport(tenant.id, {
type: ExportType.FULL,
format: [ExportFormat.JSON],
scope: {
domains: [ExportDomain.QMS],
includeDeleted: false,
includeAttachments: true,
includeAuditTrail: true
}
});

await exportService.executeExport(exportJob.id);

// Load export package
const exportData = await loadExportPackage(exportJob.exportPackageUrl);

// Verify row counts
expect(exportData.data.qms.workOrders.length).toBe(100);

// Verify relationships preserved
for (const wo of exportData.data.qms.workOrders) {
expect(wo.originator).toBeDefined();
expect(wo.assignee).toBeDefined();
expect(wo.item).toBeDefined();

if (wo.approvals.length > 0) {
expect(wo.approvals[0].signature).toBeDefined();
}
}
});

it('should verify export integrity', async () => {
const tenant = await createTestTenant();
await seedCompleteDataset(tenant.id);

const exportJob = await exportService.createExport(tenant.id, {
type: ExportType.FULL,
format: [ExportFormat.JSON, ExportFormat.CSV],
scope: {
domains: Object.values(ExportDomain),
includeDeleted: true,
includeAttachments: true,
includeAuditTrail: true
}
});

await exportService.executeExport(exportJob.id);

// Verify integrity
const verifier = new ExportIntegrityVerifier();
const verification = await verifier.verifyExport(
exportJob.id,
exportJob.exportPackageUrl
);

// All checks must pass
expect(verification.passed).toBe(true);
expect(verification.failureReasons).toHaveLength(0);

// Row counts must match
for (const check of verification.rowCountVerification) {
expect(check.match).toBe(true);
expect(check.discrepancy).toBe(0);
}

// File hashes must match
for (const check of verification.fileHashVerification) {
expect(check.match).toBe(true);
}

// Foreign keys must resolve
for (const check of verification.foreignKeyVerification) {
expect(check.passed).toBe(true);
expect(check.resolutionRate).toBe(1.0);
}
});
});

12.2 Migration Tests

describe('Tenant Migration', () => {

it('should migrate tenant data without conflicts', async () => {
// Create source and target tenants
const sourceTenant = await createTestTenant('Source Corp');
const targetTenant = await createTestTenant('Target Corp');

// Seed source tenant
await seedCompleteDataset(sourceTenant.id);

// Export source
const exportJob = await exportService.createExport(sourceTenant.id, {
type: ExportType.FULL,
format: [ExportFormat.JSON],
scope: { /* full scope */ }
});
await exportService.executeExport(exportJob.id);

// Execute migration
const migrationJob = await migrationService.createMigration({
sourceTenantId: sourceTenant.id,
targetTenantId: targetTenant.id,
exportPackageUrl: exportJob.exportPackageUrl,
strategy: MigrationStrategy.MERGE,
conflictResolution: ConflictResolution.FAIL,
userMappingStrategy: UserMappingStrategy.CREATE_NEW
});

const report = await migrationService.executeMigration(migrationJob.id);

// Verify migration success
expect(report.summary.failedRecords).toBe(0);
expect(report.summary.conflictsDetected).toBe(0);

// Verify data in target
const targetWorkOrders = await prisma.workOrder.findMany({
where: { tenantId: targetTenant.id }
});

expect(targetWorkOrders.length).toBeGreaterThan(0);
});

it('should rollback migration on failure', async () => {
const sourceTenant = await createTestTenant();
const targetTenant = await createTestTenant();

// Seed both tenants with conflicting data
await seedConflictingData(sourceTenant.id, targetTenant.id);

// Export source
const exportJob = await exportService.createExport(sourceTenant.id, {
type: ExportType.FULL,
format: [ExportFormat.JSON],
scope: { /* full scope */ }
});
await exportService.executeExport(exportJob.id);

// Attempt migration with FAIL conflict resolution
const migrationJob = await migrationService.createMigration({
sourceTenantId: sourceTenant.id,
targetTenantId: targetTenant.id,
exportPackageUrl: exportJob.exportPackageUrl,
strategy: MigrationStrategy.MERGE,
conflictResolution: ConflictResolution.FAIL,
userMappingStrategy: UserMappingStrategy.EMAIL_MATCH
});

// Should fail due to conflicts
await expect(
migrationService.executeMigration(migrationJob.id)
).rejects.toThrow('Migration failed: unresolved conflicts');

// Verify rollback occurred
const updatedJob = await prisma.migrationJob.findUnique({
where: { id: migrationJob.id }
});

expect(updatedJob.status).toBe(MigrationJobStatus.ROLLED_BACK);

// Verify target tenant unchanged
const targetDataAfter = await prisma.workOrder.findMany({
where: { tenantId: targetTenant.id }
});

// Should match pre-migration state
expect(targetDataAfter.length).toBe(/* original count */);
});
});

13. Documentation and Training

13.1 User Documentation

Export User Guide: docs/user-guides/tenant-data-export.md

Topics:

  • How to trigger an on-demand export
  • Understanding export formats (JSON, CSV, XML, PDF)
  • Downloading and verifying export packages
  • Scheduling automated exports
  • Retention policies and expiration

Migration User Guide: docs/user-guides/tenant-migration.md

Topics:

  • When to use tenant migration (M&A scenarios)
  • Pre-migration checklist
  • Executing dry run validation
  • Resolving migration conflicts
  • Post-migration verification
  • Rollback procedures

13.2 Runbooks

Export Failure Runbook: docs/runbooks/export-failure-recovery.md

Steps:

  1. Check export job status in database
  2. Review error logs for specific failure
  3. Verify tenant data integrity
  4. Retry export with reduced scope
  5. Escalate to engineering if persistent

Migration Rollback Runbook: docs/runbooks/migration-rollback.md

Steps:

  1. Verify rollback snapshot availability
  2. Notify stakeholders of rollback decision
  3. Execute rollback API call
  4. Verify target tenant restored to pre-migration state
  5. Document rollback reason and lessons learned

14. Appendix

14.1 Export Schema Versioning

Schema VersionRelease DateChangesBackward Compatible
1.0.02026-02-16Initial releaseN/A
1.1.0TBDAdd training records domainYes (additive only)
2.0.0TBDBreaking: Change ID format to UUIDsNo

14.2 Glossary

TermDefinition
Export JobAsynchronous task that extracts tenant data into export package
Export PackageZip archive containing tenant data in multiple formats with integrity verification
Migration JobAsynchronous task that imports data from one tenant to another
ConflictData discrepancy between source and target tenants requiring resolution
Rollback SnapshotPre-migration backup of target tenant enabling rollback within 72 hours
Dry RunMigration validation without actual data import
User MappingAssociation between source tenant users and target tenant users
Integrity VerificationMulti-layer checks ensuring export completeness and tamper-evidence

Document Prepared By: Chief Data Officer Office Technical Review: Chief Architect, VP Engineering Compliance Review: Regulatory Affairs, Legal Department Approval Status: Draft - Pending Executive Approval

Next Steps:

  1. Executive approval (target: 2026-02-20)
  2. Engineering implementation (target: 2026-03-01)
  3. QA validation and testing (target: 2026-03-15)
  4. User training and documentation (target: 2026-03-20)
  5. Production deployment (target: 2026-04-01)

End of Document