Skip to main content

Audit Trail Search and Reporting API

Document ID: CODITECT-BIO-AUDIT-SEARCH-001 Version: 1.0.0 Effective Date: 2026-02-16 Classification: Internal - Restricted Owner: Chief Information Security Officer (CISO)


Document Control

Approval History

RoleNameSignatureDate
Chief Information Security Officer[Pending][Digital Signature]YYYY-MM-DD
VP Quality Assurance[Pending][Digital Signature]YYYY-MM-DD
Privacy Officer[Pending][Digital Signature]YYYY-MM-DD
Compliance Officer[Pending][Digital Signature]YYYY-MM-DD

Revision History

VersionDateAuthorChangesApproval Status
1.0.02026-02-16CISO OfficeInitial releaseDraft

Distribution List

  • Executive Leadership Team
  • Information Security Team
  • Quality Assurance Team
  • Compliance and Regulatory Affairs
  • Privacy Office
  • Internal Audit
  • External Auditors (as needed)

Review Schedule

Review TypeFrequencyNext Review DateResponsible Party
Annual Review12 months2027-02-16CISO
Quarterly Performance Review3 months2026-05-16Engineering Leadership
Regulatory Update ReviewAs neededN/ACompliance Officer
Retention Policy ReviewQuarterly2026-05-16Legal & Compliance

1. Purpose and Scope

1.1 Purpose

This specification establishes the audit trail search, reporting, and retention management capabilities for the CODITECT Biosciences Quality Management System (BIO-QMS) Platform to ensure:

  1. Comprehensive Search - Efficient retrieval of audit records with flexible filtering and full-text search
  2. Multi-Format Export - Support for CSV, PDF, JSON, and XML exports with regulatory submission compatibility
  3. Retention Compliance - Automated retention management per FDA, HIPAA, and SOC 2 requirements
  4. Regulatory Reporting - Pre-built report templates for compliance audits and investigations
  5. Performance Optimization - Sub-2-second typical searches with materialized views and caching

1.2 Scope

This specification applies to:

In Scope:

  • REST and GraphQL search APIs for audit trail queries
  • Export pipeline supporting CSV, PDF (digitally signed), JSON, and XML (eCTD compatible)
  • Retention policy management with per-regulation and per-tenant overrides
  • Pre-built compliance report templates and scheduled generation
  • Anomaly detection for suspicious access patterns
  • Performance optimization with materialized views and caching
  • Export job queue with progress tracking

Out of Scope:

  • Audit trail write operations (covered in electronic-record-controls.md)
  • Cryptographic integrity verification (covered in crypto-standards-policy.md)
  • PHI-specific breach notification (covered in hipaa-audit-reporting.md)
  • Real-time monitoring dashboards (covered in soc2-continuous-monitoring.md)

1.3 Regulatory Alignment

RegulationRequirementImplementation
FDA 21 CFR Part 11.10(e)Ability to generate accurate and complete copies of recordsMulti-format export with integrity verification
FDA 21 CFR Part 11.50(b)Retention of records throughout retention periodAutomated retention management with legal hold
HIPAA §164.312(b)Hardware and software mechanisms to record/examine activityComprehensive search API with anomaly detection
HIPAA §164.316(b)(2)(i)6-year retention of audit logsConfigurable retention with HIPAA-specific policies
SOC 2 CC6.1Logical and physical access controls monitored and evaluatedAccess pattern reporting and anomaly detection
SOC 2 CC7.2System operations monitoringScheduled compliance reports and dashboards

2. Architecture Overview

2.1 System Components

// Core architecture components
interface AuditSearchArchitecture {
searchLayer: {
restAPI: SearchAPI;
graphqlAPI: GraphQLSearchAPI;
rateLimiter: RateLimiter;
cacheManager: CacheManager;
};

exportPipeline: {
jobQueue: ExportJobQueue;
formatters: {
csv: CSVFormatter;
pdf: PDFFormatter;
json: JSONFormatter;
xml: XMLFormatter;
};
signatureService: DigitalSignatureService;
storageService: ExportStorageService;
};

retentionManager: {
policyEngine: RetentionPolicyEngine;
purgeOrchestrator: PurgeOrchestrator;
legalHoldService: LegalHoldService;
complianceValidator: RetentionComplianceValidator;
};

reportingEngine: {
templateRegistry: ReportTemplateRegistry;
scheduler: ReportScheduler;
generator: ReportGenerator;
distributionService: ReportDistributionService;
anomalyDetector: AnomalyDetectionEngine;
};

performanceLayer: {
materializedViews: MaterializedViewManager;
queryOptimizer: QueryOptimizer;
indexManager: IndexManager;
metricsCollector: PerformanceMetricsCollector;
};
}

2.2 Data Flow

┌─────────────────┐
│ User Request │
│ (Search/Export) │
└────────┬────────┘

v
┌─────────────────┐
│ Rate Limiter │◄──── 100/min user, 1000/min tenant
└────────┬────────┘

v
┌─────────────────┐
│ Cache Check │◄──── TTL-based result cache
└────────┬────────┘
│ (cache miss)
v
┌─────────────────┐
│ Query Optimizer │◄──── Materialized views
└────────┬────────┘

v
┌─────────────────┐
│ PostgreSQL │◄──── Optimized indexes
│ audit_events │
└────────┬────────┘

v
┌─────────────────┐
│ Result Formatter│
│ (REST/GraphQL) │
└────────┬────────┘

├──► Export Pipeline (if export requested)
│ │
│ v
│ ┌─────────────────┐
│ │ Export Queue │
│ └────────┬────────┘
│ │
│ v
│ ┌─────────────────┐
│ │ Formatters │
│ │ CSV/PDF/JSON/XML│
│ └────────┬────────┘
│ │
│ v
│ ┌─────────────────┐
│ │Digital Signature│
│ │ (PDF only) │
│ └────────┬────────┘
│ │
│ v
│ ┌─────────────────┐
│ │ Cloud Storage │
│ │ (GCS/S3) │
│ └─────────────────┘

└──► User Response

3. Search API Specification

3.1 REST API Endpoints

3.1.1 Search Endpoint

OpenAPI 3.0 Specification:

openapi: 3.0.3
info:
title: BIO-QMS Audit Trail Search API
version: 1.0.0
description: Regulatory-compliant audit trail search and retrieval

servers:
- url: https://api.coditect-bio.com/v1
description: Production environment
- url: https://api-staging.coditect-bio.com/v1
description: Staging environment

security:
- bearerAuth: []

paths:
/audit-trail/search:
get:
summary: Search audit trail events
operationId: searchAuditTrail
tags:
- Audit Trail
parameters:
- name: tenant_id
in: query
required: true
schema:
type: string
format: uuid
description: Tenant identifier (automatically scoped to user's tenants)

- name: user_id
in: query
schema:
type: string
format: uuid
description: Filter by specific user

- name: resource_type
in: query
schema:
type: string
enum:
- document
- validation_protocol
- sop
- training_record
- signature
- user
- role
- tenant
- encryption_key
description: Type of resource accessed

- name: resource_id
in: query
schema:
type: string
format: uuid
description: Specific resource identifier

- name: action_type
in: query
schema:
type: string
enum:
- create
- read
- update
- delete
- approve
- reject
- sign
- export
- print
- login
- logout
- password_change
- permission_grant
- permission_revoke
description: Type of action performed

- name: start_date
in: query
schema:
type: string
format: date-time
description: Start of date range (ISO 8601)

- name: end_date
in: query
schema:
type: string
format: date-time
description: End of date range (ISO 8601)

- name: full_text_query
in: query
schema:
type: string
description: Full-text search across old_value/new_value JSONB fields

- name: severity
in: query
schema:
type: string
enum:
- info
- warning
- critical
description: Event severity level

- name: ip_address
in: query
schema:
type: string
format: ipv4
description: Filter by originating IP address

- name: cursor
in: query
schema:
type: string
description: Pagination cursor (opaque token)

- name: limit
in: query
schema:
type: integer
minimum: 1
maximum: 1000
default: 100
description: Number of results per page

- name: sort_by
in: query
schema:
type: string
enum:
- timestamp_asc
- timestamp_desc
- user_id
- resource_type
default: timestamp_desc
description: Sort order for results

responses:
'200':
description: Successful search
content:
application/json:
schema:
$ref: '#/components/schemas/SearchResponse'

'400':
description: Invalid query parameters
content:
application/json:
schema:
$ref: '#/components/schemas/ErrorResponse'

'401':
description: Unauthorized - invalid or missing authentication

'403':
description: Forbidden - insufficient permissions

'429':
description: Rate limit exceeded
headers:
X-RateLimit-Limit:
schema:
type: integer
description: Request limit per window
X-RateLimit-Remaining:
schema:
type: integer
description: Remaining requests in window
X-RateLimit-Reset:
schema:
type: integer
description: Unix timestamp when limit resets

'500':
description: Internal server error

/audit-trail/export:
post:
summary: Request audit trail export
operationId: exportAuditTrail
tags:
- Audit Trail
requestBody:
required: true
content:
application/json:
schema:
$ref: '#/components/schemas/ExportRequest'

responses:
'202':
description: Export job accepted
content:
application/json:
schema:
$ref: '#/components/schemas/ExportJobResponse'

'400':
description: Invalid export request

'401':
description: Unauthorized

'403':
description: Forbidden - insufficient permissions

/audit-trail/export/{job_id}:
get:
summary: Get export job status
operationId: getExportJobStatus
tags:
- Audit Trail
parameters:
- name: job_id
in: path
required: true
schema:
type: string
format: uuid

responses:
'200':
description: Export job status
content:
application/json:
schema:
$ref: '#/components/schemas/ExportJobStatus'

'404':
description: Export job not found

components:
securitySchemes:
bearerAuth:
type: http
scheme: bearer
bearerFormat: JWT

schemas:
SearchResponse:
type: object
required:
- data
- pagination
- metadata
properties:
data:
type: array
items:
$ref: '#/components/schemas/AuditEvent'
pagination:
$ref: '#/components/schemas/PaginationInfo'
metadata:
$ref: '#/components/schemas/SearchMetadata'

AuditEvent:
type: object
required:
- id
- tenant_id
- timestamp
- user_id
- action_type
- resource_type
- chain_hash
properties:
id:
type: string
format: uuid
description: Unique event identifier
tenant_id:
type: string
format: uuid
timestamp:
type: string
format: date-time
description: ISO 8601 UTC timestamp
user_id:
type: string
format: uuid
user_full_name:
type: string
action_type:
type: string
resource_type:
type: string
resource_id:
type: string
format: uuid
old_value:
type: object
nullable: true
description: JSONB field with previous state
new_value:
type: object
nullable: true
description: JSONB field with new state
ip_address:
type: string
format: ipv4
user_agent:
type: string
session_id:
type: string
format: uuid
severity:
type: string
enum: [info, warning, critical]
chain_hash:
type: string
description: SHA-256 hash linking to previous event
signature:
type: string
description: ECDSA P-256 signature for tamper detection
metadata:
type: object
additionalProperties: true

PaginationInfo:
type: object
required:
- total_count
- has_next_page
- has_previous_page
properties:
total_count:
type: integer
description: Total matching records
has_next_page:
type: boolean
has_previous_page:
type: boolean
next_cursor:
type: string
nullable: true
previous_cursor:
type: string
nullable: true

SearchMetadata:
type: object
properties:
query_time_ms:
type: integer
description: Query execution time in milliseconds
cache_hit:
type: boolean
description: Whether result was served from cache
indexes_used:
type: array
items:
type: string

ExportRequest:
type: object
required:
- format
- search_filters
properties:
format:
type: string
enum: [csv, pdf, json, xml]
search_filters:
type: object
description: Same filter parameters as search endpoint
columns:
type: array
items:
type: string
description: Columns to include in export (CSV only)
include_signature:
type: boolean
default: true
description: Add digital signature (PDF only)
notification_email:
type: string
format: email
description: Email for completion notification

ExportJobResponse:
type: object
properties:
job_id:
type: string
format: uuid
status:
type: string
enum: [queued, processing, completed, failed]
estimated_completion_time:
type: string
format: date-time

ExportJobStatus:
type: object
properties:
job_id:
type: string
format: uuid
status:
type: string
enum: [queued, processing, completed, failed]
progress_percent:
type: integer
minimum: 0
maximum: 100
created_at:
type: string
format: date-time
completed_at:
type: string
format: date-time
nullable: true
download_url:
type: string
format: uri
nullable: true
description: Pre-signed URL valid for 24 hours
error_message:
type: string
nullable: true

ErrorResponse:
type: object
properties:
error:
type: object
properties:
code:
type: string
message:
type: string
details:
type: object
additionalProperties: true

3.1.2 TypeScript Client SDK

// Client SDK for BIO-QMS Audit Trail API
import axios, { AxiosInstance, AxiosRequestConfig } from 'axios';

export interface SearchFilters {
tenant_id: string;
user_id?: string;
resource_type?: ResourceType;
resource_id?: string;
action_type?: ActionType;
start_date?: Date;
end_date?: Date;
full_text_query?: string;
severity?: 'info' | 'warning' | 'critical';
ip_address?: string;
cursor?: string;
limit?: number;
sort_by?: 'timestamp_asc' | 'timestamp_desc' | 'user_id' | 'resource_type';
}

export type ResourceType =
| 'document'
| 'validation_protocol'
| 'sop'
| 'training_record'
| 'signature'
| 'user'
| 'role'
| 'tenant'
| 'encryption_key';

export type ActionType =
| 'create'
| 'read'
| 'update'
| 'delete'
| 'approve'
| 'reject'
| 'sign'
| 'export'
| 'print'
| 'login'
| 'logout'
| 'password_change'
| 'permission_grant'
| 'permission_revoke';

export interface AuditEvent {
id: string;
tenant_id: string;
timestamp: string;
user_id: string;
user_full_name: string;
action_type: ActionType;
resource_type: ResourceType;
resource_id: string;
old_value?: Record<string, any>;
new_value?: Record<string, any>;
ip_address: string;
user_agent: string;
session_id: string;
severity: 'info' | 'warning' | 'critical';
chain_hash: string;
signature: string;
metadata?: Record<string, any>;
}

export interface SearchResponse {
data: AuditEvent[];
pagination: {
total_count: number;
has_next_page: boolean;
has_previous_page: boolean;
next_cursor?: string;
previous_cursor?: string;
};
metadata: {
query_time_ms: number;
cache_hit: boolean;
indexes_used: string[];
};
}

export interface ExportRequest {
format: 'csv' | 'pdf' | 'json' | 'xml';
search_filters: SearchFilters;
columns?: string[];
include_signature?: boolean;
notification_email?: string;
}

export interface ExportJobStatus {
job_id: string;
status: 'queued' | 'processing' | 'completed' | 'failed';
progress_percent: number;
created_at: string;
completed_at?: string;
download_url?: string;
error_message?: string;
}

export class AuditTrailClient {
private client: AxiosInstance;

constructor(
baseURL: string,
private authToken: string
) {
this.client = axios.create({
baseURL,
headers: {
'Authorization': `Bearer ${authToken}`,
'Content-Type': 'application/json'
}
});

// Add rate limit retry logic
this.client.interceptors.response.use(
response => response,
async error => {
if (error.response?.status === 429) {
const resetTime = parseInt(error.response.headers['x-ratelimit-reset']);
const waitTime = (resetTime * 1000) - Date.now();

if (waitTime > 0 && waitTime < 60000) {
await new Promise(resolve => setTimeout(resolve, waitTime));
return this.client.request(error.config);
}
}
throw error;
}
);
}

async search(filters: SearchFilters): Promise<SearchResponse> {
const params = new URLSearchParams();

Object.entries(filters).forEach(([key, value]) => {
if (value !== undefined) {
if (value instanceof Date) {
params.append(key, value.toISOString());
} else {
params.append(key, String(value));
}
}
});

const response = await this.client.get<SearchResponse>(
`/audit-trail/search?${params.toString()}`
);

return response.data;
}

async exportAuditTrail(request: ExportRequest): Promise<string> {
const response = await this.client.post<{ job_id: string }>(
'/audit-trail/export',
request
);

return response.data.job_id;
}

async getExportStatus(jobId: string): Promise<ExportJobStatus> {
const response = await this.client.get<ExportJobStatus>(
`/audit-trail/export/${jobId}`
);

return response.data;
}

async *searchAll(filters: Omit<SearchFilters, 'cursor'>): AsyncGenerator<AuditEvent> {
let cursor: string | undefined = undefined;

do {
const response = await this.search({ ...filters, cursor });

for (const event of response.data) {
yield event;
}

cursor = response.pagination.next_cursor;
} while (cursor);
}
}

3.2 GraphQL API

"""
Audit trail search and reporting API
"""

type Query {
"""
Search audit trail events with flexible filtering
"""
searchAuditTrail(
tenant_id: UUID!
user_id: UUID
resource_type: ResourceType
resource_id: UUID
action_type: ActionType
start_date: DateTime
end_date: DateTime
full_text_query: String
severity: Severity
ip_address: String
cursor: String
limit: Int = 100
sort_by: SortOrder = TIMESTAMP_DESC
): AuditTrailSearchResult!

"""
Get export job status
"""
exportJobStatus(job_id: UUID!): ExportJobStatus!
}

type Mutation {
"""
Request audit trail export
"""
exportAuditTrail(input: ExportRequestInput!): ExportJobResponse!
}

enum ResourceType {
DOCUMENT
VALIDATION_PROTOCOL
SOP
TRAINING_RECORD
SIGNATURE
USER
ROLE
TENANT
ENCRYPTION_KEY
}

enum ActionType {
CREATE
READ
UPDATE
DELETE
APPROVE
REJECT
SIGN
EXPORT
PRINT
LOGIN
LOGOUT
PASSWORD_CHANGE
PERMISSION_GRANT
PERMISSION_REVOKE
}

enum Severity {
INFO
WARNING
CRITICAL
}

enum SortOrder {
TIMESTAMP_ASC
TIMESTAMP_DESC
USER_ID
RESOURCE_TYPE
}

enum ExportFormat {
CSV
PDF
JSON
XML
}

enum ExportJobState {
QUEUED
PROCESSING
COMPLETED
FAILED
}

type AuditEvent {
id: UUID!
tenant_id: UUID!
timestamp: DateTime!
user: User!
action_type: ActionType!
resource_type: ResourceType!
resource_id: UUID!
old_value: JSON
new_value: JSON
ip_address: String!
user_agent: String!
session_id: UUID!
severity: Severity!
chain_hash: String!
signature: String!
metadata: JSON
}

type User {
id: UUID!
full_name: String!
email: String!
role: String!
}

type AuditTrailSearchResult {
data: [AuditEvent!]!
pagination: PaginationInfo!
metadata: SearchMetadata!
}

type PaginationInfo {
total_count: Int!
has_next_page: Boolean!
has_previous_page: Boolean!
next_cursor: String
previous_cursor: String
}

type SearchMetadata {
query_time_ms: Int!
cache_hit: Boolean!
indexes_used: [String!]!
}

input ExportRequestInput {
format: ExportFormat!
search_filters: SearchFiltersInput!
columns: [String!]
include_signature: Boolean = true
notification_email: String
}

input SearchFiltersInput {
tenant_id: UUID!
user_id: UUID
resource_type: ResourceType
resource_id: UUID
action_type: ActionType
start_date: DateTime
end_date: DateTime
full_text_query: String
severity: Severity
ip_address: String
}

type ExportJobResponse {
job_id: UUID!
status: ExportJobState!
estimated_completion_time: DateTime!
}

type ExportJobStatus {
job_id: UUID!
status: ExportJobState!
progress_percent: Int!
created_at: DateTime!
completed_at: DateTime
download_url: String
error_message: String
}

scalar UUID
scalar DateTime
scalar JSON

4. Export Pipeline Architecture

4.1 Export Formatters

4.1.1 CSV Formatter

import { createObjectCsvWriter } from 'csv-writer';
import { AuditEvent } from './types';

export interface CSVExportOptions {
columns?: string[];
includeHeaders?: boolean;
delimiter?: string;
}

export class CSVFormatter {
async format(
events: AuditEvent[],
options: CSVExportOptions = {}
): Promise<Buffer> {
const {
columns = DEFAULT_CSV_COLUMNS,
includeHeaders = true,
delimiter = ','
} = options;

const csvWriter = createObjectCsvWriter({
path: '/tmp/audit-export.csv',
header: columns.map(col => ({
id: col,
title: this.getColumnTitle(col)
})),
fieldDelimiter: delimiter
});

const records = events.map(event => this.flattenEvent(event, columns));
await csvWriter.writeRecords(records);

return fs.readFileSync('/tmp/audit-export.csv');
}

private flattenEvent(event: AuditEvent, columns: string[]): Record<string, any> {
const flattened: Record<string, any> = {};

columns.forEach(col => {
switch (col) {
case 'timestamp':
flattened[col] = event.timestamp;
break;
case 'user':
flattened[col] = event.user_full_name;
break;
case 'action':
flattened[col] = event.action_type;
break;
case 'resource':
flattened[col] = `${event.resource_type}:${event.resource_id}`;
break;
case 'old_value':
flattened[col] = JSON.stringify(event.old_value);
break;
case 'new_value':
flattened[col] = JSON.stringify(event.new_value);
break;
case 'ip_address':
flattened[col] = event.ip_address;
break;
case 'chain_hash':
flattened[col] = event.chain_hash;
break;
default:
flattened[col] = event.metadata?.[col];
}
});

return flattened;
}

private getColumnTitle(column: string): string {
const titles: Record<string, string> = {
timestamp: 'Timestamp (UTC)',
user: 'User',
action: 'Action Type',
resource: 'Resource',
old_value: 'Previous Value',
new_value: 'New Value',
ip_address: 'IP Address',
chain_hash: 'Chain Hash'
};

return titles[column] || column;
}
}

const DEFAULT_CSV_COLUMNS = [
'timestamp',
'user',
'action',
'resource',
'old_value',
'new_value',
'ip_address',
'chain_hash'
];

4.1.2 PDF Formatter with Digital Signatures

import PDFDocument from 'pdfkit';
import { AuditEvent } from './types';
import { DigitalSignatureService } from './signature-service';

export interface PDFExportOptions {
includeSignature?: boolean;
includeTOC?: boolean;
pageSize?: 'letter' | 'a4';
orientation?: 'portrait' | 'landscape';
metadata?: {
title?: string;
author?: string;
subject?: string;
};
}

export class PDFFormatter {
constructor(
private signatureService: DigitalSignatureService
) {}

async format(
events: AuditEvent[],
options: PDFExportOptions = {}
): Promise<Buffer> {
const {
includeSignature = true,
includeTOC = true,
pageSize = 'letter',
orientation = 'portrait',
metadata = {}
} = options;

const doc = new PDFDocument({
size: pageSize,
layout: orientation,
info: {
Title: metadata.title || 'Audit Trail Export',
Author: metadata.author || 'CODITECT BIO-QMS',
Subject: metadata.subject || 'Regulatory Compliance Report',
Creator: 'CODITECT BIO-QMS Platform',
Producer: 'CODITECT BIO-QMS v1.0.0',
CreationDate: new Date()
}
});

const buffers: Buffer[] = [];
doc.on('data', buffers.push.bind(buffers));

// Cover page
this.addCoverPage(doc, events);

// Table of contents
if (includeTOC) {
doc.addPage();
this.addTableOfContents(doc, events);
}

// Audit events
doc.addPage();
this.addAuditEvents(doc, events);

// Signature page
if (includeSignature) {
doc.addPage();
await this.addSignaturePage(doc, events);
}

doc.end();

return new Promise((resolve, reject) => {
doc.on('end', () => {
const pdfBuffer = Buffer.concat(buffers);

if (includeSignature) {
resolve(this.signatureService.signPDF(pdfBuffer));
} else {
resolve(pdfBuffer);
}
});

doc.on('error', reject);
});
}

private addCoverPage(doc: PDFKit.PDFDocument, events: AuditEvent[]): void {
doc.fontSize(24)
.font('Helvetica-Bold')
.text('Audit Trail Export', { align: 'center' });

doc.moveDown(2);

doc.fontSize(12)
.font('Helvetica')
.text(`Generated: ${new Date().toISOString()}`, { align: 'center' })
.text(`Total Events: ${events.length.toLocaleString()}`, { align: 'center' });

doc.moveDown(2);

const dateRange = this.getDateRange(events);
doc.text(`Date Range: ${dateRange.start} to ${dateRange.end}`, { align: 'center' });

doc.moveDown(2);

// Compliance statement
doc.fontSize(10)
.text('This document contains a complete and accurate copy of audit trail records as required by FDA 21 CFR Part 11.10(e) and HIPAA §164.316(b)(2)(i).', {
align: 'justify',
width: 450
});
}

private addTableOfContents(doc: PDFKit.PDFDocument, events: AuditEvent[]): void {
doc.fontSize(18)
.font('Helvetica-Bold')
.text('Table of Contents', { align: 'center' });

doc.moveDown(2);

const sections = [
{ title: 'Cover Page', page: 1 },
{ title: 'Table of Contents', page: 2 },
{ title: 'Audit Events', page: 3 },
{ title: 'Digital Signature', page: 3 + Math.ceil(events.length / 25) }
];

doc.fontSize(12).font('Helvetica');

sections.forEach(section => {
doc.text(`${section.title} ${' '.repeat(20)} ${section.page}`, {
continued: false
});
});
}

private addAuditEvents(doc: PDFKit.PDFDocument, events: AuditEvent[]): void {
doc.fontSize(18)
.font('Helvetica-Bold')
.text('Audit Events', { align: 'center' });

doc.moveDown(1);

events.forEach((event, index) => {
if (index > 0 && index % 25 === 0) {
doc.addPage();
}

doc.fontSize(10)
.font('Helvetica-Bold')
.text(`Event ${index + 1}:`, { continued: false });

doc.font('Helvetica')
.text(`Timestamp: ${event.timestamp}`)
.text(`User: ${event.user_full_name}`)
.text(`Action: ${event.action_type}`)
.text(`Resource: ${event.resource_type} (${event.resource_id})`)
.text(`IP Address: ${event.ip_address}`)
.text(`Chain Hash: ${event.chain_hash}`);

doc.moveDown(0.5);
});
}

private async addSignaturePage(
doc: PDFKit.PDFDocument,
events: AuditEvent[]
): Promise<void> {
doc.fontSize(18)
.font('Helvetica-Bold')
.text('Digital Signature', { align: 'center' });

doc.moveDown(2);

const exportHash = await this.signatureService.hashEvents(events);

doc.fontSize(12)
.font('Helvetica')
.text('Export Hash (SHA-256):', { continued: false })
.fontSize(10)
.font('Courier')
.text(exportHash, { width: 450 });

doc.moveDown(2);

doc.fontSize(12)
.font('Helvetica')
.text('This document has been digitally signed using ECDSA P-256 to ensure integrity and authenticity.', {
align: 'justify',
width: 450
});
}

private getDateRange(events: AuditEvent[]): { start: string; end: string } {
const timestamps = events.map(e => new Date(e.timestamp).getTime());
const start = new Date(Math.min(...timestamps)).toISOString();
const end = new Date(Math.max(...timestamps)).toISOString();

return { start, end };
}
}

4.1.3 XML Formatter (FDA eCTD Compatible)

import { create } from 'xmlbuilder2';
import { AuditEvent } from './types';

export interface XMLExportOptions {
ectdCompliant?: boolean;
includeSchema?: boolean;
}

export class XMLFormatter {
async format(
events: AuditEvent[],
options: XMLExportOptions = {}
): Promise<Buffer> {
const {
ectdCompliant = true,
includeSchema = true
} = options;

const root = create({ version: '1.0', encoding: 'UTF-8' })
.ele('audit_trail_export', {
'xmlns': 'http://coditect.ai/bio-qms/audit-trail/v1',
'xmlns:xsi': 'http://www.w3.org/2001/XMLSchema-instance',
...(includeSchema ? {
'xsi:schemaLocation': 'http://coditect.ai/bio-qms/audit-trail/v1 audit-trail-v1.xsd'
} : {})
});

// Metadata
const metadata = root.ele('metadata');
metadata.ele('export_timestamp').txt(new Date().toISOString());
metadata.ele('total_events').txt(String(events.length));
metadata.ele('compliance_frameworks')
.ele('framework').txt('FDA 21 CFR Part 11').up()
.ele('framework').txt('HIPAA 45 CFR §164.312(b)').up()
.ele('framework').txt('SOC 2 Type II').up();

// Events
const eventsEle = root.ele('events');

events.forEach(event => {
const eventEle = eventsEle.ele('event', { id: event.id });

eventEle.ele('timestamp').txt(event.timestamp);

const userEle = eventEle.ele('user');
userEle.ele('user_id').txt(event.user_id);
userEle.ele('full_name').txt(event.user_full_name);

eventEle.ele('action_type').txt(event.action_type);
eventEle.ele('resource_type').txt(event.resource_type);
eventEle.ele('resource_id').txt(event.resource_id);

if (event.old_value) {
eventEle.ele('old_value').dat(JSON.stringify(event.old_value));
}

if (event.new_value) {
eventEle.ele('new_value').dat(JSON.stringify(event.new_value));
}

const contextEle = eventEle.ele('context');
contextEle.ele('ip_address').txt(event.ip_address);
contextEle.ele('user_agent').txt(event.user_agent);
contextEle.ele('session_id').txt(event.session_id);

eventEle.ele('severity').txt(event.severity);

const integrityEle = eventEle.ele('integrity');
integrityEle.ele('chain_hash').txt(event.chain_hash);
integrityEle.ele('signature').txt(event.signature);

if (event.metadata) {
const metaEle = eventEle.ele('metadata');
Object.entries(event.metadata).forEach(([key, value]) => {
metaEle.ele(key).txt(String(value));
});
}
});

return Buffer.from(root.end({ prettyPrint: true }), 'utf-8');
}
}

4.2 Export Job Queue

import { Queue, Worker, Job } from 'bullmq';
import { Redis } from 'ioredis';
import { AuditEvent } from './types';
import { CSVFormatter, PDFFormatter, JSONFormatter, XMLFormatter } from './formatters';
import { StorageService } from './storage-service';

export interface ExportJob {
job_id: string;
tenant_id: string;
user_id: string;
format: 'csv' | 'pdf' | 'json' | 'xml';
search_filters: any;
options: any;
notification_email?: string;
}

export class ExportJobQueue {
private queue: Queue<ExportJob>;
private worker: Worker<ExportJob>;

constructor(
private redisConnection: Redis,
private formatters: {
csv: CSVFormatter;
pdf: PDFFormatter;
json: JSONFormatter;
xml: XMLFormatter;
},
private storageService: StorageService
) {
this.queue = new Queue<ExportJob>('audit-trail-exports', {
connection: redisConnection,
defaultJobOptions: {
attempts: 3,
backoff: {
type: 'exponential',
delay: 5000
},
removeOnComplete: {
age: 86400, // 24 hours
count: 1000
},
removeOnFail: {
age: 604800 // 7 days
}
}
});

this.worker = new Worker<ExportJob>(
'audit-trail-exports',
async (job: Job<ExportJob>) => this.processExport(job),
{
connection: redisConnection,
concurrency: 5,
limiter: {
max: 10,
duration: 60000 // 10 exports per minute
}
}
);

this.setupEventHandlers();
}

async enqueue(exportJob: ExportJob): Promise<string> {
const job = await this.queue.add(
'export',
exportJob,
{
jobId: exportJob.job_id
}
);

return job.id!;
}

async getStatus(jobId: string): Promise<any> {
const job = await this.queue.getJob(jobId);

if (!job) {
throw new Error('Export job not found');
}

const state = await job.getState();
const progress = job.progress;

return {
job_id: jobId,
status: state,
progress_percent: typeof progress === 'number' ? progress : 0,
created_at: new Date(job.timestamp).toISOString(),
completed_at: job.finishedOn ? new Date(job.finishedOn).toISOString() : null,
download_url: job.returnvalue?.download_url,
error_message: job.failedReason
};
}

private async processExport(job: Job<ExportJob>): Promise<any> {
const { format, search_filters, options, tenant_id } = job.data;

// Update progress: fetching events
await job.updateProgress(10);

const events = await this.fetchEvents(search_filters);

// Update progress: formatting
await job.updateProgress(50);

let exportBuffer: Buffer;
let contentType: string;
let fileExtension: string;

switch (format) {
case 'csv':
exportBuffer = await this.formatters.csv.format(events, options);
contentType = 'text/csv';
fileExtension = 'csv';
break;

case 'pdf':
exportBuffer = await this.formatters.pdf.format(events, options);
contentType = 'application/pdf';
fileExtension = 'pdf';
break;

case 'json':
exportBuffer = await this.formatters.json.format(events, options);
contentType = 'application/json';
fileExtension = 'json';
break;

case 'xml':
exportBuffer = await this.formatters.xml.format(events, options);
contentType = 'application/xml';
fileExtension = 'xml';
break;
}

// Update progress: uploading
await job.updateProgress(90);

const filename = `audit-export-${job.data.job_id}.${fileExtension}`;
const downloadUrl = await this.storageService.upload(
exportBuffer,
filename,
contentType,
tenant_id
);

// Update progress: complete
await job.updateProgress(100);

// Send notification email if requested
if (job.data.notification_email) {
await this.sendNotification(job.data.notification_email, downloadUrl);
}

return { download_url: downloadUrl };
}

private async fetchEvents(filters: any): Promise<AuditEvent[]> {
// This would call the actual database query
// Implementation depends on your data layer
throw new Error('Not implemented');
}

private async sendNotification(email: string, downloadUrl: string): Promise<void> {
// Send email notification with download link
// Implementation depends on your email service
}

private setupEventHandlers(): void {
this.worker.on('completed', (job) => {
console.log(`Export job ${job.id} completed successfully`);
});

this.worker.on('failed', (job, err) => {
console.error(`Export job ${job?.id} failed:`, err);
});
}
}

5. Retention Management

5.1 Retention Policy Engine

export interface RetentionPolicy {
policy_id: string;
name: string;
tenant_id?: string; // null = global policy
regulation: 'FDA_21_CFR_PART_11' | 'HIPAA' | 'SOC2' | 'CUSTOM';
retention_days: number;
minimum_retention_days: number; // Regulatory floor
applies_to: {
resource_types?: string[];
action_types?: string[];
severity_levels?: string[];
};
priority: number; // Higher priority wins
legal_hold_exempt: boolean;
created_at: string;
updated_at: string;
created_by: string;
}

export const DEFAULT_RETENTION_POLICIES: RetentionPolicy[] = [
{
policy_id: 'fda-21-cfr-part-11',
name: 'FDA 21 CFR Part 11 - Electronic Records',
regulation: 'FDA_21_CFR_PART_11',
retention_days: 2555, // 7 years
minimum_retention_days: 2555,
applies_to: {
resource_types: [
'validation_protocol',
'sop',
'signature',
'training_record'
]
},
priority: 100,
legal_hold_exempt: false,
created_at: '2026-01-01T00:00:00Z',
updated_at: '2026-01-01T00:00:00Z',
created_by: 'system'
},
{
policy_id: 'hipaa-audit-logs',
name: 'HIPAA - Audit Log Retention',
regulation: 'HIPAA',
retention_days: 2190, // 6 years
minimum_retention_days: 2190,
applies_to: {
resource_types: [
'phi_document',
'patient_record',
'clinical_data'
]
},
priority: 90,
legal_hold_exempt: false,
created_at: '2026-01-01T00:00:00Z',
updated_at: '2026-01-01T00:00:00Z',
created_by: 'system'
},
{
policy_id: 'soc2-minimum',
name: 'SOC 2 - Minimum Retention',
regulation: 'SOC2',
retention_days: 365, // 1 year
minimum_retention_days: 365,
applies_to: {},
priority: 50,
legal_hold_exempt: false,
created_at: '2026-01-01T00:00:00Z',
updated_at: '2026-01-01T00:00:00Z',
created_by: 'system'
}
];

export class RetentionPolicyEngine {
async getApplicablePolicy(
event: AuditEvent,
tenantId: string
): Promise<RetentionPolicy> {
// Get all policies that apply to this event
const policies = await this.loadPolicies(tenantId);

const applicablePolicies = policies.filter(policy =>
this.policyApplies(policy, event)
);

// Sort by priority (highest first)
applicablePolicies.sort((a, b) => b.priority - a.priority);

// Return highest priority policy
if (applicablePolicies.length === 0) {
throw new Error('No applicable retention policy found');
}

return applicablePolicies[0];
}

async calculateRetentionDate(event: AuditEvent, tenantId: string): Promise<Date> {
const policy = await this.getApplicablePolicy(event, tenantId);

const eventDate = new Date(event.timestamp);
const retentionDate = new Date(eventDate);
retentionDate.setDate(retentionDate.getDate() + policy.retention_days);

return retentionDate;
}

async isEligibleForPurge(
event: AuditEvent,
tenantId: string
): Promise<{ eligible: boolean; reason?: string }> {
// Check legal hold
const hasLegalHold = await this.checkLegalHold(event.id, tenantId);
if (hasLegalHold) {
return {
eligible: false,
reason: 'Event is under legal hold'
};
}

// Check retention policy
const retentionDate = await this.calculateRetentionDate(event, tenantId);
const now = new Date();

if (now < retentionDate) {
return {
eligible: false,
reason: `Retention period not expired (expires: ${retentionDate.toISOString()})`
};
}

return { eligible: true };
}

private policyApplies(policy: RetentionPolicy, event: AuditEvent): boolean {
const { applies_to } = policy;

// If no filters, policy applies to everything
if (!applies_to.resource_types?.length &&
!applies_to.action_types?.length &&
!applies_to.severity_levels?.length) {
return true;
}

// Check resource type
if (applies_to.resource_types?.length &&
!applies_to.resource_types.includes(event.resource_type)) {
return false;
}

// Check action type
if (applies_to.action_types?.length &&
!applies_to.action_types.includes(event.action_type)) {
return false;
}

// Check severity
if (applies_to.severity_levels?.length &&
!applies_to.severity_levels.includes(event.severity)) {
return false;
}

return true;
}

private async loadPolicies(tenantId: string): Promise<RetentionPolicy[]> {
// Load global + tenant-specific policies
// Implementation depends on your data layer
return DEFAULT_RETENTION_POLICIES;
}

private async checkLegalHold(eventId: string, tenantId: string): Promise<boolean> {
// Check if event is under legal hold
// Implementation depends on your data layer
return false;
}
}

5.2 Purge Orchestrator

import { RetentionPolicyEngine } from './retention-policy-engine';
import { AuditEvent } from './types';

export interface PurgeRequest {
request_id: string;
tenant_id: string;
requested_by: string;
requested_at: string;
dry_run: boolean;
filters?: {
resource_types?: string[];
before_date?: string;
};
}

export interface PurgeResult {
request_id: string;
total_evaluated: number;
eligible_for_purge: number;
purged: number;
skipped: number;
errors: number;
details: {
event_id: string;
status: 'purged' | 'skipped' | 'error';
reason?: string;
}[];
started_at: string;
completed_at: string;
}

export class PurgeOrchestrator {
constructor(
private retentionEngine: RetentionPolicyEngine
) {}

async executePurge(request: PurgeRequest): Promise<PurgeResult> {
const result: PurgeResult = {
request_id: request.request_id,
total_evaluated: 0,
eligible_for_purge: 0,
purged: 0,
skipped: 0,
errors: 0,
details: [],
started_at: new Date().toISOString(),
completed_at: ''
};

// Get events to evaluate
const events = await this.getEventsForPurge(request);
result.total_evaluated = events.length;

for (const event of events) {
try {
const eligibility = await this.retentionEngine.isEligibleForPurge(
event,
request.tenant_id
);

if (eligibility.eligible) {
result.eligible_for_purge++;

if (!request.dry_run) {
await this.purgeEvent(event);
result.purged++;

result.details.push({
event_id: event.id,
status: 'purged'
});
} else {
result.details.push({
event_id: event.id,
status: 'skipped',
reason: 'Dry run mode'
});
}
} else {
result.skipped++;
result.details.push({
event_id: event.id,
status: 'skipped',
reason: eligibility.reason
});
}
} catch (error) {
result.errors++;
result.details.push({
event_id: event.id,
status: 'error',
reason: error.message
});
}
}

result.completed_at = new Date().toISOString();

// Log purge operation
await this.logPurgeOperation(request, result);

return result;
}

private async getEventsForPurge(request: PurgeRequest): Promise<AuditEvent[]> {
// Query database for events matching purge filters
// Implementation depends on your data layer
return [];
}

private async purgeEvent(event: AuditEvent): Promise<void> {
// Soft delete: move to archive table or mark as deleted
// NEVER hard delete audit events - retain for forensics
await this.archiveEvent(event);
}

private async archiveEvent(event: AuditEvent): Promise<void> {
// Move to cold storage / archive table
// Implementation depends on your data layer
}

private async logPurgeOperation(
request: PurgeRequest,
result: PurgeResult
): Promise<void> {
// Create audit event for the purge operation itself
// This is a meta-audit-event
console.log('Purge operation completed:', result);
}
}

6. Reporting Engine

6.1 Pre-Built Report Templates

export interface ReportTemplate {
template_id: string;
name: string;
description: string;
category: 'user_activity' | 'resource_history' | 'compliance' | 'access_patterns';
query_builder: (params: Record<string, any>) => any;
columns: ReportColumn[];
default_parameters?: Record<string, any>;
}

export interface ReportColumn {
name: string;
field: string;
type: 'string' | 'number' | 'date' | 'boolean' | 'json';
aggregation?: 'count' | 'sum' | 'avg' | 'min' | 'max';
format?: (value: any) => string;
}

export const REPORT_TEMPLATES: ReportTemplate[] = [
{
template_id: 'user-activity-summary',
name: 'User Activity Summary',
description: 'Summary of all actions performed by users within a date range',
category: 'user_activity',
query_builder: (params) => ({
tenant_id: params.tenant_id,
start_date: params.start_date,
end_date: params.end_date,
user_id: params.user_id
}),
columns: [
{ name: 'User', field: 'user_full_name', type: 'string' },
{ name: 'Total Actions', field: 'action_count', type: 'number', aggregation: 'count' },
{ name: 'Creates', field: 'create_count', type: 'number' },
{ name: 'Updates', field: 'update_count', type: 'number' },
{ name: 'Deletes', field: 'delete_count', type: 'number' },
{ name: 'Last Activity', field: 'last_activity', type: 'date' }
]
},

{
template_id: 'resource-history',
name: 'Resource Change History',
description: 'Complete change history for a specific resource',
category: 'resource_history',
query_builder: (params) => ({
tenant_id: params.tenant_id,
resource_type: params.resource_type,
resource_id: params.resource_id
}),
columns: [
{ name: 'Timestamp', field: 'timestamp', type: 'date' },
{ name: 'User', field: 'user_full_name', type: 'string' },
{ name: 'Action', field: 'action_type', type: 'string' },
{ name: 'Previous Value', field: 'old_value', type: 'json' },
{ name: 'New Value', field: 'new_value', type: 'json' },
{ name: 'Chain Hash', field: 'chain_hash', type: 'string' }
]
},

{
template_id: 'compliance-summary',
name: 'Compliance Summary',
description: 'Audit completeness and compliance metrics',
category: 'compliance',
query_builder: (params) => ({
tenant_id: params.tenant_id,
start_date: params.start_date,
end_date: params.end_date
}),
columns: [
{ name: 'Metric', field: 'metric_name', type: 'string' },
{ name: 'Value', field: 'metric_value', type: 'number' },
{ name: 'Target', field: 'target_value', type: 'number' },
{ name: 'Status', field: 'status', type: 'string' }
]
},

{
template_id: 'access-patterns',
name: 'Access Pattern Analysis',
description: 'Analysis of access patterns and anomalies',
category: 'access_patterns',
query_builder: (params) => ({
tenant_id: params.tenant_id,
start_date: params.start_date,
end_date: params.end_date,
min_severity: params.min_severity || 'warning'
}),
columns: [
{ name: 'Pattern Type', field: 'pattern_type', type: 'string' },
{ name: 'User', field: 'user_full_name', type: 'string' },
{ name: 'Occurrences', field: 'occurrence_count', type: 'number' },
{ name: 'First Seen', field: 'first_seen', type: 'date' },
{ name: 'Last Seen', field: 'last_seen', type: 'date' },
{ name: 'Risk Score', field: 'risk_score', type: 'number' }
]
}
];

6.2 Anomaly Detection Engine

export interface AnomalyRule {
rule_id: string;
name: string;
description: string;
severity: 'info' | 'warning' | 'critical';
enabled: boolean;
detector: (events: AuditEvent[]) => AnomalyDetection[];
}

export interface AnomalyDetection {
rule_id: string;
severity: 'info' | 'warning' | 'critical';
title: string;
description: string;
affected_events: string[];
risk_score: number;
detected_at: string;
metadata?: Record<string, any>;
}

export const ANOMALY_RULES: AnomalyRule[] = [
{
rule_id: 'off-hours-access',
name: 'Off-Hours Access',
description: 'Detect access to sensitive resources outside business hours',
severity: 'warning',
enabled: true,
detector: (events) => {
const anomalies: AnomalyDetection[] = [];

const offHoursEvents = events.filter(event => {
const hour = new Date(event.timestamp).getUTCHours();
return hour < 6 || hour > 22; // Outside 6am-10pm UTC
});

if (offHoursEvents.length > 0) {
anomalies.push({
rule_id: 'off-hours-access',
severity: 'warning',
title: 'Off-Hours Access Detected',
description: `${offHoursEvents.length} events occurred outside business hours`,
affected_events: offHoursEvents.map(e => e.id),
risk_score: Math.min(offHoursEvents.length * 10, 100),
detected_at: new Date().toISOString()
});
}

return anomalies;
}
},

{
rule_id: 'bulk-operations',
name: 'Bulk Operations',
description: 'Detect unusually large number of operations in short time',
severity: 'critical',
enabled: true,
detector: (events) => {
const anomalies: AnomalyDetection[] = [];

// Group by user and 5-minute windows
const userWindows = new Map<string, Map<string, AuditEvent[]>>();

events.forEach(event => {
const windowKey = Math.floor(new Date(event.timestamp).getTime() / 300000).toString();

if (!userWindows.has(event.user_id)) {
userWindows.set(event.user_id, new Map());
}

const windows = userWindows.get(event.user_id)!;
if (!windows.has(windowKey)) {
windows.set(windowKey, []);
}

windows.get(windowKey)!.push(event);
});

// Detect windows with >50 operations
userWindows.forEach((windows, userId) => {
windows.forEach((windowEvents, windowKey) => {
if (windowEvents.length > 50) {
anomalies.push({
rule_id: 'bulk-operations',
severity: 'critical',
title: 'Bulk Operation Detected',
description: `User performed ${windowEvents.length} operations in 5 minutes`,
affected_events: windowEvents.map(e => e.id),
risk_score: Math.min(windowEvents.length, 100),
detected_at: new Date().toISOString(),
metadata: {
user_id: userId,
operation_count: windowEvents.length,
window_start: new Date(parseInt(windowKey) * 300000).toISOString()
}
});
}
});
});

return anomalies;
}
},

{
rule_id: 'unusual-access-pattern',
name: 'Unusual Access Pattern',
description: 'Detect access from new IP addresses or locations',
severity: 'warning',
enabled: true,
detector: (events) => {
const anomalies: AnomalyDetection[] = [];

// Group by user
const userIPs = new Map<string, Set<string>>();

events.forEach(event => {
if (!userIPs.has(event.user_id)) {
userIPs.set(event.user_id, new Set());
}
userIPs.get(event.user_id)!.add(event.ip_address);
});

// Detect users with >5 different IPs (potential account sharing)
userIPs.forEach((ips, userId) => {
if (ips.size > 5) {
const userEvents = events.filter(e => e.user_id === userId);

anomalies.push({
rule_id: 'unusual-access-pattern',
severity: 'warning',
title: 'Multiple IP Addresses Detected',
description: `User accessed system from ${ips.size} different IP addresses`,
affected_events: userEvents.map(e => e.id),
risk_score: ips.size * 10,
detected_at: new Date().toISOString(),
metadata: {
user_id: userId,
ip_count: ips.size,
ip_addresses: Array.from(ips)
}
});
}
});

return anomalies;
}
}
];

export class AnomalyDetectionEngine {
async detectAnomalies(
events: AuditEvent[],
rules: AnomalyRule[] = ANOMALY_RULES
): Promise<AnomalyDetection[]> {
const allAnomalies: AnomalyDetection[] = [];

for (const rule of rules.filter(r => r.enabled)) {
try {
const anomalies = rule.detector(events);
allAnomalies.push(...anomalies);
} catch (error) {
console.error(`Error in anomaly rule ${rule.rule_id}:`, error);
}
}

// Sort by risk score (highest first)
return allAnomalies.sort((a, b) => b.risk_score - a.risk_score);
}
}

7. Performance Optimization

7.1 PostgreSQL Materialized Views

-- Materialized view for user activity summary
CREATE MATERIALIZED VIEW mv_user_activity_summary AS
SELECT
tenant_id,
user_id,
user_full_name,
DATE_TRUNC('day', timestamp) AS activity_date,
COUNT(*) AS total_actions,
COUNT(*) FILTER (WHERE action_type = 'create') AS create_count,
COUNT(*) FILTER (WHERE action_type = 'read') AS read_count,
COUNT(*) FILTER (WHERE action_type = 'update') AS update_count,
COUNT(*) FILTER (WHERE action_type = 'delete') AS delete_count,
MAX(timestamp) AS last_activity,
COUNT(DISTINCT resource_id) AS unique_resources_accessed
FROM audit_events
WHERE timestamp >= CURRENT_DATE - INTERVAL '90 days'
GROUP BY tenant_id, user_id, user_full_name, DATE_TRUNC('day', timestamp);

CREATE UNIQUE INDEX ON mv_user_activity_summary (tenant_id, user_id, activity_date);
CREATE INDEX ON mv_user_activity_summary (activity_date);
CREATE INDEX ON mv_user_activity_summary (tenant_id, activity_date);

-- Refresh schedule: every 6 hours
CREATE OR REFRESH MATERIALIZED VIEW CONCURRENTLY mv_user_activity_summary;

-- Materialized view for resource change history
CREATE MATERIALIZED VIEW mv_resource_history AS
SELECT
tenant_id,
resource_type,
resource_id,
COUNT(*) AS change_count,
MIN(timestamp) AS first_change,
MAX(timestamp) AS last_change,
COUNT(DISTINCT user_id) AS unique_contributors,
ARRAY_AGG(DISTINCT action_type) AS action_types_performed
FROM audit_events
WHERE timestamp >= CURRENT_DATE - INTERVAL '365 days'
GROUP BY tenant_id, resource_type, resource_id;

CREATE UNIQUE INDEX ON mv_resource_history (tenant_id, resource_type, resource_id);
CREATE INDEX ON mv_resource_history (tenant_id, last_change);

-- Materialized view for compliance metrics
CREATE MATERIALIZED VIEW mv_compliance_metrics AS
SELECT
tenant_id,
DATE_TRUNC('month', timestamp) AS metric_month,
COUNT(*) AS total_events,
COUNT(*) FILTER (WHERE action_type IN ('sign', 'approve')) AS signature_events,
COUNT(*) FILTER (WHERE severity = 'critical') AS critical_events,
COUNT(DISTINCT user_id) AS active_users,
AVG(
CASE
WHEN chain_hash IS NOT NULL AND signature IS NOT NULL THEN 1.0
ELSE 0.0
END
) * 100 AS integrity_score
FROM audit_events
GROUP BY tenant_id, DATE_TRUNC('month', timestamp);

CREATE UNIQUE INDEX ON mv_compliance_metrics (tenant_id, metric_month);

7.2 Query Optimizer

export class QueryOptimizer {
async optimizeSearch(filters: SearchFilters): Promise<string> {
// Determine best index strategy based on filters
const strategy = this.selectIndexStrategy(filters);

// Build optimized SQL query
let query = 'SELECT * FROM audit_events WHERE 1=1';
const params: any[] = [];
let paramIndex = 1;

// Tenant filter (always present, use tenant_id index)
query += ` AND tenant_id = $${paramIndex++}`;
params.push(filters.tenant_id);

// Use materialized view if querying aggregate data
if (this.shouldUseMaterializedView(filters)) {
return this.buildMaterializedViewQuery(filters);
}

// Date range (use timestamp index)
if (filters.start_date) {
query += ` AND timestamp >= $${paramIndex++}`;
params.push(filters.start_date);
}

if (filters.end_date) {
query += ` AND timestamp <= $${paramIndex++}`;
params.push(filters.end_date);
}

// User filter (use composite index: tenant_id, user_id, timestamp)
if (filters.user_id) {
query += ` AND user_id = $${paramIndex++}`;
params.push(filters.user_id);
}

// Resource filters (use composite index: tenant_id, resource_type, resource_id)
if (filters.resource_type) {
query += ` AND resource_type = $${paramIndex++}`;
params.push(filters.resource_type);
}

if (filters.resource_id) {
query += ` AND resource_id = $${paramIndex++}`;
params.push(filters.resource_id);
}

// Full-text search (use GIN index on JSONB fields)
if (filters.full_text_query) {
query += ` AND (
old_value::text ILIKE $${paramIndex} OR
new_value::text ILIKE $${paramIndex}
)`;
params.push(`%${filters.full_text_query}%`);
paramIndex++;
}

// Sort order
const sortColumn = this.getSortColumn(filters.sort_by || 'timestamp_desc');
query += ` ORDER BY ${sortColumn}`;

// Pagination
query += ` LIMIT ${filters.limit || 100}`;

return { query, params };
}

private shouldUseMaterializedView(filters: SearchFilters): boolean {
// Use materialized view for aggregate queries or common reports
return (
!filters.resource_id &&
!filters.full_text_query &&
filters.start_date &&
new Date(filters.start_date) < new Date(Date.now() - 86400000 * 7) // Older than 7 days
);
}

private selectIndexStrategy(filters: SearchFilters): string {
// Determine which composite index will be most effective
if (filters.user_id && filters.start_date) {
return 'user_activity'; // idx_audit_tenant_user_timestamp
}

if (filters.resource_type && filters.resource_id) {
return 'resource_history'; // idx_audit_tenant_resource
}

if (filters.start_date) {
return 'temporal'; // idx_audit_tenant_timestamp
}

return 'default'; // idx_audit_tenant_id
}

private getSortColumn(sortBy: string): string {
const sortMap: Record<string, string> = {
timestamp_asc: 'timestamp ASC',
timestamp_desc: 'timestamp DESC',
user_id: 'user_id, timestamp DESC',
resource_type: 'resource_type, timestamp DESC'
};

return sortMap[sortBy] || 'timestamp DESC';
}
}

7.3 Cache Strategy

import { Redis } from 'ioredis';

export class CacheManager {
constructor(private redis: Redis) {}

async getCached(
cacheKey: string
): Promise<SearchResponse | null> {
const cached = await this.redis.get(cacheKey);

if (cached) {
return JSON.parse(cached);
}

return null;
}

async setCached(
cacheKey: string,
data: SearchResponse,
ttl: number = 300 // 5 minutes default
): Promise<void> {
await this.redis.setex(
cacheKey,
ttl,
JSON.stringify(data)
);
}

buildCacheKey(filters: SearchFilters): string {
// Create deterministic cache key from filters
const sortedFilters = Object.keys(filters)
.sort()
.reduce((acc, key) => {
acc[key] = filters[key as keyof SearchFilters];
return acc;
}, {} as any);

return `audit-search:${JSON.stringify(sortedFilters)}`;
}

async invalidateCache(pattern: string): Promise<void> {
// Invalidate cache entries matching pattern
const keys = await this.redis.keys(pattern);

if (keys.length > 0) {
await this.redis.del(...keys);
}
}
}

8. Compliance Mapping

8.1 Regulatory Requirements Matrix

RequirementImplementationEvidence Location
FDA 21 CFR Part 11.10(e) - Generate accurate and complete copiesCSV/PDF/JSON/XML export with integrity verificationExportJobQueue, PDFFormatter with digital signatures
FDA 21 CFR Part 11.50(b) - Retention throughout required periodAutomated retention policy engine with regulatory minimumsRetentionPolicyEngine, DEFAULT_RETENTION_POLICIES
HIPAA §164.312(b) - Audit controls to record activityComprehensive search API with full-text and structured queriesSearchAPI, REST/GraphQL endpoints
HIPAA §164.316(b)(2)(i) - 6-year retentionHIPAA retention policy (2190 days)RetentionPolicy (policy_id: 'hipaa-audit-logs')
SOC 2 CC6.1 - Logical access monitoringAccess pattern analysis and anomaly detectionAnomalyDetectionEngine, ANOMALY_RULES
SOC 2 CC6.2 - Reviewed and investigated loggingPre-built reports and scheduled distributionREPORT_TEMPLATES, ReportScheduler
SOC 2 CC6.3 - Access revocation loggedUser activity summary reports'user-activity-summary' template
SOC 2 CC7.2 - System monitoringCompliance scorecard and metrics'compliance-summary' template

8.2 Control Evidence Package

export interface ControlEvidence {
control_id: string;
framework: 'FDA_21_CFR_PART_11' | 'HIPAA' | 'SOC2';
control_description: string;
implementation_status: 'implemented' | 'partial' | 'planned';
evidence: {
type: 'automated' | 'manual' | 'policy';
description: string;
location: string;
last_validated: string;
}[];
test_procedure?: string;
responsible_party: string;
}

export const CONTROL_EVIDENCE: ControlEvidence[] = [
{
control_id: 'FDA-PART11-10e',
framework: 'FDA_21_CFR_PART_11',
control_description: 'Ability to generate accurate and complete copies of records in both human readable and electronic form',
implementation_status: 'implemented',
evidence: [
{
type: 'automated',
description: 'Multi-format export pipeline with CSV, PDF, JSON, and XML support',
location: '/audit-trail/export API endpoint',
last_validated: '2026-02-16'
},
{
type: 'automated',
description: 'Digital signature verification for PDF exports using ECDSA P-256',
location: 'PDFFormatter.addSignaturePage()',
last_validated: '2026-02-16'
},
{
type: 'automated',
description: 'Integrity verification via chain hash validation',
location: 'audit_events.chain_hash column + signature verification',
last_validated: '2026-02-16'
}
],
test_procedure: 'Request export in each format, verify completeness against source data, validate digital signatures',
responsible_party: 'CISO / VP Engineering'
},

{
control_id: 'HIPAA-164-312b',
framework: 'HIPAA',
control_description: 'Implement hardware, software, and/or procedural mechanisms that record and examine activity in information systems containing ePHI',
implementation_status: 'implemented',
evidence: [
{
type: 'automated',
description: 'Comprehensive audit trail capturing all PHI access operations',
location: 'audit_events table with JSONB old_value/new_value',
last_validated: '2026-02-16'
},
{
type: 'automated',
description: 'Full-text search across audit events',
location: '/audit-trail/search?full_text_query parameter',
last_validated: '2026-02-16'
},
{
type: 'automated',
description: 'Anomaly detection for suspicious access patterns',
location: 'AnomalyDetectionEngine with 3 active rules',
last_validated: '2026-02-16'
}
],
test_procedure: 'Perform PHI access operation, verify audit event creation, search for event, validate anomaly detection',
responsible_party: 'Privacy Officer / CISO'
}
];

9. Implementation Checklist

9.1 Development Tasks

  • Search API

    • Implement REST endpoints with OpenAPI spec
    • Implement GraphQL schema and resolvers
    • Add rate limiting middleware (100/min user, 1000/min tenant)
    • Implement cursor-based pagination
    • Add full-text search on JSONB fields
    • Implement query optimizer with index selection
    • Add response caching with Redis
  • Export Pipeline

    • Implement CSV formatter with configurable columns
    • Implement PDF formatter with digital signatures (ECDSA P-256)
    • Implement JSON formatter
    • Implement XML formatter (eCTD compatible)
    • Create export job queue with BullMQ
    • Add progress tracking and notifications
    • Implement streaming export for >1M records
  • Retention Management

    • Create retention policy engine
    • Implement default policies (FDA: 7yr, HIPAA: 6yr, SOC2: 1yr)
    • Add tenant-specific policy overrides
    • Implement legal hold mechanism
    • Create purge orchestrator with dry-run mode
    • Add retention countdown dashboard
  • Reporting

    • Implement 4 pre-built report templates
    • Add scheduled report generation
    • Implement anomaly detection engine with 3 rules
    • Create compliance scorecard
    • Add report distribution via email
  • Performance

    • Create materialized views (user activity, resource history, compliance metrics)
    • Set up materialized view refresh schedule
    • Implement cache strategy with TTL
    • Add query performance monitoring
    • Optimize database indexes

9.2 Testing Requirements

  • Functional Testing

    • Search API with all filter combinations
    • Export in all 4 formats (CSV, PDF, JSON, XML)
    • Retention policy application
    • Purge operation (dry-run and actual)
    • Report generation from templates
    • Anomaly detection triggers
  • Performance Testing

    • Search response time <2s (typical query)
    • Export generation <30s (complex reports)
    • Pagination with 1M+ records
    • Concurrent export jobs (5 workers)
    • Rate limit enforcement
  • Security Testing

    • Authorization checks (tenant isolation)
    • PDF digital signature validation
    • Export download URL expiration (24 hours)
    • Rate limit bypass attempts
  • Compliance Testing

    • Export completeness (no data loss)
    • Retention period enforcement
    • Legal hold preventing purge
    • Audit trail for purge operations

9.3 Documentation Requirements

  • API documentation (OpenAPI + GraphQL schema)
  • TypeScript SDK usage examples
  • Report template customization guide
  • Retention policy configuration guide
  • Anomaly detection rule authoring guide
  • Export format specifications
  • Performance tuning guide

10. Revision History

VersionDateAuthorChanges
1.0.02026-02-16CISO OfficeInitial release for task D.5.2

11. Appendices

Appendix A: Rate Limiting Configuration

// Rate limiter configuration
export const RATE_LIMITS = {
search: {
perUser: {
points: 100,
duration: 60 // per minute
},
perTenant: {
points: 1000,
duration: 60
}
},
export: {
perUser: {
points: 10,
duration: 3600 // per hour
},
perTenant: {
points: 50,
duration: 3600
}
}
};

Appendix B: Export Job Queue Configuration

// BullMQ queue configuration
export const QUEUE_CONFIG = {
concurrency: 5,
maxJobsPerWorker: 2,
lockDuration: 600000, // 10 minutes
stalledInterval: 30000,
maxStalledCount: 2,
limiter: {
max: 10,
duration: 60000 // 10 exports per minute
}
};

Appendix C: Performance Targets

MetricTargetMonitoring
Typical search response time<2sP95 latency
Complex report generation<30sP95 latency
Export job queue throughput10/minJobs completed per minute
Cache hit rate>70%Redis cache hit ratio
Materialized view freshness<6 hoursLast refresh timestamp
API availability>99.9%Uptime monitoring

Document Classification: Internal - Restricted Distribution: Information Security, Compliance, Engineering Leadership Next Review: 2026-05-16