Skip to main content

Track J: Product Management & Analytics

Evidence Documentation BIO-QMS Platform Track J: Product Management & Analytics

Executive Summary

This document provides comprehensive implementation evidence for Track J (Product Management & Analytics) of the BIO-QMS platform. The analytics infrastructure enables data-driven product decisions while maintaining full compliance with FDA 21 CFR Part 11, HIPAA, and SOC 2 requirements.

Key Capabilities:

  • Privacy-compliant event telemetry with real-time streaming
  • Feature adoption tracking and cohort retention analysis
  • Work order performance metrics and quality trend dashboards
  • Compliance-safe A/B testing with statistical rigor
  • Feedback collection and roadmap integration

Technology Stack:

  • Backend: NestJS + Prisma + PostgreSQL + BigQuery
  • Event Streaming: Google Cloud Pub/Sub
  • Analytics Engine: BigQuery + Looker Studio
  • Frontend SDK: React hooks + PostHog-compatible API
  • Statistical Analysis: SciPy + Bayesian methods
  • Infrastructure: GCP (Cloud Run, Cloud Storage, Secret Manager)

Regulatory Considerations:

  • All telemetry opt-in with granular consent management
  • PHI/PII data minimization and pseudonymization
  • Audit trail for all analytics queries touching regulated data
  • Feature flags with compliance approval gates
  • A/B tests require regulatory review for GxP-impacting changes

Table of Contents

  1. J.1: Product Telemetry Infrastructure

    • J.1.1: Event Collection Pipeline
    • J.1.2: Data Warehouse Integration
    • J.1.3: Privacy-Compliant Telemetry
    • J.1.4: Real-Time Event Processing
    • J.1.5: Telemetry Health Monitoring
  2. J.2: Feature Analytics & Adoption

    • J.2.1: Feature Usage Tracking
    • J.2.2: User Journey Analysis
    • J.2.3: Cohort Analysis Engine
    • J.2.4: Feature Impact Measurement
    • J.2.5: Adoption Dashboard
  3. J.3: Work Order Analytics Dashboard

    • J.3.1: Work Order Performance Metrics
    • J.3.2: Quality Trend Analysis
    • J.3.3: Compliance Metrics Dashboard
    • J.3.4: Operational Efficiency Scoring
    • J.3.5: Executive Summary Generation
  4. J.4: A/B Testing & Experimentation

    • J.4.1: Experiment Framework
    • J.4.2: Feature Flag Integration
    • J.4.3: Statistical Analysis Engine
    • J.4.4: Experiment Governance
  5. J.5: Feedback & Roadmap Management

    • J.5.1: In-App Feedback Collection
    • J.5.2: Feedback Prioritization Engine
    • J.5.3: Roadmap Integration

J.1: Product Telemetry Infrastructure

J.1.1: Event Collection Pipeline

Overview

The BIO-QMS telemetry pipeline implements a dual-stream architecture:

  1. Client-side events (user interactions, page views, UI errors)
  2. Server-side events (API calls, business logic, system events)

Both streams merge into Google Cloud Pub/Sub for unified event processing, with BigQuery as the analytical warehouse.

Event Schema v2.0

All events conform to a canonical JSON schema based on the Segment Spec v1.0 with regulatory extensions:

{
"event_id": "evt_01H8XZQK9M5N2P3R4S5T6U7V8",
"event_name": "work_order.status_changed",
"event_version": "2.0.0",
"timestamp": "2026-02-17T14:32:15.234Z",
"received_at": "2026-02-17T14:32:15.456Z",

"session_id": "ses_A1B2C3D4E5F6G7H8I9J0K",
"user_id": "usr_bio_12345",
"anonymous_id": "anon_9876543210abcdef",
"tenant_id": "tnt_acme_pharma",

"context": {
"page": {
"url": "/work-orders/WO-2026-0042",
"path": "/work-orders/:id",
"title": "Work Order WO-2026-0042",
"referrer": "/dashboard"
},
"user_agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7)...",
"ip": "10.0.1.45",
"locale": "en-US",
"timezone": "America/New_York",
"library": {
"name": "bio-qms-analytics",
"version": "1.2.0"
},
"device": {
"type": "desktop",
"manufacturer": "Apple",
"model": "MacBook Pro"
}
},

"properties": {
"work_order_id": "WO-2026-0042",
"previous_status": "IN_PROGRESS",
"new_status": "PENDING_REVIEW",
"duration_seconds": 1847,
"assigned_to": "usr_bio_67890",
"department": "manufacturing",
"is_regulatory": true,
"regulatory_impact": "medium"
},

"traits": {
"role": "operator",
"subscription_tier": "enterprise",
"account_age_days": 127,
"feature_flags": ["ai_copilot_v2", "batch_processing"]
},

"integrations": {
"BigQuery": true,
"Looker": true,
"Amplitude": false
},

"regulatory": {
"contains_phi": false,
"contains_pii": false,
"audit_category": "business_event",
"retention_days": 2555,
"requires_signature": false
}
}

Core Event Fields

FieldTypeRequiredDescriptionExample
event_idstringUnique identifier (ULID format)evt_01H8XZ...
event_namestringEvent name (dot notation)work_order.created
event_versionstringSchema version (semver)2.0.0
timestampISO8601Client/server event time (UTC)2026-02-17T14:32:15.234Z
received_atISO8601Server receipt time (UTC)2026-02-17T14:32:15.456Z
session_idstringSession identifierses_A1B2...
user_idstringAuthenticated user IDusr_bio_12345
anonymous_idstringAnonymous identifier (cookie)anon_98765...
tenant_idstringMulti-tenant organization IDtnt_acme_pharma
contextobjectContextual metadataSee schema
propertiesobjectEvent-specific dataSee schema
traitsobjectUser attributes (cached)See schema
integrationsobjectDestination routing{"BigQuery": true}
regulatoryobjectCompliance metadataSee schema

Frontend Event Collection

React Analytics Hook:

// libs/analytics/src/hooks/useAnalytics.ts
import { useCallback, useEffect } from 'react';
import { analytics } from '../client/analytics';
import { useAuth } from '@bio-qms/auth';
import { useTenant } from '@bio-qms/tenant';

export interface TrackEventOptions {
properties?: Record<string, any>;
context?: Partial<EventContext>;
integrations?: Record<string, boolean>;
}

export function useAnalytics() {
const { user } = useAuth();
const { tenant } = useTenant();

// Auto-track page views
useEffect(() => {
analytics.page({
title: document.title,
url: window.location.href,
path: window.location.pathname,
});
}, [window.location.pathname]);

const track = useCallback((
eventName: string,
options: TrackEventOptions = {}
) => {
analytics.track(eventName, {
...options.properties,
tenant_id: tenant?.id,
}, {
context: {
...options.context,
traits: {
role: user?.role,
subscription_tier: tenant?.subscriptionTier,
feature_flags: user?.featureFlags || [],
},
},
integrations: options.integrations,
});
}, [user, tenant]);

const identify = useCallback((userId: string, traits?: Record<string, any>) => {
analytics.identify(userId, traits);
}, []);

const page = useCallback((name?: string, properties?: Record<string, any>) => {
analytics.page(name, properties);
}, []);

return { track, identify, page };
}

Analytics Client Implementation:

// libs/analytics/src/client/analytics.ts
import { v4 as uuidv4 } from 'uuid';
import { ulid } from 'ulid';

interface AnalyticsConfig {
writeKey: string;
endpoint: string;
flushInterval: number;
maxBatchSize: number;
retryAttempts: number;
}

class AnalyticsClient {
private config: AnalyticsConfig;
private queue: Event[] = [];
private flushTimer: NodeJS.Timeout | null = null;
private sessionId: string;
private anonymousId: string;

constructor(config: AnalyticsConfig) {
this.config = config;
this.sessionId = this.getOrCreateSessionId();
this.anonymousId = this.getOrCreateAnonymousId();
this.startFlushTimer();
}

track(eventName: string, properties?: Record<string, any>, options?: any): void {
const event = {
event_id: ulid(),
event_name: eventName,
event_version: '2.0.0',
timestamp: new Date().toISOString(),
session_id: this.sessionId,
anonymous_id: this.anonymousId,
user_id: options?.userId,
tenant_id: options?.tenantId,
context: {
page: {
url: window.location.href,
path: window.location.pathname,
title: document.title,
referrer: document.referrer,
},
user_agent: navigator.userAgent,
locale: navigator.language,
timezone: Intl.DateTimeFormat().resolvedOptions().timeZone,
library: {
name: 'bio-qms-analytics',
version: '1.2.0',
},
...(options?.context || {}),
},
properties: properties || {},
traits: options?.context?.traits || {},
integrations: options?.integrations || { BigQuery: true },
regulatory: this.inferRegulatoryMetadata(eventName, properties),
};

this.enqueue(event);
}

page(name?: string, properties?: Record<string, any>): void {
this.track('page_viewed', {
name: name || document.title,
...properties,
});
}

identify(userId: string, traits?: Record<string, any>): void {
this.track('user_identified', {
user_id: userId,
...traits,
});
}

private enqueue(event: Event): void {
this.queue.push(event);

if (this.queue.length >= this.config.maxBatchSize) {
this.flush();
}
}

private async flush(): Promise<void> {
if (this.queue.length === 0) return;

const batch = this.queue.splice(0, this.config.maxBatchSize);

try {
await fetch(this.config.endpoint, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${this.config.writeKey}`,
},
body: JSON.stringify({ batch }),
});
} catch (error) {
console.error('[Analytics] Flush failed:', error);
// Re-queue on failure (with retry limit)
this.queue.unshift(...batch);
}
}

private startFlushTimer(): void {
this.flushTimer = setInterval(() => {
this.flush();
}, this.config.flushInterval);
}

private getOrCreateSessionId(): string {
const key = 'bio_qms_session_id';
let sessionId = sessionStorage.getItem(key);
if (!sessionId) {
sessionId = `ses_${uuidv4().replace(/-/g, '')}`;
sessionStorage.setItem(key, sessionId);
}
return sessionId;
}

private getOrCreateAnonymousId(): string {
const key = 'bio_qms_anonymous_id';
let anonymousId = localStorage.getItem(key);
if (!anonymousId) {
anonymousId = `anon_${uuidv4().replace(/-/g, '')}`;
localStorage.setItem(key, anonymousId);
}
return anonymousId;
}

private inferRegulatoryMetadata(eventName: string, properties?: Record<string, any>) {
// Detect PHI/PII based on event name and properties
const containsPHI = eventName.includes('patient') ||
eventName.includes('health') ||
properties?.contains_phi === true;

const containsPII = eventName.includes('user') ||
properties?.email !== undefined ||
properties?.contains_pii === true;

return {
contains_phi: containsPHI,
contains_pii: containsPII,
audit_category: this.categorizeEvent(eventName),
retention_days: containsPHI ? 2555 : 730, // 7 years for PHI, 2 years default
requires_signature: false,
};
}

private categorizeEvent(eventName: string): string {
if (eventName.startsWith('work_order.')) return 'business_event';
if (eventName.startsWith('user.')) return 'user_event';
if (eventName.startsWith('system.')) return 'system_event';
if (eventName.startsWith('error.')) return 'error_event';
return 'interaction_event';
}
}

// Singleton instance
export const analytics = new AnalyticsClient({
writeKey: process.env.REACT_APP_ANALYTICS_WRITE_KEY || '',
endpoint: process.env.REACT_APP_ANALYTICS_ENDPOINT || '/api/v1/analytics/events',
flushInterval: 5000, // 5 seconds
maxBatchSize: 50,
retryAttempts: 3,
});

Usage Example:

// Example: Track work order status change
import { useAnalytics } from '@bio-qms/analytics';

function WorkOrderStatusButton({ workOrder }) {
const { track } = useAnalytics();

const handleStatusChange = async (newStatus: string) => {
const startTime = Date.now();

try {
await updateWorkOrderStatus(workOrder.id, newStatus);

track('work_order.status_changed', {
work_order_id: workOrder.id,
previous_status: workOrder.status,
new_status: newStatus,
duration_seconds: (Date.now() - startTime) / 1000,
is_regulatory: workOrder.isRegulatory,
department: workOrder.department,
});
} catch (error) {
track('work_order.status_change_failed', {
work_order_id: workOrder.id,
attempted_status: newStatus,
error_message: error.message,
});
}
};

return <Button onClick={() => handleStatusChange('APPROVED')}>Approve</Button>;
}

Backend Event Collection

NestJS Analytics Service:

// apps/api/src/modules/analytics/analytics.service.ts
import { Injectable, Logger } from '@nestjs/common';
import { ConfigService } from '@nestjs/config';
import { PubSub } from '@google-cloud/pubsub';
import { ulid } from 'ulid';

export interface ServerEvent {
event_id?: string;
event_name: string;
event_version?: string;
timestamp?: string;
user_id?: string;
tenant_id: string;
properties: Record<string, any>;
regulatory?: {
contains_phi?: boolean;
contains_pii?: boolean;
audit_category?: string;
retention_days?: number;
};
}

@Injectable()
export class AnalyticsService {
private readonly logger = new Logger(AnalyticsService.name);
private readonly pubsub: PubSub;
private readonly topicName: string;

constructor(private readonly config: ConfigService) {
this.pubsub = new PubSub({
projectId: this.config.get('GCP_PROJECT_ID'),
});
this.topicName = this.config.get('ANALYTICS_PUBSUB_TOPIC') || 'bio-qms-events';
}

async track(event: ServerEvent): Promise<void> {
const enrichedEvent = {
event_id: event.event_id || ulid(),
event_name: event.event_name,
event_version: event.event_version || '2.0.0',
timestamp: event.timestamp || new Date().toISOString(),
received_at: new Date().toISOString(),
user_id: event.user_id,
tenant_id: event.tenant_id,
context: {
library: {
name: 'bio-qms-backend',
version: this.config.get('APP_VERSION'),
},
server: {
hostname: process.env.HOSTNAME,
region: this.config.get('GCP_REGION'),
},
},
properties: event.properties,
regulatory: {
contains_phi: event.regulatory?.contains_phi || false,
contains_pii: event.regulatory?.contains_pii || false,
audit_category: event.regulatory?.audit_category || 'system_event',
retention_days: event.regulatory?.retention_days || 730,
requires_signature: false,
},
};

try {
const topic = this.pubsub.topic(this.topicName);
const messageBuffer = Buffer.from(JSON.stringify(enrichedEvent));

await topic.publishMessage({
data: messageBuffer,
attributes: {
event_name: event.event_name,
tenant_id: event.tenant_id,
timestamp: enrichedEvent.timestamp,
},
});

this.logger.debug(`Event tracked: ${event.event_name}`);
} catch (error) {
this.logger.error(`Failed to track event: ${event.event_name}`, error.stack);
}
}

async trackWorkOrderEvent(
eventName: string,
workOrderId: string,
tenantId: string,
userId: string,
properties: Record<string, any>
): Promise<void> {
await this.track({
event_name: `work_order.${eventName}`,
tenant_id: tenantId,
user_id: userId,
properties: {
work_order_id: workOrderId,
...properties,
},
regulatory: {
audit_category: 'business_event',
retention_days: 2555, // 7 years for GxP records
},
});
}

async trackAPICall(
method: string,
path: string,
statusCode: number,
durationMs: number,
tenantId: string,
userId?: string
): Promise<void> {
await this.track({
event_name: 'api.request_completed',
tenant_id: tenantId,
user_id: userId,
properties: {
method,
path,
status_code: statusCode,
duration_ms: durationMs,
},
regulatory: {
audit_category: 'system_event',
retention_days: 365,
},
});
}
}

Analytics Interceptor (Auto-track API calls):

// apps/api/src/common/interceptors/analytics.interceptor.ts
import {
Injectable,
NestInterceptor,
ExecutionContext,
CallHandler,
} from '@nestjs/common';
import { Observable } from 'rxjs';
import { tap } from 'rxjs/operators';
import { AnalyticsService } from '../modules/analytics/analytics.service';

@Injectable()
export class AnalyticsInterceptor implements NestInterceptor {
constructor(private readonly analytics: AnalyticsService) {}

intercept(context: ExecutionContext, next: CallHandler): Observable<any> {
const req = context.switchToHttp().getRequest();
const startTime = Date.now();

return next.handle().pipe(
tap(() => {
const res = context.switchToHttp().getResponse();
const durationMs = Date.now() - startTime;

// Don't track analytics endpoints (avoid recursion)
if (req.path.startsWith('/api/v1/analytics')) return;

this.analytics.trackAPICall(
req.method,
req.path,
res.statusCode,
durationMs,
req.user?.tenantId || 'unknown',
req.user?.id,
);
})
);
}
}

Usage in Work Order Service:

// apps/api/src/modules/work-orders/work-orders.service.ts
import { Injectable } from '@nestjs/common';
import { PrismaService } from '../prisma/prisma.service';
import { AnalyticsService } from '../analytics/analytics.service';

@Injectable()
export class WorkOrdersService {
constructor(
private readonly prisma: PrismaService,
private readonly analytics: AnalyticsService,
) {}

async updateStatus(
workOrderId: string,
newStatus: string,
tenantId: string,
userId: string,
) {
const workOrder = await this.prisma.workOrder.findUnique({
where: { id: workOrderId, tenantId },
});

if (!workOrder) {
throw new NotFoundException('Work order not found');
}

const previousStatus = workOrder.status;

const updated = await this.prisma.workOrder.update({
where: { id: workOrderId },
data: { status: newStatus, updatedAt: new Date() },
});

// Track analytics event
await this.analytics.trackWorkOrderEvent(
'status_changed',
workOrderId,
tenantId,
userId,
{
previous_status: previousStatus,
new_status: newStatus,
is_regulatory: workOrder.isRegulatory,
department: workOrder.department,
}
);

return updated;
}
}

Event Ingestion API Endpoint

// apps/api/src/modules/analytics/analytics.controller.ts
import { Controller, Post, Body, UseGuards, Req } from '@nestjs/common';
import { JwtAuthGuard } from '../auth/guards/jwt-auth.guard';
import { AnalyticsService } from './analytics.service';
import { RateLimit } from '@nestjs/throttler';

interface BatchEventDto {
batch: Array<{
event_name: string;
timestamp: string;
properties: Record<string, any>;
context?: Record<string, any>;
}>;
}

@Controller('api/v1/analytics')
@UseGuards(JwtAuthGuard)
export class AnalyticsController {
constructor(private readonly analytics: AnalyticsService) {}

@Post('events')
@RateLimit({ ttl: 60, limit: 100 }) // 100 requests per minute
async ingestEvents(@Req() req, @Body() dto: BatchEventDto) {
const { user } = req;

for (const event of dto.batch) {
await this.analytics.track({
event_name: event.event_name,
timestamp: event.timestamp,
user_id: user.id,
tenant_id: user.tenantId,
properties: event.properties,
});
}

return { success: true, events_received: dto.batch.length };
}
}

J.1.2: Data Warehouse Integration

BigQuery Schema Design

Events Table (Partitioned by Day):

-- BigQuery Schema: analytics.events
CREATE TABLE `bio-qms-prod.analytics.events`
(
event_id STRING NOT NULL,
event_name STRING NOT NULL,
event_version STRING NOT NULL,
timestamp TIMESTAMP NOT NULL,
received_at TIMESTAMP NOT NULL,

session_id STRING,
user_id STRING,
anonymous_id STRING NOT NULL,
tenant_id STRING NOT NULL,

-- Context (nested)
context STRUCT<
page STRUCT<
url STRING,
path STRING,
title STRING,
referrer STRING
>,
user_agent STRING,
ip STRING,
locale STRING,
timezone STRING,
library STRUCT<
name STRING,
version STRING
>,
device STRUCT<
type STRING,
manufacturer STRING,
model STRING
>
>,

-- Event properties (flexible JSON)
properties JSON,

-- User traits (cached)
traits STRUCT<
role STRING,
subscription_tier STRING,
account_age_days INT64,
feature_flags ARRAY<STRING>
>,

-- Regulatory metadata
regulatory STRUCT<
contains_phi BOOL,
contains_pii BOOL,
audit_category STRING,
retention_days INT64,
requires_signature BOOL
>,

-- Partitioning and clustering
_partition_date DATE AS DATE(timestamp),
_ingestion_time TIMESTAMP DEFAULT CURRENT_TIMESTAMP()
)
PARTITION BY _partition_date
CLUSTER BY tenant_id, event_name, user_id
OPTIONS(
partition_expiration_days=2555,
require_partition_filter=true,
description="BIO-QMS product analytics events"
);

-- Create indexes for common queries
CREATE INDEX idx_events_user_id ON `bio-qms-prod.analytics.events`(user_id);
CREATE INDEX idx_events_tenant_event ON `bio-qms-prod.analytics.events`(tenant_id, event_name);

Aggregated Tables (Materialized Views):

-- Daily active users by tenant
CREATE MATERIALIZED VIEW `bio-qms-prod.analytics.daily_active_users`
PARTITION BY DATE(event_date)
AS
SELECT
DATE(timestamp) AS event_date,
tenant_id,
COUNT(DISTINCT user_id) AS dau,
COUNT(DISTINCT session_id) AS sessions,
COUNT(*) AS total_events
FROM `bio-qms-prod.analytics.events`
WHERE user_id IS NOT NULL
GROUP BY event_date, tenant_id;

-- Feature usage metrics
CREATE MATERIALIZED VIEW `bio-qms-prod.analytics.feature_usage_daily`
PARTITION BY DATE(event_date)
AS
SELECT
DATE(timestamp) AS event_date,
tenant_id,
event_name,
COUNT(DISTINCT user_id) AS unique_users,
COUNT(*) AS event_count,
APPROX_QUANTILES(
CAST(JSON_EXTRACT_SCALAR(properties, '$.duration_seconds') AS FLOAT64), 100
)[OFFSET(50)] AS median_duration_seconds
FROM `bio-qms-prod.analytics.events`
WHERE event_name LIKE 'feature.%'
GROUP BY event_date, tenant_id, event_name;

-- Work order performance metrics
CREATE MATERIALIZED VIEW `bio-qms-prod.analytics.work_order_metrics_daily`
PARTITION BY DATE(event_date)
AS
SELECT
DATE(timestamp) AS event_date,
tenant_id,
JSON_EXTRACT_SCALAR(properties, '$.department') AS department,
COUNT(DISTINCT JSON_EXTRACT_SCALAR(properties, '$.work_order_id')) AS work_orders_count,
AVG(CAST(JSON_EXTRACT_SCALAR(properties, '$.duration_seconds') AS FLOAT64)) AS avg_cycle_time_seconds,
COUNTIF(JSON_EXTRACT_SCALAR(properties, '$.new_status') = 'COMPLETED') AS completed_count,
COUNTIF(JSON_EXTRACT_SCALAR(properties, '$.new_status') = 'REJECTED') AS rejected_count
FROM `bio-qms-prod.analytics.events`
WHERE event_name = 'work_order.status_changed'
GROUP BY event_date, tenant_id, department;

Cloud Pub/Sub to BigQuery Dataflow

Dataflow Pipeline Configuration:

// infrastructure/dataflow/pubsub-to-bigquery.ts
import { Dataflow } from '@google-cloud/dataflow';
import { PubSub } from '@google-cloud/pubsub';

export async function deployPubSubToBigQueryPipeline() {
const dataflow = new Dataflow();

const jobConfig = {
projectId: 'bio-qms-prod',
jobName: 'pubsub-events-to-bigquery',
templatePath: 'gs://dataflow-templates/latest/PubSub_Subscription_to_BigQuery',
parameters: {
inputSubscription: 'projects/bio-qms-prod/subscriptions/analytics-events-sub',
outputTableSpec: 'bio-qms-prod:analytics.events',
outputDeadletterTable: 'bio-qms-prod:analytics.events_deadletter',
javascriptTextTransformGcsPath: 'gs://bio-qms-dataflow/transform.js',
javascriptTextTransformFunctionName: 'transformEvent',
},
environment: {
tempLocation: 'gs://bio-qms-dataflow/temp',
maxWorkers: 10,
numWorkers: 2,
workerMachineType: 'n1-standard-2',
zone: 'us-central1-a',
},
};

await dataflow.projects.locations.templates.create(jobConfig);
}

Event Transform Function (JavaScript UDF for Dataflow):

// infrastructure/dataflow/transform.js
function transformEvent(eventJson) {
const event = JSON.parse(eventJson);

// Validate required fields
if (!event.event_id || !event.event_name || !event.tenant_id) {
throw new Error('Missing required fields');
}

// Ensure timestamp is valid
const timestamp = new Date(event.timestamp);
if (isNaN(timestamp.getTime())) {
throw new Error('Invalid timestamp');
}

// Transform to BigQuery schema
return JSON.stringify({
event_id: event.event_id,
event_name: event.event_name,
event_version: event.event_version || '2.0.0',
timestamp: event.timestamp,
received_at: event.received_at || new Date().toISOString(),
session_id: event.session_id || null,
user_id: event.user_id || null,
anonymous_id: event.anonymous_id,
tenant_id: event.tenant_id,
context: event.context || {},
properties: event.properties || {},
traits: event.traits || {},
regulatory: event.regulatory || {
contains_phi: false,
contains_pii: false,
audit_category: 'interaction_event',
retention_days: 730,
requires_signature: false,
},
});
}

Terraform Infrastructure

# infrastructure/terraform/analytics.tf
resource "google_pubsub_topic" "analytics_events" {
name = "bio-qms-events"
project = var.gcp_project_id

message_retention_duration = "604800s" # 7 days

labels = {
environment = var.environment
service = "analytics"
}
}

resource "google_pubsub_subscription" "analytics_events_bigquery" {
name = "analytics-events-sub"
topic = google_pubsub_topic.analytics_events.name
project = var.gcp_project_id

ack_deadline_seconds = 60
message_retention_duration = "604800s"

retry_policy {
minimum_backoff = "10s"
maximum_backoff = "600s"
}

dead_letter_policy {
dead_letter_topic = google_pubsub_topic.analytics_deadletter.id
max_delivery_attempts = 5
}
}

resource "google_bigquery_dataset" "analytics" {
dataset_id = "analytics"
project = var.gcp_project_id
location = "US"
description = "BIO-QMS product analytics data warehouse"

default_table_expiration_ms = 220752000000 # 2555 days (7 years)

access {
role = "OWNER"
user_by_email = google_service_account.analytics.email
}

access {
role = "READER"
special_group = "projectReaders"
}

labels = {
environment = var.environment
compliance = "gxp"
}
}

resource "google_bigquery_table" "events" {
dataset_id = google_bigquery_dataset.analytics.dataset_id
table_id = "events"
project = var.gcp_project_id

time_partitioning {
type = "DAY"
field = "timestamp"
expiration_ms = 220752000000 # 2555 days
require_partition_filter = true
}

clustering = ["tenant_id", "event_name", "user_id"]

schema = file("${path.module}/schemas/events.json")

labels = {
environment = var.environment
pii = "true"
}
}

J.1.3: Privacy-Compliant Telemetry

Prisma Schema:

// prisma/schema.prisma
model User {
id String @id @default(cuid())
email String @unique
tenantId String

// Analytics consent
analyticsConsent AnalyticsConsent?

@@index([tenantId])
}

model AnalyticsConsent {
id String @id @default(cuid())
userId String @unique
user User @relation(fields: [userId], references: [id], onDelete: Cascade)

// Granular consent flags
allowProductAnalytics Boolean @default(false)
allowPerformanceMonitoring Boolean @default(false)
allowErrorTracking Boolean @default(true) // Always true for support
allowMarketingAnalytics Boolean @default(false)

// IP address handling
anonymizeIP Boolean @default(true)

// Consent metadata
consentVersion String @default("1.0")
consentDate DateTime @default(now())
updatedAt DateTime @updatedAt
ipAddress String? // IP where consent was given
userAgent String?

// GDPR/CCPA
dataRetentionDays Int @default(730) // 2 years default

@@index([userId])
}

model AnalyticsOptOut {
id String @id @default(cuid())
anonymousId String @unique
reason String?
optedOutAt DateTime @default(now())

@@index([anonymousId])
}

Consent Service:

// apps/api/src/modules/analytics/consent.service.ts
import { Injectable } from '@nestjs/common';
import { PrismaService } from '../prisma/prisma.service';

export interface UpdateConsentDto {
allowProductAnalytics?: boolean;
allowPerformanceMonitoring?: boolean;
allowErrorTracking?: boolean;
allowMarketingAnalytics?: boolean;
anonymizeIP?: boolean;
dataRetentionDays?: number;
}

@Injectable()
export class ConsentService {
constructor(private readonly prisma: PrismaService) {}

async getConsent(userId: string) {
return this.prisma.analyticsConsent.findUnique({
where: { userId },
});
}

async updateConsent(
userId: string,
dto: UpdateConsentDto,
ipAddress: string,
userAgent: string,
) {
return this.prisma.analyticsConsent.upsert({
where: { userId },
create: {
userId,
...dto,
consentVersion: '1.0',
ipAddress,
userAgent,
},
update: {
...dto,
updatedAt: new Date(),
},
});
}

async checkConsent(userId: string, consentType: keyof UpdateConsentDto): Promise<boolean> {
const consent = await this.getConsent(userId);
if (!consent) return false;

return consent[consentType] === true;
}

async optOut(anonymousId: string, reason?: string) {
return this.prisma.analyticsOptOut.create({
data: {
anonymousId,
reason,
},
});
}

async isOptedOut(anonymousId: string): Promise<boolean> {
const optOut = await this.prisma.analyticsOptOut.findUnique({
where: { anonymousId },
});
return !!optOut;
}
}

Consent-Aware Analytics Middleware:

// apps/api/src/modules/analytics/analytics.service.ts (updated)
@Injectable()
export class AnalyticsService {
constructor(
private readonly config: ConfigService,
private readonly consent: ConsentService,
) {}

async track(event: ServerEvent): Promise<void> {
// Check opt-out for anonymous users
if (event.anonymous_id) {
const isOptedOut = await this.consent.isOptedOut(event.anonymous_id);
if (isOptedOut) {
this.logger.debug(`Event blocked: user opted out (${event.anonymous_id})`);
return;
}
}

// Check consent for authenticated users
if (event.user_id) {
const consentType = this.getRequiredConsentType(event.event_name);
const hasConsent = await this.consent.checkConsent(event.user_id, consentType);

if (!hasConsent) {
this.logger.debug(`Event blocked: missing consent (${event.user_id}, ${consentType})`);
return;
}
}

// Anonymize IP if required
if (event.context?.ip && event.user_id) {
const consent = await this.consent.getConsent(event.user_id);
if (consent?.anonymizeIP) {
event.context.ip = this.anonymizeIP(event.context.ip);
}
}

// Continue with event tracking...
}

private getRequiredConsentType(eventName: string): keyof UpdateConsentDto {
if (eventName.startsWith('error.')) return 'allowErrorTracking';
if (eventName.startsWith('performance.')) return 'allowPerformanceMonitoring';
if (eventName.startsWith('marketing.')) return 'allowMarketingAnalytics';
return 'allowProductAnalytics';
}

private anonymizeIP(ip: string): string {
// IPv4: mask last octet
if (ip.includes('.')) {
const parts = ip.split('.');
return `${parts[0]}.${parts[1]}.${parts[2]}.0`;
}
// IPv6: mask last 80 bits
if (ip.includes(':')) {
const parts = ip.split(':');
return `${parts.slice(0, 3).join(':')}::`;
}
return '0.0.0.0';
}
}

Data Minimization

PII Scrubbing Function:

// libs/analytics/src/utils/scrub-pii.ts
const PII_PATTERNS = {
email: /[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}/g,
phone: /(\+?1[-.\s]?)?(\(?\d{3}\)?[-.\s]?)?\d{3}[-.\s]?\d{4}/g,
ssn: /\d{3}-?\d{2}-?\d{4}/g,
credit_card: /\d{4}[-\s]?\d{4}[-\s]?\d{4}[-\s]?\d{4}/g,
ip_address: /\b(?:\d{1,3}\.){3}\d{1,3}\b/g,
};

const PHI_KEYWORDS = [
'patient',
'diagnosis',
'medication',
'prescription',
'treatment',
'health_record',
'medical_condition',
];

export function scrubPII(obj: any): any {
if (typeof obj !== 'object' || obj === null) {
if (typeof obj === 'string') {
return scrubString(obj);
}
return obj;
}

if (Array.isArray(obj)) {
return obj.map(scrubPII);
}

const scrubbed: any = {};
for (const [key, value] of Object.entries(obj)) {
// Redact sensitive keys
if (isSensitiveKey(key)) {
scrubbed[key] = '[REDACTED]';
} else {
scrubbed[key] = scrubPII(value);
}
}

return scrubbed;
}

function scrubString(str: string): string {
let result = str;

for (const [type, pattern] of Object.entries(PII_PATTERNS)) {
result = result.replace(pattern, `[${type.toUpperCase()}_REDACTED]`);
}

return result;
}

function isSensitiveKey(key: string): boolean {
const lowerKey = key.toLowerCase();

const sensitiveKeys = [
'password',
'secret',
'token',
'api_key',
'ssn',
'social_security',
'credit_card',
'cvv',
'pin',
'phone_number',
'date_of_birth',
'dob',
];

return sensitiveKeys.some(k => lowerKey.includes(k)) ||
PHI_KEYWORDS.some(k => lowerKey.includes(k));
}

Usage in Analytics Client:

// libs/analytics/src/client/analytics.ts (updated)
import { scrubPII } from '../utils/scrub-pii';

class AnalyticsClient {
track(eventName: string, properties?: Record<string, any>, options?: any): void {
// Scrub PII from properties before sending
const scrubbedProperties = scrubPII(properties || {});

const event = {
// ... other fields
properties: scrubbedProperties,
// ...
};

this.enqueue(event);
}
}

GDPR Data Export & Deletion

GDPR Controller:

// apps/api/src/modules/analytics/gdpr.controller.ts
import { Controller, Get, Delete, UseGuards, Req, Res } from '@nestjs/common';
import { JwtAuthGuard } from '../auth/guards/jwt-auth.guard';
import { BigQuery } from '@google-cloud/bigquery';
import { Response } from 'express';

@Controller('api/v1/analytics/gdpr')
@UseGuards(JwtAuthGuard)
export class GDPRController {
private readonly bigquery: BigQuery;

constructor() {
this.bigquery = new BigQuery({ projectId: 'bio-qms-prod' });
}

@Get('export')
async exportUserData(@Req() req, @Res() res: Response) {
const userId = req.user.id;

// Query all events for this user
const query = `
SELECT *
FROM \`bio-qms-prod.analytics.events\`
WHERE user_id = @userId
ORDER BY timestamp DESC
`;

const [rows] = await this.bigquery.query({
query,
params: { userId },
});

// Stream as JSON
res.setHeader('Content-Type', 'application/json');
res.setHeader('Content-Disposition', `attachment; filename="analytics-export-${userId}.json"`);
res.send(JSON.stringify(rows, null, 2));
}

@Delete('delete')
async deleteUserData(@Req() req) {
const userId = req.user.id;

// Delete from BigQuery (soft delete by adding deletion flag)
const query = `
UPDATE \`bio-qms-prod.analytics.events\`
SET properties = JSON_SET(properties, '$.gdpr_deleted', true),
user_id = NULL,
anonymous_id = 'deleted'
WHERE user_id = @userId
`;

await this.bigquery.query({
query,
params: { userId },
});

// Delete consent record
await this.prisma.analyticsConsent.delete({
where: { userId },
});

return { success: true, message: 'Analytics data deleted' };
}
}

J.1.4: Real-Time Event Processing

Cloud Functions for Real-Time Alerts

Anomaly Detection Function:

// functions/analytics-anomaly-detection/src/index.ts
import { EventFunction } from '@google-cloud/functions-framework';
import { PubSub } from '@google-cloud/pubsub';
import { BigQuery } from '@google-cloud/bigquery';

interface PubSubMessage {
data: string;
attributes: Record<string, string>;
}

export const detectAnomalies: EventFunction = async (message: PubSubMessage) => {
const eventData = JSON.parse(Buffer.from(message.data, 'base64').toString());

// Check for error rate spikes
if (eventData.event_name.startsWith('error.')) {
await checkErrorRateAnomaly(eventData);
}

// Check for usage drops (feature adoption)
if (eventData.event_name.startsWith('feature.')) {
await checkUsageDropAnomaly(eventData);
}

// Check for performance degradation
if (eventData.properties.duration_ms > 5000) {
await alertSlowPerformance(eventData);
}
};

async function checkErrorRateAnomaly(event: any) {
const bigquery = new BigQuery();

// Get error rate for last hour
const query = `
SELECT COUNT(*) as error_count
FROM \`bio-qms-prod.analytics.events\`
WHERE event_name LIKE 'error.%'
AND tenant_id = @tenantId
AND timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 1 HOUR)
`;

const [rows] = await bigquery.query({
query,
params: { tenantId: event.tenant_id },
});

const errorCount = rows[0].error_count;

// Compare to baseline (previous 24 hours)
const baselineQuery = `
SELECT AVG(hourly_errors) as baseline
FROM (
SELECT COUNT(*) as hourly_errors
FROM \`bio-qms-prod.analytics.events\`
WHERE event_name LIKE 'error.%'
AND tenant_id = @tenantId
AND timestamp BETWEEN TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 24 HOUR)
AND TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 1 HOUR)
GROUP BY TIMESTAMP_TRUNC(timestamp, HOUR)
)
`;

const [baselineRows] = await bigquery.query({
query: baselineQuery,
params: { tenantId: event.tenant_id },
});

const baseline = baselineRows[0].baseline || 0;

// Alert if error rate is 3x baseline
if (errorCount > baseline * 3 && errorCount > 10) {
await sendAlert({
type: 'error_rate_spike',
tenant_id: event.tenant_id,
current_rate: errorCount,
baseline_rate: baseline,
severity: 'high',
});
}
}

async function sendAlert(alert: any) {
const pubsub = new PubSub();
const topic = pubsub.topic('analytics-alerts');

await topic.publishMessage({
data: Buffer.from(JSON.stringify(alert)),
attributes: {
type: alert.type,
severity: alert.severity,
},
});
}

Deploy Configuration:

# functions/analytics-anomaly-detection/function.yaml
runtime: nodejs20
entryPoint: detectAnomalies
eventTrigger:
eventType: google.pubsub.topic.publish
resource: projects/bio-qms-prod/topics/bio-qms-events
failurePolicy:
retry: {}
environmentVariables:
GCP_PROJECT_ID: bio-qms-prod
ALERT_TOPIC: analytics-alerts

Streaming Analytics with Dataflow

Real-Time Aggregation Pipeline:

# infrastructure/dataflow/realtime_metrics.py
import apache_beam as beam
from apache_beam.options.pipeline_options import PipelineOptions
from apache_beam.transforms import window
import json

class ParseEvent(beam.DoFn):
def process(self, message):
event = json.loads(message.data.decode('utf-8'))
yield {
'tenant_id': event.get('tenant_id'),
'event_name': event.get('event_name'),
'timestamp': event.get('timestamp'),
'user_id': event.get('user_id'),
}

class CountByTenant(beam.DoFn):
def process(self, element):
tenant_id, events = element
yield {
'tenant_id': tenant_id,
'event_count': len(events),
'unique_users': len(set(e['user_id'] for e in events if e.get('user_id'))),
'window_end': events[0]['timestamp'],
}

def run_pipeline():
options = PipelineOptions([
'--project=bio-qms-prod',
'--region=us-central1',
'--runner=DataflowRunner',
'--streaming',
'--temp_location=gs://bio-qms-dataflow/temp',
])

with beam.Pipeline(options=options) as p:
events = (
p
| 'Read from Pub/Sub' >> beam.io.ReadFromPubSub(
subscription='projects/bio-qms-prod/subscriptions/analytics-events-sub'
)
| 'Parse JSON' >> beam.ParDo(ParseEvent())
)

# Real-time metrics (5-minute windows)
realtime_metrics = (
events
| 'Window 5min' >> beam.WindowInto(window.FixedWindows(5 * 60))
| 'Key by tenant' >> beam.Map(lambda e: (e['tenant_id'], e))
| 'Group by tenant' >> beam.GroupByKey()
| 'Count events' >> beam.ParDo(CountByTenant())
| 'Write to BigQuery' >> beam.io.WriteToBigQuery(
'bio-qms-prod:analytics.realtime_metrics',
schema='tenant_id:STRING,event_count:INTEGER,unique_users:INTEGER,window_end:TIMESTAMP',
write_disposition=beam.io.BigQueryDisposition.WRITE_APPEND,
)
)

if __name__ == '__main__':
run_pipeline()

J.1.5: Telemetry Health Monitoring

Monitoring Dashboard (Looker Studio)

BigQuery Views for Monitoring:

-- Event ingestion health metrics
CREATE OR REPLACE VIEW `bio-qms-prod.analytics.telemetry_health` AS
SELECT
TIMESTAMP_TRUNC(timestamp, HOUR) AS hour,
COUNT(*) AS events_received,
COUNT(DISTINCT tenant_id) AS active_tenants,
COUNT(DISTINCT user_id) AS active_users,

-- Latency metrics (received_at - timestamp)
AVG(TIMESTAMP_DIFF(received_at, timestamp, MILLISECOND)) AS avg_latency_ms,
APPROX_QUANTILES(TIMESTAMP_DIFF(received_at, timestamp, MILLISECOND), 100)[OFFSET(95)] AS p95_latency_ms,
APPROX_QUANTILES(TIMESTAMP_DIFF(received_at, timestamp, MILLISECOND), 100)[OFFSET(99)] AS p99_latency_ms,

-- Event types
COUNT(DISTINCT event_name) AS unique_event_types,
COUNTIF(event_name LIKE 'error.%') AS error_events,
COUNTIF(regulatory.contains_phi) AS phi_events,
COUNTIF(regulatory.contains_pii) AS pii_events,

-- Data quality
COUNTIF(event_id IS NULL) AS missing_event_id,
COUNTIF(timestamp IS NULL) AS missing_timestamp,
COUNTIF(tenant_id IS NULL) AS missing_tenant_id,

FROM `bio-qms-prod.analytics.events`
WHERE timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 7 DAY)
GROUP BY hour
ORDER BY hour DESC;

-- Event schema version distribution
CREATE OR REPLACE VIEW `bio-qms-prod.analytics.schema_versions` AS
SELECT
event_version,
COUNT(*) AS event_count,
COUNT(DISTINCT tenant_id) AS tenant_count,
MIN(timestamp) AS first_seen,
MAX(timestamp) AS last_seen
FROM `bio-qms-prod.analytics.events`
WHERE timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 30 DAY)
GROUP BY event_version
ORDER BY last_seen DESC;

-- Data quality report
CREATE OR REPLACE VIEW `bio-qms-prod.analytics.data_quality` AS
SELECT
DATE(timestamp) AS date,

-- Completeness
COUNT(*) AS total_events,
COUNTIF(user_id IS NOT NULL) AS events_with_user_id,
COUNTIF(session_id IS NOT NULL) AS events_with_session_id,
COUNTIF(context.page.url IS NOT NULL) AS events_with_page_url,

-- Validity
COUNTIF(TIMESTAMP_DIFF(received_at, timestamp, SECOND) < 0) AS future_timestamps,
COUNTIF(TIMESTAMP_DIFF(received_at, timestamp, SECOND) > 3600) AS delayed_events_1h,
COUNTIF(JSON_EXTRACT(properties, '$') IS NULL) AS empty_properties,

-- Consistency
COUNTIF(LENGTH(event_id) != 26) AS invalid_event_id_format,
COUNTIF(event_name NOT LIKE '%.%') AS invalid_event_name_format,

FROM `bio-qms-prod.analytics.events`
WHERE timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 7 DAY)
GROUP BY date
ORDER BY date DESC;

Alerting Rules (Cloud Monitoring)

Alert Policy Configuration:

# infrastructure/monitoring/analytics-alerts.yaml
alerts:
- name: "Analytics Ingestion Failure"
condition:
query: |
SELECT COUNT(*) as event_count
FROM `bio-qms-prod.analytics.events`
WHERE timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 5 MINUTE)
threshold: 10 # Alert if < 10 events in 5 minutes
comparison: LESS_THAN
notification:
- channel: pagerduty-oncall
- channel: slack-analytics
severity: CRITICAL

- name: "High Event Latency"
condition:
query: |
SELECT AVG(TIMESTAMP_DIFF(received_at, timestamp, SECOND)) as avg_latency_sec
FROM `bio-qms-prod.analytics.events`
WHERE timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 5 MINUTE)
threshold: 60 # Alert if avg latency > 60 seconds
comparison: GREATER_THAN
notification:
- channel: slack-analytics
severity: WARNING

- name: "Data Quality Degradation"
condition:
query: |
SELECT
COUNTIF(user_id IS NULL) / COUNT(*) * 100 as missing_user_id_pct
FROM `bio-qms-prod.analytics.events`
WHERE timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 1 HOUR)
threshold: 50 # Alert if > 50% events missing user_id
comparison: GREATER_THAN
notification:
- channel: slack-analytics
severity: WARNING

- name: "PHI Leak Detection"
condition:
query: |
SELECT COUNT(*) as phi_events
FROM `bio-qms-prod.analytics.events`
WHERE timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 5 MINUTE)
AND regulatory.contains_phi = true
AND event_name NOT LIKE 'work_order.%' # PHI expected in work orders
threshold: 0 # Alert on any unexpected PHI
comparison: GREATER_THAN
notification:
- channel: pagerduty-security
- channel: slack-compliance
severity: CRITICAL

Health Check API Endpoint

// apps/api/src/modules/analytics/health.controller.ts
import { Controller, Get } from '@nestjs/common';
import { BigQuery } from '@google-cloud/bigquery';

@Controller('api/v1/analytics/health')
export class AnalyticsHealthController {
private readonly bigquery: BigQuery;

constructor() {
this.bigquery = new BigQuery({ projectId: 'bio-qms-prod' });
}

@Get()
async getHealth() {
const checks = await Promise.all([
this.checkIngestionRate(),
this.checkLatency(),
this.checkDataQuality(),
]);

const overallStatus = checks.every(c => c.status === 'healthy') ? 'healthy' : 'degraded';

return {
status: overallStatus,
timestamp: new Date().toISOString(),
checks,
};
}

private async checkIngestionRate() {
const query = `
SELECT COUNT(*) as count
FROM \`bio-qms-prod.analytics.events\`
WHERE timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 5 MINUTE)
`;

const [rows] = await this.bigquery.query({ query });
const count = rows[0].count;

return {
name: 'ingestion_rate',
status: count > 10 ? 'healthy' : 'unhealthy',
value: count,
threshold: 10,
message: `${count} events in last 5 minutes`,
};
}

private async checkLatency() {
const query = `
SELECT AVG(TIMESTAMP_DIFF(received_at, timestamp, SECOND)) as avg_latency
FROM \`bio-qms-prod.analytics.events\`
WHERE timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 5 MINUTE)
`;

const [rows] = await this.bigquery.query({ query });
const latency = rows[0].avg_latency || 0;

return {
name: 'latency',
status: latency < 60 ? 'healthy' : 'degraded',
value: latency,
threshold: 60,
unit: 'seconds',
message: `Average latency: ${latency.toFixed(2)}s`,
};
}

private async checkDataQuality() {
const query = `
SELECT
COUNTIF(user_id IS NULL) / COUNT(*) * 100 as missing_user_id_pct
FROM \`bio-qms-prod.analytics.events\`
WHERE timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 1 HOUR)
`;

const [rows] = await this.bigquery.query({ query });
const missingPct = rows[0].missing_user_id_pct || 0;

return {
name: 'data_quality',
status: missingPct < 50 ? 'healthy' : 'degraded',
value: missingPct,
threshold: 50,
unit: 'percent',
message: `${missingPct.toFixed(1)}% events missing user_id`,
};
}
}

J.2: Feature Analytics & Adoption

J.2.1: Feature Usage Tracking

Feature Event Naming Convention

feature.<feature_name>.<action>

Examples:
- feature.ai_copilot.enabled
- feature.ai_copilot.suggestion_accepted
- feature.ai_copilot.suggestion_rejected
- feature.batch_processing.job_created
- feature.batch_processing.job_completed
- feature.e_signature.signature_added

Feature Registry

Prisma Schema:

// prisma/schema.prisma
model Feature {
id String @id @default(cuid())
name String @unique
displayName String
description String
category String // 'core', 'premium', 'experimental'

// Launch metadata
launchedAt DateTime?
sunsetAt DateTime?
status String @default("beta") // 'alpha', 'beta', 'ga', 'deprecated'

// Adoption targets
targetAdoptionPct Float? // Target adoption rate (e.g., 0.75 for 75%)
targetDaysToAdopt Int? // Target days to first use (e.g., 7)

// Feature flags
isGated Boolean @default(false)
requiredTier String? // 'starter', 'professional', 'enterprise'

// Metrics (cached)
adoptionRate Float? @default(0)
activeUsers30d Int? @default(0)
lastMetricsUpdate DateTime?

createdAt DateTime @default(now())
updatedAt DateTime @updatedAt

@@index([category, status])
}

model FeatureAdoption {
id String @id @default(cuid())
featureName String
userId String
tenantId String

// Adoption lifecycle
firstUsedAt DateTime @default(now())
lastUsedAt DateTime @default(now())
usageCount Int @default(1)

// Stickiness (7-day rolling window)
daysActiveL7 Int @default(1)
daysActiveL30 Int @default(1)

// Engagement
isActive Boolean @default(true)
abandonedAt DateTime?

@@unique([featureName, userId, tenantId])
@@index([featureName, tenantId])
@@index([userId, tenantId])
}

Feature Tracking Service

// apps/api/src/modules/analytics/feature-tracking.service.ts
import { Injectable, Logger } from '@nestjs/common';
import { PrismaService } from '../prisma/prisma.service';
import { AnalyticsService } from './analytics.service';
import { Cron, CronExpression } from '@nestjs/schedule';

@Injectable()
export class FeatureTrackingService {
private readonly logger = new Logger(FeatureTrackingService.name);

constructor(
private readonly prisma: PrismaService,
private readonly analytics: AnalyticsService,
) {}

async trackFeatureUsage(
featureName: string,
action: string,
userId: string,
tenantId: string,
properties?: Record<string, any>,
): Promise<void> {
// Track analytics event
await this.analytics.track({
event_name: `feature.${featureName}.${action}`,
user_id: userId,
tenant_id: tenantId,
properties: {
feature_name: featureName,
action,
...properties,
},
regulatory: {
audit_category: 'feature_event',
retention_days: 730,
},
});

// Update adoption record
await this.upsertAdoption(featureName, userId, tenantId);
}

private async upsertAdoption(
featureName: string,
userId: string,
tenantId: string,
): Promise<void> {
const now = new Date();
const sevenDaysAgo = new Date(now.getTime() - 7 * 24 * 60 * 60 * 1000);

await this.prisma.featureAdoption.upsert({
where: {
featureName_userId_tenantId: {
featureName,
userId,
tenantId,
},
},
create: {
featureName,
userId,
tenantId,
firstUsedAt: now,
lastUsedAt: now,
usageCount: 1,
daysActiveL7: 1,
daysActiveL30: 1,
},
update: {
lastUsedAt: now,
usageCount: { increment: 1 },
isActive: true,
abandonedAt: null,
},
});
}

// Daily cron job to update feature metrics
@Cron(CronExpression.EVERY_DAY_AT_1AM)
async updateFeatureMetrics(): Promise<void> {
this.logger.log('Starting daily feature metrics update...');

const features = await this.prisma.feature.findMany();

for (const feature of features) {
await this.updateFeatureAdoptionRate(feature.name);
}

this.logger.log('Feature metrics update complete');
}

private async updateFeatureAdoptionRate(featureName: string): Promise<void> {
const thirtyDaysAgo = new Date(Date.now() - 30 * 24 * 60 * 60 * 1000);

// Count active users in last 30 days
const activeUsers = await this.prisma.featureAdoption.count({
where: {
featureName,
lastUsedAt: { gte: thirtyDaysAgo },
},
});

// Count total eligible users (from all tenants with access)
const totalUsers = await this.prisma.user.count({
where: {
tenant: {
features: {
some: { name: featureName },
},
},
},
});

const adoptionRate = totalUsers > 0 ? activeUsers / totalUsers : 0;

await this.prisma.feature.update({
where: { name: featureName },
data: {
adoptionRate,
activeUsers30d: activeUsers,
lastMetricsUpdate: new Date(),
},
});
}

async getFeatureAdoptionMetrics(featureName: string) {
const feature = await this.prisma.feature.findUnique({
where: { name: featureName },
});

if (!feature) {
throw new NotFoundException(`Feature not found: ${featureName}`);
}

// Get adoption curve (daily cumulative adopters)
const adoptionCurve = await this.prisma.$queryRaw`
SELECT
DATE(first_used_at) as date,
COUNT(DISTINCT user_id) as new_adopters,
SUM(COUNT(DISTINCT user_id)) OVER (ORDER BY DATE(first_used_at)) as cumulative_adopters
FROM feature_adoption
WHERE feature_name = ${featureName}
GROUP BY date
ORDER BY date ASC
`;

// Get retention cohorts
const retentionCohorts = await this.getRetentionCohorts(featureName);

// Get stickiness (DAU/MAU ratio)
const stickiness = await this.getStickiness(featureName);

return {
feature,
adoptionCurve,
retentionCohorts,
stickiness,
};
}

private async getStickiness(featureName: string) {
const result = await this.prisma.$queryRaw`
SELECT
AVG(days_active_l7 / 7.0) as dau_wau_ratio,
AVG(days_active_l30 / 30.0) as dau_mau_ratio
FROM feature_adoption
WHERE feature_name = ${featureName}
AND last_used_at >= NOW() - INTERVAL '30 days'
`;

return result[0];
}

private async getRetentionCohorts(featureName: string) {
// Weekly cohort retention
const cohorts = await this.prisma.$queryRaw`
WITH cohorts AS (
SELECT
user_id,
DATE_TRUNC('week', first_used_at) as cohort_week,
first_used_at
FROM feature_adoption
WHERE feature_name = ${featureName}
),
activity AS (
SELECT
user_id,
DATE_TRUNC('week', timestamp) as activity_week
FROM analytics.events
WHERE event_name LIKE 'feature.' || ${featureName} || '.%'
GROUP BY user_id, activity_week
)
SELECT
c.cohort_week,
COUNT(DISTINCT c.user_id) as cohort_size,
EXTRACT(WEEK FROM a.activity_week - c.cohort_week) as weeks_since_signup,
COUNT(DISTINCT a.user_id) as retained_users,
COUNT(DISTINCT a.user_id)::FLOAT / COUNT(DISTINCT c.user_id) as retention_rate
FROM cohorts c
LEFT JOIN activity a ON c.user_id = a.user_id
WHERE a.activity_week >= c.cohort_week
GROUP BY c.cohort_week, weeks_since_signup
ORDER BY c.cohort_week, weeks_since_signup
`;

return cohorts;
}
}

React Hook for Feature Tracking

// libs/analytics/src/hooks/useFeatureTracking.ts
import { useCallback } from 'react';
import { useAnalytics } from './useAnalytics';

export function useFeatureTracking(featureName: string) {
const { track } = useAnalytics();

const trackFeatureUsage = useCallback((
action: string,
properties?: Record<string, any>
) => {
track(`feature.${featureName}.${action}`, {
feature_name: featureName,
action,
...properties,
});
}, [featureName, track]);

const trackFeatureEnabled = useCallback(() => {
trackFeatureUsage('enabled');
}, [trackFeatureUsage]);

const trackFeatureDisabled = useCallback(() => {
trackFeatureUsage('disabled');
}, [trackFeatureUsage]);

return {
trackFeatureUsage,
trackFeatureEnabled,
trackFeatureDisabled,
};
}

Usage Example:

// Example: AI Copilot feature tracking
import { useFeatureTracking } from '@bio-qms/analytics';

function AICopilotPanel() {
const { trackFeatureUsage } = useFeatureTracking('ai_copilot');

const handleSuggestionAccepted = (suggestion: string) => {
trackFeatureUsage('suggestion_accepted', {
suggestion_type: 'work_order_description',
suggestion_length: suggestion.length,
});

// Apply suggestion...
};

const handleSuggestionRejected = () => {
trackFeatureUsage('suggestion_rejected');
};

return (
<div>
{/* AI Copilot UI */}
</div>
);
}

J.2.2: User Journey Analysis

Journey Mapping

Define Key User Journeys:

// apps/api/src/modules/analytics/journey-definitions.ts
export const USER_JOURNEYS = {
work_order_creation: {
name: 'Work Order Creation',
steps: [
{ name: 'Dashboard Visit', event: 'page.dashboard_viewed' },
{ name: 'New WO Button Click', event: 'work_order.create_clicked' },
{ name: 'Form Opened', event: 'work_order.form_opened' },
{ name: 'Form Filled', event: 'work_order.form_filled' },
{ name: 'WO Created', event: 'work_order.created' },
],
maxDuration: 600, // 10 minutes
},

work_order_approval: {
name: 'Work Order Approval',
steps: [
{ name: 'Email Notification', event: 'notification.email_sent' },
{ name: 'Email Opened', event: 'notification.email_opened' },
{ name: 'WO Viewed', event: 'work_order.viewed' },
{ name: 'Review Started', event: 'work_order.review_started' },
{ name: 'Approved/Rejected', event: 'work_order.status_changed' },
],
maxDuration: 86400, // 24 hours
},

feature_onboarding_ai_copilot: {
name: 'AI Copilot Onboarding',
steps: [
{ name: 'Feature Announcement', event: 'feature.ai_copilot.announcement_shown' },
{ name: 'Tutorial Started', event: 'feature.ai_copilot.tutorial_started' },
{ name: 'First Suggestion', event: 'feature.ai_copilot.first_suggestion_shown' },
{ name: 'First Acceptance', event: 'feature.ai_copilot.suggestion_accepted' },
{ name: 'Onboarding Complete', event: 'feature.ai_copilot.onboarding_completed' },
],
maxDuration: 1800, // 30 minutes
},
};

Funnel Analysis

BigQuery Funnel Query:

-- Work order creation funnel
WITH funnel_events AS (
SELECT
session_id,
user_id,
tenant_id,
timestamp,
event_name,
ROW_NUMBER() OVER (PARTITION BY session_id ORDER BY timestamp) as step_order
FROM `bio-qms-prod.analytics.events`
WHERE DATE(timestamp) >= CURRENT_DATE() - 30
AND event_name IN (
'page.dashboard_viewed',
'work_order.create_clicked',
'work_order.form_opened',
'work_order.form_filled',
'work_order.created'
)
),
funnel_sessions AS (
SELECT
session_id,
user_id,
tenant_id,
MAX(CASE WHEN event_name = 'page.dashboard_viewed' THEN 1 ELSE 0 END) as step_1,
MAX(CASE WHEN event_name = 'work_order.create_clicked' THEN 1 ELSE 0 END) as step_2,
MAX(CASE WHEN event_name = 'work_order.form_opened' THEN 1 ELSE 0 END) as step_3,
MAX(CASE WHEN event_name = 'work_order.form_filled' THEN 1 ELSE 0 END) as step_4,
MAX(CASE WHEN event_name = 'work_order.created' THEN 1 ELSE 0 END) as step_5
FROM funnel_events
GROUP BY session_id, user_id, tenant_id
)
SELECT
'Dashboard Visit' as step_name,
1 as step_number,
COUNT(*) as users,
1.0 as conversion_rate,
NULL as drop_off_rate
FROM funnel_sessions
WHERE step_1 = 1

UNION ALL

SELECT
'New WO Button Click' as step_name,
2 as step_number,
COUNT(*) as users,
COUNT(*) / (SELECT COUNT(*) FROM funnel_sessions WHERE step_1 = 1) as conversion_rate,
1 - (COUNT(*) / (SELECT COUNT(*) FROM funnel_sessions WHERE step_1 = 1)) as drop_off_rate
FROM funnel_sessions
WHERE step_2 = 1

UNION ALL

SELECT
'Form Opened' as step_name,
3 as step_number,
COUNT(*) as users,
COUNT(*) / (SELECT COUNT(*) FROM funnel_sessions WHERE step_1 = 1) as conversion_rate,
1 - (COUNT(*) / (SELECT COUNT(*) FROM funnel_sessions WHERE step_2 = 1)) as drop_off_rate
FROM funnel_sessions
WHERE step_3 = 1

UNION ALL

SELECT
'Form Filled' as step_name,
4 as step_number,
COUNT(*) as users,
COUNT(*) / (SELECT COUNT(*) FROM funnel_sessions WHERE step_1 = 1) as conversion_rate,
1 - (COUNT(*) / (SELECT COUNT(*) FROM funnel_sessions WHERE step_3 = 1)) as drop_off_rate
FROM funnel_sessions
WHERE step_4 = 1

UNION ALL

SELECT
'WO Created' as step_name,
5 as step_number,
COUNT(*) as users,
COUNT(*) / (SELECT COUNT(*) FROM funnel_sessions WHERE step_1 = 1) as conversion_rate,
1 - (COUNT(*) / (SELECT COUNT(*) FROM funnel_sessions WHERE step_4 = 1)) as drop_off_rate
FROM funnel_sessions
WHERE step_5 = 1

ORDER BY step_number;

Funnel Analysis Service:

// apps/api/src/modules/analytics/funnel.service.ts
import { Injectable } from '@nestjs/common';
import { BigQuery } from '@google-cloud/bigquery';
import { USER_JOURNEYS } from './journey-definitions';

interface FunnelStep {
stepName: string;
stepNumber: number;
users: number;
conversionRate: number;
dropOffRate: number;
}

@Injectable()
export class FunnelService {
private readonly bigquery: BigQuery;

constructor() {
this.bigquery = new BigQuery({ projectId: 'bio-qms-prod' });
}

async analyzeFunnel(
journeyName: keyof typeof USER_JOURNEYS,
tenantId?: string,
startDate?: string,
endDate?: string,
): Promise<FunnelStep[]> {
const journey = USER_JOURNEYS[journeyName];

if (!journey) {
throw new Error(`Unknown journey: ${journeyName}`);
}

const eventNames = journey.steps.map(s => s.event);
const stepCases = journey.steps.map((step, index) =>
`MAX(CASE WHEN event_name = '${step.event}' THEN 1 ELSE 0 END) as step_${index + 1}`
).join(',\n ');

const stepSelects = journey.steps.map((step, index) => {
const stepNum = index + 1;
const prevStep = index > 0 ? index : index + 1;

return `
SELECT
'${step.name}' as step_name,
${stepNum} as step_number,
COUNT(*) as users,
COUNT(*) / (SELECT COUNT(*) FROM funnel_sessions WHERE step_1 = 1) as conversion_rate,
${index > 0 ? `1 - (COUNT(*) / (SELECT COUNT(*) FROM funnel_sessions WHERE step_${prevStep} = 1))` : 'NULL'} as drop_off_rate
FROM funnel_sessions
WHERE step_${stepNum} = 1
`;
}).join('\n UNION ALL\n');

const query = `
WITH funnel_events AS (
SELECT
session_id,
user_id,
tenant_id,
timestamp,
event_name
FROM \`bio-qms-prod.analytics.events\`
WHERE DATE(timestamp) BETWEEN @startDate AND @endDate
${tenantId ? 'AND tenant_id = @tenantId' : ''}
AND event_name IN UNNEST(@eventNames)
),
funnel_sessions AS (
SELECT
session_id,
user_id,
tenant_id,
${stepCases}
FROM funnel_events
GROUP BY session_id, user_id, tenant_id
)
${stepSelects}
ORDER BY step_number
`;

const [rows] = await this.bigquery.query({
query,
params: {
eventNames,
tenantId,
startDate: startDate || '2026-01-01',
endDate: endDate || '2026-12-31',
},
});

return rows as FunnelStep[];
}

async getDropOffPoints(
journeyName: keyof typeof USER_JOURNEYS,
tenantId?: string,
): Promise<Array<{ stepName: string; dropOffRate: number }>> {
const funnel = await this.analyzeFunnel(journeyName, tenantId);

return funnel
.filter(step => step.dropOffRate !== null)
.sort((a, b) => b.dropOffRate - a.dropOffRate)
.slice(0, 3); // Top 3 drop-off points
}
}

Due to length constraints, I'll continue with J.2.3 through J.5 in the next response. The document is being built systematically with production-quality code examples, schemas, and implementation details.

J.2.3: Cohort Analysis Engine

Cohort Definition

Cohorts group users by a shared characteristic (signup date, feature adoption, subscription tier) to analyze behavior patterns over time.

Cohort Types:

Cohort TypeDefinitionUse Case
Acquisition CohortUsers who signed up in the same week/monthRetention analysis
Feature CohortUsers who adopted a feature in the same periodFeature stickiness
Behavioral CohortUsers who performed a specific actionConversion optimization
Tier CohortUsers on the same subscription tierUpgrade/downgrade patterns

Cohort Retention Analysis

BigQuery Cohort Retention Query:

-- Weekly cohort retention for work order creators
WITH cohorts AS (
SELECT
user_id,
tenant_id,
DATE_TRUNC(MIN(timestamp), WEEK) as cohort_week
FROM `bio-qms-prod.analytics.events`
WHERE event_name = 'work_order.created'
GROUP BY user_id, tenant_id
),
activity AS (
SELECT DISTINCT
user_id,
tenant_id,
DATE_TRUNC(timestamp, WEEK) as activity_week
FROM `bio-qms-prod.analytics.events`
WHERE event_name = 'work_order.created'
),
cohort_activity AS (
SELECT
c.cohort_week,
a.activity_week,
c.user_id,
c.tenant_id,
DATE_DIFF(a.activity_week, c.cohort_week, WEEK) as weeks_since_cohort
FROM cohorts c
INNER JOIN activity a
ON c.user_id = a.user_id
AND c.tenant_id = a.tenant_id
AND a.activity_week >= c.cohort_week
)
SELECT
cohort_week,
COUNT(DISTINCT CASE WHEN weeks_since_cohort = 0 THEN user_id END) as week_0,
COUNT(DISTINCT CASE WHEN weeks_since_cohort = 1 THEN user_id END) as week_1,
COUNT(DISTINCT CASE WHEN weeks_since_cohort = 2 THEN user_id END) as week_2,
COUNT(DISTINCT CASE WHEN weeks_since_cohort = 3 THEN user_id END) as week_3,
COUNT(DISTINCT CASE WHEN weeks_since_cohort = 4 THEN user_id END) as week_4,

-- Retention percentages
ROUND(COUNT(DISTINCT CASE WHEN weeks_since_cohort = 1 THEN user_id END) /
NULLIF(COUNT(DISTINCT CASE WHEN weeks_since_cohort = 0 THEN user_id END), 0) * 100, 1) as week_1_retention_pct,
ROUND(COUNT(DISTINCT CASE WHEN weeks_since_cohort = 2 THEN user_id END) /
NULLIF(COUNT(DISTINCT CASE WHEN weeks_since_cohort = 0 THEN user_id END), 0) * 100, 1) as week_2_retention_pct,
ROUND(COUNT(DISTINCT CASE WHEN weeks_since_cohort = 3 THEN user_id END) /
NULLIF(COUNT(DISTINCT CASE WHEN weeks_since_cohort = 0 THEN user_id END), 0) * 100, 1) as week_3_retention_pct,
ROUND(COUNT(DISTINCT CASE WHEN weeks_since_cohort = 4 THEN user_id END) /
NULLIF(COUNT(DISTINCT CASE WHEN weeks_since_cohort = 0 THEN user_id END), 0) * 100, 1) as week_4_retention_pct
FROM cohort_activity
GROUP BY cohort_week
ORDER BY cohort_week DESC
LIMIT 12;

Cohort Service:

// apps/api/src/modules/analytics/cohort.service.ts
import { Injectable } from '@nestjs/common';
import { BigQuery } from '@google-cloud/bigquery';

export interface CohortRetention {
cohortPeriod: string;
cohortSize: number;
retention: Record<string, number>; // { "week_1": 75.3, "week_2": 62.1, ... }
}

@Injectable()
export class CohortService {
private readonly bigquery: BigQuery;

constructor() {
this.bigquery = new BigQuery({ projectId: 'bio-qms-prod' });
}

async getFeatureCohortRetention(
featureName: string,
tenantId?: string,
periodType: 'week' | 'month' = 'week',
): Promise<CohortRetention[]> {
const periodFunc = periodType === 'week' ? 'WEEK' : 'MONTH';

const query = `
WITH cohorts AS (
SELECT
user_id,
tenant_id,
DATE_TRUNC(MIN(timestamp), ${periodFunc}) as cohort_period
FROM \`bio-qms-prod.analytics.events\`
WHERE event_name LIKE 'feature.${featureName}.%'
${tenantId ? 'AND tenant_id = @tenantId' : ''}
GROUP BY user_id, tenant_id
),
activity AS (
SELECT DISTINCT
user_id,
tenant_id,
DATE_TRUNC(timestamp, ${periodFunc}) as activity_period
FROM \`bio-qms-prod.analytics.events\`
WHERE event_name LIKE 'feature.${featureName}.%'
${tenantId ? 'AND tenant_id = @tenantId' : ''}
),
cohort_activity AS (
SELECT
c.cohort_period,
a.activity_period,
c.user_id,
DATE_DIFF(a.activity_period, c.cohort_period, ${periodFunc}) as periods_since_cohort
FROM cohorts c
INNER JOIN activity a
ON c.user_id = a.user_id
AND c.tenant_id = a.tenant_id
AND a.activity_period >= c.cohort_period
)
SELECT
cohort_period,
COUNT(DISTINCT CASE WHEN periods_since_cohort = 0 THEN user_id END) as cohort_size,

-- Dynamic retention by period
STRUCT(
COUNT(DISTINCT CASE WHEN periods_since_cohort = 1 THEN user_id END) /
NULLIF(COUNT(DISTINCT CASE WHEN periods_since_cohort = 0 THEN user_id END), 0) * 100 as period_1,
COUNT(DISTINCT CASE WHEN periods_since_cohort = 2 THEN user_id END) /
NULLIF(COUNT(DISTINCT CASE WHEN periods_since_cohort = 0 THEN user_id END), 0) * 100 as period_2,
COUNT(DISTINCT CASE WHEN periods_since_cohort = 3 THEN user_id END) /
NULLIF(COUNT(DISTINCT CASE WHEN periods_since_cohort = 0 THEN user_id END), 0) * 100 as period_3,
COUNT(DISTINCT CASE WHEN periods_since_cohort = 4 THEN user_id END) /
NULLIF(COUNT(DISTINCT CASE WHEN periods_since_cohort = 0 THEN user_id END), 0) * 100 as period_4,
COUNT(DISTINCT CASE WHEN periods_since_cohort = 8 THEN user_id END) /
NULLIF(COUNT(DISTINCT CASE WHEN periods_since_cohort = 0 THEN user_id END), 0) * 100 as period_8,
COUNT(DISTINCT CASE WHEN periods_since_cohort = 12 THEN user_id END) /
NULLIF(COUNT(DISTINCT CASE WHEN periods_since_cohort = 0 THEN user_id END), 0) * 100 as period_12
) as retention

FROM cohort_activity
GROUP BY cohort_period
ORDER BY cohort_period DESC
LIMIT 12
`;

const [rows] = await this.bigquery.query({
query,
params: { tenantId },
});

return rows.map(row => ({
cohortPeriod: row.cohort_period,
cohortSize: row.cohort_size,
retention: {
period_1: row.retention.period_1,
period_2: row.retention.period_2,
period_3: row.retention.period_3,
period_4: row.retention.period_4,
period_8: row.retention.period_8,
period_12: row.retention.period_12,
},
}));
}

async getBehavioralCohorts(
eventName: string,
tenantId?: string,
): Promise<Array<{
cohort: string;
userCount: number;
avgEventsPerUser: number;
retentionDay7: number;
retentionDay30: number;
}>> {
const query = `
WITH user_behavior AS (
SELECT
user_id,
tenant_id,
MIN(timestamp) as first_event,
COUNT(*) as event_count
FROM \`bio-qms-prod.analytics.events\`
WHERE event_name = @eventName
${tenantId ? 'AND tenant_id = @tenantId' : ''}
GROUP BY user_id, tenant_id
),
cohorts AS (
SELECT
user_id,
CASE
WHEN event_count = 1 THEN 'One-time users'
WHEN event_count BETWEEN 2 AND 5 THEN 'Casual users'
WHEN event_count BETWEEN 6 AND 20 THEN 'Regular users'
ELSE 'Power users'
END as cohort
FROM user_behavior
),
retention AS (
SELECT
c.cohort,
c.user_id,
MAX(CASE WHEN DATE_DIFF(CURRENT_DATE(), DATE(ub.first_event), DAY) >= 7
AND EXISTS (
SELECT 1 FROM \`bio-qms-prod.analytics.events\` e
WHERE e.user_id = c.user_id
AND e.event_name = @eventName
AND DATE(e.timestamp) BETWEEN DATE(ub.first_event) + 7 AND DATE(ub.first_event) + 14
)
THEN 1 ELSE 0 END) as retained_day_7,
MAX(CASE WHEN DATE_DIFF(CURRENT_DATE(), DATE(ub.first_event), DAY) >= 30
AND EXISTS (
SELECT 1 FROM \`bio-qms-prod.analytics.events\` e
WHERE e.user_id = c.user_id
AND e.event_name = @eventName
AND DATE(e.timestamp) BETWEEN DATE(ub.first_event) + 30 AND DATE(ub.first_event) + 37
)
THEN 1 ELSE 0 END) as retained_day_30
FROM cohorts c
JOIN user_behavior ub ON c.user_id = ub.user_id
GROUP BY c.cohort, c.user_id
)
SELECT
c.cohort,
COUNT(DISTINCT c.user_id) as user_count,
AVG(ub.event_count) as avg_events_per_user,
AVG(r.retained_day_7) * 100 as retention_day_7,
AVG(r.retained_day_30) * 100 as retention_day_30
FROM cohorts c
JOIN user_behavior ub ON c.user_id = ub.user_id
LEFT JOIN retention r ON c.cohort = r.cohort AND c.user_id = r.user_id
GROUP BY c.cohort
ORDER BY user_count DESC
`;

const [rows] = await this.bigquery.query({
query,
params: { eventName, tenantId },
});

return rows;
}

async compareCohortsAcrossTenants(featureName: string): Promise<any> {
const query = `
WITH tenant_cohorts AS (
SELECT
tenant_id,
DATE_TRUNC(MIN(timestamp), WEEK) as cohort_week,
COUNT(DISTINCT user_id) as cohort_size
FROM \`bio-qms-prod.analytics.events\`
WHERE event_name LIKE 'feature.${featureName}.%'
GROUP BY tenant_id, cohort_week
),
tenant_retention AS (
SELECT
tc.tenant_id,
tc.cohort_week,
tc.cohort_size,
COUNT(DISTINCT CASE WHEN DATE_DIFF(DATE(e.timestamp), DATE(tc.cohort_week), WEEK) = 4 THEN e.user_id END) / tc.cohort_size * 100 as week_4_retention
FROM tenant_cohorts tc
LEFT JOIN \`bio-qms-prod.analytics.events\` e
ON tc.tenant_id = e.tenant_id
AND e.event_name LIKE 'feature.${featureName}.%'
AND DATE(e.timestamp) >= tc.cohort_week
GROUP BY tc.tenant_id, tc.cohort_week, tc.cohort_size
)
SELECT
tenant_id,
AVG(cohort_size) as avg_cohort_size,
AVG(week_4_retention) as avg_week_4_retention,
MIN(week_4_retention) as min_retention,
MAX(week_4_retention) as max_retention
FROM tenant_retention
WHERE cohort_week >= DATE_SUB(CURRENT_DATE(), INTERVAL 12 WEEK)
GROUP BY tenant_id
ORDER BY avg_week_4_retention DESC
`;

const [rows] = await this.bigquery.query({ query });
return rows;
}
}

J.2.4: Feature Impact Measurement

A/B Test Correlation Analysis

When a feature is released via A/B test, measure its impact on key business metrics:

// apps/api/src/modules/analytics/feature-impact.service.ts
import { Injectable } from '@nestjs/common';
import { BigQuery } from '@google-cloud/bigquery';

export interface FeatureImpact {
featureName: string;
variant: 'control' | 'treatment';

// Engagement metrics
avgSessionDuration: number;
avgEventsPerSession: number;
bounceRate: number;

// Business metrics
workOrdersCreated: number;
workOrdersCompleted: number;
avgCycleTime: number;

// User satisfaction
npsScore: number | null;
feedbackSentimentScore: number | null;
}

@Injectable()
export class FeatureImpactService {
private readonly bigquery: BigQuery;

constructor() {
this.bigquery = new BigQuery({ projectId: 'bio-qms-prod' });
}

async measureFeatureImpact(
featureName: string,
variantA: string,
variantB: string,
startDate: string,
endDate: string,
): Promise<{ control: FeatureImpact; treatment: FeatureImpact; uplift: Record<string, number> }> {
const controlImpact = await this.getVariantMetrics(featureName, variantA, startDate, endDate);
const treatmentImpact = await this.getVariantMetrics(featureName, variantB, startDate, endDate);

const uplift = {
sessionDuration: ((treatmentImpact.avgSessionDuration - controlImpact.avgSessionDuration) / controlImpact.avgSessionDuration) * 100,
eventsPerSession: ((treatmentImpact.avgEventsPerSession - controlImpact.avgEventsPerSession) / controlImpact.avgEventsPerSession) * 100,
workOrdersCreated: ((treatmentImpact.workOrdersCreated - controlImpact.workOrdersCreated) / controlImpact.workOrdersCreated) * 100,
cycleTime: ((treatmentImpact.avgCycleTime - controlImpact.avgCycleTime) / controlImpact.avgCycleTime) * 100,
};

return {
control: controlImpact,
treatment: treatmentImpact,
uplift,
};
}

private async getVariantMetrics(
featureName: string,
variant: string,
startDate: string,
endDate: string,
): Promise<FeatureImpact> {
const query = `
WITH variant_users AS (
SELECT DISTINCT user_id
FROM \`bio-qms-prod.analytics.events\`
WHERE event_name = 'experiment.assigned'
AND JSON_EXTRACT_SCALAR(properties, '$.experiment_name') = @featureName
AND JSON_EXTRACT_SCALAR(properties, '$.variant') = @variant
AND DATE(timestamp) BETWEEN @startDate AND @endDate
),
session_metrics AS (
SELECT
e.session_id,
TIMESTAMP_DIFF(MAX(e.timestamp), MIN(e.timestamp), SECOND) as session_duration,
COUNT(*) as events_per_session
FROM \`bio-qms-prod.analytics.events\` e
WHERE e.user_id IN (SELECT user_id FROM variant_users)
AND DATE(e.timestamp) BETWEEN @startDate AND @endDate
GROUP BY e.session_id
),
work_order_metrics AS (
SELECT
COUNTIF(event_name = 'work_order.created') as wo_created,
COUNTIF(event_name = 'work_order.completed') as wo_completed,
AVG(CAST(JSON_EXTRACT_SCALAR(properties, '$.cycle_time_seconds') AS FLOAT64)) as avg_cycle_time
FROM \`bio-qms-prod.analytics.events\`
WHERE user_id IN (SELECT user_id FROM variant_users)
AND event_name IN ('work_order.created', 'work_order.completed')
AND DATE(timestamp) BETWEEN @startDate AND @endDate
),
feedback_metrics AS (
SELECT
AVG(CAST(JSON_EXTRACT_SCALAR(properties, '$.nps_score') AS INT64)) as nps_score,
AVG(CAST(JSON_EXTRACT_SCALAR(properties, '$.sentiment_score') AS FLOAT64)) as sentiment_score
FROM \`bio-qms-prod.analytics.events\`
WHERE user_id IN (SELECT user_id FROM variant_users)
AND event_name = 'feedback.submitted'
AND DATE(timestamp) BETWEEN @startDate AND @endDate
)
SELECT
@variant as variant,
(SELECT AVG(session_duration) FROM session_metrics) as avg_session_duration,
(SELECT AVG(events_per_session) FROM session_metrics) as avg_events_per_session,
(SELECT COUNTIF(events_per_session = 1) / COUNT(*) * 100 FROM session_metrics) as bounce_rate,
(SELECT wo_created FROM work_order_metrics) as work_orders_created,
(SELECT wo_completed FROM work_order_metrics) as work_orders_completed,
(SELECT avg_cycle_time FROM work_order_metrics) as avg_cycle_time,
(SELECT nps_score FROM feedback_metrics) as nps_score,
(SELECT sentiment_score FROM feedback_metrics) as feedback_sentiment_score
`;

const [rows] = await this.bigquery.query({
query,
params: {
featureName,
variant,
startDate,
endDate,
},
});

return rows[0] as FeatureImpact;
}

async getFeatureROI(featureName: string): Promise<{
developmentCost: number;
monthlyActiveUsers: number;
timeValuePerUser: number;
totalValueGenerated: number;
roi: number;
}> {
// This would integrate with your project management/JIRA to get dev cost
const developmentCost = 50000; // Example: $50k development cost

const query = `
WITH user_time_saved AS (
SELECT
COUNT(DISTINCT user_id) as active_users,
AVG(CAST(JSON_EXTRACT_SCALAR(properties, '$.time_saved_seconds') AS FLOAT64)) as avg_time_saved_per_use,
COUNT(*) as total_uses
FROM \`bio-qms-prod.analytics.events\`
WHERE event_name LIKE 'feature.${featureName}.%'
AND DATE(timestamp) >= DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY)
)
SELECT
active_users,
avg_time_saved_per_use,
total_uses,
(avg_time_saved_per_use / 3600) * 50 as hourly_value, -- Assume $50/hour value
(avg_time_saved_per_use / 3600) * 50 * total_uses as total_value_generated
FROM user_time_saved
`;

const [rows] = await this.bigquery.query({ query });
const { active_users, hourly_value, total_value_generated } = rows[0];

const roi = ((total_value_generated - developmentCost) / developmentCost) * 100;

return {
developmentCost,
monthlyActiveUsers: active_users,
timeValuePerUser: hourly_value,
totalValueGenerated: total_value_generated,
roi,
};
}
}

J.2.5: Adoption Dashboard

React Dashboard Component

// apps/frontend/src/pages/analytics/FeatureAdoptionDashboard.tsx
import React, { useEffect, useState } from 'react';
import { BarChart, Bar, LineChart, Line, XAxis, YAxis, CartesianGrid, Tooltip, Legend, ResponsiveContainer } from 'recharts';
import { api } from '@/lib/api';

interface Feature {
name: string;
displayName: string;
adoptionRate: number;
activeUsers30d: number;
targetAdoptionPct: number;
status: string;
}

interface AdoptionCurveData {
date: string;
newAdopters: number;
cumulativeAdopters: number;
}

export function FeatureAdoptionDashboard() {
const [features, setFeatures] = useState<Feature[]>([]);
const [selectedFeature, setSelectedFeature] = useState<string | null>(null);
const [adoptionCurve, setAdoptionCurve] = useState<AdoptionCurveData[]>([]);
const [cohortRetention, setCohortRetention] = useState<any[]>([]);

useEffect(() => {
loadFeatures();
}, []);

useEffect(() => {
if (selectedFeature) {
loadFeatureMetrics(selectedFeature);
}
}, [selectedFeature]);

const loadFeatures = async () => {
const response = await api.get('/api/v1/analytics/features');
setFeatures(response.data);
};

const loadFeatureMetrics = async (featureName: string) => {
const response = await api.get(`/api/v1/analytics/features/${featureName}/metrics`);
setAdoptionCurve(response.data.adoptionCurve);
setCohortRetention(response.data.retentionCohorts);
};

return (
<div className="p-6">
<h1 className="text-3xl font-bold mb-6">Feature Adoption Dashboard</h1>

{/* Feature Overview Cards */}
<div className="grid grid-cols-1 md:grid-cols-3 gap-4 mb-8">
{features.map(feature => (
<div
key={feature.name}
className={`p-4 border rounded-lg cursor-pointer hover:shadow-lg transition-shadow ${
selectedFeature === feature.name ? 'border-blue-500 bg-blue-50' : 'border-gray-200'
}`}
onClick={() => setSelectedFeature(feature.name)}
>
<div className="flex justify-between items-start mb-2">
<h3 className="font-semibold text-lg">{feature.displayName}</h3>
<span className={`px-2 py-1 text-xs rounded ${
feature.status === 'ga' ? 'bg-green-100 text-green-800' :
feature.status === 'beta' ? 'bg-yellow-100 text-yellow-800' :
'bg-gray-100 text-gray-800'
}`}>
{feature.status.toUpperCase()}
</span>
</div>

<div className="space-y-2">
<div>
<div className="flex justify-between text-sm mb-1">
<span className="text-gray-600">Adoption Rate</span>
<span className="font-semibold">{(feature.adoptionRate * 100).toFixed(1)}%</span>
</div>
<div className="w-full bg-gray-200 rounded-full h-2">
<div
className={`h-2 rounded-full ${
feature.adoptionRate >= feature.targetAdoptionPct ? 'bg-green-500' : 'bg-blue-500'
}`}
style={{ width: `${feature.adoptionRate * 100}%` }}
/>
</div>
{feature.targetAdoptionPct && (
<div className="text-xs text-gray-500 mt-1">
Target: {(feature.targetAdoptionPct * 100).toFixed(0)}%
</div>
)}
</div>

<div className="flex justify-between text-sm">
<span className="text-gray-600">Active Users (30d)</span>
<span className="font-semibold">{feature.activeUsers30d}</span>
</div>
</div>
</div>
))}
</div>

{selectedFeature && (
<>
{/* Adoption Curve */}
<div className="bg-white p-6 rounded-lg shadow mb-8">
<h2 className="text-xl font-semibold mb-4">Adoption Curve</h2>
<ResponsiveContainer width="100%" height={300}>
<LineChart data={adoptionCurve}>
<CartesianGrid strokeDasharray="3 3" />
<XAxis dataKey="date" />
<YAxis yAxisId="left" />
<YAxis yAxisId="right" orientation="right" />
<Tooltip />
<Legend />
<Line yAxisId="left" type="monotone" dataKey="newAdopters" stroke="#8884d8" name="New Adopters" />
<Line yAxisId="right" type="monotone" dataKey="cumulativeAdopters" stroke="#82ca9d" name="Cumulative Adopters" />
</LineChart>
</ResponsiveContainer>
</div>

{/* Cohort Retention Heatmap */}
<div className="bg-white p-6 rounded-lg shadow">
<h2 className="text-xl font-semibold mb-4">Cohort Retention</h2>
<div className="overflow-x-auto">
<table className="min-w-full">
<thead>
<tr className="bg-gray-100">
<th className="px-4 py-2 text-left">Cohort</th>
<th className="px-4 py-2 text-right">Size</th>
<th className="px-4 py-2 text-right">Week 1</th>
<th className="px-4 py-2 text-right">Week 2</th>
<th className="px-4 py-2 text-right">Week 3</th>
<th className="px-4 py-2 text-right">Week 4</th>
<th className="px-4 py-2 text-right">Week 8</th>
<th className="px-4 py-2 text-right">Week 12</th>
</tr>
</thead>
<tbody>
{cohortRetention.map((cohort, idx) => (
<tr key={idx} className="border-b">
<td className="px-4 py-2">{cohort.cohortPeriod}</td>
<td className="px-4 py-2 text-right font-semibold">{cohort.cohortSize}</td>
{[1, 2, 3, 4, 8, 12].map(week => {
const retention = cohort.retention[`period_${week}`];
const color = retention >= 75 ? 'bg-green-100' :
retention >= 50 ? 'bg-yellow-100' :
retention >= 25 ? 'bg-orange-100' :
'bg-red-100';
return (
<td key={week} className={`px-4 py-2 text-right ${color}`}>
{retention ? `${retention.toFixed(1)}%` : '-'}
</td>
);
})}
</tr>
))}
</tbody>
</table>
</div>
</div>
</>
)}
</div>
);
}

J.3: Work Order Analytics Dashboard

J.3.1: Work Order Performance Metrics

Core WO Metrics

MetricDefinitionTargetCalculation
Cycle TimeTime from DRAFT to COMPLETED< 7 daysCOMPLETED_AT - CREATED_AT
ThroughputWOs completed per week50+COUNT(*) per week
SLA Compliance% of WOs completed within SLA> 95%COMPLETED_ON_TIME / TOTAL * 100
First-Time Approval Rate% approved without rejection> 90%NEVER_REJECTED / TOTAL * 100
Avg Review TimeTime in PENDING_REVIEW state< 4 hoursAPPROVED_AT - REVIEW_STARTED_AT
Rejection Rate% of WOs rejected at least once< 5%REJECTED_COUNT / TOTAL * 100

BigQuery Performance Views

-- Work order performance metrics (daily aggregates)
CREATE MATERIALIZED VIEW `bio-qms-prod.analytics.wo_performance_daily`
PARTITION BY DATE(date)
AS
SELECT
DATE(timestamp) as date,
tenant_id,
JSON_EXTRACT_SCALAR(properties, '$.department') as department,

-- Volume metrics
COUNT(DISTINCT JSON_EXTRACT_SCALAR(properties, '$.work_order_id')) as wo_count,
COUNTIF(JSON_EXTRACT_SCALAR(properties, '$.new_status') = 'COMPLETED') as wo_completed,
COUNTIF(JSON_EXTRACT_SCALAR(properties, '$.new_status') = 'REJECTED') as wo_rejected,

-- Cycle time metrics (seconds)
AVG(CAST(JSON_EXTRACT_SCALAR(properties, '$.cycle_time_seconds') AS FLOAT64)) as avg_cycle_time,
APPROX_QUANTILES(CAST(JSON_EXTRACT_SCALAR(properties, '$.cycle_time_seconds') AS FLOAT64), 100)[OFFSET(50)] as median_cycle_time,
APPROX_QUANTILES(CAST(JSON_EXTRACT_SCALAR(properties, '$.cycle_time_seconds') AS FLOAT64), 100)[OFFSET(95)] as p95_cycle_time,

-- SLA metrics
COUNTIF(CAST(JSON_EXTRACT_SCALAR(properties, '$.is_sla_compliant') AS BOOL)) as sla_compliant_count,
COUNTIF(CAST(JSON_EXTRACT_SCALAR(properties, '$.is_sla_compliant') AS BOOL)) /
NULLIF(COUNTIF(JSON_EXTRACT_SCALAR(properties, '$.new_status') = 'COMPLETED'), 0) * 100 as sla_compliance_pct,

-- Regulatory breakdown
COUNTIF(CAST(JSON_EXTRACT_SCALAR(properties, '$.is_regulatory') AS BOOL)) as regulatory_wo_count,
COUNTIF(NOT CAST(JSON_EXTRACT_SCALAR(properties, '$.is_regulatory') AS BOOL)) as non_regulatory_wo_count

FROM `bio-qms-prod.analytics.events`
WHERE event_name = 'work_order.status_changed'
AND DATE(timestamp) >= DATE_SUB(CURRENT_DATE(), INTERVAL 90 DAY)
GROUP BY date, tenant_id, department;

-- Work order state transition times
CREATE OR REPLACE VIEW `bio-qms-prod.analytics.wo_state_durations` AS
WITH state_changes AS (
SELECT
JSON_EXTRACT_SCALAR(properties, '$.work_order_id') as work_order_id,
tenant_id,
JSON_EXTRACT_SCALAR(properties, '$.new_status') as state,
timestamp as entered_at,
LEAD(timestamp) OVER (PARTITION BY JSON_EXTRACT_SCALAR(properties, '$.work_order_id') ORDER BY timestamp) as exited_at
FROM `bio-qms-prod.analytics.events`
WHERE event_name = 'work_order.status_changed'
)
SELECT
work_order_id,
tenant_id,
state,
entered_at,
exited_at,
TIMESTAMP_DIFF(exited_at, entered_at, SECOND) as duration_seconds,
TIMESTAMP_DIFF(exited_at, entered_at, HOUR) as duration_hours
FROM state_changes
WHERE exited_at IS NOT NULL;

-- Average time in each state
CREATE OR REPLACE VIEW `bio-qms-prod.analytics.wo_state_avg_durations` AS
SELECT
tenant_id,
state,
COUNT(*) as transition_count,
AVG(duration_seconds) as avg_duration_seconds,
APPROX_QUANTILES(duration_seconds, 100)[OFFSET(50)] as median_duration_seconds,
APPROX_QUANTILES(duration_seconds, 100)[OFFSET(95)] as p95_duration_seconds
FROM `bio-qms-prod.analytics.wo_state_durations`
GROUP BY tenant_id, state;

NestJS Performance Service

// apps/api/src/modules/analytics/wo-performance.service.ts
import { Injectable } from '@nestjs/common';
import { BigQuery } from '@google-cloud/bigquery';

export interface WOPerformanceMetrics {
date: string;
woCount: number;
woCompleted: number;
avgCycleTime: number;
medianCycleTime: number;
slaCompliancePct: number;
rejectionRate: number;
}

export interface StateDurationMetrics {
state: string;
avgDurationSeconds: number;
medianDurationSeconds: number;
p95DurationSeconds: number;
}

@Injectable()
export class WorkOrderPerformanceService {
private readonly bigquery: BigQuery;

constructor() {
this.bigquery = new BigQuery({ projectId: 'bio-qms-prod' });
}

async getDailyMetrics(
tenantId: string,
startDate: string,
endDate: string,
department?: string,
): Promise<WOPerformanceMetrics[]> {
const query = `
SELECT
date,
wo_count,
wo_completed,
avg_cycle_time,
median_cycle_time,
sla_compliance_pct,
(wo_rejected / NULLIF(wo_count, 0)) * 100 as rejection_rate
FROM \`bio-qms-prod.analytics.wo_performance_daily\`
WHERE tenant_id = @tenantId
AND date BETWEEN @startDate AND @endDate
${department ? 'AND department = @department' : ''}
ORDER BY date DESC
`;

const [rows] = await this.bigquery.query({
query,
params: { tenantId, startDate, endDate, department },
});

return rows as WOPerformanceMetrics[];
}

async getStateDurations(tenantId: string): Promise<StateDurationMetrics[]> {
const query = `
SELECT
state,
avg_duration_seconds,
median_duration_seconds,
p95_duration_seconds
FROM \`bio-qms-prod.analytics.wo_state_avg_durations\`
WHERE tenant_id = @tenantId
ORDER BY avg_duration_seconds DESC
`;

const [rows] = await this.bigquery.query({
query,
params: { tenantId },
});

return rows as StateDurationMetrics[];
}

async getBottlenecks(tenantId: string): Promise<Array<{
state: string;
avgDuration: number;
impact: string;
}>> {
const durations = await this.getStateDurations(tenantId);

// Identify bottlenecks (states with > 24 hour avg duration)
const bottlenecks = durations
.filter(d => d.avgDurationSeconds > 86400) // > 24 hours
.map(d => ({
state: d.state,
avgDuration: d.avgDurationSeconds / 3600, // Convert to hours
impact: this.categorizeImpact(d.avgDurationSeconds),
}))
.sort((a, b) => b.avgDuration - a.avgDuration);

return bottlenecks;
}

private categorizeImpact(seconds: number): string {
const days = seconds / 86400;
if (days > 7) return 'Critical';
if (days > 3) return 'High';
if (days > 1) return 'Medium';
return 'Low';
}

async getSLABreakdown(tenantId: string): Promise<{
overall: number;
byDepartment: Array<{ department: string; compliance: number }>;
byType: Array<{ isRegulatory: boolean; compliance: number }>;
}> {
const query = `
WITH recent_data AS (
SELECT *
FROM \`bio-qms-prod.analytics.wo_performance_daily\`
WHERE tenant_id = @tenantId
AND date >= DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY)
)
SELECT
AVG(sla_compliance_pct) as overall_compliance,

ARRAY_AGG(STRUCT(
department,
AVG(sla_compliance_pct) as compliance
)) as by_department,

SUM(regulatory_wo_count * sla_compliance_pct / 100) / NULLIF(SUM(regulatory_wo_count), 0) * 100 as regulatory_compliance,
SUM(non_regulatory_wo_count * sla_compliance_pct / 100) / NULLIF(SUM(non_regulatory_wo_count), 0) * 100 as non_regulatory_compliance
FROM recent_data
`;

const [rows] = await this.bigquery.query({
query,
params: { tenantId },
});

const row = rows[0];

return {
overall: row.overall_compliance,
byDepartment: row.by_department,
byType: [
{ isRegulatory: true, compliance: row.regulatory_compliance },
{ isRegulatory: false, compliance: row.non_regulatory_compliance },
],
};
}
}

J.3.2: Quality Trend Analysis

Quality Metrics

// apps/api/src/modules/analytics/quality-trends.service.ts
import { Injectable } from '@nestjs/common';
import { BigQuery } from '@google-cloud/bigquery';

export interface QualityMetrics {
date: string;
defectRate: number;
capaCount: number;
deviationCount: number;
avgDefectResolutionTime: number;
repeatDefectRate: number;
}

@Injectable()
export class QualityTrendsService {
private readonly bigquery: BigQuery;

constructor() {
this.bigquery = new BigQuery({ projectId: 'bio-qms-prod' });
}

async getDefectTrends(
tenantId: string,
startDate: string,
endDate: string,
): Promise<QualityMetrics[]> {
const query = `
WITH defects AS (
SELECT
DATE(timestamp) as date,
JSON_EXTRACT_SCALAR(properties, '$.defect_type') as defect_type,
JSON_EXTRACT_SCALAR(properties, '$.work_order_id') as work_order_id,
JSON_EXTRACT_SCALAR(properties, '$.root_cause') as root_cause,
CAST(JSON_EXTRACT_SCALAR(properties, '$.resolution_time_seconds') AS FLOAT64) as resolution_time
FROM \`bio-qms-prod.analytics.events\`
WHERE event_name = 'quality.defect_reported'
AND tenant_id = @tenantId
AND DATE(timestamp) BETWEEN @startDate AND @endDate
),
total_wo AS (
SELECT
DATE(timestamp) as date,
COUNT(DISTINCT JSON_EXTRACT_SCALAR(properties, '$.work_order_id')) as total_count
FROM \`bio-qms-prod.analytics.events\`
WHERE event_name = 'work_order.completed'
AND tenant_id = @tenantId
AND DATE(timestamp) BETWEEN @startDate AND @endDate
GROUP BY date
),
capas AS (
SELECT
DATE(timestamp) as date,
COUNT(*) as capa_count
FROM \`bio-qms-prod.analytics.events\`
WHERE event_name = 'capa.created'
AND tenant_id = @tenantId
AND DATE(timestamp) BETWEEN @startDate AND @endDate
GROUP BY date
),
deviations AS (
SELECT
DATE(timestamp) as date,
COUNT(*) as deviation_count
FROM \`bio-qms-prod.analytics.events\`
WHERE event_name = 'deviation.reported'
AND tenant_id = @tenantId
AND DATE(timestamp) BETWEEN @startDate AND @endDate
GROUP BY date
),
repeat_defects AS (
SELECT
d1.date,
COUNT(DISTINCT d1.work_order_id) as repeat_count
FROM defects d1
INNER JOIN defects d2
ON d1.root_cause = d2.root_cause
AND d1.date > d2.date
AND DATE_DIFF(d1.date, d2.date, DAY) <= 30
GROUP BY d1.date
)
SELECT
d.date,
COUNT(DISTINCT d.work_order_id) / NULLIF(tw.total_count, 0) * 100 as defect_rate,
COALESCE(c.capa_count, 0) as capa_count,
COALESCE(dv.deviation_count, 0) as deviation_count,
AVG(d.resolution_time) / 3600 as avg_defect_resolution_time_hours,
COALESCE(rd.repeat_count, 0) / NULLIF(COUNT(DISTINCT d.work_order_id), 0) * 100 as repeat_defect_rate
FROM defects d
LEFT JOIN total_wo tw ON d.date = tw.date
LEFT JOIN capas c ON d.date = c.date
LEFT JOIN deviations dv ON d.date = dv.date
LEFT JOIN repeat_defects rd ON d.date = rd.date
GROUP BY d.date, tw.total_count, c.capa_count, dv.deviation_count, rd.repeat_count
ORDER BY d.date DESC
`;

const [rows] = await this.bigquery.query({
query,
params: { tenantId, startDate, endDate },
});

return rows as QualityMetrics[];
}

async getCAPAEffectiveness(tenantId: string): Promise<{
totalCapas: number;
closedCapas: number;
avgTimeToClose: number;
recurrenceRate: number;
}> {
const query = `
WITH capas AS (
SELECT
JSON_EXTRACT_SCALAR(properties, '$.capa_id') as capa_id,
JSON_EXTRACT_SCALAR(properties, '$.root_cause') as root_cause,
MIN(timestamp) as created_at,
MAX(CASE WHEN JSON_EXTRACT_SCALAR(properties, '$.status') = 'CLOSED' THEN timestamp END) as closed_at
FROM \`bio-qms-prod.analytics.events\`
WHERE event_name IN ('capa.created', 'capa.status_changed')
AND tenant_id = @tenantId
AND timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 90 DAY)
GROUP BY capa_id, root_cause
),
recurrences AS (
SELECT
c1.capa_id,
COUNT(DISTINCT c2.capa_id) as recurrence_count
FROM capas c1
LEFT JOIN capas c2
ON c1.root_cause = c2.root_cause
AND c2.created_at > c1.closed_at
AND DATE_DIFF(DATE(c2.created_at), DATE(c1.closed_at), DAY) <= 90
WHERE c1.closed_at IS NOT NULL
GROUP BY c1.capa_id
)
SELECT
COUNT(*) as total_capas,
COUNTIF(closed_at IS NOT NULL) as closed_capas,
AVG(TIMESTAMP_DIFF(closed_at, created_at, DAY)) as avg_time_to_close_days,
SUM(CASE WHEN r.recurrence_count > 0 THEN 1 ELSE 0 END) / NULLIF(COUNTIF(closed_at IS NOT NULL), 0) * 100 as recurrence_rate
FROM capas c
LEFT JOIN recurrences r ON c.capa_id = r.capa_id
`;

const [rows] = await this.bigquery.query({
query,
params: { tenantId },
});

return rows[0];
}

async getTopDefectRootCauses(tenantId: string, limit: number = 10): Promise<Array<{
rootCause: string;
occurrences: number;
percentOfTotal: number;
}>> {
const query = `
WITH defects AS (
SELECT
JSON_EXTRACT_SCALAR(properties, '$.root_cause') as root_cause
FROM \`bio-qms-prod.analytics.events\`
WHERE event_name = 'quality.defect_reported'
AND tenant_id = @tenantId
AND timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 90 DAY)
),
total AS (
SELECT COUNT(*) as total_count
FROM defects
)
SELECT
root_cause,
COUNT(*) as occurrences,
COUNT(*) / (SELECT total_count FROM total) * 100 as percent_of_total
FROM defects
WHERE root_cause IS NOT NULL
GROUP BY root_cause
ORDER BY occurrences DESC
LIMIT @limit
`;

const [rows] = await this.bigquery.query({
query,
params: { tenantId, limit },
});

return rows;
}
}

J.3.3: Compliance Metrics Dashboard

Compliance KPIs

// apps/api/src/modules/analytics/compliance-metrics.service.ts
import { Injectable } from '@nestjs/common';
import { BigQuery } from '@google-cloud/bigquery';
import { PrismaService } from '../prisma/prisma.service';

export interface ComplianceMetrics {
trainingCompliancePct: number;
auditReadinessScore: number;
part11CompliancePct: number;
signatureCompletionRate: number;
auditTrailIntegrity: number;
documentControlScore: number;
}

@Injectable()
export class ComplianceMetricsService {
private readonly bigquery: BigQuery;

constructor(private readonly prisma: PrismaService) {
this.bigquery = new BigQuery({ projectId: 'bio-qms-prod' });
}

async getComplianceScorecard(tenantId: string): Promise<ComplianceMetrics> {
const [
trainingCompliance,
auditReadiness,
part11Compliance,
signatureCompletion,
auditTrailIntegrity,
documentControl,
] = await Promise.all([
this.getTrainingCompliance(tenantId),
this.getAuditReadinessScore(tenantId),
this.getPart11Compliance(tenantId),
this.getSignatureCompletionRate(tenantId),
this.getAuditTrailIntegrity(tenantId),
this.getDocumentControlScore(tenantId),
]);

return {
trainingCompliancePct: trainingCompliance,
auditReadinessScore: auditReadiness,
part11CompliancePct: part11Compliance,
signatureCompletionRate: signatureCompletion,
auditTrailIntegrity: auditTrailIntegrity,
documentControlScore: documentControl,
};
}

private async getTrainingCompliance(tenantId: string): Promise<number> {
// Users with required training completed / total users
const result = await this.prisma.$queryRaw`
WITH required_training AS (
SELECT u.id as user_id, rt.training_id
FROM "User" u
CROSS JOIN "RequiredTraining" rt
WHERE u.tenant_id = ${tenantId}
AND u.role IN (SELECT role FROM unnest(rt.required_roles) as role)
),
completed_training AS (
SELECT user_id, training_id
FROM "TrainingCompletion"
WHERE tenant_id = ${tenantId}
AND expires_at > NOW()
)
SELECT
COUNT(DISTINCT ct.user_id || '-' || ct.training_id)::FLOAT /
NULLIF(COUNT(DISTINCT rt.user_id || '-' || rt.training_id), 0) * 100 as compliance_pct
FROM required_training rt
LEFT JOIN completed_training ct
ON rt.user_id = ct.user_id
AND rt.training_id = ct.training_id
`;

return result[0]?.compliance_pct || 0;
}

private async getAuditReadinessScore(tenantId: string): Promise<number> {
// Composite score based on multiple factors
const query = `
WITH metrics AS (
SELECT
-- Documentation completeness (40 points)
(SELECT COUNT(*) FROM "WorkOrder" WHERE tenant_id = ${tenantId} AND status = 'COMPLETED' AND description IS NOT NULL AND description != '') /
NULLIF((SELECT COUNT(*) FROM "WorkOrder" WHERE tenant_id = ${tenantId} AND status = 'COMPLETED'), 0) * 40 as doc_score,

-- Signature compliance (30 points)
(SELECT COUNT(*) FROM "WorkOrder" WHERE tenant_id = ${tenantId} AND is_regulatory = true AND (SELECT COUNT(*) FROM "ESignature" WHERE work_order_id = "WorkOrder".id) >= 2) /
NULLIF((SELECT COUNT(*) FROM "WorkOrder" WHERE tenant_id = ${tenantId} AND is_regulatory = true), 0) * 30 as sig_score,

-- Timely closure (20 points)
(SELECT COUNT(*) FROM "WorkOrder" WHERE tenant_id = ${tenantId} AND status = 'COMPLETED' AND completed_at <= target_completion_date) /
NULLIF((SELECT COUNT(*) FROM "WorkOrder" WHERE tenant_id = ${tenantId} AND status = 'COMPLETED'), 0) * 20 as closure_score,

-- CAPA effectiveness (10 points)
(SELECT COUNT(*) FROM "CAPA" WHERE tenant_id = ${tenantId} AND status = 'CLOSED' AND effectiveness_verified = true) /
NULLIF((SELECT COUNT(*) FROM "CAPA" WHERE tenant_id = ${tenantId} AND status = 'CLOSED'), 0) * 10 as capa_score
)
SELECT doc_score + sig_score + closure_score + capa_score as total_score
FROM metrics
`;

const result = await this.prisma.$queryRawUnsafe(query);
return result[0]?.total_score || 0;
}

private async getPart11Compliance(tenantId: string): Promise<number> {
// 21 CFR Part 11 compliance checks
const result = await this.prisma.$queryRaw`
WITH part11_checks AS (
SELECT
wo.id,
-- Check 1: Electronic signature present
(SELECT COUNT(*) FROM "ESignature" WHERE work_order_id = wo.id) >= 1 as has_signature,

-- Check 2: Audit trail complete
(SELECT COUNT(*) FROM "AuditLog" WHERE entity_type = 'WorkOrder' AND entity_id = wo.id::text) >= 3 as has_audit_trail,

-- Check 3: User authentication logged
(SELECT COUNT(*) FROM "AuditLog" WHERE entity_type = 'WorkOrder' AND entity_id = wo.id::text AND action = 'USER_AUTHENTICATED') >= 1 as has_auth_log,

-- Check 4: Data integrity (hash validation)
(SELECT COUNT(*) FROM "DataIntegrityCheck" WHERE work_order_id = wo.id AND status = 'PASSED') >= 1 as has_integrity_check

FROM "WorkOrder" wo
WHERE wo.tenant_id = ${tenantId}
AND wo.is_regulatory = true
AND wo.status = 'COMPLETED'
AND wo.completed_at >= NOW() - INTERVAL '90 days'
)
SELECT
(SUM(CASE WHEN has_signature AND has_audit_trail AND has_auth_log AND has_integrity_check THEN 1 ELSE 0 END)::FLOAT /
NULLIF(COUNT(*), 0)) * 100 as compliance_pct
FROM part11_checks
`;

return result[0]?.compliance_pct || 0;
}

private async getSignatureCompletionRate(tenantId: string): Promise<number> {
const result = await this.prisma.$queryRaw`
SELECT
(SELECT COUNT(DISTINCT work_order_id) FROM "ESignature" WHERE tenant_id = ${tenantId})::FLOAT /
NULLIF((SELECT COUNT(*) FROM "WorkOrder" WHERE tenant_id = ${tenantId} AND is_regulatory = true AND status IN ('COMPLETED', 'APPROVED')), 0) * 100 as completion_rate
`;

return result[0]?.completion_rate || 0;
}

private async getAuditTrailIntegrity(tenantId: string): Promise<number> {
// Check for gaps in audit trail timestamps
const query = `
WITH audit_gaps AS (
SELECT
entity_id,
LAG(created_at) OVER (PARTITION BY entity_type, entity_id ORDER BY created_at) as prev_timestamp,
created_at,
EXTRACT(EPOCH FROM (created_at - LAG(created_at) OVER (PARTITION BY entity_type, entity_id ORDER BY created_at))) as gap_seconds
FROM "AuditLog"
WHERE tenant_id = ${tenantId}
AND created_at >= NOW() - INTERVAL '30 days'
)
SELECT
(COUNT(CASE WHEN gap_seconds IS NULL OR gap_seconds < 86400 THEN 1 END)::FLOAT /
NULLIF(COUNT(*), 0)) * 100 as integrity_score
FROM audit_gaps
`;

const result = await this.prisma.$queryRawUnsafe(query);
return result[0]?.integrity_score || 100;
}

private async getDocumentControlScore(tenantId: string): Promise<number> {
const result = await this.prisma.$queryRaw`
SELECT
(
-- Version control (50%)
(SELECT COUNT(*) FROM "Document" WHERE tenant_id = ${tenantId} AND version >= 2) /
NULLIF((SELECT COUNT(*) FROM "Document" WHERE tenant_id = ${tenantId}), 0) * 50
+
-- Approval status (50%)
(SELECT COUNT(*) FROM "Document" WHERE tenant_id = ${tenantId} AND status = 'APPROVED') /
NULLIF((SELECT COUNT(*) FROM "Document" WHERE tenant_id = ${tenantId}), 0) * 50
) as doc_control_score
`;

return result[0]?.doc_control_score || 0;
}

async getComplianceTrends(tenantId: string, days: number = 90): Promise<Array<{
date: string;
trainingCompliance: number;
auditReadiness: number;
part11Compliance: number;
}>> {
// Historical compliance metrics (calculated daily via cron job and stored)
const result = await this.prisma.$queryRaw`
SELECT
date,
training_compliance_pct,
audit_readiness_score,
part11_compliance_pct
FROM "ComplianceMetricsDaily"
WHERE tenant_id = ${tenantId}
AND date >= CURRENT_DATE - INTERVAL '${days} days'
ORDER BY date DESC
`;

return result.map(r => ({
date: r.date,
trainingCompliance: r.training_compliance_pct,
auditReadiness: r.audit_readiness_score,
part11Compliance: r.part11_compliance_pct,
}));
}
}

Due to the character limit, I'll continue with J.3.4, J.3.5, J.4, and J.5 in the next response to complete the document.