Skip to main content

Data Quality Monitoring

Implement data quality monitoring and validation for research pipelines and databases

Complexity: Moderate | Duration: 15-30m | Category: Research/Intelligence

Tags: data-quality monitoring validation pipeline-reliability anomaly-detection

Workflow Diagram

Steps

Step 1: Rule definition

Agent: data

engineering - Define data quality rules (completeness, accuracy, consistency, timeliness)

Step 2: Schema validation

Agent: data

engineering - Implement schema validation and type checking

Step 3: Range checks

Agent: data

engineering - Add min/max, outlier detection, anomaly detection

Step 4: Deduplication

Agent: data

engineering - Detect and flag duplicate records

Step 5: Freshness monitoring

Agent: devops

engineer - Monitor data staleness and pipeline execution delays

Step 6: Alerting

Agent: devops

engineer - Configure alerts for quality threshold violations

Step 7: Dashboard

Agent: frontend

developer - Build data quality dashboard with trend tracking

Step 8: Remediation

Agent: data

engineering - Document remediation workflows for quality failures

Usage

To execute this workflow:

/workflow research/intelligence/data-quality-monitoring.workflow

See other workflows in this category for related automation patterns.