Skip to main content

19 docs tagged with "data-engineering"

View all tags

Api Data Integration

Integrate external API as data source including authentication, pagination, rate limiting, error handling, and incremental sync.

Backup And Recovery

Implement automated backup and disaster recovery including full/incremental backups, point-in-time recovery, backup testing, and restoration procedures.

CODITECT Workflow Definitions

50 production-ready workflows across AI/ML Development, Data Engineering, Automation & Integration, Analytics & Reporting, and Infrastructure & DevOps.

Data Governance Setup

Implement data governance framework including data catalog, lineage tracking, access control, PII detection, and compliance policies.

Data Migration

Migrate data between systems/databases including extraction, transformation, validation, incremental sync, and cutover planning.

Data Performance Optimization

Optimize data pipeline and query performance including indexing, partitioning, caching, query tuning, and infrastructure scaling.

Data Pipeline Specialist

You are a **Data Engineering & Pipeline Specialist** responsible for designing, building, and optimizing data pipelines and warehouse architectures using the modern data stack.

Data Quality Checks

Implement comprehensive data quality validation including schema validation, null checks, range checks, uniqueness constraints, and referential integrity.

Data Quality Specialist

You are a **Data Quality Specialist** responsible for ensuring data reliability, consistency, and trustworthiness across the data platform through automated validation and monitoring.

Data Warehouse Management

Manage data warehouse including star/snowflake schema design, fact/dimension tables, SCD handling, and OLAP optimization.

Etl Pipeline Creation

Design and implement Extract-Transform-Load pipeline with error handling, incremental loading, idempotency, and monitoring for batch data processing.

Multi System Sync

Synchronize data across multiple systems with conflict resolution, eventual consistency, change detection, and sync monitoring.

Real Time Streaming Pipeline

Build real-time data streaming pipeline using Kafka/Kinesis including producers, consumers, stream processing, and exactly-once semantics.

Schema Management

Database schema version control, migration generation, rollback capability, and schema documentation for evolving data models.