Skip to main content

Real Time Streaming Pipeline

Build real-time data streaming pipeline using Kafka/Kinesis including producers, consumers, stream processing, and exactly-once semantics.

Complexity: Complex | Duration: 30m+ | Category: Devops

Tags: data-engineering streaming kafka real-time

Workflow Diagram

Steps

Step 1: Streaming platform setup

Agent: devops

engineer - Deploy Kafka/Kinesis cluster

Step 2: Topic/stream creation

Agent: data

engineer - Create topics with partitions

Step 3: Producer implementation

Agent: backend

architect - Implement event producers

Step 4: Serialization

Agent: data

engineer - Choose Avro, Protobuf, JSON with schema registry

Step 5: Consumer implementation

Agent: backend

architect - Implement consumer groups

Step 6: Stream processing

Agent: data

engineer - Kafka Streams/Flink for transformations

Step 7: Exactly

Agent: once semantics

data-engineer - Implement idempotent producers, transactions

Step 8: Monitoring

Agent: devops

engineer - Monitor lag, throughput, error rates

Usage

To execute this workflow:

/workflow devops/real-time-streaming-pipeline.workflow

See other workflows in this category for related automation patterns.