Multi-Language Processing
Modern backend development often requires combining the strengths of different programming languages. TypeScript for APIs, Python for data processing and AI, JavaScript for rapid prototyping. Traditional approaches involve complex microservices architectures with intricate communication patterns.
This comprehensive guide explores how to build a unified multi-language data processing pipeline using Motia's step primitive. We'll cover:
- Steps as Core Primitive: How steps unify different languages under a single abstraction.
- Building the Pipeline: A step-by-step guide to creating a cohesive multi-language data processing workflow.
- Unified Execution Model: How steps enable seamless communication between different runtime environments.
- Hands-On Development: How to build, run, and observe your unified multi-language pipeline.
Let's build a production-ready data processing system where steps unify TypeScript, Python, and JavaScript into a single cohesive workflow.
The Power of Steps: A Unified Multi-Language Primitive
At its core, our data processing pipeline demonstrates how steps solve the fundamental challenge of multi-language systems: unifying different programming languages under a single, coherent abstraction. Traditional polyglot architectures require complex inter-process communication and deployment coordination. Motia's step primitive unifies everything.
Steps enable true language unification:
- TypeScript steps: Strong typing and excellent tooling for APIs and orchestration
- Python steps: Rich ecosystem for data processing, ML, and scientific computing
- JavaScript steps: Dynamic processing and rapid development
- Motia's Step Primitive: The unifying abstraction that makes all languages work as a single system
Instead of managing multiple services, steps provide a single programming model. Whether written in TypeScript, Python, or JavaScript, every step follows the same pattern: receive data, process it, emit events. This unification is what makes multi-language development straightforward.
The Anatomy of Our Multi-Language Pipeline
Our application consists of six specialized steps, each leveraging the optimal language for its specific task. Let's explore the complete architecture.
<Tabs items={['api-starter', 'bridge-step', 'python-processor', 'notification-handler', 'finalizer', 'summary-generator']}>
import { z } from 'zod'
const bodySchema = z.object({
data: z.record(z.unknown()).optional(),
message: z.string().optional()
})
// API endpoint to start the multi-language pipeline
export const config = {
type: 'api',
name: 'AppStarter',
description: 'Start the multi-language app pipeline',
method: 'POST',
path: '/start-app',
bodySchema,
responseSchema: {
200: z.object({
message: z.string(),
appId: z.number(),
traceId: z.string()
})
},
emits: ['app.started'],
flows: ['data-processing']
} as const
export const handler = async (req: any, { logger, emit, traceId }: any) => {
logger.info('🚀 Starting multi-language app', { body: req.body, traceId })
const appData = {
id: Date.now(),
input: req.body.data || {},
started_at: new Date().toISOString(),
traceId
}
// Emit to next step
await emit({
topic: 'app.started',
data: appData
})
logger.info('✅ App started successfully', {
appId: appData.id,
traceId
})
return {
status: 200,
body: {
message: 'Multi-language app started successfully',
appId: appData.id,
traceId
}
}
}
import { z } from 'zod'
// Bridge step to connect app starter to Python processing
export const config = {
type: 'event',
name: 'AppBridge',
description: 'Bridge between app start and Python processing',
subscribes: ['app.started'],
emits: ['data.processed'],
input: z.object({
id: z.number(),
input: z.record(z.unknown()),
started_at: z.string(),
traceId: z.string()
}),
flows: ['data-processing']
} as const
export const handler = async (input: any, { logger, emit }: any) => {
logger.info('🌉 Processing app data and sending to Python', { appId: input.id })
// Process data for Python step
const processedResult = {
original_id: input.id,
processed_at: input.started_at,
result: `Processed: ${JSON.stringify(input.input)}`,
confidence: 0.95,
model_version: '1.0'
}
// Send to Python processing
await emit({
topic: 'data.processed',
data: processedResult
})
logger.info('✅ Data sent to Python processing', {
originalId: input.id
})
}
import time
from datetime import datetime
# Python processing step configuration
config = {
"type": "event",
"name": "ProcessDataPython",
"description": "Process data using Python capabilities",
"subscribes": ["data.processed"],
"emits": ["python.done"],
"flows": ["data-processing"]
}
async def handler(input_data, ctx):
"""
Python step that processes data and demonstrates Python capabilities
"""
logger = ctx.logger
emit = ctx.emit
# Extract data from input
original_id = input_data.get("original_id")
result = input_data.get("result", "")
logger.info(f"🐍 Python processing data for ID: {original_id}")
start_time = time.time()
# Simulate Python data processing
processed_message = f"Python processed: {result}"
# Add some Python-specific processing
data_analysis = {
"word_count": len(result.split()) if isinstance(result, str) else 0,
"character_count": len(result) if isinstance(result, str) else 0,
"processed_timestamp": datetime.now().isoformat(),
"processing_language": "Python 3.x"
}
processing_time = (time.time() - start_time) * 1000 # Convert to milliseconds
# Create result object
python_result = {
"id": original_id,
"python_message": processed_message,
"processed_by": ["appStarter", "appBridge", "ProcessDataPython"],
"processing_time": processing_time,
"analysis": data_analysis
}
# Emit to next step
await emit({
"topic": "python.done",
"data": python_result
})
logger.info(f"✅ Python processing completed in {processing_time:.2f}ms")
import { z } from 'zod'
export const config = {
type: 'event',
name: 'NotificationHandler',
description: 'Send notifications after Python processing',
subscribes: ['python.done'],
emits: ['notification.sent'],
input: z.object({
id: z.number(),
python_message: z.string(),
processed_by: z.array(z.string()),
processing_time: z.number(),
analysis: z.record(z.unknown()).optional()
}),
flows: ['data-processing']
} as const
export const handler = async (input: any, { logger, emit }: any) => {
logger.info('📧 Sending notifications after Python processing', { id: input.id })
// Simulate sending notifications (email, slack, etc.)
const notification = {
id: input.id,
message: `Notification: ${input.python_message}`,
processed_by: input.processed_by,
sent_at: new Date().toISOString()
}
// Send notification data to final step
await emit({
topic: 'notification.sent',
data: {
...notification,
processing_time: input.processing_time
}
})
logger.info('✅ Notifications sent successfully', { id: input.id })
}
import { z } from 'zod'
// Final step to complete the app - TypeScript
export const config = {
type: 'event',
name: 'AppFinalizer',
description: 'Complete the basic app and log final results',
subscribes: ['notification.sent'],
emits: ['app.completed'],
input: z.object({
id: z.number(),
message: z.string(),
processed_by: z.array(z.string()),
sent_at: z.string(),
processing_time: z.number()
}),
flows: ['data-processing']
} as const
export const handler = async (input: any, { logger, emit }: any) => {
logger.info('🏁 Finalizing app', {
notificationId: input.id,
message: input.message
})
// Create final app summary
const summary = {
appId: input.id,
status: 'completed',
completed_at: new Date().toISOString(),
steps_executed: [
'app-starter',
'app-bridge',
'python-processor',
'notification-handler',
'app-finalizer'
],
result: input.message
}
// Send to JavaScript summary generator
await emit({
topic: 'app.completed',
data: {
...summary,
total_processing_time: input.processing_time
}
})
logger.info('✅ App finalized successfully', {
appId: input.id,
totalSteps: summary.steps_executed.length
})
}
// Final summary step - JavaScript
export const config = {
type: 'event',
name: 'summaryGenerator',
description: 'Generate final summary in JavaScript',
subscribes: ['app.completed'],
emits: [], // Final step - no further processing needed
flows: ['data-processing']
}
export const handler = async (input, { logger }) => {
logger.info('📊 Generating final summary in JavaScript', {
appId: input.appId,
status: input.status
})
// Calculate processing metrics
const processingTime = input.total_processing_time || 0
const stepsCount = input.steps_executed ? input.steps_executed.length : 0
// Create comprehensive summary
const summary = {
appId: input.appId,
finalStatus: input.status,
totalSteps: stepsCount,
processingTimeMs: processingTime,
languages: ['TypeScript', 'Python', 'JavaScript'],
summary: `Multi-language app completed successfully with ${stepsCount} steps`,
result: input.result,
completedAt: new Date().toISOString(),
generatedBy: 'javascript-summary-step'
}
// Log final summary (final step - no emit needed)
logger.info('✨ Final summary generated successfully', summary)
return summary
}
Type Definitions
Our unified system uses shared TypeScript types to ensure type safety across the multi-language pipeline:
// types/index.ts
export interface AppData {
id: number
input: Record<string, unknown>
started_at: string
traceId: string
}
export interface ProcessedResult {
original_id: number
processed_at: string
result: string
confidence: number
model_version: string
}
export interface PythonResult {
id: number
python_message: string
processed_by: string[]
processing_time: number
}
export interface NotificationData {
id: number
message: string
processed_by: string[]
sent_at: string
}
export interface AppSummary {
appId: number
status: string
completed_at: string
steps_executed: string[]
result: string
}
Explore the Workbench
The Motia Workbench provides a visual representation of your multi-language pipeline, making it easy to trace data flow between TypeScript, Python, and JavaScript steps.
You can monitor real-time execution, view logs from all languages in a unified interface, and trace the complete data flow from the TypeScript API through Python processing to JavaScript summary generation.
Event Flow Architecture
The pipeline follows a clear event-driven flow that connects all languages seamlessly:
app.started- TypeScript API → TypeScript Bridgedata.processed- TypeScript Bridge → Python Processorpython.done- Python Processor → TypeScript Notification Handlernotification.sent- TypeScript Notification → TypeScript Finalizerapp.completed- TypeScript Finalizer → JavaScript Summary Generator
Each step only needs to know the events it subscribes to and emits, creating loose coupling while maintaining strong data flow guarantees.
Key Features & Benefits
🧩 Step as Universal Primitive
Every piece of logic—whether TypeScript, Python, or JavaScript—follows the same step pattern, creating true unification.
🌐 Seamless Language Integration
Steps eliminate the complexity of multi-language systems by providing a unified programming model.
📊 Unified Development Experience
Write, debug, and monitor all languages through a single interface and shared execution model.
⚡ Hot Reload Across Languages
Edit any step in any language and see changes instantly across the entire pipeline.
🔄 Event-Driven Communication
Steps communicate through events, enabling loose coupling and independent scaling.
🎯 Single Deployment Model
Deploy all languages together as a cohesive system, not as separate microservices.
🐍 Python Step Naming
Python steps use the _step.py suffix convention for proper module resolution (e.g., simple-python_step.py).
Trying It Out
Ready to build your first multi-language Motia application? Let's get it running.
Create Your Motia App
Start by creating a new Motia project with the interactive setup.
npx motia@latest create
Navigate and Start Development
Move into your project directory and start the development server.
cd my-app # Replace with your project name
npm run dev
Open the Workbench
Navigate to http://localhost:3000 to access the Workbench and run your workflow.
Test the Multi-Language Pipeline
Send a request to your API endpoint to see the multi-language workflow in action:
curl -X POST http://localhost:3000/start-app \
-H "Content-Type: application/json" \
-d '{"data": {"test": "value"}, "message": "Hello!"}'
Watch in the Workbench as your data flows through:
- TypeScript validation and event emission
- TypeScript bridge processing and forwarding
- Python data processing with rich logging
- TypeScript notification handling
- TypeScript finalization and aggregation
- JavaScript summary generation and metrics
💻 Dive into the Code
Want to explore multi-language workflows further? Check out additional examples and the complete source code:
Multi-Language Examples
Access complete multi-language implementations, configuration examples, and learn how to integrate TypeScript, Python, and JavaScript in production applications.
Conclusion: The Power of Unification Through Steps
This multi-language data processing pipeline demonstrates how steps fundamentally change multi-language development. By providing a single primitive that works across TypeScript, Python, and JavaScript, we've eliminated the traditional complexity of polyglot architectures.
The step primitive enables true unification:
- Universal Pattern - Every step, regardless of language, follows the same receive-process-emit pattern
- Seamless Integration - Add Ruby, Go, Rust, or any language using the same step abstraction
- Unified Deployment - All languages deploy together as a single, coherent system
- Shared Development Model - Write, debug, and monitor everything through the same interface
Key benefits of step-based unification:
- Single Mental Model - Learn the step pattern once, apply it to any language
- Cohesive System - All components work together as parts of one application, not separate services
- Consistent Experience - Development, debugging, and monitoring work the same way across all languages
- Natural Scaling - Each step can scale independently while maintaining system coherence
Extend your pipeline with more steps:
- Add specialized processing steps for different data types and business logic
- Integrate machine learning workflows with Python steps for AI processing
- Build real-time analytics with streaming steps for live data processing
- Connect to enterprise systems through database and API integration steps
- Implement scheduled processing with cron steps for batch operations
The step primitive makes all extensions natural and straightforward—every new capability follows the same unified pattern.
Ready to unify your multi-language systems? Start building with steps today!