motia open source backend github what is it what can it be used for
Motia is an open-source backend framework hosted on GitHub at MotiaDev/motia. It unifies APIs, background jobs, queues, workflows, streams, AI agents, observability, and state management around a single core primitive called a "Step."12
Core Concept
Motia addresses backend fragmentation by letting developers use one system for all common patterns, supporting languages like JavaScript, TypeScript, Python (stable), Ruby (beta), and Go (upcoming). Steps act like React components for the backend, handling everything from REST endpoints to event-driven AI workflows with zero configuration and built-in tracing.31
Key Uses
- Building event-driven APIs, schedulers, and real-time streaming apps.
- Creating multi-language workflows, such as TypeScript APIs calling Python processors (e.g., chess evaluation with Stockfish).1
- Developing AI agents for tasks like web research, email automation, GitHub PR management, or market analysis.1
- Production setups with authentication, leaderboards, and one-click cloud deployment.41
Getting Started
Run npx motia@latest create -i for a new project, then npx motia dev to launch a workbench at localhost:3000 with visual debugging. It includes 20+ examples, docs, and Discord support.1
Follow-ups
How do I install and start a Motia project What are some example use cases for Motia How does the Step primitive work in Motia Motia vs Express and Celery comparison Motia tutorial for building an AI agent backend
Motia projects install and run via npm for quick starts across languages like TypeScript, JavaScript, and Python. Follow the steps below to bootstrap and launch your first app.11
Installation Steps
- Run
npx motia@latest createin your terminal and follow prompts for template, name, and language. - Navigate to the new project folder.
- Start dev server with
npx motia dev, which launches a workbench at http://localhost:3000 for visual debugging, APIs, and tracing.11
This sets up REST APIs, event-driven flows, and zero-config tools instantly.11
Example Use Cases
Motia excels in unified backends blending APIs, jobs, and AI.
- ChessArena.ai: Production app with auth, multi-LLM evaluation (OpenAI/Claude/Gemini), Python Stockfish integration, real-time streaming, and leaderboards.11
- AI Research Agent: Web research with iterative analysis; Streaming Chatbot: Real-time AI responses; Gmail Automation: Smart email processing; GitHub PR Manager: Automated workflows; Finance Agent: Market analysis.11
- Pet store backend tutorial: APIs, background jobs, workflows, AI decisions, real-time updates.12
Step Primitive Explained
A Step is Motia's single core unit—like a React component for backends—handling APIs, events, crons, queues, workflows, AI agents, state, streams, and observability uniformly. Each Step file has a config (name, type like 'api'/'event'/'cron', triggers like path/method/subscribes/cron, emits) and handler function (business logic accessing emit/logger/state/streams). Motia auto-discovers Steps, wires them event-driven (e.g., API emits 'message.sent', event subscribes/processes/emits next), supports multi-language mixing (TS API → Python worker), and provides built-in tracing.13141511
Example API + Event flow:
config = { name: 'SendMessage', type: 'api', path: '/messages', method: 'POST', emits: ['message.sent'] };
handler = async (req, { emit }) => { await emit({ topic: 'message.sent', data: req.body }); return { ok: true }; };
config = { name: 'ProcessMessage', type: 'event', subscribes: ['message.sent'] };
handler = async (input) => { /* process */ };
Motia vs Express/Celery
| Aspect | Motia | Express (Node API) | Celery (Python Jobs) |
|---|---|---|---|
| Scope | Unified: APIs/jobs/queues/workflows/AI/streams/observability 11 | HTTP APIs only 16 | Background jobs/queues only 16 |
| Patterns | Single Step primitive, event-driven auto-wiring 11 | Routes/middleware manual 16 | Tasks/brokers (Redis) manual 16 |
| Multi-lang | JS/TS/Python/Ruby/Go 11 | JS/TS only | Python only |
| Observability | Built-in tracing/workbench 11 | Add-ons (e.g., Prometheus) | Flower/Loguru add-ons |
| Setup/Deploy | npx motia dev; one-click cloud 11 | npm install; manual server | Pip install; broker config |
| AI/Agents | Native workflows/streams 11 | Custom integrations | Custom tasks |
Motia eliminates "tool stitching" (Express+BullMQ+Cron+Socket.io), using one pattern for all.16
AI Agent Tutorial
Start with Motia's pet store guide or AI examples for agentic backends.1211
- Create project:
npx motia@latest create(pick AI template). - Define API Step to receive query, emit 'research.start'.
- Event Steps for agents: e.g., researcher (fetch web/LLM analyze), synthesizer (summarize), emit 'research.done'.
- Use workbench to test traces; add streams for real-time.
- Deploy to Motia Cloud.
See ChessArena.ai source for multi-agent LLM orchestration or AGENTS.md for Cursor/OpenAI integration. Video tutorials cover full builds.171811
Motia pet store tutorial walkthrough
Build AI agent workflow with Motia Steps Motia Step file code examples for streaming How to deploy Motia app to production Motia observability and debugging features
Motia's pet store tutorial builds a full backend with APIs, jobs, workflows, AI agents, and streaming. It covers pet management, adoption flows, feeding reminders, health decisions, cleanup jobs, and real-time updates.21
Pet Store Walkthrough
- Create project:
npx motia@latest create(select pet store template). - Run
npx motia devfor workbench at localhost:3000. - Build APIs (create/update/list pets), workflows (adoption: quarantine → health check → available), cron jobs (feeding/cleanup), AI agents (symptom evaluation), and streaming (live progress).
- Test visually: APIs return responses, jobs process async, agents decide intelligently, streams push updates.
- Expand to production-ready with auth and persistence.2221
Full code/examples in docs and GitHub.23
Building AI Agent Workflow
- Define API Step: POST /research emits 'research.start'.
- Event Steps chain: 'researcher' (web fetch/LLM analyze) → emit 'research.synthesize'; 'synthesizer' (summarize) → emit 'research.done'.
- Use agentic Steps for LLM calls (e.g., OpenAI/Claude).
- Add state for persistence, emit for orchestration.
Example from ChessArena.ai: API → Python Stockfish eval → LLM judge → stream results.2425
Streaming Step Examples
Configure stream schema first (streams.ts):
export const petCreation = {
name: 'petCreation',
schema: z.object({ status: z.string(), progress: z.number() }),
};
In Step handler:
handler = async ({ streams }) => {
const stream = streams.petCreation; // Auto-available
await stream.push({ status: 'Creating...', progress: 0 });
// Process...
await stream.push({ status: 'Done!', progress: 100 });
};
APIs init/return stream ID for client SSE; multi-Steps update same stream. Agentic Steps stream LLM progress (bio/breed generation).2224
Production Deployment
- Motia Cloud:
npx motia deployor web interface (connect localhost:3000 → deploy with env vars). Zero-config for APIs/jobs/agents; monitor traces.2627 - Self-Hosted: Use motia-docker for custom infra (e.g., Kubernetes). Preview in workbench, supports multi-lang, auth, persistence.26
Observability Features
Motia provides zero-config tracing, logs, and workbench visualization.
- Workbench: Visual flow diagrams, chronological logs, traceId filtering (follow request end-to-end), state/logs inspection.
- Tracing: Every API/event/job auto-traced; debug mode for verbose logs.
- Logging: Structured/contextual; production monitoring. Click flows/logs for instant debugging—no manual instrumentation.282924
Build AI agent for pet health decisions in Motia
Motia streams configuration and usage tutorial Self-hosted deployment with motia-docker Common debugging issues in Motia workflows
Motia includes a pet health AI agent in its tutorial that analyzes symptoms like coughing or lethargy to decide on treatment needs. It uses agentic Steps for intelligent decisions without hardcoded rules.33
Pet Health AI Agent Build
- Create pet store project:
npx motia@latest create(pet template). - Add Health Review Agentic Step (agentic-health-review.step.ts):
config = {
name: 'HealthReview',
type: 'agentic',
subscribes: ['pet.health.review'],
emits: ['pet.treatment.needed', 'pet.health.ok']
};
handler = async ({ input, emit, llm }) => {
const decision = await llm.chat({
model: 'gpt-4o-mini',
messages: [{ role: 'system', content: 'Evaluate pet symptoms...' }, { role: 'user', content: input.symptoms }]
});
if (decision.includes('treatment')) {
emit({ topic: 'pet.treatment.needed', data: { petId: input.petId } });
} else {
emit({ topic: 'pet.health.ok' });
}
};
- Trigger via workflow: New pet → emit 'pet.health.review'.
- Test in workbench: Create pet, view AI decision trace, status update.3433
View full GitHub example.33
Streams Config and Usage
Define streams in streams.ts for auto-availability in handlers:
import { z } from 'zod';
export const petUpdates = {
name: 'petUpdates',
schema: z.object({ status: z.string(), progress: z.number() })
};
In Step:
handler = async ({ streams }) => {
const stream = streams.petUpdates;
await stream.push({ status: 'Updating...', progress: 50 });
// Logic...
await stream.push({ status: 'Complete' });
};
Clients subscribe via SSE (ID from API response). Supports groups, auth (canAccess fn), ephemeral events (send). Perfect for AI streaming/LLM tokens.3536
Self-Hosted with motia-docker
- In project root:
npx motia docker setup(creates Dockerfile, .dockerignore). - For Python Steps: Add requirements.txt, uncomment Python in Dockerfile.
- Add docker-compose.yml with Redis:
services:
motia:
build: .
ports: ["3000:3000"]
env: { REDIS_URL: redis://redis:6379 }
redis:
image: redis:alpine
- Run
docker-compose up; scale instances (Redis shares state/events). Deploy to Fly.io/Railway/K8s; load balance for prod.37
Common Debugging Issues
- Step config errors: Invalid type/path (api/event/cron); fix schema validation.38
- Dependency conflicts: Node <20.11/pnpm issues; use
volta install node@20.11.1; pnpm store prune.38 - Infinite recursion: Stream wrapper reuse; avoid reusing stream objects.39
- Logs disappear: Workbench refresh bug; check console/traceId.39
- Workflow stalls: No emit/subscribe match; verify topics in workbench traces.
- Performance: Slow Steps (async/DB optimize); memory (clean resources); state access (batch). Use workbench for visual traces/logs/filtering.403839