copilotkit what is its function
CopilotKit is an open-source framework for embedding AI copilots, chatbots, and in-app agents directly into web applications (primarily React/Next.js), giving them access to real-time app state and the ability to act on the UI and backend.123
Core Function
CopilotKit’s main function is to provide the in-app “agentic frontend” layer: UI components, hooks, and infrastructure that connect LLMs/agent frameworks to your product’s state and actions.341
- It lets you integrate AI copilots that can read your app’s context (forms, tables, session state) and generate contextual suggestions or actions.243
- It exposes mechanisms for agents to trigger frontend and backend actions (e.g., updating fields, calling APIs) while keeping a human-in-the-loop UI.453
Key Capabilities
- React/Next.js integration: Pre-built chat UI, headless hooks (e.g.,
useCopilotChat), and components to drop into existing apps.651 - Agent integration layer: Supports plugging in external agent frameworks like LangGraph, CrewAI, MCP, etc., and standardizes how they emit events to the UI.784
- Context grounding: Stream your app’s live state into the copilot so it can reason over current user data, not just static prompts.534
- Frontend & backend actions: Define tools the agent can use to manipulate UI state or call backend services, enabling multi-step workflows and task automation.345
- Observability & guardrails: Built-in tracing, error surfacing, and hooks for monitoring, plus mechanisms for prompt protection and controlling agent actions.913
Typical Use Cases
- SaaS copilots: Embedded assistants that guide users through domain workflows (CRM, project management, analytics, etc.).1023
- Task automation: In-app agents that can, for example, identify unvalidated users and send emails or update records via your own APIs.85
- Co-creation & productivity: Agents that help users write, plan, research, or fill complex forms with generative UI elements inside your product.21083
How It Fits Architecturally
- You keep your existing backend / agent stack (LangGraph, CrewAI, custom orchestrators); CopilotKit provides the frontend integration and real-time shared state.743
- It can be used self-hosted or with CopilotKit Cloud for orchestration and tracing, depending on your infra and compliance needs.127
If you want, next step could be: “Given a React/Next.js app and a LangGraph-based multi-agent backend, outline an architecture and minimal code skeleton to integrate CopilotKit as the agentic frontend.”
How does CopilotKit compare to LangChain for agents
What components are included in CopilotKit's React UI How to install and initialize CopilotKit in a new app Which agent frameworks does CopilotKit support How to connect CopilotKit to backend services and APIs
CopilotKit is a frontend- and protocol-layer for user-facing agents, while LangChain is primarily a backend framework for building LLM chains and tools; they complement more than substitute each other.161718
CopilotKit vs LangChain (for agents)
- LangChain focuses on building chains/agents, tool calling, retrievers, and integrations on the backend; it is not opinionated about UI or human-in-the-loop workflows.18
- CopilotKit assumes you already have an agent (LangGraph, LangChain, CrewAI, etc.) and solves the “agent ↔ product UX” layer using AG-UI/MCP/A2A protocols, shared state, and generative UI.171916
- CopilotKit can consume LangChain/LangGraph-based agents as backend actions (direct LangChain.js, LangServe, or LangGraph CoAgents), making them visible and controllable from your app.201618
When to use which
- Use LangChain to implement the agent logic, tools, memory, retrieval, workflows, and orchestration.18
- Use CopilotKit to expose that agent to users: chat surfaces, progress UIs, UI tool calls, and multi-step interactive flows inside React/Next.js, mobile, or other clients via AG-UI.1917
React UI components in CopilotKit
Common React-side building blocks include:21222317
<CopilotKit>provider: top-level context that wires your app to a runtime URL / CopilotKit Cloud / AG-UI endpoint and config (agent id, API keys, options).231721<CopilotChat>: production-ready chat interface with messages, streaming, tool call visualizations, status, and labels (title, initial message).2123- Headless hooks & utilities (e.g., chat/message hooks, state hooks): let you build fully custom UIs while still speaking AG-UI/CopilotKit protocols.1721
- Generative UI / frontend tools components: let agents render UI elements or trigger UI actions (e.g., suggestions, editable blocks, forms) via tool calls.2419
Install and initialize in a new app
Minimal Next.js/React setup (direct to CopilotKit Cloud or your own runtime):222321
- Install packages
npm install @copilotkit/react-core @copilotkit/react-ui
# or
yarn add @copilotkit/react-core @copilotkit/react-ui
- Wrap your app with the provider
// e.g. app/layout.tsx or src/main.tsx
import { CopilotKit } from "@copilotkit/react-core";
import "@copilotkit/react-ui/styles.css";
export default function RootLayout({ children }) {
return (
<html>
<body>
<CopilotKit
runtimeUrl="https://api.copilotkit.ai/runtime" // or your /api/copilotkit
agent="my-agent-id"
>
{children}
</CopilotKit>
</body>
</html>
);
}
- Drop in a chat component
// e.g. app/page.tsx
import { CopilotChat } from "@copilotkit/react-ui";
export default function Page() {
return (
<main>
<CopilotChat
labels={{
title: "Your Assistant",
initial: "Hi! 👋 How can I assist you today?",
}}
/>
</main>
);
}
For CopilotKit Cloud, you add your public API key per the quickstart and configure the runtimeUrl/agent accordingly.23
Agent frameworks CopilotKit supports
CopilotKit integrates frameworks mainly via the AG-UI protocol and related adapters.25261917
- First-class integrations (AG-UI): LangGraph (CoAgents), CrewAI crews, Mastra, and other frameworks documented in the AG-UI / Python SDK docs.2625161921
- LangChain-based stacks:
- Additional ecosystems via AG-UI/MCP/A2A: frameworks like AutoGen, LlamaIndex, Pydantic AI, and custom agents that expose an AG-UI endpoint can be consumed.2719
Connecting to backend services and APIs
CopilotKit gives multiple integration patterns depending on how your backend is structured.172118
1) Direct backend actions (TypeScript / Node.js)
Define backend actions in a CopilotRuntime, then expose them to the UI:18
- TypeScript/Node actions: functions the agent can call (e.g., send email, update DB) configured in CopilotRuntime.18
- LangChain.js actions: wrap LangChain chains as actions (e.g., call a chain that queries a vector store plus tools).18
- LangServe integration: connect to LangChain chains hosted elsewhere via HTTP (LangServe).18
2) Python SDK and remote agents
Using copilotkit-python, you define remote actions/agents and expose them via FastAPI/other frameworks as AG-UI endpoints that CopilotKit can consume.282627
- The SDK lets you wrap agents built in LangGraph, CrewAI, Blaxel, etc., and publish them as
/copilotkitendpoints.262827 - Your React app then points
runtimeUrlat that endpoint, and calls become HTTP/SSE requests into your service.2821
3) Generic external services / MCP
- CopilotKit can connect to MCP-compatible servers, auto-discover tools/resources, and expose them as native actions in the in-app copilot.292417
- Developers or end users configure MCP endpoints; the copilot can then call those tools, stream context, and render tool-specific UI.29
If you want, next prompt you could use:
“Design a production-ready architecture to use LangGraph + CopilotKit in a Next.js app, including AG-UI endpoints, backend actions (Node/Python), and example code for one multi-step business workflow.”
Design a production-ready architecture to use LangGraph + CopilotKit in a Next.js app, including AG-UI endpoints, backend actions (Node/Python), and example code for one multi-step business workflow. Compare CopilotKit and LangChain for multi agent orchestration
Benefits of CopilotKit over LangChain for React apps Which CopilotKit React components show agent progress How to customize CopilotKit headless UI components Step by step install CopilotKit in a new React app guide
CopilotKit plus LangGraph in a Next.js app gives a production-ready split where LangGraph handles multi-agent orchestration in the backend and CopilotKit provides the AG-UI–driven, human-in-the-loop React frontend for progress, tools, and shared state.31323334
Architecture: LangGraph + CopilotKit + Next.js
A pragmatic production layout:
- Frontend (Next.js / React)
- Backend “CopilotRuntime” (Node/TypeScript)
- Python agent service (LangGraph)
AG-UI / state flow
- User interacts with
<CopilotChat>or custom UI.3831 - CopilotKit sends prompts and UI state → Next.js
/api/copilotkit→ Node actions and/or Python LangGraph via remote actions.3239 - LangGraph CoAgent emits graph node transitions and intermediate state via AG-UI → back to CopilotKit → rendered through generative UI /
useCoAgentStateRender.374131
Example: Multi-step “Invoice Reconciliation” workflow
Business flow: “Given uploaded purchase orders and bank transactions, reconcile invoices, flag mismatches, and draft emails to suppliers.”
Backend: Node/TypeScript CopilotRuntime (Next.js route)
app/api/copilotkit/route.ts (Next 13+/App Router style):3932
// app/api/copilotkit/route.ts
import { CopilotRuntime } from "@copilotkit/backend";
const BASE_URL = process.env.LANGGRAPH_AGENT_BASE_URL!; // e.g. https://lg-agents.internal
const runtime = new CopilotRuntime({
actions: ({ properties, url }) => {
return (function* () {
// Native TS action: mark invoice as reconciled in DB
yield {
name: "mark_invoice_reconciled",
description: "Mark an invoice as reconciled in the internal database.",
parameters: {
type: "object",
properties: {
invoiceId: { type: "string" },
amount: { type: "number" },
},
required: ["invoiceId", "amount"],
},
handler: async ({ invoiceId, amount }) => {
// call internal service/ORM here
await fetch(process.env.INTERNAL_API_URL + "/invoices/reconcile", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ invoiceId, amount }),
});
return { status: "ok" };
},
};
})();
},
remoteActions: [
{
// LangGraph + copilotkit-python AG-UI endpoint
url: `${BASE_URL}/copilotkit`,
name: "langgraph_invoice_reconciliation",
description: "Multi-step LangGraph agent for reconciling invoices.",
},
],
});
export const GET = runtime.createHandler();
export const POST = runtime.createHandler();
This lets the LangGraph agent orchestrate the high-level workflow and optionally call mark_invoice_reconciled when ready.423239
Backend: Python + LangGraph service with AG-UI
Sketch using copilotkit-python and LangGraph CoAgents:403439
# app.py (FastAPI service)
from fastapi import FastAPI
from copilotkit import CopilotKitAgent, copilotkit_router
from my_langgraph import build_invoice_graph # your LangGraph definition
app = FastAPI()
graph = build_invoice_graph()
agent = CopilotKitAgent.from_langgraph(
graph=graph,
name="invoice_reconciliation_agent",
description="Reconcile invoices using POs and bank transactions.",
)
app.include_router(
copilotkit_router(agent),
prefix="/copilotkit",
)
Your LangGraph graph might have nodes like: ingest_docs → match_invoices → detect_anomalies → draft_emails, emitting state (e.g., current node, matched list, pending approvals) that CopilotKit can render via generative UI.413734
Frontend: Next.js + CopilotKit
// app/layout.tsx
import { CopilotKit } from "@copilotkit/react-core";
import "@copilotkit/react-ui/styles.css";
export default function RootLayout({ children }: { children: React.ReactNode }) {
return (
<html>
<body>
<CopilotKit runtimeUrl="/api/copilotkit">
{children}
</CopilotKit>
</body>
</html>
);
}
app/page.tsx with chat and progress UI using CoAgents hooks:363731
// app/page.tsx
import { CopilotChat } from "@copilotkit/react-ui";
import { useCoAgent, useCoAgentStateRender } from "@copilotkit/react-core";
export default function Home() {
const { state } = useCoAgent({
name: "langgraph_invoice_reconciliation",
});
const Progress = useCoAgentStateRender(state, {
idle: () => <p>No active reconciliation.</p>,
running: (s) => (
<div>
<p>Current stage: {s.current_node}</p>
<p>Matched invoices: {s.matched_count}</p>
</div>
),
error: (s) => <p>Error: {s.error_message}</p>,
done: (s) => <p>Reconciliation complete: {s.summary}</p>,
});
return (
<main className="flex gap-4">
<section className="w-2/3">
<CopilotChat
labels={{
title: "Invoice Copilot",
initial: "Upload POs and bank CSVs, and I’ll reconcile invoices.",
}}
/>
</section>
<aside className="w-1/3 border-l pl-4">
<h2>Agent Progress</h2>
<Progress />
</aside>
</main>
);
}
This shows LangGraph’s progress (via CoAgents) alongside the chat, with Node/Python actions behind the scenes.373141
CopilotKit vs LangChain for multi-agent orchestration (React context)
Conceptual split
- LangChain / LangGraph
- CopilotKit
Benefits of CopilotKit over LangChain for React apps
For in-app AI experiences specifically:
- Purpose-built React integration
- Provides `<CopilotChat>`, `<CopilotSidebar>`, `<CopilotPopup>`, `useCopilotChat`, `useCoAgent`, etc., for production-ready chat and agent UIs.[^3_1][^3_8]
- LangChain ships no official React UI toolkit; React UX is entirely custom.[^3_15]
- AG-UI generative UI & progress
- Frontend-aware backend actions
- Human-in-the-loop patterns out-of-the-box
CopilotKit React components for agent progress
Progress / state visualization is covered by:
<CopilotChat>
- **`<CopilotSidebar>` & `<CopilotPopup>`**
- Alternate layouts for chat + agent cards, useful for narrow side panels with task status.3831
- Hooks for fine-grained progress
useCoAgent: subscribe to CoAgent (LangGraph, Pydantic AI, etc.) state, including node, status, and intermediate outputs.4131useCoAgentStateRender: map agent states to your own React components: progress bars, timelines, step indicators, etc.4131- Generative UI APIs to render agent-emitted UI definitions in real time.3736
Customizing headless UI components
CopilotKit supports full BYO UI while reusing logic:
- Headless chat
useCopilotChatgives messages array, send function, streaming state, tool call events, etc., letting you render a completely custom chat surface.31- You control markup, styling (Tailwind, MUI, etc.), and concurrency UX (e.g., disabling input during tool calls).
- CoAgents state rendering
- Custom sub-components
- Docs describe “Custom Sub-Components” where you override built-in blocks (e.g., agent state component) instead of using defaults.47
- Good for plugging CopilotKit into an existing design system while keeping protocol and wiring intact.
Step-by-step: install CopilotKit in a new React/Next app
High-level steps for a new React or Next.js project:35433141
- Create the app (if needed)
- Next.js:
npx create-next-app@latest my-app - React+Vite:
npm create vite@latest my-app -- --template react-ts
- Next.js:
- Install CopilotKit packages
cd my-app
npm install @copilotkit/react-core @copilotkit/react-ui @copilotkit/backend
# or: yarn add ...
- Set up the provider
import { CopilotKit } from "@copilotkit/react-core";
import "@copilotkit/react-ui/styles.css";
export default function RootLayout({ children }) {
return (
<html>
<body>
<CopilotKit runtimeUrl="/api/copilotkit">
{children}
</CopilotKit>
</body>
</html>
);
}
- Create backend runtime route
import { CopilotRuntime } from "@copilotkit/backend";
const runtime = new CopilotRuntime({
actions: () =>
(function* () {
yield {
name: "ping",
description: "Simple health-check action.",
parameters: { type: "object", properties: {}, required: [] },
handler: async () => ({ status: "ok" }),
};
})(),
});
export const GET = runtime.createHandler();
export const POST = runtime.createHandler();
- Add a basic chat UI
// app/page.tsx or src/App.tsx
import { CopilotChat } from "@copilotkit/react-ui";
export default function Page() {
return (
<main className="p-4">
<CopilotChat
labels={{
title: "Assistant",
initial: "Hi! Ask me anything about your app.",
}}
/>
</main>
);
}
- Configure environment & production settings
- Set runtime URL and any CopilotKit Cloud / API keys if using hosted runtime.4643
- Add auth, rate limiting, logging, and observability on
/api/copilotkit.3931 - Configure Node/Python services (LangGraph, DB, queues) via
actionsandremoteActions.3239
If you want, next iteration can focus on: a full LangGraph graph definition for the invoice workflow (with 3–5 nodes), its AG-UI state schema, and the exact TypeScript/Python types for end-to-end safety.
Show a minimal Next.js project structure for LangGraph plus CopilotKit
Provide AG-UI endpoint schemas and expected event payloads Example Node backend action that calls external APIs and returns progress Example Python backend action for heavy data processing with streaming Sample CoAgent workflow for a multi step order fulfillment process flowchart
A minimal production template is: Next.js app (CopilotKit React UI) → /api/copilotkit route (CopilotRuntime + backend actions) → Python LangGraph service exposing an AG-UI endpoint consumed as a remote CoAgent.505152
Minimal Next.js + LangGraph + CopilotKit structure
For a Next.js App Router project:
my-app/
app/
layout.tsx
page.tsx
api/
copilotkit/
route.ts # CopilotRuntime + AG-UI bridge to LangGraph
agent-backend/ # separate Python service (LangGraph + AG-UI)
app.py
graph.py
package.json
tsconfig.json
next.config.mjs
Key files:
app/layout.tsx: wraps app in<CopilotKit runtimeUrl="/api/copilotkit">.5352app/page.tsx: uses<CopilotChat>and CoAgents hooks for progress.5254app/api/copilotkit/route.ts: definesCopilotRuntimewith localactionsandremoteActionspointing at the LangGraph AG-UI endpoint.5552agent-backend/app.py: FastAPI (or similar) app exposing/copilotkitstreaming SSE endpoint using AG-UI and LangGraph.5051
AG-UI endpoint schemas and event payloads
AG-UI is an event-based protocol over SSE/WebSocket that streams structured JSON events.565758
HTTP/SSE endpoint contract
- Method/URL:
POST /copilotkit(or/agui) with JSON body containing: - Response:
Content-Type: text/event-stream, streaming SSE events, each withdata: { ...ag-ui-event-json... }.565850
Simplified request schema (conceptual):
{
"thread_id": "string",
"messages": [
{ "id": "string", "role": "user|assistant|system|tool", "content": "..." }
],
"state": { "any": "json" }
}
Core AG-UI event shapes
AG-UI defines ~17 event types, grouped into categories.596058
- Lifecycle events (run progress)
RUN_STARTED:
{
"type": "RUN_STARTED",
"thread_id": "string",
"run_id": "string",
"timestamp": "iso-8601",
"metadata": { "agent": "order_fulfillment" }
}
- Text message events (chat streaming)
TEXT_MESSAGE_START:
{
"type": "TEXT_MESSAGE_START",
"message_id": "uuid",
"role": "assistant",
"thread_id": "string"
}
TEXT_MESSAGE_CONTENT(token/segment streaming):
{
"type": "TEXT_MESSAGE_CONTENT",
"message_id": "uuid",
"delta": "partial text"
}
TEXT_MESSAGE_ENDto finalize.59
- Tool call events (external APIs / actions)
TOOL_CALL_START:
{
"type": "TOOL_CALL_START",
"tool_call_id": "uuid",
"tool_name": "reserve_inventory",
"arguments": { "order_id": "123", "sku": "SKU-42" }
}
TOOL_CALL_RESULT:
{
"type": "TOOL_CALL_RESULT",
"tool_call_id": "uuid",
"output": { "status": "reserved", "quantity": 3 }
}
- State management events (shared UI/agent state)
STATE_SNAPSHOT:
{
"type": "STATE_SNAPSHOT",
"thread_id": "string",
"snapshot": {
"step": "PACKING",
"order": { "id": "123", "items": [/*...*/] }
}
}
STATE_DELTA(json-patch-style):
{
"type": "STATE_DELTA",
"thread_id": "string",
"delta": [
{ "op": "replace", "path": "/step", "value": "SHIPPING" }
]
}
CopilotKit and AG-UI SDKs handle encoding/decoding; you mostly emit domain-specific state.5658
Example Node backend action calling external APIs with progress
Using @copilotkit/backend actions to call an external order management API and emit tool progress events implicitly through the runtime.5255
// app/api/copilotkit/route.ts
import { CopilotRuntime } from "@copilotkit/backend";
const runtime = new CopilotRuntime({
actions: () =>
(function* () {
// Tool: check shipment status via external API
yield {
name: "get_shipment_status",
description: "Get shipment status for a given order id.",
parameters: {
type: "object",
properties: {
orderId: { type: "string" }
},
required: ["orderId"]
},
// This handler is called when agent emits a TOOL_CALL for this action
handler: async ({ orderId }) => {
// Step 1: call external shipping API
const res = await fetch(
`${process.env.SHIPPING_API_URL}/shipments/${orderId}`,
{
headers: {
Authorization: `Bearer ${process.env.SHIPPING_API_TOKEN}`
}
}
);
if (!res.ok) {
return {
status: "error",
message: `Failed to fetch shipment: ${res.status}`
};
}
const data = await res.json();
// Returning structured JSON becomes TOOL_CALL_RESULT payload
return {
status: "ok",
carrier: data.carrier,
trackingNumber: data.tracking_number,
eta: data.eta,
raw: data
};
}
};
})()
});
export const GET = runtime.createHandler();
export const POST = runtime.createHandler();
The runtime automatically emits TOOL_CALL_START and TOOL_CALL_RESULT events around this handler; CopilotKit UI can show “Checking shipment status…” and then detailed results.5859
Example Python backend action (heavy processing with streaming)
Heavy processing (e.g., scoring items or running batch sims) belongs in Python with LangGraph + AG-UI streaming.545150
Minimal FastAPI AG-UI endpoint with streaming progress:
# agent-backend/app.py
from fastapi import FastAPI, Request
from fastapi.responses import StreamingResponse
from agui import EventEncoder # from AG-UI SDK
import asyncio
import json
app = FastAPI()
@app.post("/copilotkit")
async def copilotkit_endpoint(request: Request):
payload = await request.json()
thread_id = payload.get("thread_id")
async def event_generator():
encoder = EventEncoder()
# RUN_STARTED
yield encoder.encode({
"type": "RUN_STARTED",
"thread_id": thread_id,
"run_id": "run-123"
})
orders = payload.get("state", {}).get("orders", [])
total = len(orders)
# Simulate heavy processing with per-order updates
for idx, order in enumerate(orders, start=1):
# STATE_DELTA: update current index
yield encoder.encode({
"type": "STATE_DELTA",
"thread_id": thread_id,
"delta": [
{"op": "replace", "path": "/progress/current", "value": idx},
{"op": "replace", "path": "/progress/total", "value": total}
]
})
# TEXT_MESSAGE_CONTENT: stream narrative
yield encoder.encode({
"type": "TEXT_MESSAGE_CONTENT",
"message_id": "msg-progress",
"delta": f"Processing order {order['id']} ({idx}/{total})...\n"
})
# Heavy computation placeholder
await asyncio.sleep(0.2)
# Final STATE_SNAPSHOT
yield encoder.encode({
"type": "STATE_SNAPSHOT",
"thread_id": thread_id,
"snapshot": {
"status": "DONE",
"processed_orders": [o["id"] for o in orders]
}
})
# RUN_FINISHED
yield encoder.encode({
"type": "RUN_FINISHED",
"thread_id": thread_id,
"run_id": "run-123"
})
return StreamingResponse(event_generator(), media_type="text/event-stream")
CopilotKit listens, updates UI progress via CoAgents hooks, and displays the streamed text.5452
Sample CoAgent workflow: multi-step order fulfillment (flowchart)
A CoAgent-backed LangGraph workflow for order fulfillment typically has nodes like:625154
- INTAKE_ORDER
- Validate input, normalize order schema, log in state.
- CHECK_INVENTORY
- Tool calls to inventory system; mark items as available / backordered.
- RESERVE_STOCK
- Reserve units for available items; update state and emit
STATE_DELTA.
- Reserve units for available items; update state and emit
- PAYMENT_AUTH
- Call payment gateway tools; handle failures with human-in-the-loop.
- CREATE_SHIPMENT
- Create shipment in logistics system; return tracking details.
- NOTIFY_CUSTOMER
- Draft and possibly send email/SMS; allow user edits via generative UI.
- FINALIZE_ORDER
- Mark order as fulfilled; produce summary state and timeline.
Textual flowchart (CoAgent-friendly):
[INTAKE_ORDER]
|
v
[CHECK_INVENTORY] --(missing items)--> [HUMAN_REVIEW] --(adjust order)--> [RESERVE_STOCK]
|
v
[RESERVE_STOCK]
|
v
[PAYMENT_AUTH] --(failure)--> [HUMAN_REVIEW_PAYMENT]
|
v
[CREATE_SHIPMENT]
|
v
[NOTIFY_CUSTOMER]
|
v
[FINALIZE_ORDER]
CoAgents side: each node emits AG-UI state snapshots/deltas, and CopilotKit renders them via generative UI and useCoAgentStateRender.636254
If you want, next step could be: “Give concrete LangGraph Python code for that order-fulfillment graph (nodes + edges), plus matching TypeScript types and a useCoAgentStateRender snippet that renders the flow as a live stepper.”
Minimal Nextjs folder layout for LangGraph and CopilotKit integration
AG-UI event types list with sample JSON payloads Schema for AG-UI tool call and state delta events Example SSE versus WebSocket setup for AG-UI endpoint Node backend action example calling third party REST API and streaming results
A minimal Next.js + CopilotKit + LangGraph setup has a small app structure, one CopilotRuntime route, and a separate Python AG-UI service; AG-UI then streams JSON events (messages, tool calls, state snapshots/deltas) over SSE or WebSockets, which CopilotKit renders in React.66676869
Minimal Next.js layout (LangGraph + CopilotKit)
A lean App Router layout:
my-app/
app/
layout.tsx # <CopilotKit> provider
page.tsx # <CopilotChat> + CoAgent progress UI
api/
copilotkit/
route.ts # CopilotRuntime, actions + remote AG-UI agent
agent-backend/ # Python LangGraph + AG-UI
app.py
graph.py
package.json
tsconfig.json
next.config.mjs
layout.tsx: wraps withCopilotKitpointing to/api/copilotkit.7071page.tsx: usesCopilotChatand hooks likeuseCoAgentoruseCoAgentStateRenderfor LangGraph state.7267route.ts: constructsCopilotRuntimewith backend actions and a remote agent that calls the Python AG-UI endpoint.7366agent-backend/app.py: FastAPI (or similar) exposes/copilotkitwith LangGraph + AG-UI events.6874
AG-UI event types with sample payloads
AG-UI standardizes events for chat, tools, and state; implementations typically expose them via SSE.756976
1) Run lifecycle
RUN_STARTED:
{
"type": "RUN_STARTED",
"thread_id": "thread-123",
"run_id": "run-abc",
"timestamp": "2025-11-02T10:00:00Z",
"metadata": { "agent": "order_fulfillment" }
}
RUN_FINISHED:
{
"type": "RUN_FINISHED",
"thread_id": "thread-123",
"run_id": "run-abc",
"timestamp": "2025-11-02T10:00:08Z",
"status": "success"
}
2) Text message (streamed chat)
TEXT_MESSAGE_START:
{
"type": "TEXT_MESSAGE_START",
"message_id": "msg-1",
"thread_id": "thread-123",
"role": "assistant"
}
TEXT_MESSAGE_CONTENT(token/segment streaming):
{
"type": "TEXT_MESSAGE_CONTENT",
"message_id": "msg-1",
"delta": "Checking inventory for your order..."
}
TEXT_MESSAGE_END:
{
"type": "TEXT_MESSAGE_END",
"message_id": "msg-1"
}
3) Tool calls (request + result)
TOOL_CALL_START:
{
"type": "TOOL_CALL_START",
"tool_call_id": "tc-1",
"tool_name": "get_shipment_status",
"arguments": { "orderId": "ORD-12345" }
}
TOOL_CALL_RESULT:
{
"type": "TOOL_CALL_RESULT",
"tool_call_id": "tc-1",
"output": {
"status": "in_transit",
"carrier": "DHL",
"trackingNumber": "TRACK-999",
"eta": "2025-11-04"
}
}
4) State management (snapshot + delta)
STATE_SNAPSHOT:
{
"type": "STATE_SNAPSHOT",
"thread_id": "thread-123",
"snapshot": {
"step": "CHECK_INVENTORY",
"progress": { "current": 1, "total": 3 },
"orderId": "ORD-12345"
}
}
STATE_DELTA(RFC 6902 patch):
{
"type": "STATE_DELTA",
"thread_id": "thread-123",
"delta": [
{ "op": "replace", "path": "/step", "value": "CREATE_SHIPMENT" },
{ "op": "replace", "path": "/progress/current", "value": 2 }
]
}
5) Messages snapshot
MESSAGES_SNAPSHOT:
{
"type": "MESSAGES_SNAPSHOT",
"thread_id": "thread-123",
"messages": [
{ "id": "u1", "role": "user", "content": "Fulfill order ORD-12345" },
{ "id": "a1", "role": "assistant", "content": "Got it, I'll start." }
]
}
Schemas: tool call and state delta events
Conceptual JSON schemas (simplified) based on AG-UI docs.777875
Tool call events
{
"type": "TOOL_CALL_START",
"tool_call_id": "string",
"tool_name": "string",
"arguments": { "any": "json" }
}
{
"type": "TOOL_CALL_RESULT",
"tool_call_id": "string",
"output": { "any": "json" },
"error": { "message": "string", "code": "string" }
}
Key fields:
tool_call_id: correlation id between start and result.77tool_name: matches the tool schema name exposed to the LLM.78arguments: JSON compatible with the tool’s JSON Schema.78output: arbitrary JSON result sent back to the agent/UI.77
State delta events
AG-UI uses RFC 6902 JSON Patch for deltas.7975
{
"type": "STATE_DELTA",
"thread_id": "string",
"delta": [
{
"op": "add | remove | replace | move | copy | test",
"path": "/path/in/state",
"value": {}
}
]
}
Typical pattern:
- Initial
STATE_SNAPSHOTwith full state.75 - Frequent
STATE_DELTAevents to update progress, current step, selected entity, etc.8075
SSE vs WebSocket setup for AG-UI endpoint
Most AG-UI examples use SSE (text/event-stream); WebSockets are possible but less standardized.698176
SSE endpoint (Node/Next.js API route)
Conceptual Next.js handler emitting AG-UI events over SSE:
// app/api/agui/route.ts
import { NextRequest } from "next/server";
export async function POST(req: NextRequest): Promise<Response> {
const payload = await req.json();
const stream = new ReadableStream({
async start(controller) {
const encoder = new TextEncoder();
function sendEvent(event: unknown) {
const data = JSON.stringify(event);
controller.enqueue(encoder.encode(`data: ${data}\n\n`));
}
// Example: RUN_STARTED
sendEvent({
type: "RUN_STARTED",
thread_id: payload.thread_id,
run_id: "run-1"
});
// Example: STREAMED TEXT
sendEvent({
type: "TEXT_MESSAGE_START",
message_id: "msg-1",
role: "assistant"
});
sendEvent({
type: "TEXT_MESSAGE_CONTENT",
message_id: "msg-1",
delta: "Processing your request..."
});
// Finish message and run
sendEvent({ type: "TEXT_MESSAGE_END", message_id: "msg-1" });
sendEvent({
type: "RUN_FINISHED",
thread_id: payload.thread_id,
run_id: "run-1",
status: "success"
});
controller.close();
}
});
return new Response(stream, {
headers: {
"Content-Type": "text/event-stream",
"Cache-Control": "no-cache",
Connection: "keep-alive"
}
});
}
WebSocket endpoint (conceptual)
AG-UI over WebSocket is just events encoded as JSON frames.8283
High-level shape:
- Client opens
wss://.../agui. - Sends initial context:
{ "type": "CLIENT_HELLO", "thread_id": "thread-123", "messages": [] }
- Server responds with the same event set as in SSE, but each as a WebSocket message:
{ "type": "RUN_STARTED", "thread_id": "thread-123", "run_id": "run-1" }
{ "type": "TEXT_MESSAGE_CONTENT", "message_id": "msg-1", "delta": "..." }
{ "type": "STATE_DELTA", "thread_id": "thread-123", "delta": [ ... ] }
SSE tends to be simpler in serverless/Next.js contexts; WebSockets are better for full bidirectional streaming where the client also sends updates mid-run.8182
Node backend action calling third-party REST API and streaming results
With CopilotKit, backend actions are the natural place to encapsulate third-party REST calls; the runtime handles tool events, and the agent can stream textual progress via AG-UI.678466
Minimal example in a Next.js route:
// app/api/copilotkit/route.ts
import { CopilotRuntime } from "@copilotkit/backend";
const runtime = new CopilotRuntime({
actions: () =>
(function* () {
// Tool: search products in external catalog and stream partial results
yield {
name: "search_products",
description: "Search third-party product catalog by query string.",
parameters: {
type: "object",
properties: {
query: { type: "string", description: "Search query" },
limit: { type: "number", description: "Max items", default: 10 }
},
required: ["query"]
},
// handler is called when the agent issues TOOL_CALL_START
handler: async ({ query, limit }) => {
const res = await fetch(
`${process.env.CATALOG_API_URL}/search?q=${encodeURIComponent(
query
)}&limit=${limit}`,
{
headers: {
Authorization: `Bearer ${process.env.CATALOG_API_TOKEN}`
}
}
);
if (!res.ok) {
return {
status: "error",
message: `Catalog API returned ${res.status}`
};
}
const data = await res.json();
// You can trim/shape the payload to avoid overloading the LLM/UI
const items = (data.items ?? []).map((item: any) => ({
id: item.id,
title: item.title,
price: item.price,
url: item.url
}));
// Returning structured data becomes TOOL_CALL_RESULT output
return {
status: "ok",
total: items.length,
items
};
}
};
})()
});
export const GET = runtime.createHandler();
export const POST = runtime.createHandler();
Streaming behavior:
- AG-UI emits
TOOL_CALL_STARTwhen the agent decides to callsearch_products.7778 - When the handler returns,
TOOL_CALL_RESULTis streamed to the frontend; CopilotKit can render it as text, cards, or generative UI.7267
If you want, next request can be: “Give concrete TypeScript types and Zod schemas for the AG-UI tool call and state model for an order-fulfillment CoAgent, plus a useCoAgentStateRender example that renders a 5-step progress bar.”