Skip to main content

WHAT IS OPEN CODE AI?

OpenCode AI (often just called OpenCode) is an open‑source AI coding agent that runs in your terminal, desktop, or IDE and connects to a variety of LLM providers to help you read, write, refactor, and debug code.123

What it is

  • OpenCode is a Go-based CLI/TUI application that embeds an AI assistant directly into your terminal workflow instead of a browser or proprietary IDE.24
  • It is fully open source (MIT-licensed) and designed as a general AI coding agent rather than a single-vendor copilot, letting you plug in OpenAI, Anthropic, Google Gemini, Groq, Together, local models, and more.562

Core capabilities

  • Multi-model coding assistant: interactive chat for generating code, explanations, tests, refactors, and debugging, with session history stored locally in SQLite.25
  • Deep project understanding: integrates Language Server Protocol (LSP) to scan your repo, understand types, imports, and dependencies, and reason about multi-file changes without copy/paste.452
  • Tooling & H.P.002-COMMANDS: provides slash H.P.002-COMMANDS like /init, /plan, /undo, /redo, and /share to analyze a project, plan changes, apply edits, and share sessions.5

Architecture & integration

  • Modular Go architecture with clear packages for CLI (Cobra), TUI (Bubble Tea), LLM providers, sessions, LSP, and tool integrations, making it extensible for custom H.P.006-WORKFLOWS.42
  • Client–server design that can run locally or connect to remote environments, and can be wired to OpenAI-compatible endpoints such as Docker Model Runner or other self‑hosted gateways.672
  • Offers terminal UI, desktop app, and IDE extensions, so it can sit alongside your existing editor rather than replacing it.31

Why people use it

  • Vendor and model flexibility: switch between cloud and local models, or route via OpenAI-compatible endpoints, which is attractive for regulated or privacy‑sensitive work.625
  • Strong terminal-first UX: built for developers who live in tmux/zellij and want an AI collaborator with minimal context-juggling friction.245
  • Community-driven: actively developed on GitHub with a growing user base and star count, encouraging contributions, forks, and custom extensions.52

If you want, a good next question to dig in could be: “Design an architecture to integrate OpenCode as a tenant-aware coding agent into a multi-tenant SaaS dev platform focused on regulated industries.” 8910


OpenCode vs GitHub Copilot

FeatureOpenCodeGitHub Copilot
Open sourceMIT license, fully openProprietary
Model/provider lock-inNone – swap providers via H.P.009-CONFIGTied to OpenAI/Anthropic models behind Copilot API
Primary interfaceTerminal TUI, desktop app, IDE extensionVS Code / JetBrains / Neovim extensions
Session persistenceLocal SQLite sessions + undo/redo stackCloud-based, limited local history
Offline supportYes – via LOCAL_ENDPOINT to any OpenAI-compatible server or local modelNo – requires GitHub service
LSP integrationBuilt-in; diagnostics exposed to AILimited; relies on IDE's LSP
PricingFree (you pay your LLM provider)Subscription ($10–$39/mo per seat)
MCP tool extensibilityNative MCP support for custom tool serversNot supported

11


Supported Models & Providers

OpenCode dynamically discovers models from each provider you H.P.009-CONFIGure.

ProviderExample models
OpenAIGPT-4.1, GPT-4.5 Preview, GPT-4o, O1/O3/O4 families
AnthropicClaude 4 Sonnet/Opus, Claude 3.7 Sonnet, Claude 3.5 Sonnet/Haiku
Google (Gemini)Gemini 2.5, 2.5 Flash, 2.0 Flash/Lite
Google VertexAIGemini 2.5, 2.5 Flash
AWS BedrockClaude 3.7 Sonnet
Azure OpenAIGPT-4.1, GPT-4o, O1/O3, O4 Mini
GroqLlama 4 Maverick/Scout, Deepseek R1, QWEN QWQ-32b, Llama 3.3 70b
GitHub CopilotGPT-4o, Claude 3.7 Sonnet, Gemini 2.5 Pro, O3 Mini, O4 Mini
OpenRouterAny model exposed via OpenRouter
Local / self-hostedAny model behind an OpenAI-compatible endpoint (LOCAL_ENDPOINT)

Installation on macOS / Linux

curl -fsSL https://opencode.ai/install | bash

Homebrew (macOS & Linux)

brew install opencode-ai/tap/opencode

Go (if you have Go 1.24+)

go install github.com/opencode-ai/opencode@latest

Node / NPM

npm install -g opencode

Arch Linux (AUR)

paru -S opencode-ai-bin   # or yay

Fully Offline with Local Models

Yes, OpenCode can run completely offline:

  1. Point LOCAL_ENDPOINT to an OpenAI-compatible server (e.g., LM Studio, Ollama, llama.cpp server, Docker Model Runner).
  2. Configure your agent model:
{
"H.P.001-AGENTS": {
"coder": {
"model": "local.granite-3.3-2b-instruct@q8_0"
}
}
}
  1. No cloud API keys are required once the local endpoint is up.

LSP & Editor Integration

OpenCode embeds an LSP client that can connect to any language server.

Configuration example

{
"lsp": {
"go": { "command": "gopls" },
"typescript": { "command": "typescript-language-server", "args": ["--stdio"] },
"rust": { "command": "rust-analyzer" }
}
}

Capabilities exposed to the AI

  • Diagnostics – The AI can call a diagnostics tool to retrieve errors/warnings from the language server and propose fixes.
  • File watching – LSP servers are notified of edits so diagnostics stay current.

Currently OpenCode only surfaces diagnostics to the agent; the full LSP feature set (hover, go-to-def, completions) is implemented under the hood but not yet wired to the AI.

Editor / Desktop / IDE modes

  • TUI – primary interface; runs in any terminal.
  • Desktop app – Electron wrapper with the same TUI.
  • IDE extensions – integrate with existing editors so the agent can run alongside your workflow.11

Note: The original opencode-ai/opencode GitHub repo has been archived; development continues under the project name Crush by the Charm team.


Suggested next prompt

"Show me an architecture diagram for running OpenCode with a local LLM inside a secure, tenant-isolated Kubernetes namespace—targeting HIPAA workloads."


can it be decomposed away from the opencode CLI and integrated into any terminal CLI?

Yes, but not as a neat, documented SDK today; you would be effectively embedding or talking to its core “agent server” rather than dropping a small library into any arbitrary CLI.121314

How OpenCode is structured

  • The codebase is modular: Cobra cmd/ for the CLI entrypoints, internal/app for core services, internal/llm for providers, internal/session for state, internal/lsp for language servers, and internal/tui for the UI.14
  • The CLI already separates a backend “server” (serve / web) from “frontends” (attach, TUI), so other frontends can, in principle, connect over HTTP and reuse the agent logic.1214

Practical decomposition options

  • Sidecar server model:
    • Run opencode serve (or the equivalent in the Crush successor) as a long‑lived background process per project or per user.1412
    • Your own CLI then talks to that process over HTTP (or stdin/stdout if you wrap it) to create sessions, send H.P.007-PROMPTS, and read responses, similar to how the attach subcommand works.1312
  • Library / embedded model (harder):
    • Since the core is under internal/ and not exported as a Go module, you cannot cleanly go get it as a reusable library without forking or refactoring.14
    • You could fork the repo and promote selected internal/* packages to a public API, then expose a thin RPC or gRPC surface that your own CLI calls, but this is a structural change on your side.1314

Integration into “any terminal CLI”

  • If “any terminal CLI” is yours, the cleanest pattern is:
    • Keep your UX and H.P.002-COMMANDS, and delegate “AI actions” to an opencode/crush sidecar via a stable protocol (HTTP/OpenAI‑style, or your own JSON‑RPC).1514
    • This lets you share provider H.P.009-CONFIG, sessions, and LSP wiring with the upstream project while keeping your CLI focused on domain‑specific H.P.002-COMMANDS.
  • If “any terminal CLI” includes 3rd‑party tools (git, kubectl, terraform, etc.), the realistic route is:
    • Add a thin wrapper command (e.g., mytool ai ...) that shells out to your agent client, or
    • Inject a shared “AI helper” binary that those tools call with context (current directory, file list, flags), again talking to the OpenCode/Crush backend over HTTP.1617

Constraints and caveats

  • There is no published “embedded agent SDK” with stable API guarantees; you are leaning on internal structures and CLI contracts that can change.1514
  • For long‑term stability, treating the agent as a separate, versioned service with a narrow protocol is safer than trying to deeply embed the Go code into every CLI you own.1813

Focused next prompt you could use:

“Design an HTTP/JSON RPC interface for running OpenCode/Crush as a multi-tenant coding agent service that arbitrary CLIs can call with repo context and H.P.002-COMMANDS.” 1920212223242526


How to run OpenCode as a standalone server without the CLI

API endpoints OpenCode exposes for external CLIs Steps to embed OpenCode functionality into bash or zsh Using OpenCode MCP with custom terminal applications How to call OpenCode features via HTTP from another CLI tool

OpenCode can run as a standalone HTTP server and exposes a small, well-defined API that other CLIs can call; you can also layer MCP on top for richer tool integration.272829

Run as a standalone server

  • Start a headless server (no TUI) with:
opencode serve --port 4096 --hostname 127.0.0.1

Optional flags: --mdns for discovery, --cors for extra origins.2827

  • The server exposes an OpenAPI 3.1 spec endpoint and is designed for multiple clients (TUI, web UI, custom CLIs) to connect concurrently.303127

HTTP API endpoints for external CLIs

From the server docs (abbreviated to the most useful pieces):2730

  • OpenAPI spec
    • GET /openapi.json – OpenAPI 3.1 document; used to generate SDKs (TypeScript, Go, etc.).
  • Auth H.P.009-CONFIGuration
    • PUT /auth/:id – Set provider credentials (e.g., openai, anthropic). Body must match provider schema; returns boolean.
  • Sessions / chat (typical pattern)
    • Endpoints (simplified, varies slightly by version) include:
      • POST /session – Create a new session (model, repo path, H.P.009-CONFIG).
      • GET /session/:id – Get session state.
      • POST /session/:id/message – Send a message / command, stream or poll for responses.
  • Events (SSE)
    • GET /event – Server-sent events stream; first event is server.connected, then bus events for messages, edits, etc.3027

Design-wise, the intended flow is: create session → send message → consume events/stream for output and tool activity.27

Embed into bash/zsh

Minimal pattern: treat OpenCode as a local HTTP agent and wrap it with shell functions.

    1. Start server in the background (e.g., in .bashrc or a tmux hook):
opencode serve --port 4096 --hostname 127.0.0.1 >/tmp/opencode.log 2>&1 &

28

    1. Shell helper using curl (bash/zsh):
oc_session() {
curl -s -X POST http://127.0.0.1:4096/session \
-H 'Content-Type: application/json' \
-d "{\"cwd\": \"$(pwd)\", \"model\": \"anthropic:claude-3-7-sonnet\"}" |
jq -r '.id'
}

oc_ask() {
local sid msg
sid="${1:?session id required}"
msg="${2:?message required}"
curl -s -X POST "http://127.0.0.1:4096/session/${sid}/message" \
-H 'Content-Type: application/json' \
-d "{\"text\": \"$msg\"}"
}
    1. Convenience alias that includes context (e.g., git status or file list) before asking the agent:
oc_here() {
local sid ctx
sid="$(oc_session)"
ctx="$(git status --short 2>/dev/null || ls)"
oc_ask "$sid" "Here is the context:\n$ctx\n\nNow: $*"
}

This lets you stay in bash/zsh and treat OpenCode as an “AI coprocessor” without touching its TUI.3127

Using MCP with custom terminal apps

  • OpenCode supports MCP servers (local/remote) defined in its H.P.009-CONFIG under mcp, with support for type: "local" and H.P.002-COMMANDS that start MCP processes.29
  • Example MCP H.P.009-CONFIGuration (e.g., for a local tool server):
{
"mcp": {
"my_tools": {
"enabled": true,
"type": "local",
"command": "npx",
"args": ["@modelcontextprotocol/server-everything"],
"env": {
"MY_ENV_VAR": "value"
}
}
}
}
  • Your custom terminal app does not need to speak MCP itself; it just talks to OpenCode’s HTTP API, and OpenCode handles MCP tool discovery and invocation based on H.P.007-PROMPTS like “use the my_tools tool to …”.323329

Calling OpenCode via HTTP from another CLI

For a custom CLI (Go, Rust, TS, etc.), treat OpenCode as a local service:

    1. Generate an SDK from GET /openapi.json or use a generic OpenAPI client generator.3127
    1. Implement a small adapter in your CLI: High-level flow:
    • On mycli ai ...:
      • Ensure server is running (spawn if not).
      • POST /session with { cwd, model, H.P.009-CONFIG }.
      • POST /session/:id/message with { text, files?, H.P.002-COMMANDS? }.
      • Subscribe to GET /event (or per-session stream) and map events to your CLI UX (progress, edits, suggestions).3027
    1. For scripting languages, you can skip SDKs and just do raw HTTP, like:
curl -s http://127.0.0.1:4096/openapi.json | jq '.info'

or in Node:

const res = await fetch('http://127.0.0.1:4096/session', {
method: 'POST',
headers: { 'content-type': 'application/json' },
body: JSON.stringify({ cwd: process.cwd(), model: 'openai:gpt-4.1' }),
});

2827


Focused next prompt you could use:

“Design a concrete HTTP session/message schema and Go client for treating OpenCode as a multi-tenant coding agent microservice behind my own CLI.” 3435363738394041

Footnotes

  1. https://opencode.ai 2

  2. https://github.com/opencode-ai/opencode 2 3 4 5 6 7 8 9 10

  3. https://opencode.ai/docs/ 2

  4. https://dev.to/apilover/opencode-the-open-source-claude-code-alternative-how-it-elevates-your-terminal-workflow-2apl 2 3 4

  5. https://www.reddit.com/r/AISEOInsider/comments/1qh838s/opencode_ai_coding_tool_is_changing_how/ 2 3 4 5 6 7

  6. https://www.docker.com/blog/opencode-docker-model-runner-private-ai-coding/ 2 3

  7. https://docs.together.ai/docs/how-to-use-opencode

  8. https://www.youtube.com/watch?v=c2Mf4U8s8f8

  9. https://docs.z.ai/devpack/tool/opencode

  10. https://www.youtube.com/watch?v=WXffHkvfRpM

  11. https://opencode.ai 2

  12. https://opencode.ai/docs/cli/ 2 3 4

  13. https://cefboud.com/posts/coding-H.P.001-AGENTS-internals-opencode-deepdive/ 2 3 4

  14. https://github.com/opencode-ai/opencode 2 3 4 5 6 7 8

  15. https://github.com/charmbracelet/crush 2

  16. https://www.freecodecamp.org/news/integrate-ai-into-your-terminal-using-opencode/

  17. https://www.linkedin.com/posts/lunguflorin_how-to-integrate-ai-into-your-terminal-using-activity-7382988737565069312-0eLI

  18. https://atalupadhyay.wordpress.com/2025/08/12/crush-cli-the-next-generation-ai-coding-agent/

  19. https://www.reddit.com/r/ClaudeAI/comments/1mda0yw/your_cli_but_smarter_crush_your_ai_bestie_for_the/

  20. https://www.reddit.com/r/ClaudeCode/comments/1mkku6v/after_2_days_of_using_opencode_cli_i_have_no/

  21. https://charm.land

  22. https://opencode.ai

  23. https://news.ycombinator.com/item?id=44736176

  24. https://www.youtube.com/watch?v=3crlgrd86r0

  25. https://www.youtube.com/watch?v=KWo3lo5VipI

  26. https://www.youtube.com/watch?v=ipY_e9sldFM

  27. https://opencode.ai/docs/server/ 2 3 4 5 6 7 8 9 10

  28. https://open-code.ai/en/docs/server 2 3 4

  29. https://opencode.ai/docs/mcp-servers/ 2 3

  30. https://www.opencodecn.com/docs/server 2 3 4

  31. https://github.com/chris-tse/opencode-web 2 3

  32. https://brightdata.com/blog/ai/crush-cli-with-web-mcp

  33. https://composio.dev/blog/mcp-with-opencode

  34. https://www.pulsemcp.com/clients/sst-opencode

  35. https://docs.z.ai/devpack/tool/opencode

  36. https://github.com/charmbracelet/crush

  37. https://docs.z.ai/devpack/tool/crush

  38. https://learn.netdata.cloud/docs/netdata-ai/mcp/mcp-clients/opencode

  39. https://www.reddit.com/r/ClaudeAI/comments/1mda0yw/your_cli_but_smarter_crush_your_ai_bestie_for_the/

  40. https://opencode.ai/docs/providers/

  41. https://github.com/sst/opencode