Skip to main content

Multi-AI CLI Generalization Strategy

Analysis Date: 2025-10-14
Scope: Generalizing HumanLayer for Claude Code, Gemini CLI, Grok CLI, OpenAI CLI

Current Limitations​

HumanLayer is Claude Code-specific with hardcoded:

  • MCP protocol integration (hlyr/src/mcp.ts)
  • Binary paths and validation
  • Event stream format assumptions
  • Fixed tool name request_permission

AI CLI Comparison​

FeatureClaude CodeGemini CLIGrok CLIOpenAI CLI
ProtocolMCPMCP + ExtensionsOpenAI APIOpenAI API
ToolsMCP stdioBuilt-in + MCPFunction roundsFunction calling
ConfigJSONGEMINI.mdConfig filesAPI keys
SessionsNativeCheckpointsContext windowsCompletions

Proposed Architecture​

Core Abstraction Layer​

type AIProvider interface {
GetInfo() ProviderInfo
DiscoverBinary() (string, error)
CreateSession(config SessionConfig) (AISession, error)
CreateToolServer(config ToolServerConfig) (ToolServer, error)
}

type AISession interface {
Start() error
Stop() error
GetEventStream() <-chan SessionEvent
SendCommand(cmd Command) error
}

type ToolServer interface {
Start() error
RegisterTool(tool Tool) error
SetApprovalHandler(handler ApprovalHandler) error
}

Provider Implementations​

Claude Provider​

func (cp *ClaudeProvider) CreateSession(config SessionConfig) (AISession, error) {
claudeConfig := &claudecode.Config{
Model: config.Model,
Directory: config.WorkingDir,
MCPServers: map[string]claudecode.MCPServer{
"humanlayer": {
Command: "hlyr",
Args: []string{"mcp", "claude_approvals"},
},
},
}
return NewClaudeSession(claudeConfig), nil
}

Gemini Provider​

func (gp *GeminiProvider) CreateSession(config SessionConfig) (AISession, error) {
// Create GEMINI.md context file
contextFile := filepath.Join(config.WorkingDir, "GEMINI.md")
contextContent := generateApprovalContext(config)
ioutil.WriteFile(contextFile, []byte(contextContent), 0644)

return NewGeminiSession(config), nil
}

Grok Provider​

func (gp *GrokProvider) CreateSession(config SessionConfig) (AISession, error) {
grokConfig := GrokSessionConfig{
Model: config.Model,
MaxToolRounds: 400,
APIEndpoint: "https://api.x.ai/v1/chat/completions",
}
return NewGrokSession(grokConfig), nil
}

Multi-Protocol Tool Server​

type MultiProtocolToolServer struct {
mcpServer *MCPToolServer
openaiServer *OpenAIToolServer
router *ToolRouter
}

func (mpts *MultiProtocolToolServer) RegisterTool(tool Tool) error {
// Register with appropriate protocol servers based on provider
if provider.GetInfo().Capabilities.HasMCP() {
mpts.mcpServer.RegisterTool(tool)
}
if provider.GetInfo().Capabilities.HasOpenAI() {
mpts.openaiServer.RegisterTool(tool)
}
return nil
}

Configuration Schema​

providers:
claude:
binary_path: "/usr/local/bin/claude-code"
protocol: "mcp"
approval_mode: "required"

gemini:
binary_path: "/usr/local/bin/gemini"
protocol: "extensions"
context_file: "GEMINI.md"

grok:
binary_path: "/usr/local/bin/grok-cli"
protocol: "openai"
max_rounds: 400

openai:
binary_path: "/usr/local/bin/codex"
protocol: "openai"
auto_approve: ["read_file"]

Implementation Strategy​

Phase 1: Core Abstraction (4-6 weeks)​

  1. Define AIProvider, AISession, ToolServer interfaces
  2. Implement provider registry system
  3. Refactor session manager for provider abstraction
  4. Create multi-provider configuration system

Phase 2: Provider Migration (2-3 weeks)​

  1. Wrap existing Claude Code logic in provider interface
  2. Maintain backward compatibility
  3. Test existing workflows continue working

Phase 3: New Providers (6-8 weeks each)​

  1. Gemini: Extensions-based approval system
  2. Grok: OpenAI function calling integration
  3. OpenAI: Direct function calling with optional approval

Phase 4: Advanced Features (4-6 weeks)​

  1. Multi-protocol tool servers
  2. Provider auto-discovery
  3. Plugin architecture for third-party providers

Integration Points​

REST API Extensions​

POST /api/v1/sessions
{
"provider": "gemini|claude|grok|openai",
"model": "gemini-2.5-pro",
"directory": "./project",
"approval_mode": "required"
}

GET /api/v1/providers
# Returns available providers and capabilities

CLI Commands​

hlyr launch --provider gemini --model gemini-2.5-pro --directory ./project
hlyr providers list
hlyr providers install gemini-cli
hlyr tools start --protocols mcp,openai

Key Benefits​

For Users​

  • Provider Choice: Use any AI CLI with same approval workflow
  • Unified Experience: Same UI across different providers
  • Cost Optimization: Choose optimal provider per task

For Developers​

  • Extensibility: Easy addition of new providers
  • Protocol Agnostic: Support MCP, OpenAI, custom protocols
  • Clear Separation: Reduced complexity through abstraction

For Enterprise​

  • Vendor Independence: No AI provider lock-in
  • Compliance: Choose providers by requirements
  • Cost Control: Optimize across multiple providers

Migration Path​

  1. Backward Compatibility: Existing Claude workflows unchanged
  2. Gradual Migration: Add providers incrementally
  3. Configuration Migration: Tools to convert existing configs
  4. Documentation: Provider-specific setup guides

Conclusion​

The proposed abstraction enables HumanLayer to support multiple AI CLI tools while preserving the core approval workflow. The phased implementation ensures smooth migration from Claude-specific to multi-provider architecture, providing users with flexibility and preventing vendor lock-in.