Multi-AI CLI Generalization Strategy
Analysis Date: 2025-10-14
Scope: Generalizing HumanLayer for Claude Code, Gemini CLI, Grok CLI, OpenAI CLI
Current Limitations​
HumanLayer is Claude Code-specific with hardcoded:
- MCP protocol integration (
hlyr/src/mcp.ts) - Binary paths and validation
- Event stream format assumptions
- Fixed tool name
request_permission
AI CLI Comparison​
| Feature | Claude Code | Gemini CLI | Grok CLI | OpenAI CLI |
|---|---|---|---|---|
| Protocol | MCP | MCP + Extensions | OpenAI API | OpenAI API |
| Tools | MCP stdio | Built-in + MCP | Function rounds | Function calling |
| Config | JSON | GEMINI.md | Config files | API keys |
| Sessions | Native | Checkpoints | Context windows | Completions |
Proposed Architecture​
Core Abstraction Layer​
type AIProvider interface {
GetInfo() ProviderInfo
DiscoverBinary() (string, error)
CreateSession(config SessionConfig) (AISession, error)
CreateToolServer(config ToolServerConfig) (ToolServer, error)
}
type AISession interface {
Start() error
Stop() error
GetEventStream() <-chan SessionEvent
SendCommand(cmd Command) error
}
type ToolServer interface {
Start() error
RegisterTool(tool Tool) error
SetApprovalHandler(handler ApprovalHandler) error
}
Provider Implementations​
Claude Provider​
func (cp *ClaudeProvider) CreateSession(config SessionConfig) (AISession, error) {
claudeConfig := &claudecode.Config{
Model: config.Model,
Directory: config.WorkingDir,
MCPServers: map[string]claudecode.MCPServer{
"humanlayer": {
Command: "hlyr",
Args: []string{"mcp", "claude_approvals"},
},
},
}
return NewClaudeSession(claudeConfig), nil
}
Gemini Provider​
func (gp *GeminiProvider) CreateSession(config SessionConfig) (AISession, error) {
// Create GEMINI.md context file
contextFile := filepath.Join(config.WorkingDir, "GEMINI.md")
contextContent := generateApprovalContext(config)
ioutil.WriteFile(contextFile, []byte(contextContent), 0644)
return NewGeminiSession(config), nil
}
Grok Provider​
func (gp *GrokProvider) CreateSession(config SessionConfig) (AISession, error) {
grokConfig := GrokSessionConfig{
Model: config.Model,
MaxToolRounds: 400,
APIEndpoint: "https://api.x.ai/v1/chat/completions",
}
return NewGrokSession(grokConfig), nil
}
Multi-Protocol Tool Server​
type MultiProtocolToolServer struct {
mcpServer *MCPToolServer
openaiServer *OpenAIToolServer
router *ToolRouter
}
func (mpts *MultiProtocolToolServer) RegisterTool(tool Tool) error {
// Register with appropriate protocol servers based on provider
if provider.GetInfo().Capabilities.HasMCP() {
mpts.mcpServer.RegisterTool(tool)
}
if provider.GetInfo().Capabilities.HasOpenAI() {
mpts.openaiServer.RegisterTool(tool)
}
return nil
}
Configuration Schema​
providers:
claude:
binary_path: "/usr/local/bin/claude-code"
protocol: "mcp"
approval_mode: "required"
gemini:
binary_path: "/usr/local/bin/gemini"
protocol: "extensions"
context_file: "GEMINI.md"
grok:
binary_path: "/usr/local/bin/grok-cli"
protocol: "openai"
max_rounds: 400
openai:
binary_path: "/usr/local/bin/codex"
protocol: "openai"
auto_approve: ["read_file"]
Implementation Strategy​
Phase 1: Core Abstraction (4-6 weeks)​
- Define
AIProvider,AISession,ToolServerinterfaces - Implement provider registry system
- Refactor session manager for provider abstraction
- Create multi-provider configuration system
Phase 2: Provider Migration (2-3 weeks)​
- Wrap existing Claude Code logic in provider interface
- Maintain backward compatibility
- Test existing workflows continue working
Phase 3: New Providers (6-8 weeks each)​
- Gemini: Extensions-based approval system
- Grok: OpenAI function calling integration
- OpenAI: Direct function calling with optional approval
Phase 4: Advanced Features (4-6 weeks)​
- Multi-protocol tool servers
- Provider auto-discovery
- Plugin architecture for third-party providers
Integration Points​
REST API Extensions​
POST /api/v1/sessions
{
"provider": "gemini|claude|grok|openai",
"model": "gemini-2.5-pro",
"directory": "./project",
"approval_mode": "required"
}
GET /api/v1/providers
# Returns available providers and capabilities
CLI Commands​
hlyr launch --provider gemini --model gemini-2.5-pro --directory ./project
hlyr providers list
hlyr providers install gemini-cli
hlyr tools start --protocols mcp,openai
Key Benefits​
For Users​
- Provider Choice: Use any AI CLI with same approval workflow
- Unified Experience: Same UI across different providers
- Cost Optimization: Choose optimal provider per task
For Developers​
- Extensibility: Easy addition of new providers
- Protocol Agnostic: Support MCP, OpenAI, custom protocols
- Clear Separation: Reduced complexity through abstraction
For Enterprise​
- Vendor Independence: No AI provider lock-in
- Compliance: Choose providers by requirements
- Cost Control: Optimize across multiple providers
Migration Path​
- Backward Compatibility: Existing Claude workflows unchanged
- Gradual Migration: Add providers incrementally
- Configuration Migration: Tools to convert existing configs
- Documentation: Provider-specific setup guides
Conclusion​
The proposed abstraction enables HumanLayer to support multiple AI CLI tools while preserving the core approval workflow. The phased implementation ensures smooth migration from Claude-specific to multi-provider architecture, providing users with flexibility and preventing vendor lock-in.