Skip to main content

Technical Design Document (TDD)

AZ1.AI llm IDE - Technical Implementation Specification​

Version: 0.1.0 Date: 2025-10-06 Status: Active Development


Table of Contents​

  1. Technical Overview
  2. System Architecture
  3. API Specifications
  4. Database Schema
  5. Component Specifications
  6. Integration Points
  7. Error Handling
  8. Testing Strategy

1. Technical Overview​

1.1 Technology Stack​

1.2 System Requirements​

ComponentMinimumRecommended
BrowserChrome 90+, Firefox 88+Chrome 120+, Firefox 120+
RAM4GB8GB+
CPU2 cores4 cores+
Storage500MB2GB+
NetworkLocalhost onlyLocalhost only

2. System Architecture​

2.1 Component Hierarchy​

2.2 State Management Architecture​

// Zustand Store Structure
interface AppState {
// Session management
sessions: {
sessionStore: {
sessions: Session[];
activeSessionId: string;
createSession: () => void;
closeSession: (id: string) => void;
switchSession: (id: string) => void;
};
};

// editor state per session
editorStore: {
tabs: Map<sessionId, editorTab[]>;
activeTabId: Map<sessionId, string>;
// ... editor methods
};

// llm state per session
llmStore: {
messages: Map<sessionId, Message[]>;
models: llmModel[];
configs: Map<sessionId, llmConfig>;
// ... llm methods
};

// Global state
fileStore: FileNode[];
terminalStore: terminalState;
themeStore: ThemeConfig;
}

3. API Specifications​

3.1 LM Studio API Integration​

Base URL: http://host.docker.internal:1234/v1

3.1.1 List Models​

GET /models

Response:
{
"data": [
{
"id": "qwen/qwq-32b",
"object": "model",
"owned_by": "organization_owner"
}
],
"object": "list"
}

3.1.2 Chat Completion​

POST /chat/completions

Request:
{
"model": "qwen/qwq-32b",
"messages": [
{ "role": "user", "content": "Hello" }
],
"temperature": 0.7,
"max_tokens": 1000,
"stream": true
}

Response (streaming):
data: {"choices":[{"delta":{"content":"Hello"},"index":0}]}
data: {"choices":[{"delta":{"content":" there"},"index":0}]}
data: [DONE]

3.2 MCP Server API​

Location: /workspace/PROJECTS/t2/mcp-lmstudio/

Tools Available:​

interface MCPTools {
lmstudio_list_models: {
input: {};
output: { models: llmModel[] };
};

lmstudio_chat: {
input: {
model: string;
messages: Message[];
temperature?: number;
max_tokens?: number;
};
output: { content: string };
};

lmstudio_completion: {
input: {
model: string;
prompt: string;
temperature?: number;
max_tokens?: number;
};
output: { content: string };
};
}

3.3 File System API (OPFS)​

class FileSystemService {
private async getRoot(): Promise<FileSystemDirectoryHandle>;

async createFile(path: string, content: string): Promise<void>;

async readFile(path: string): Promise<string>;

async updateFile(path: string, content: string): Promise<void>;

async deleteFile(path: string): Promise<void>;

async listDirectory(path: string): Promise<FileNode[]>;

async createDirectory(path: string): Promise<void>;
}

4. Database Schema​

4.1 IndexedDB Schema (Browser Storage)​

interface IDBSchema {
sessions: {
key: string; // session ID
value: {
id: string;
name: string;
createdAt: Date;
updatedAt: Date;
editorTabs: editorTab[];
llmMessages: Message[];
llmConfig: llmConfig;
};
};

files: {
key: string; // file path
value: {
path: string;
content: string;
language: string;
updatedAt: Date;
};
};

settings: {
key: string; // setting key
value: any;
};
}

4.2 Data Persistence Flow​


5. Component Specifications​

5.1 Session Tabs Component NEW​

// Session Tabs Component
interface session-tabsProps {
sessions: Session[];
activeSessionId: string;
onSessionChange: (id: string) => void;
onSessionClose: (id: string) => void;
onSessionCreate: () => void;
}

interface Session {
id: string;
name: string;
icon?: string;
isDirty: boolean; // unsaved changes
createdAt: Date;
}

// Implementation
const session-tabs: React.FC<session-tabsProps> = ({
sessions,
activeSessionId,
onSessionChange,
onSessionClose,
onSessionCreate
}) => {
return (
<Tabs index={sessions.findIndex(s => s.id === activeSessionId)}>
<TabList>
{sessions.map(session => (
<Tab key={session.id}>
{session.icon} {session.name}
{session.isDirty && <Badge>•</Badge>}
<IconButton onClick={() => onSessionClose(session.id)} />
</Tab>
))}
<IconButton icon={<FiPlus />} onClick={onSessionCreate} />
</TabList>
</Tabs>
);
};

5.2 Monaco editor Integration​

interface editorConfig {
theme: 'vs-dark' | 'vs-light';
fontSize: number;
fontFamily: string;
minimap: { enabled: boolean };
lineNumbers: 'on' | 'off' | 'relative';
wordWrap: 'on' | 'off';
autoSave: boolean;
autoSaveDelay: number; // ms
}

class editorService {
private editor: monaco.editor.IStandaloneCodeeditor;

initialize(container: HTMLElement, config: editorConfig): void;

setContent(content: string, language: string): void;

getContent(): string;

setLanguage(language: string): void;

dispose(): void;

// Event handlers
onContentChange(callback: (content: string) => void): void;

onCursorChange(callback: (position: Position) => void): void;
}

5.3 terminal Integration​

interface terminalConfig {
rows: number;
cols: number;
fontSize: number;
fontFamily: string;
theme: terminalTheme;
cursorBlink: boolean;
}

class terminalService {
private terminal: terminal;
private fitAddon: FitAddon;

initialize(container: HTMLElement, config: terminalConfig): void;

write(data: string): void;

writeln(data: string): void;

clear(): void;

fit(): void;

// Event handlers
onData(callback: (data: string) => void): void;

onResize(callback: (dimensions: {cols, rows}) => void): void;
}

5.4 llm Workflow Engine​

class llmWorkflowEngine {
async executeWorkflow(
mode: WorkflowMode,
prompt: string,
config: llmConfig
): Promise<WorkflowResult> {
switch (mode) {
case 'single':
return this.executeSingle(prompt, config);
case 'parallel':
return this.executeParallel(prompt, config);
case 'sequential':
return this.executeSequential(prompt, config);
case 'consensus':
return this.executeConsensus(prompt, config);
}
}

private async executeSingle(
prompt: string,
config: llmConfig
): Promise<SingleResult> {
const response = await llmService.chatCompletion(
config.primaryModel,
[{ role: 'user', content: prompt }],
config.temperature,
config.maxTokens
);
return { type: 'single', response };
}

private async executeParallel(
prompt: string,
config: llmConfig
): Promise<ParallelResult> {
const [primary, secondary] = await Promise.all([
llmService.chatCompletion(config.primaryModel, ...),
llmService.chatCompletion(config.secondaryModel, ...)
]);
return { type: 'parallel', primary, secondary };
}

// ... sequential and consensus implementations
}

6. Integration Points​

6.1 LM Studio Integration​

6.2 Claude Code Integration​


7. Error Handling​

7.1 Error Types​

enum ErrorType {
NETWORK_ERROR = 'NETWORK_ERROR',
API_ERROR = 'API_ERROR',
VALIDATION_ERROR = 'VALIDATION_ERROR',
FILE_SYSTEM_ERROR = 'FILE_SYSTEM_ERROR',
EDITOR_ERROR = 'EDITOR_ERROR',
TERMINAL_ERROR = 'TERMINAL_ERROR',
}

interface AppError {
type: ErrorType;
message: string;
code?: string;
details?: any;
timestamp: Date;
}

7.2 Error Handling Strategy​


8. Testing Strategy​

8.1 Test Pyramid​

8.2 Test Coverage Goals​

ComponentUnitIntegrationE2E
UI Components80%60%Critical paths
State Management90%70%-
Services85%75%-
API Integration-80%Key workflows

8.3 Testing Examples​

// Unit Test Example
describe('SessionStore', () => {
it('should create new session', () => {
const store = useSessionStore.getState();
const initialCount = store.sessions.length;

store.createSession();

expect(store.sessions.length).toBe(initialCount + 1);
expect(store.activeSessionId).toBe(store.sessions[initialCount].id);
});
});

// Integration Test Example
describe('ChatPanel', () => {
it('should send message and receive response', async () => {
render(<ChatPanel />);

const input = screen.getByPlaceholderText('Ask anything...');
fireEvent.change(input, { target: { value: 'Hello' } });
fireEvent.click(screen.getByText('Send'));

await waitFor(() => {
expect(screen.getByText(/response/i)).toBeInTheDocument();
});
});
});

// E2E Test Example
test('full llm workflow', async ({ page }) => {
await page.goto('http://localhost:5173');

// Select model
await page.click('[data-testid="model-selector"]');
await page.click('text=qwen/qwq-32b');

// Send message
await page.fill('[data-testid="chat-input"]', 'Hello');
await page.click('[data-testid="send-button"]');

// Verify response
await expect(page.locator('[data-testid="message"]')).toContainText('response');
});

Appendix A: Performance Optimization​

Code Splitting Strategy​

// Route-based splitting
const IDElayout = lazy(() => import('./components/layout/IDElayout'));
const Settings = lazy(() => import('./components/Settings/Settings'));

// Component-based splitting
const Monaco = lazy(() => import('@monaco-editor/react'));
const terminal = lazy(() => import('./components/terminal/Xterminal'));

Memoization Strategy​

// Expensive calculations
const processedMessages = useMemo(() => {
return messages.map(formatMessage);
}, [messages]);

// Callback optimization
const handleSend = useCallback((prompt: string) => {
llmService.sendPrompt(prompt);
}, []);

// Component memoization
export default React.memo(ChatPanel, (prev, next) => {
return prev.messages.length === next.messages.length;
});

Document Version: 1.0 Last Updated: 2025-10-06 Next Review: 2025-11-06