14. Technical Implementation
Overview
Dashboard 2.0 is implemented as a database-driven single-page application with SQLite backend, pure JavaScript frontend (no framework initially), and RESTful API for data access. This document provides complete technical specifications for implementation.
Technology Stack:
- Backend: Python 3.10+ with SQLite knowledge.db
- Frontend: Pure JavaScript (ES6+), HTML5, CSS3
- Charts: Chart.js or D3.js
- Search: Fuse.js for client-side fuzzy search
- Server: Python http.server or Flask for development
System Architecture Diagram
Data Flow:
- Markdown Files → Python Parsers → SQLite Database (one-time import)
- Frontend → Fetch data via REST API → Backend → Query database
- Frontend → Update task status → Backend → Update database + log history
14.1 Database Architecture
Existing Schema (knowledge.db)
Current tables (from v1.5):
CREATE TABLE IF NOT EXISTS messages (
id INTEGER PRIMARY KEY AUTOINCREMENT,
hash TEXT UNIQUE NOT NULL, -- SHA-256 message hash
role TEXT NOT NULL, -- 'user' or 'assistant'
content TEXT NOT NULL,
timestamp TEXT NOT NULL, -- ISO 8601 format
checkpoint_id INTEGER,
FOREIGN KEY (checkpoint_id) REFERENCES checkpoints(id)
);
CREATE TABLE IF NOT EXISTS checkpoints (
id INTEGER PRIMARY KEY AUTOINCREMENT,
title TEXT NOT NULL,
description TEXT,
created_at TEXT NOT NULL,
message_count INTEGER DEFAULT 0,
file_path TEXT -- Path to checkpoint markdown file
);
CREATE TABLE IF NOT EXISTS git_commits (
id INTEGER PRIMARY KEY AUTOINCREMENT,
hash TEXT UNIQUE NOT NULL,
repository TEXT NOT NULL,
author_name TEXT,
author_email TEXT,
date TEXT NOT NULL,
message TEXT NOT NULL,
files_changed TEXT -- JSON array
);
New Tables (Dashboard 2.0)
6 new tables for project-plan.md and tasklist.md integration:
-- Projects table (from project-plan.md)
CREATE TABLE IF NOT EXISTS projects (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT UNIQUE NOT NULL, -- e.g., "coditect-cloud-backend"
title TEXT NOT NULL, -- Display name
description TEXT,
status TEXT DEFAULT 'active', -- active, completed, on_hold
phase TEXT, -- Phase 0, Phase 1, Phase 2, etc.
priority TEXT, -- P0, P1, P2, P3
start_date TEXT, -- ISO 8601
end_date TEXT,
budget REAL, -- USD
completion_percentage REAL DEFAULT 0.0,
created_at TEXT NOT NULL,
updated_at TEXT
);
-- Tasks table (from tasklist.md)
CREATE TABLE IF NOT EXISTS tasks (
id INTEGER PRIMARY KEY AUTOINCREMENT,
project_id INTEGER NOT NULL,
content TEXT NOT NULL, -- Task description
active_form TEXT NOT NULL, -- Present continuous form
status TEXT DEFAULT 'pending', -- pending, in_progress, completed
phase TEXT, -- Phase 0, Phase 1, Dashboard 2.0, etc.
priority TEXT, -- P0, P1, P2, P3
effort_hours REAL, -- Estimated effort
assignee TEXT,
tags TEXT, -- JSON array: ["backend", "database"]
checkpoint_id INTEGER, -- Associated checkpoint
created_at TEXT NOT NULL,
completed_at TEXT,
FOREIGN KEY (project_id) REFERENCES projects(id),
FOREIGN KEY (checkpoint_id) REFERENCES checkpoints(id)
);
-- Task dependencies
CREATE TABLE IF NOT EXISTS task_dependencies (
id INTEGER PRIMARY KEY AUTOINCREMENT,
task_id INTEGER NOT NULL,
depends_on_task_id INTEGER NOT NULL,
dependency_type TEXT DEFAULT 'finish_to_start', -- finish_to_start, start_to_start
FOREIGN KEY (task_id) REFERENCES tasks(id),
FOREIGN KEY (depends_on_task_id) REFERENCES tasks(id),
UNIQUE(task_id, depends_on_task_id)
);
-- Milestones (from project-plan.md phases)
CREATE TABLE IF NOT EXISTS milestones (
id INTEGER PRIMARY KEY AUTOINCREMENT,
project_id INTEGER NOT NULL,
name TEXT NOT NULL,
description TEXT,
target_date TEXT NOT NULL,
status TEXT DEFAULT 'pending', -- pending, achieved, missed
criteria TEXT, -- Success criteria
FOREIGN KEY (project_id) REFERENCES projects(id)
);
-- Task status history (audit trail)
CREATE TABLE IF NOT EXISTS task_status_history (
id INTEGER PRIMARY KEY AUTOINCREMENT,
task_id INTEGER NOT NULL,
old_status TEXT,
new_status TEXT NOT NULL,
changed_at TEXT NOT NULL,
changed_by TEXT, -- User or system
checkpoint_id INTEGER, -- Checkpoint when changed
FOREIGN KEY (task_id) REFERENCES tasks(id),
FOREIGN KEY (checkpoint_id) REFERENCES checkpoints(id)
);
-- Project metrics (calculated KPIs)
CREATE TABLE IF NOT EXISTS project_metrics (
id INTEGER PRIMARY KEY AUTOINCREMENT,
project_id INTEGER NOT NULL,
metric_name TEXT NOT NULL, -- completion_rate, velocity, etc.
value REAL NOT NULL,
calculated_at TEXT NOT NULL,
FOREIGN KEY (project_id) REFERENCES projects(id)
);
-- Indexes for performance
CREATE INDEX IF NOT EXISTS idx_tasks_project ON tasks(project_id);
CREATE INDEX IF NOT EXISTS idx_tasks_status ON tasks(status);
CREATE INDEX IF NOT EXISTS idx_tasks_phase ON tasks(phase);
CREATE INDEX IF NOT EXISTS idx_messages_checkpoint ON messages(checkpoint_id);
CREATE INDEX IF NOT EXISTS idx_commits_repo ON git_commits(repository);
14.2 Data Migration & Parsing
tasklist.md Parser
Parse tasklist.md into tasks table:
#!/usr/bin/env python3
"""
Parse tasklist.md and populate tasks table
"""
import re
import sqlite3
from datetime import datetime
from pathlib import Path
def parse_tasklist(file_path):
"""
Parse tasklist.md and extract all tasks with metadata.
Returns:
List of task dicts with content, status, phase, priority, etc.
"""
with open(file_path, 'r', encoding='utf-8') as f:
content = f.read()
tasks = []
current_project = None
current_phase = None
# Regex patterns
project_pattern = r'^## (.+)$'
phase_pattern = r'^### (.+)$'
task_pattern = r'^\s*- \[([ x~])\] \*\*(.+?)\*\*'
for line in content.split('\n'):
# Match project header
if match := re.match(project_pattern, line):
current_project = match.group(1).strip()
# Match phase header
elif match := re.match(phase_pattern, line):
current_phase = match.group(1).strip()
# Match task checkbox
elif match := re.match(task_pattern, line):
checkbox = match.group(1)
task_content = match.group(2)
# Map checkbox to status
status_map = {
' ': 'pending',
'x': 'completed',
'~': 'in_progress'
}
status = status_map.get(checkbox, 'pending')
# Extract priority (P0, P1, P2, P3)
priority = None
if pri_match := re.search(r'P[0-3]', task_content):
priority = pri_match.group(0)
# Extract effort (e.g., "2h", "30m")
effort_hours = None
if effort_match := re.search(r'(\d+)h', task_content):
effort_hours = float(effort_match.group(1))
elif effort_match := re.search(r'(\d+)m', task_content):
effort_hours = float(effort_match.group(1)) / 60
tasks.append({
'project': current_project,
'phase': current_phase,
'content': task_content,
'status': status,
'priority': priority,
'effort_hours': effort_hours,
'created_at': datetime.utcnow().isoformat() + 'Z'
})
return tasks
def insert_tasks_to_db(tasks, db_path):
"""Insert parsed tasks into database."""
conn = sqlite3.connect(db_path)
cursor = conn.cursor()
for task in tasks:
# Get or create project
cursor.execute(
"SELECT id FROM projects WHERE name = ?",
(task['project'],)
)
project_row = cursor.fetchone()
if project_row:
project_id = project_row[0]
else:
cursor.execute("""
INSERT INTO projects (name, title, status, created_at)
VALUES (?, ?, 'active', ?)
""", (task['project'], task['project'], task['created_at']))
project_id = cursor.lastrowid
# Insert task
cursor.execute("""
INSERT INTO tasks (
project_id, content, active_form, status, phase,
priority, effort_hours, created_at
) VALUES (?, ?, ?, ?, ?, ?, ?, ?)
""", (
project_id,
task['content'],
f"Working on {task['content']}", # Generate active_form
task['status'],
task['phase'],
task['priority'],
task['effort_hours'],
task['created_at']
))
conn.commit()
conn.close()
print(f"✅ Inserted {len(tasks)} tasks into database")
if __name__ == '__main__':
tasklist_path = Path('docs/project-management/tasklist.md')
db_path = Path('MEMORY-CONTEXT/knowledge.db')
tasks = parse_tasklist(tasklist_path)
print(f"📋 Parsed {len(tasks)} tasks from tasklist.md")
insert_tasks_to_db(tasks, db_path)
project-plan.md Parser
Parse project-plan.md into projects and milestones tables:
#!/usr/bin/env python3
"""
Parse project-plan.md and populate projects/milestones tables
"""
import re
import sqlite3
from datetime import datetime
from pathlib import Path
def parse_project_plan(file_path):
"""Parse project-plan.md and extract project metadata."""
with open(file_path, 'r', encoding='utf-8') as f:
content = f.read()
projects = []
milestones = []
# Extract project sections (e.g., "## Dashboard 2.0: ...")
project_pattern = r'## (.+?)\n\n### Overview\n\*\*Status:\*\* (.+?) \| \*\*Priority:\*\* (.+?) \| \*\*Timeline:\*\* (.+?)\n\*\*Budget:\*\* \$(.+?) \|'
for match in re.finditer(project_pattern, content):
name = match.group(1).strip()
status_emoji = match.group(2).strip()
priority = match.group(3).strip()
timeline = match.group(4).strip()
budget = float(match.group(5).replace(',', ''))
# Map emoji to status
status_map = {
'✅': 'completed',
'🚧': 'in_progress',
'📅': 'scheduled',
'⏸️': 'on_hold'
}
status = status_map.get(status_emoji, 'active')
projects.append({
'name': name,
'status': status,
'priority': priority,
'timeline': timeline,
'budget': budget,
'created_at': datetime.utcnow().isoformat() + 'Z'
})
return projects, milestones
def insert_projects_to_db(projects, db_path):
"""Insert parsed projects into database."""
conn = sqlite3.connect(db_path)
cursor = conn.cursor()
for project in projects:
cursor.execute("""
INSERT OR REPLACE INTO projects (
name, title, status, priority, budget, created_at
) VALUES (?, ?, ?, ?, ?, ?)
""", (
project['name'],
project['name'],
project['status'],
project['priority'],
project['budget'],
project['created_at']
))
conn.commit()
conn.close()
print(f"✅ Inserted {len(projects)} projects into database")
if __name__ == '__main__':
plan_path = Path('docs/project-management/project-plan.md')
db_path = Path('MEMORY-CONTEXT/knowledge.db')
projects, milestones = parse_project_plan(plan_path)
print(f"📋 Parsed {len(projects)} projects from project-plan.md")
insert_projects_to_db(projects, db_path)
14.3 REST API Endpoints
Backend Flask Server
Simple Flask API for dashboard data:
from flask import Flask, jsonify, request
from flask_cors import CORS
import sqlite3
from pathlib import Path
app = Flask(__name__)
CORS(app) # Allow frontend to access
DB_PATH = Path(__file__).parent / 'MEMORY-CONTEXT/knowledge.db'
def query_db(query, args=(), one=False):
"""Query database and return results."""
conn = sqlite3.connect(DB_PATH)
conn.row_factory = sqlite3.Row
cursor = conn.execute(query, args)
rows = cursor.fetchall()
conn.close()
return (rows[0] if rows else None) if one else rows
@app.route('/api/projects', methods=['GET'])
def get_projects():
"""Get all projects with stats."""
projects = query_db("""
SELECT
p.id, p.name, p.title, p.status, p.phase, p.priority,
p.start_date, p.end_date, p.budget, p.completion_percentage,
COUNT(t.id) as task_count,
SUM(CASE WHEN t.status = 'completed' THEN 1 ELSE 0 END) as completed_tasks
FROM projects p
LEFT JOIN tasks t ON p.id = t.project_id
GROUP BY p.id
ORDER BY p.priority, p.name
""")
return jsonify([dict(row) for row in projects])
@app.route('/api/projects/<int:project_id>/tasks', methods=['GET'])
def get_project_tasks(project_id):
"""Get all tasks for a project."""
status = request.args.get('status') # Optional filter
query = """
SELECT * FROM tasks
WHERE project_id = ?
"""
args = [project_id]
if status:
query += " AND status = ?"
args.append(status)
query += " ORDER BY priority, created_at"
tasks = query_db(query, args)
return jsonify([dict(row) for row in tasks])
@app.route('/api/tasks/<int:task_id>', methods=['GET'])
def get_task(task_id):
"""Get single task with details."""
task = query_db("SELECT * FROM tasks WHERE id = ?", [task_id], one=True)
if not task:
return jsonify({'error': 'Task not found'}), 404
# Get dependencies
deps = query_db("""
SELECT t.* FROM tasks t
JOIN task_dependencies td ON t.id = td.depends_on_task_id
WHERE td.task_id = ?
""", [task_id])
# Get history
history = query_db("""
SELECT * FROM task_status_history
WHERE task_id = ?
ORDER BY changed_at DESC
""", [task_id])
return jsonify({
'task': dict(task),
'dependencies': [dict(d) for d in deps],
'history': [dict(h) for h in history]
})
@app.route('/api/tasks/<int:task_id>', methods=['PATCH'])
def update_task(task_id):
"""Update task status."""
data = request.json
new_status = data.get('status')
if not new_status:
return jsonify({'error': 'Status required'}), 400
conn = sqlite3.connect(DB_PATH)
cursor = conn.cursor()
# Get current status
cursor.execute("SELECT status FROM tasks WHERE id = ?", [task_id])
row = cursor.fetchone()
if not row:
return jsonify({'error': 'Task not found'}), 404
old_status = row[0]
# Update task
cursor.execute("""
UPDATE tasks
SET status = ?, updated_at = datetime('now')
WHERE id = ?
""", [new_status, task_id])
# Log history
cursor.execute("""
INSERT INTO task_status_history (
task_id, old_status, new_status, changed_at, changed_by
) VALUES (?, ?, ?, datetime('now'), 'dashboard_user')
""", [task_id, old_status, new_status])
conn.commit()
conn.close()
return jsonify({'success': True})
@app.route('/api/analytics/completion', methods=['GET'])
def get_completion_rate():
"""Get overall completion rate."""
stats = query_db("""
SELECT
COUNT(*) as total_tasks,
SUM(CASE WHEN status = 'completed' THEN 1 ELSE 0 END) as completed,
SUM(CASE WHEN status = 'in_progress' THEN 1 ELSE 0 END) as in_progress,
SUM(CASE WHEN status = 'pending' THEN 1 ELSE 0 END) as pending
FROM tasks
""", one=True)
return jsonify(dict(stats))
@app.route('/api/search', methods=['GET'])
def search():
"""Search tasks, projects, checkpoints."""
query = request.args.get('q', '')
if len(query) < 2:
return jsonify([])
# Simple LIKE search (Fuse.js on frontend for fuzzy)
results = query_db("""
SELECT 'task' as type, id, content as title, project_id
FROM tasks
WHERE content LIKE ?
LIMIT 20
""", [f'%{query}%'])
return jsonify([dict(r) for r in results])
if __name__ == '__main__':
app.run(debug=True, port=5000)
14.4 Frontend Architecture
File Structure
dashboard/
├── index.html
├── css/
│ ├── main.css
│ ├── kanban.css
│ ├── modals.css
│ └── responsive.css
├── js/
│ ├── main.js
│ ├── api.js # API calls
│ ├── kanban.js # Kanban board
│ ├── search.js # Global search
│ ├── filters.js # Filter modal
│ ├── charts.js # Analytics charts
│ └── utils.js # Helper functions
└── assets/
└── icons/
Main HTML (index.html)
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Dashboard 2.0 - CODITECT</title>
<link rel="stylesheet" href="css/main.css">
<link rel="stylesheet" href="css/kanban.css">
<link rel="stylesheet" href="css/modals.css">
<link rel="stylesheet" href="css/responsive.css">
</head>
<body>
<!-- Skip Links -->
<a href="#main-content" class="skip-link">Skip to main content</a>
<div class="dashboard-layout">
<!-- Sidebar -->
<aside class="sidebar" id="sidebar">
<!-- Navigation from 04-navigation.md -->
</aside>
<!-- Top Bar -->
<header class="topbar">
<!-- Search and controls from 11-global-search.md -->
</header>
<!-- Main Content -->
<main class="main-content" id="main-content">
<!-- Views: Portfolio, Kanban, Timeline, Analytics -->
</main>
</div>
<!-- Modals -->
<div id="filter-modal" class="modal" style="display: none;">
<!-- Filter modal from 09-filter-modal.md -->
</div>
<div id="task-detail-modal" class="modal" style="display: none;">
<!-- Task detail from 10-task-detail-modal.md -->
</div>
<!-- Scripts -->
<script src="https://cdn.jsdelivr.net/npm/chart.js"></script>
<script src="https://cdn.jsdelivr.net/npm/fuse.js/dist/fuse.js"></script>
<script src="js/utils.js"></script>
<script src="js/api.js"></script>
<script src="js/kanban.js"></script>
<script src="js/search.js"></script>
<script src="js/filters.js"></script>
<script src="js/charts.js"></script>
<script src="js/main.js"></script>
</body>
</html>
API Client (js/api.js)
/**
* API client for dashboard backend
*/
const API_BASE = 'http://localhost:5000/api';
class DashboardAPI {
async fetchProjects() {
const response = await fetch(`${API_BASE}/projects`);
if (!response.ok) throw new Error('Failed to fetch projects');
return response.json();
}
async fetchProjectTasks(projectId, status = null) {
let url = `${API_BASE}/projects/${projectId}/tasks`;
if (status) url += `?status=${status}`;
const response = await fetch(url);
if (!response.ok) throw new Error('Failed to fetch tasks');
return response.json();
}
async fetchTask(taskId) {
const response = await fetch(`${API_BASE}/tasks/${taskId}`);
if (!response.ok) throw new Error('Failed to fetch task');
return response.json();
}
async updateTask(taskId, updates) {
const response = await fetch(`${API_BASE}/tasks/${taskId}`, {
method: 'PATCH',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(updates)
});
if (!response.ok) throw new Error('Failed to update task');
return response.json();
}
async search(query) {
const response = await fetch(`${API_BASE}/search?q=${encodeURIComponent(query)}`);
if (!response.ok) throw new Error('Search failed');
return response.json();
}
async fetchAnalytics() {
const response = await fetch(`${API_BASE}/analytics/completion`);
if (!response.ok) throw new Error('Failed to fetch analytics');
return response.json();
}
}
const api = new DashboardAPI();
14.5 Performance Optimization
Database Indexing
Already included in schema (see 14.1):
idx_tasks_project- Fast project-based queriesidx_tasks_status- Quick status filteringidx_tasks_phase- Phase-based queries
Frontend Caching
Cache API responses:
class CachedAPI extends DashboardAPI {
constructor() {
super();
this.cache = new Map();
this.cacheDuration = 60000; // 60 seconds
}
async fetchProjects() {
const cacheKey = 'projects';
if (this.cache.has(cacheKey)) {
const cached = this.cache.get(cacheKey);
if (Date.now() - cached.timestamp < this.cacheDuration) {
return cached.data;
}
}
const data = await super.fetchProjects();
this.cache.set(cacheKey, { data, timestamp: Date.now() });
return data;
}
invalidateCache(key = null) {
if (key) {
this.cache.delete(key);
} else {
this.cache.clear();
}
}
}
Lazy Loading
Load content on demand:
// Intersection Observer for lazy image loading
const imageObserver = new IntersectionObserver((entries) => {
entries.forEach(entry => {
if (entry.isIntersecting) {
const img = entry.target;
img.src = img.dataset.src;
imageObserver.unobserve(img);
}
});
});
document.querySelectorAll('img[data-src]').forEach(img => {
imageObserver.observe(img);
});
14.6 Deployment
Development Server
# Backend
cd backend
python -m venv venv
source venv/bin/activate
pip install flask flask-cors
python server.py
# Frontend
cd dashboard
python -m http.server 8000
Production Deployment
Option 1: Docker
# Dockerfile
FROM python:3.10-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
EXPOSE 5000
CMD ["gunicorn", "-w", "4", "-b", "0.0.0.0:5000", "server:app"]
Option 2: Cloud Run (GCP)
# app.yaml (App Engine)
runtime: python310
handlers:
- url: /static
static_dir: dashboard
- url: /.*
script: auto
Next: 15. User Stories & Scenarios Previous: 13. Accessibility (WCAG 2.1 AA) Index: Master Index