Skip to main content

/session-log - Append to Daily Session Log

Append entries to the daily session log with ISO 8601 timestamps (UTC). Supports project-scoped session logs (ADR-155) with automatic project detection. Automatically runs /classify after each update to maintain frontmatter metadata.

Project-Scoped Session Logs (ADR-155)

Session logs are now stored per-project with machine-specific isolation:

Path Structure:

~/.coditect-data/session-logs/
├── projects/ # Project-scoped logs (ADR-155)
│ ├── {project-id}/ # Auto-detected from working directory
│ │ └── {machine-uuid}/ # Machine-specific isolation
│ │ └── SESSION-LOG-YYYY-MM-DD.md
│ └── PILOT/
│ └── d3d3a316-.../
│ └── SESSION-LOG-2026-02-04.md
└── SESSION-LOG-*.md # Legacy (flat) structure

Project Detection:

# Auto-detect project from working directory
cd /path/to/my-project
/session-log "Work done" # → logs to projects/my-project/{machine}/

# Explicit project override
/session-log "Work done" --project PILOT

# Force legacy (flat) location
/session-log "Work done" --no-project

Usage

# Quick entry
/session-log "Fixed cart API response mismatch"

# Detailed entry
/session-log "Cart API Fix" \
--issue "getCart() returned wrapped response" \
--fix "Extract .cart from response" \
--deployed "v1.0.2-cart-fix"

# Start new session
/session-log --new-session "Commerce API Fixes"

# View today's log
/session-log --view

# View specific date
/session-log --view --date 2025-12-30

# Include files modified
/session-log "Type alignment" --files "types/commerce.ts,useCart.ts"

# Include root cause
/session-log "Cart crash" \
--issue "items.length undefined" \
--root-cause "Backend wraps response, frontend didn't unwrap" \
--fix "Modified commerce.service.ts"

# Include task IDs (ADR-054 Track Nomenclature)
/session-log "Track Nomenclature Standard" \
--tasks "F.4.1.1,F.4.2.1,F.4.3.1,F.4.4.1" \
--fix "Created ADR-054 and full standard documentation"

# Range of tasks
/session-log "Cloud Context Sync" --tasks "A.9.1.1-A.9.4.5"

# Include bus activity since last entry (ADR-173)
/session-log "Session coordination work" --bus-activity

# Bus activity dump only (no work description)
/session-log --bus-activity --only

# Tag entry with LLM model (ADR-201 multi-provider)
/session-log "Crypto foundation work" --model kimi-k2.5 --tasks "C.4.1.1"

# Useful when running via claude-kimi, claude-deepseek, etc.
/session-log "API scaffold" --model deepseek-reasoner

Options

OptionDescription
messageBrief description of the task/fix (required for append)
--tasks IDSTask IDs per ADR-054 (e.g., "A.9.1.1" or "F.4.1.1-F.4.4.1")
--issue TEXTDescription of the problem
--root-cause TEXTRoot cause analysis
--fix TEXTWhat was done to fix it
--files LISTComma-separated list of modified files
--deployed VERSIONDeployment version tag
--new-session FOCUSStart a new session with focus description
--viewView the log instead of appending
--date YYYY-MM-DDSpecify date (default: today UTC)
--no-classifySkip auto-classification after update
--project IDExplicit project ID (default: auto-detect from working directory)
--bus-activityInclude inter-session bus activity since last entry (ADR-173). Uses structured session_messages columns.
--model NAMELLM model identifier for the Author line (e.g., kimi-k2.5, deepseek-reasoner). Overrides default Claude (Opus 4.6). Use when running via claude-kimi, claude-deepseek, etc. (ADR-201)
--no-projectDEPRECATED Use legacy flat structure (prints deprecation warning)

System Prompt

EXECUTION DIRECTIVE: When the user invokes /session-log, you MUST:

  1. Detect project context (ADR-155) - see Project Detection below
  2. Determine log file path based on project context
  3. Get current UTC date using date -u '+%Y-%m-%d'
  4. Check if log file exists for current UTC date
  5. If new date: CREATE new log file before appending (see Date Rollover below)
  6. Generate session UUID if starting a new session
  7. Check for errors since last run (see Error Checking below) 7b. Check bus activity since last entry if --bus-activity specified (see Bus Activity below)
  8. Determine Author line — if --model specified, use Claude via {model} (e.g., Claude via kimi-k2.5); otherwise use Claude (Opus 4.6) or appropriate model identifier
  9. Append the entry with ISO 8601 timestamp (UTC), Author line, including any errors and bus activity found
  10. Run /classify on the log file to update frontmatter
  11. Dashboard Phase 1 refresh (ADR-170) — if project is dashboard-enabled, run Node.js metrics generator (see Dashboard Refresh below)
  12. Dashboard Phase 2 refresh (ADR-170) — if --tasks provided AND tasks mark completions, run AI narrative refresh (see Dashboard Refresh below)
  13. Confirm the entry was added with project context, model (if specified), and dashboard refresh status

PROJECT DETECTION (Step 1 - ADR-155):

# Step 1: Get machine UUID
MACHINE_UUID=$(cat ~/.coditect-data/machine-id.json 2>/dev/null | python3 -c "import sys,json; print(json.load(sys.stdin).get('machine_uuid',''))" 2>/dev/null)
if [ -z "$MACHINE_UUID" ]; then
MACHINE_UUID=$(cat ~/.coditect/machine-id.json 2>/dev/null | python3 -c "import sys,json; print(json.load(sys.stdin).get('machine_uuid',''))" 2>/dev/null)
fi

# Step 2: Detect project (unless --no-project specified)
# J.27.4.2: DEPRECATION WARNING for --no-project
if [ "$NO_PROJECT" = "true" ]; then
echo "⚠️ DEPRECATED: --no-project is deprecated. All session logs should use project-scoped paths (ADR-155)."
echo " Use --project <ID> to specify a project, or let auto-detection handle it."
echo " Legacy flat path support will be removed in v3.0.0."
fi
if [ "$NO_PROJECT" != "true" ]; then
# Use explicit --project if provided, otherwise auto-detect
if [ -n "$EXPLICIT_PROJECT" ]; then
PROJECT_ID="$EXPLICIT_PROJECT"
else
# Auto-detect from working directory
cd ~/.coditect && source .venv/bin/activate 2>/dev/null
PROJECT_ID=$(python3 -c "from scripts.core.paths import discover_project; print(discover_project() or '')" 2>/dev/null)
fi
fi

# Step 3: Determine log directory
SESSION_LOGS_BASE="$HOME/PROJECTS/.coditect-data/session-logs"
if [ -n "$PROJECT_ID" ] && [ -n "$MACHINE_UUID" ]; then
# Project-scoped path (ADR-155)
LOG_DIR="$SESSION_LOGS_BASE/projects/$PROJECT_ID/$MACHINE_UUID"
mkdir -p "$LOG_DIR"
else
# Legacy flat path (fallback)
LOG_DIR="$SESSION_LOGS_BASE"
fi

# Step 4: Set log file path
CURRENT_UTC_DATE=$(date -u '+%Y-%m-%d')
LOG_FILE="$LOG_DIR/SESSION-LOG-${CURRENT_UTC_DATE}.md"

ERROR CHECKING (Step 7):

Before appending, check logs for errors since the last session log entry. Use the timestamp of the last ### entry in today's log file, or 24 hours ago if no entries exist:

# Get last entry timestamp from today's log (or use 24 hours ago)
if [ -f "$LOG_FILE" ]; then
LAST_ENTRY=$(grep -E '^### [0-9]{4}-[0-9]{2}-[0-9]{2}T' "$LOG_FILE" | tail -1 | sed 's/### \([^ ]*\).*/\1/')
fi

# If no last entry, use 24 hours ago
if [ -z "$LAST_ENTRY" ]; then
LAST_ENTRY=$(date -u -v-24H '+%Y-%m-%dT%H:%M:%SZ' 2>/dev/null || date -u -d '24 hours ago' '+%Y-%m-%dT%H:%M:%SZ')
fi

Logs to scan for errors:

Log FileError PatternsCategory
hooks.log"success": false, "level": "ERROR"Hooks
context-watcher.logERROR, FATAL, panicContext
context-watcher.error.logAny new lines since last runContext
context-indexer.logERROR, FATAL, failedContext
backup-scheduled.logERROR, FATAL, failedBackup
backup-scheduled-error.logAny new lines since last runBackup
token-limit-hook.logERROR, error, failedHooks
mcp-call-graph.logERROR, error, ExceptionMCP
mcp-impact-analysis.logERROR, error, ExceptionMCP
mcp-semantic-search.logERROR, error, ExceptionMCP
classification-log.jsonl"approval_type": "HUMAN_REVIEW_REQUIRED"Classification
indexing-skipped.logAny new lines since last runComponent Indexing

Alerts to check:

Alert FilePurposeAction
~/PROJECTS/.coditect-data/alerts/indexing-skippedComponent indexing was skipped due to git lockRun python3 ~/.coditect/scripts/component-indexer.py

Legacy logs (not actively monitored):

Log FileStatusReason
backup.logDeprecatedReplaced by backup-scheduled.log
backup-stdout.logDeprecatedReplaced by backup-scheduled.log

BUS ACTIVITY CHECK (Step 7b - ADR-173):

When --bus-activity is specified, query the inter-session message bus for recent activity since the last log entry. Uses structured session_messages columns (not JSON payload parsing).

import os, sys
sys.path.insert(0, 'submodules/core/coditect-core')
from scripts.core.session_message_bus import get_session_message_bus

bus = get_session_message_bus()

# Query each channel for messages since last entry
# LAST_ENTRY is the timestamp from Step 7 error checking
channels = ["session_lifecycle", "task_broadcast", "operator_alert", "direct"]
bus_messages = []
for channel in channels:
msgs = bus.poll(channel, since_id=0, limit=100)
# Filter by created_at > LAST_ENTRY timestamp
bus_messages.extend([m for m in msgs if m.created_at > LAST_ENTRY])

# Also get bus stats summary
stats = bus.stats()

Bus activity uses promoted columns only — no JSON payload parsing:

ColumnSourceDescription
created_atsession_messages.created_atMessage timestamp
sender_idsession_messages.sender_idSession that sent the message
channelsession_messages.channelChannel name (lifecycle, task, alert, direct)
message_typesession_messages.message_typeEvent type (started, ended, conflict, etc.)
task_idsession_messages.task_idTask ID if applicable
prioritysession_messages.priority0=routine, 1=normal, 2=high, 3=critical
statussession_messages.statuspending, delivered, or read
project_idsession_messages.project_idProject context

Why structured columns? ADR-173 promoted these fields from the JSON payload to dedicated columns for exactly this use case — querying and displaying without JSON parsing.

DASHBOARD REFRESH (Steps 11-12 - ADR-170):

After appending the session log entry, check if the current project has dashboard refresh enabled and trigger metrics/narrative regeneration.

import sys
sys.path.insert(0, 'submodules/core/coditect-core')
from scripts.core.dashboard_registry import is_dashboard_enabled, get_dashboard_config

# Step 11: Phase 1 — Always run metrics refresh for dashboard-enabled projects
if PROJECT_ID and is_dashboard_enabled(PROJECT_ID):
config = get_dashboard_config(PROJECT_ID)
if config:
generator_script = config.get("generator_script", "")
project_root = config.get("project_root", "")
# Run Node.js metrics generator (0 AI tokens, <1s)
# node <project_root>/<generator_script> --project <PROJECT_ID>
# This updates project-dashboard-data.json with fresh metrics

# Step 12: Phase 2 — Conditional AI narrative refresh
# Only trigger when --tasks provided (indicates task completion milestone)
if TASKS_PROVIDED and PROJECT_ID and is_dashboard_enabled(PROJECT_ID):
# Run /project-status --update equivalent
# This generates executive summary, risks, recommendations (~12K tokens)
# Trigger: session log entry with --tasks flag = work milestone
pass # Implemented via /project-status --update invocation

Phase 1 triggers on EVERY session log append (cheap: 0 AI tokens, <1s). Phase 2 triggers ONLY when --tasks is provided (expensive: ~12K tokens).

Rationale (ADR-170 D6): Session log entries without --tasks are status updates (cheap metrics refresh is sufficient). Entries WITH --tasks indicate completed work milestones that warrant narrative regeneration.

CRITICAL: Date Rollover Logic

Before ANY append operation, check if a new log file is needed:

# LOG_FILE is already set by Project Detection step above
# It will be either:
# - Project-scoped: ~/.coditect-data/session-logs/projects/{project}/{machine}/SESSION-LOG-YYYY-MM-DD.md
# - Legacy: ~/.coditect-data/session-logs/SESSION-LOG-YYYY-MM-DD.md

# If file doesn't exist, CREATE IT before appending
if [ ! -f "$LOG_FILE" ]; then
# Create new log file with frontmatter template
# See "New Log File Template" section below
fi

# Now append the entry to the correct file

Why this matters: Entries must ALWAYS go into the log file matching the UTC date of the entry timestamp. An entry at 2026-01-05T02:30:00Z MUST go into SESSION-LOG-2026-01-05.md, never into SESSION-LOG-2026-01-04.md.

Log file locations (ADR-155):

ModePath
Project-scoped (default)~/.coditect-data/session-logs/projects/{project-id}/{machine-uuid}/SESSION-LOG-YYYY-MM-DD.md
Legacy (--no-project)~/.coditect-data/session-logs/SESSION-LOG-YYYY-MM-DD.md

Execution:

python3 "$CODITECT_CORE/scripts/session_log_manager.py" <action> [args] \
--log-dir "$LOG_DIR" \
--project "$PROJECT_ID"

Where:

  • $CODITECT_CORE = coditect-core directory (for scripts)
  • $LOG_DIR = determined by project detection (see above)
  • $PROJECT_ID = auto-detected or explicitly provided

Entry Format

Task Entry with Task IDs (Preferred - ADR-054)

### YYYY-MM-DDTHH:MM:SSZ - [A.9.1.1-A.9.1.5] Brief Title

**Author:** Claude (Opus 4.6)

**Tasks Completed:**
| Task ID | Description | Status |
|---------|-------------|--------|
| A.9.1.1 | Task description ||
| A.9.1.2 | Task description ||

- **Issue:** Problem description
- **Fix:** What was done
- **Files Modified:**
- `path/to/file1.ts`

Single Task Entry

### YYYY-MM-DDTHH:MM:SSZ - [F.2.3.1] Brief Title

**Author:** Claude (Opus 4.6)

- **Task:** F.2.3.1 - Implement OAuth2 token validation
- **Fix:** What was done
- **Files Modified:**
- `path/to/file.ts`

Entry with --model (ADR-201 Multi-Provider)

### YYYY-MM-DDTHH:MM:SSZ - [C.4.1.1] Brief Title

**Author:** Claude via kimi-k2.5

- **Task:** C.4.1.1 - E-Signature crypto foundation
- **Fix:** What was done

When --model is provided, the Author line changes from Claude (Opus 4.6) to Claude via {model}. This makes it immediately visible which LLM backend produced the work.

Legacy Entry (No Task IDs)

### YYYY-MM-DDTHH:MM:SSZ - Brief Title

**Author:** Claude (Opus 4.6)

- **Issue:** Problem description
- **Root Cause:** Why it happened (optional)
- **Fix:** What was done
- **Files Modified:**
- `path/to/file1.ts`
- `path/to/file2.ts`
- **Deployed:** `version-tag`

Simple Entry

### YYYY-MM-DDTHH:MM:SSZ - Brief Title

**Author:** Claude (Opus 4.6)

Description of what was done.

Entry with Log Errors

### YYYY-MM-DDTHH:MM:SSZ - Brief Title

**Log Errors Since Last Entry:**
| Category | Log | Errors | Summary |
|----------|-----|--------|---------|
| Hooks | hooks.log | 3 | task-id-validator blocked (2), classify-document timeout (1) |
| Hooks | token-limit-hook.log | 0 | - |
| Context | context-watcher.error.log | 0 | - |
| Context | context-indexer.log | 0 | - |
| Backup | backup-scheduled-error.log | 1 | Variable naming issue |
| MCP | mcp-*.log | 0 | - |
| Classification | classification-log.jsonl | 2 | HUMAN_REVIEW_REQUIRED |

**Work Done:**
- Description of what was done

Entry with Bus Activity (ADR-173)

### YYYY-MM-DDTHH:MM:SSZ - Brief Title

**Inter-Session Bus Activity Since Last Entry:**
| Time | Sender | Channel | Type | Task | Priority | Status |
|------|--------|---------|------|------|----------|--------|
| 2026-02-11T06:15:22Z | claude-29583 | session_lifecycle | started | - | 1 | delivered |
| 2026-02-11T06:16:00Z | claude-29583 | task_broadcast | started | H.13.8 | 0 | pending |
| 2026-02-11T06:31:10Z | codex-41200 | operator_alert | project_conflict | - | 3 | pending |

**Bus Summary:** 2 sessions active, 3 messages since last entry, 1 alert pending

**Work Done:**
- Description of what was done

Session Header

## YYYY-MM-DDTHH:MM:SSZ - Session Start

**Session ID:** `uuid-here`
**Focus:** Session focus description
**Track Focus:** A (Backend), F (Docs)

Auto-Classification

After each append, the system automatically:

  1. Runs /classify on the log file
  2. Updates updated timestamp in frontmatter
  3. Refreshes moe_classified date
  4. Maintains moe_confidence score

To skip classification:

/session-log "Quick note" --no-classify

Log File Structure

Project-Scoped (ADR-155 - Default):

~/.coditect-data/session-logs/
├── projects/
│ ├── PILOT/ # Project ID
│ │ └── d3d3a316-09c6-8f41-.../ # Machine UUID
│ │ ├── SESSION-LOG-2026-02-03.md
│ │ └── SESSION-LOG-2026-02-04.md
│ └── CUST-avivatec-fpa/
│ └── d3d3a316-09c6-8f41-.../
│ └── SESSION-LOG-2026-02-04.md
└── SESSION-LOG-*.md # Legacy (flat) logs

Full path examples:

  • Project-scoped: ~/.coditect-data/session-logs/projects/PILOT/d3d3a316-09c6-8f41-4a3f-d93e422d199c/SESSION-LOG-2026-02-04.md
  • Legacy (flat): ~/.coditect-data/session-logs/SESSION-LOG-2026-02-04.md

Each file contains:

  • YAML frontmatter with metadata and session list
  • Session headers with UUIDs
  • Chronological task entries with timestamps

New Log File Template (Date Rollover)

When creating a new log file for a new UTC date, use this template:

---
title: Session Log YYYY-MM-DD
type: reference
component_type: session-log
version: 1.0.0
audience: contributor
status: active
summary: Development session log for YYYY-MM-DD
keywords:
- session-log
- development-history
tokens: ~500
created: 'YYYY-MM-DDTHH:MM:SSZ'
updated: 'YYYY-MM-DDTHH:MM:SSZ'
tags:
- session-log
sessions:
- id: 'SESSION_UUID'
start: 'YYYY-MM-DDTHH:MM:SSZ'
focus: 'SESSION_FOCUS'
moe_confidence: 0.900
moe_classified: YYYY-MM-DD
---

## Session Log - YYYY-MM-DD

Consolidated development session log. All timestamps are ISO 8601 UTC.

---

## YYYY-MM-DDTHH:MM:SSZ - Session Start

**Session ID:** `SESSION_UUID`
**Focus:** SESSION_FOCUS

**Carried Forward Context:**
- [Brief context from previous day if continuing work]

---

Template Variables:

VariableValue
YYYY-MM-DDCurrent UTC date (e.g., 2026-01-05)
HH:MM:SSCurrent UTC time (e.g., 02:30:00)
SESSION_UUIDGenerate with uuidgen or python3 -c "import uuid; print(uuid.uuid4())"
SESSION_FOCUSDescribe current work focus

Automatic Rollover Behavior:

  1. When UTC date changes (midnight UTC), next /session-log call creates new file
  2. Previous day's log remains unchanged
  3. New log starts with "Session Start" entry
  4. Optional: Include "Carried Forward Context" for continuity

Examples

Morning Session Start

/session-log --new-session "Backend API Development - Track A.3"

Quick Fix Log

/session-log "Fixed 403 error on manifest" --fix "chmod 644" --deployed "v1.0.3"

Comprehensive Entry

/session-log "Cart Type Mismatch Resolution" \
--issue "TypeError: Cannot read .name of undefined" \
--root-cause "Frontend CartItem expected nested product object, backend sends flat structure" \
--fix "Rewrote CartItem type and all consumers to use flat fields" \
--files "types/commerce.ts,hooks/useCart.ts,components/cart/CartItem.tsx" \
--deployed "v1.0.5-cart-types-fix"

View Today's Progress

/session-log --view

Success Output

When entry is appended successfully:

✅ COMMAND COMPLETE: /session-log
Project: PILOT (auto-detected) # ADR-155
Machine: d3d3a316-09c6-8f41-4a3f-d93e422d199c
Entry added: YYYY-MM-DDTHH:MM:SSZ
Log file: projects/PILOT/.../SESSION-LOG-YYYY-MM-DD.md
Classification: Updated
Dashboard Phase 1: Refreshed (0.8s) # ADR-170 (if dashboard-enabled)
Dashboard Phase 2: Skipped (no --tasks) # ADR-170 (only with --tasks)

Legacy mode output:

✅ COMMAND COMPLETE: /session-log
Project: (none - legacy mode)
Entry added: YYYY-MM-DDTHH:MM:SSZ
Log file: SESSION-LOG-YYYY-MM-DD.md
Classification: Updated

Completion Checklist

Before marking complete:

  • UTC date verified - Entry goes to log matching timestamp date
  • New log file created if UTC date differs from existing log
  • Entry appended with ISO timestamp (UTC)
  • Task IDs included in header (if --tasks provided)
  • Author line includes model (if --model provided)
  • Log file exists at correct location
  • Bus activity included (if --bus-activity specified)
  • Classification updated (unless --no-classify)
  • Dashboard Phase 1 refreshed (if dashboard-enabled project)
  • Dashboard Phase 2 refreshed (if --tasks provided and dashboard-enabled)
  • Confirmation displayed

Failure Indicators

This command has FAILED if:

  • ❌ Log directory doesn't exist
  • ❌ Write permission denied
  • ❌ Invalid date format provided
  • ❌ Classification script failed
  • Entry written to wrong date's log file (timestamp vs filename mismatch)
  • ❌ New log file not created when UTC date changed

When NOT to Use

Do NOT use when:

  • Not in a CODITECT project (no log directory)
  • Entry is trivial (not worth logging)
  • Session not started (use --new-session first)

Anti-Patterns (Avoid)

Anti-PatternProblemSolution
Skip --new-sessionEntries without contextStart sessions properly
Generic messagesLow value logsBe specific in entries
Skip classificationOutdated frontmatterKeep --classify enabled
Omit task IDsNo traceability to PILOT planUse --tasks with ADR-054 format

Principles

This command embodies:

  • #9 Based on Facts - Records actual work done
  • #6 Clear, Understandable - ISO timestamps, structured format
  • #3 Complete Execution - Auto-classifies after append

Full Standard: CODITECT-STANDARD-AUTOMATION.md


Command Version: 2.4.0 Created: 2025-12-31T23:45:00Z Updated: 2026-02-16 Author: CODITECT Core Team ADR: ADR-155, ADR-170, ADR-173, ADR-201

Changelog:

  • v2.4.0 - J.17.6: ADR-170 dashboard auto-refresh. Step 11 runs Phase 1 (Node.js metrics, 0 tokens) after every append for dashboard-enabled projects. Step 12 runs Phase 2 (AI narrative, ~12K tokens) only when --tasks provided. Uses dashboard_registry.py for project lookup.
  • v2.3.0 - H.3.10: ADR-201 --model flag. Tag session log entries with LLM model identifier (e.g., --model kimi-k2.5). Author line changes to Claude via {model}. Explicit **Author:** line added to all entry format examples. Step 8 determines Author line from --model flag.
  • v2.2.0 - H.13.9: ADR-173 bus activity dump. --bus-activity flag queries session_messages using structured columns (not JSON parsing). Step 7b checks lifecycle, task, alert, and direct channels since last entry. Entry format includes bus activity table and summary.
  • v2.1.0 - J.27.4: Deprecated --no-project flag with warning. All logs should use project-scoped paths. Added migration script reference.
  • v2.0.0 - Project-scoped session logs (ADR-155): Auto-detects project from working directory, stores logs per-project with machine isolation. Added --project and --no-project options.
  • v1.5.0 - Removed run logging (unnecessary overhead); Fixed paths to use .coditect-data/ per Protected Installation Directive
  • v1.4.0 - Expanded error checking to all 11 log files (hooks, context, backup, MCP, classification)
  • v1.3.0 - Added error checking: scans logs for errors since last run
  • v1.2.0 - Added automatic date rollover: creates new log file when UTC date changes
  • v1.1.0 - Added --tasks option for ADR-054 track nomenclature traceability
  • v1.0.0 - Initial release with ISO timestamps and auto-classification