Skip to main content

/moe-calibrate

Calibrate and manage MoE classification thresholds and confidence scores.

Usage

# Show current calibration status
/moe-calibrate --status

# Show threshold statistics
/moe-calibrate --thresholds

# Show calibration curve
/moe-calibrate --curve

# Reset thresholds to defaults
/moe-calibrate --reset

# Bootstrap learning from existing documents
/moe-calibrate --bootstrap <path>

# Show analyst performance
/moe-calibrate --analysts

Arguments

ArgumentDescription
--statusShow overall calibration and learning status
--thresholdsShow current adaptive threshold values
--curveDisplay confidence calibration curve
--resetReset thresholds to default values
--bootstrap <path>Bootstrap learning from documents with frontmatter
--analystsShow analyst accuracy and weights
--confirm <path> <type>Confirm a classification for learning

System Prompt

You are the MoE Calibration Manager. Execute the requested calibration operation:

  1. For --status: Load and display stats from all enhancement modules (embeddings, learning, memory, thresholds, calibration)

  2. For --thresholds: Show current adaptive threshold values and recent adjustments

  3. For --curve: Display the confidence calibration curve showing predicted vs actual accuracy

  4. For --reset: Reset adaptive thresholds to default values (0.90, 0.85, 0.60)

  5. For --bootstrap: Scan documents with frontmatter type declarations and use them as ground truth for learning

  6. For --analysts: Show accuracy and dynamic weights for each analyst

  7. For --confirm: Record a confirmed classification for learning feedback

Execute by running:

import sys
sys.path.insert(0, 'scripts/moe_classifier')
from core.enhanced_orchestrator import get_enhanced_orchestrator
from core.learning import get_learner
from core.adaptive_thresholds import get_threshold_manager
from core.calibration import get_calibrator

orchestrator = get_enhanced_orchestrator()
stats = orchestrator.get_stats()
# Display formatted stats

Examples

# Check calibration health
/moe-calibrate --status

# See which analysts are most accurate
/moe-calibrate --analysts

# Confirm a classification was correct
/moe-calibrate --confirm docs/guides/SETUP.md guide

# Bootstrap from existing typed documents
/moe-calibrate --bootstrap docs/
  • /moe-workflow - Full classification pipeline
  • /moe-learn - Train from confirmed classifications
  • /classify - Classify documents

Success Output

When moe-calibrate completes:

✅ COMMAND COMPLETE: /moe-calibrate
Action: <status|thresholds|curve|reset|bootstrap|analysts>
Thresholds: <high>/<medium>/<low>
Analysts: N tracked
Accuracy: N%

Completion Checklist

Before marking complete:

  • Operation identified
  • Data loaded
  • Results displayed
  • Status reported

Failure Indicators

This command has FAILED if:

  • ❌ No operation specified
  • ❌ Data files not found
  • ❌ Invalid threshold values
  • ❌ Bootstrap path not found

When NOT to Use

Do NOT use when:

  • Just classifying documents (use /classify)
  • Running full pipeline (use /moe-workflow)
  • Need learning feedback (use /moe-learn)

Anti-Patterns (Avoid)

Anti-PatternProblemSolution
Skip bootstrapNo training dataBootstrap first
Reset without backupLost calibrationExport before reset
Ignore analystsPoor accuracyReview analyst stats

Principles

This command embodies:

  • #9 Based on Facts - Data-driven thresholds
  • #6 Clear, Understandable - Clear calibration stats
  • #3 Complete Execution - Full calibration workflow

Full Standard: CODITECT-STANDARD-AUTOMATION.md