/moe-calibrate
Calibrate and manage MoE classification thresholds and confidence scores.
Usage
# Show current calibration status
/moe-calibrate --status
# Show threshold statistics
/moe-calibrate --thresholds
# Show calibration curve
/moe-calibrate --curve
# Reset thresholds to defaults
/moe-calibrate --reset
# Bootstrap learning from existing documents
/moe-calibrate --bootstrap <path>
# Show analyst performance
/moe-calibrate --analysts
Arguments
| Argument | Description |
|---|---|
--status | Show overall calibration and learning status |
--thresholds | Show current adaptive threshold values |
--curve | Display confidence calibration curve |
--reset | Reset thresholds to default values |
--bootstrap <path> | Bootstrap learning from documents with frontmatter |
--analysts | Show analyst accuracy and weights |
--confirm <path> <type> | Confirm a classification for learning |
System Prompt
You are the MoE Calibration Manager. Execute the requested calibration operation:
-
For --status: Load and display stats from all enhancement modules (embeddings, learning, memory, thresholds, calibration)
-
For --thresholds: Show current adaptive threshold values and recent adjustments
-
For --curve: Display the confidence calibration curve showing predicted vs actual accuracy
-
For --reset: Reset adaptive thresholds to default values (0.90, 0.85, 0.60)
-
For --bootstrap: Scan documents with frontmatter type declarations and use them as ground truth for learning
-
For --analysts: Show accuracy and dynamic weights for each analyst
-
For --confirm: Record a confirmed classification for learning feedback
Execute by running:
import sys
sys.path.insert(0, 'scripts/moe_classifier')
from core.enhanced_orchestrator import get_enhanced_orchestrator
from core.learning import get_learner
from core.adaptive_thresholds import get_threshold_manager
from core.calibration import get_calibrator
orchestrator = get_enhanced_orchestrator()
stats = orchestrator.get_stats()
# Display formatted stats
Examples
# Check calibration health
/moe-calibrate --status
# See which analysts are most accurate
/moe-calibrate --analysts
# Confirm a classification was correct
/moe-calibrate --confirm docs/guides/SETUP.md guide
# Bootstrap from existing typed documents
/moe-calibrate --bootstrap docs/
Related Commands
/moe-workflow- Full classification pipeline/moe-learn- Train from confirmed classifications/classify- Classify documents
Success Output
When moe-calibrate completes:
✅ COMMAND COMPLETE: /moe-calibrate
Action: <status|thresholds|curve|reset|bootstrap|analysts>
Thresholds: <high>/<medium>/<low>
Analysts: N tracked
Accuracy: N%
Completion Checklist
Before marking complete:
- Operation identified
- Data loaded
- Results displayed
- Status reported
Failure Indicators
This command has FAILED if:
- ❌ No operation specified
- ❌ Data files not found
- ❌ Invalid threshold values
- ❌ Bootstrap path not found
When NOT to Use
Do NOT use when:
- Just classifying documents (use /classify)
- Running full pipeline (use /moe-workflow)
- Need learning feedback (use /moe-learn)
Anti-Patterns (Avoid)
| Anti-Pattern | Problem | Solution |
|---|---|---|
| Skip bootstrap | No training data | Bootstrap first |
| Reset without backup | Lost calibration | Export before reset |
| Ignore analysts | Poor accuracy | Review analyst stats |
Principles
This command embodies:
- #9 Based on Facts - Data-driven thresholds
- #6 Clear, Understandable - Clear calibration stats
- #3 Complete Execution - Full calibration workflow
Full Standard: CODITECT-STANDARD-AUTOMATION.md