How to Create a New Agent: Step-by-Step Guide
Time Required: 30-60 minutes Difficulty: Intermediate Prerequisites: Understanding of agent purpose, basic YAML knowledge Output: Production-ready agent compliant with CODITECT standards
Overview
This guide walks you through creating a new agent from scratch following CODITECT standards. You'll create an agent that's immediately usable, properly documented, and ready for team collaboration.
What You'll Build: A complete agent with:
- ✅ Proper YAML frontmatter
- ✅ Clear responsibilities and capabilities
- ✅ Working invocation examples
- ✅ Context awareness (optional but recommended)
- ✅ Integration with existing components
Step 1: Define Your Agent's Purpose (5 minutes)
Before writing any code, answer these questions:
1.1 What does this agent do?
Write one sentence describing the agent's primary function.
Example: "Analyzes database schemas and generates migration scripts"
1.2 Who needs this agent?
Identify the user persona:
- Backend developers?
- DevOps engineers?
- Data analysts?
- Documentation writers?
1.3 What makes this agent unique?
How is it different from existing agents?
Example: "Unlike the general database-architect, this agent specializes in zero-downtime migrations for production systems"
1.4 When should someone use this agent?
List 3-5 specific scenarios:
- Scenario 1: Adding new column to production table
- Scenario 2: Changing data types without downtime
- Scenario 3: Creating indexes on large tables
- etc.
💡 Pro Tip: If you can't clearly answer these questions, your agent might be too vague or overlapping with existing agents. Refine the scope before proceeding.
Step 2: Choose Agent Name and Create File (5 minutes)
2.1 Name Selection Rules
Format: {function}-{specialty}.md
Naming Guidelines:
- Use lowercase letters only
- Separate words with hyphens (kebab-case)
- Keep under 64 characters
- Be specific (avoid generic names like "helper" or "utility")
- Use domain terminology
Good Examples:
database-migration-specialist.mdrust-performance-optimizer.mdapi-documentation-generator.md
Bad Examples:
DatabaseMigration.md(uppercase)database_migration.md(underscore)helper.md(too generic)the-agent-that-does-database-migrations-safely.md(too long)
2.2 Create the File
cd /path/to/coditect-core
touch .coditect/agents/database-migration-specialist.md
2.3 Open in Editor
# Use your preferred editor
code .coditect/agents/database-migration-specialist.md
# or
vim .coditect/agents/database-migration-specialist.md
Step 3: Write YAML Frontmatter (10 minutes)
3.1 Start with Required Fields
Copy this template and customize:
---
name: database-migration-specialist
description: Zero-downtime database migration specialist for production systems. Analyzes schema changes, generates safe migration scripts, and validates rollback procedures for PostgreSQL, MySQL, and SQL Server.
tools: Read, Write, Edit, Bash, Grep, Glob, TodoWrite
model: sonnet
---
Field Customization:
name: Must match filename (without .md)
description:
- Summarize what, how, and for whom
- Include key technologies/frameworks
- Keep under 1024 characters
- No XML tags
- Make it findable (think: what keywords would someone search for?)
tools:
Choose from available tools based on what your agent needs:
Read- Reading files (almost always needed)Write- Creating new filesEdit- Modifying existing filesBash- Running shell commandsGrep- Searching file contentsGlob- Finding files by patternLS- Listing directoriesTodoWrite- Task management (recommended for complex workflows)WebSearch- Web search (for research agents)WebFetch- Fetching web content (for research agents)
model:
sonnet- Default, best balance (recommended for most agents)opus- More powerful, higher cost (for complex analysis)haiku- Faster, lower cost (for simple, repetitive tasks)
3.2 Add Context Awareness (Optional but Recommended)
This helps Claude automatically detect when to use your agent:
---
name: database-migration-specialist
description: Zero-downtime database migration specialist for production systems.
tools: Read, Write, Edit, Bash, Grep, Glob, TodoWrite
model: sonnet
# Context Awareness DNA
context_awareness:
auto_scope_keywords:
database_operations: ["migration", "schema", "alter table", "add column", "modify column"]
zero_downtime: ["production", "live database", "no downtime", "online migration"]
database_systems: ["postgresql", "mysql", "sql server", "postgres", "mariadb"]
entity_detection:
sql_files: ["*.sql", "*migration*.sql", "*schema*.sql"]
database_configs: ["database.yml", "db.config", ".env"]
progress_checkpoints:
- 25%: "Schema analysis complete - migration strategy identified"
- 50%: "Migration script generated - rollback plan ready"
- 75%: "Safety checks passed - ready for execution"
- 100%: "Migration complete - database verified"
---
Benefits:
- Claude can auto-suggest this agent when it sees relevant keywords
- Progress tracking gives users visibility into long-running tasks
- Entity detection helps agent identify relevant files
Step 4: Write Agent Role Statement (5 minutes)
After the YAML frontmatter, write 1-2 paragraphs in second person explaining who the agent is:
---
[YAML frontmatter above]
---
You are a database migration specialist focused on zero-downtime schema changes in production environments. Your expertise covers PostgreSQL, MySQL, and SQL Server, with deep knowledge of online DDL operations, index creation strategies, and safe column modifications.
Your primary responsibility is to analyze proposed schema changes, generate migration scripts that maintain system availability, and create comprehensive rollback procedures. You prioritize data integrity and system uptime above migration speed.
Writing Tips:
- Use second person ("You are...", "Your role is...")
- Be specific about expertise domain
- Mention key constraints or priorities
- Keep to 2-3 sentences maximum
Step 5: Define Core Responsibilities (10 minutes)
List 3-7 specific responsibilities. Each should be an action the agent performs:
## Core Responsibilities
1. **Analyze Schema Changes**
- Parse SQL DDL statements to identify risks
- Detect operations that require downtime
- Flag blocking operations (table locks, index builds on large tables)
- Recommend alternative approaches for high-risk changes
2. **Generate Migration Scripts**
- Create safe, incremental migration steps
- Implement online DDL operations where possible
- Add validation checkpoints between steps
- Generate timing estimates for production execution
3. **Design Rollback Procedures**
- Create automated rollback scripts for each migration
- Document manual rollback steps for complex changes
- Test rollback procedures in development environments
- Establish rollback decision criteria
4. **Validate Production Readiness**
- Check database version compatibility
- Verify sufficient disk space for operations
- Confirm backup procedures are in place
- Review replication impact (if applicable)
5. **Execute with Safety Checks**
- Run pre-migration validation queries
- Monitor table lock duration
- Track migration progress
- Alert on anomalies during execution
Guidelines:
- Start each with an action verb (Analyze, Generate, Design, Validate, Execute)
- Include 2-4 sub-bullets explaining specifics
- Focus on what the agent DOES, not how it's used
- Be measurable and specific
Step 6: Document Capabilities (15 minutes)
Break down capabilities into H3 subsections with clear examples:
## Capabilities
### 1. Zero-Downtime Column Addition
Adds new columns to tables without blocking writes.
**Process:**
1. Add column as NULL with default value
2. Backfill data in small batches (if needed)
3. Add NOT NULL constraint (if required)
4. Create indexes concurrently (PostgreSQL) or online (MySQL)
**Example:**
```sql
-- PostgreSQL example
ALTER TABLE users ADD COLUMN email_verified BOOLEAN DEFAULT FALSE;
-- Backfill (optional)
UPDATE users SET email_verified = TRUE WHERE email IS NOT NULL;
-- Add constraint
ALTER TABLE users ALTER COLUMN email_verified SET NOT NULL;
Estimated Downtime: 0 seconds (table remains accessible)
2. Safe Index Creation
Creates indexes without blocking table writes.
PostgreSQL:
CREATE INDEX CONCURRENTLY idx_users_email ON users(email);
MySQL 5.7+:
CREATE INDEX idx_users_email ON users(email) ALGORITHM=INPLACE, LOCK=NONE;
Safety Measures:
- Monitor disk space (indexes can be 50-100% of table size)
- Run during low-traffic periods
- Set statement timeout to prevent indefinite locks
3. Data Type Changes
Safely modifies column data types with minimal disruption.
Strategy:
- Create new column with target type
- Dual-write to both columns
- Backfill new column from old column
- Switch application to read new column
- Drop old column
Example Migration:
-- Step 1: Add new column
ALTER TABLE orders ADD COLUMN total_cents BIGINT;
-- Step 2: Dual-write (application code)
-- INSERT INTO orders (total_cents, total_dollars) VALUES (..., ...)
-- Step 3: Backfill
UPDATE orders SET total_cents = total_dollars * 100 WHERE total_cents IS NULL;
-- Step 4: Application switch to total_cents
-- Step 5: Drop old column
ALTER TABLE orders DROP COLUMN total_dollars;
**Guidelines:**
- Use H3 headings for each capability
- Include concrete code examples
- Explain the "why" not just the "how"
- Provide timing/performance estimates
- Note platform-specific considerations
---
## Step 7: Add Invocation Examples (5 minutes)
Show exactly how to use the agent:
```markdown
## Invocation
### Direct Agent Call
```python
Task(subagent_type="database-migration-specialist",
description="Generate zero-downtime migration for email verification",
prompt="""
I need to add an email_verified column to the users table.
Context:
- Database: PostgreSQL 14
- Table size: 5 million rows
- Current schema: users(id, email, created_at)
- Requirement: Cannot block writes during migration
Generate:
1. Migration SQL script
2. Rollback script
3. Execution timeline estimate
""")
Via Slash Command
# If you created a companion command
/migrate-schema --table users --change "add email_verified BOOLEAN" --db postgresql
Integration with Other Agents
# Use with codebase-analyzer to understand current schema
Task(subagent_type="codebase-analyzer",
prompt="Analyze database schema in src/models/")
# Then use migration specialist
Task(subagent_type="database-migration-specialist",
prompt="Generate migration based on schema analysis above")
**Key Points:**
- Always include Task() invocation pattern
- Show realistic, complete examples
- Include context that agent needs
- Demonstrate integration with other agents if applicable
---
## Step 8: Define Limitations (5 minutes)
Explicitly state what the agent does NOT do:
```markdown
## Limitations
This agent is designed for schema migrations and does NOT:
- **Execute migrations directly:** Generates scripts for human review and execution
- **Handle data migrations:** Focuses on schema (DDL), not bulk data transformations (ETL)
- **Optimize existing queries:** For query performance, use `database-performance-optimizer` agent
- **Manage database infrastructure:** For provisioning/scaling, use `cloud-architect` agent
- **Design new schemas:** For initial schema design, use `database-architect` agent
**For these use cases, use these agents instead:**
- Data transformations → `data-pipeline-engineer`
- Query optimization → `database-performance-optimizer`
- Infrastructure → `cloud-architect`
- Schema design → `database-architect`
Guidelines:
- State clearly what's out of scope
- Recommend alternative agents for related tasks
- Prevent scope creep
- Set clear expectations
Step 9: Document Integration (5 minutes)
Show how this agent fits into the broader ecosystem:
## Integration
### Related Components
**Agents:**
- `database-architect` - Use BEFORE this agent for initial schema design
- `database-performance-optimizer` - Use AFTER migrations to optimize queries
- `codi-devops-engineer` - Coordinates deployment of migrations to production
**Skills:**
- `database-migration-patterns` - Reusable migration templates
- `sql-validation` - Validates generated SQL syntax
**Commands:**
- `/analyze-schema` - Quick schema analysis
- `/generate-migration` - Interactive migration generator
**Scripts:**
- `scripts/validate-migration.py` - Pre-execution validation
- `scripts/estimate-migration-time.py` - Duration estimation
### Workflow Example
```bash
# 1. Analyze current schema
/analyze-schema --database production --table users
# 2. Generate migration
Task(subagent_type="database-migration-specialist", ...)
# 3. Validate migration
python3 scripts/validate-migration.py migration.sql
# 4. Review and approve
# Human review of generated scripts
# 5. Execute in staging
# Test migration in staging environment
# 6. Execute in production
# Run migration during maintenance window
---
## Step 10: Validate and Test (10 minutes)
### 10.1 Run Automated Validation
```bash
cd /path/to/coditect-core
python3 .coditect/scripts/validate-agent.py agents/database-migration-specialist.md
Expected Output:
✅ YAML frontmatter valid
✅ Required fields present
✅ Name matches filename
✅ Description under 1024 chars
✅ Tools are valid
✅ Model is valid
✅ Sections present: Core Responsibilities, Capabilities, Invocation, Limitations, Integration
✅ Invocation examples include Task() pattern
Score: 95/100 (Grade A)
10.2 Manual Quality Checks
Readability:
- Can someone unfamiliar with the domain understand the purpose?
- Are examples concrete and realistic?
- Is the language clear and jargon-free (or jargon explained)?
Completeness:
- Does it answer "what", "when", "how", and "why"?
- Are there at least 2 invocation examples?
- Are limitations clearly stated?
- Are related components linked?
Accuracy:
- Are code examples syntactically correct?
- Are platform-specific details accurate?
- Are tool names spelled correctly?
10.3 Test Invocation
Actually try using the agent:
Task(subagent_type="database-migration-specialist",
description="Test agent with simple migration",
prompt="Generate a migration to add a created_at timestamp column to the products table in PostgreSQL 14")
Verify:
- Agent loads successfully
- Instructions are clear enough for Claude to follow
- Output matches expected format
Step 11: Commit to Repository (5 minutes)
11.1 Git Add
cd /path/to/coditect-core
git add .coditect/agents/database-migration-specialist.md
11.2 Commit with Conventional Format
git commit -m "feat(agents): Add database-migration-specialist agent
- Zero-downtime schema migration specialist
- Supports PostgreSQL, MySQL, SQL Server
- Generates safe migration + rollback scripts
- Includes validation and timing estimates
- Grade A (95/100) compliance with CODITECT standards
Related: #123 (if there's an issue)"
11.3 Push to Remote
git push origin feature/database-migration-specialist
11.4 Create Pull Request
PR Title: feat(agents): Add database-migration-specialist
PR Description:
## Summary
Adds new agent for zero-downtime database schema migrations in production environments.
## Agent Details
- **Name:** database-migration-specialist
- **Purpose:** Generate safe migration scripts for PostgreSQL, MySQL, SQL Server
- **Capabilities:** Column addition, index creation, data type changes, rollback procedures
- **Quality Score:** 95/100 (Grade A)
## Testing
- [x] Automated validation passed
- [x] Manual invocation tested
- [x] Integration with database-architect verified
- [x] Documentation reviewed
## Checklist
- [x] YAML frontmatter complete
- [x] All required sections present
- [x] Examples are concrete and accurate
- [x] Related components documented
- [x] Limitations clearly stated
Step 12: Activate Agent (Optional)
12.1 Decide on Activation
Activate immediately if:
- Agent is production-ready
- Team needs it now
- Agent has been tested
Wait to activate if:
- Agent is experimental
- Team wants to review first
- Dependencies aren't ready
12.2 Activation Command
python3 .coditect/scripts/update-component-activation.py activate agent database-migration-specialist \
--reason "Production-ready zero-downtime migration specialist for database teams"
12.3 Commit Activation
git add .coditect/component-activation-status.json
git commit -m "chore: Activate database-migration-specialist agent
Agent is production-ready and tested. Enables zero-downtime schema migrations
for database engineering team."
git push origin main
Step 13: Document Usage (5 minutes)
13.1 Update AGENT-INDEX.md
Add your agent to the catalog:
### Database & Infrastructure
- **database-migration-specialist** - Zero-downtime schema migrations for PostgreSQL, MySQL, SQL Server
- Use when: Modifying production database schemas without downtime
- Capabilities: Column addition, index creation, data type changes, rollback procedures
- Output: Safe migration scripts + rollback scripts + timing estimates
13.2 Update Related Documentation
If your agent relates to existing workflows, update those docs:
# Example: Update database workflow guide
vim docs/workflows/database-management.md
# Add section about migration agent
13.3 Create Example Usage (Optional)
Create a working example in examples/:
mkdir -p examples/database-migration-specialist
cat > examples/database-migration-specialist/add-email-verification.md << 'EOF'
# Example: Adding Email Verification Column
## Scenario
Add email_verified column to users table (5M rows) in production PostgreSQL without downtime.
## Agent Invocation
```python
Task(subagent_type="database-migration-specialist",
description="Zero-downtime email verification column migration",
prompt="""Generate migration for adding email_verified column to users table.
Database: PostgreSQL 14
Table: users (5 million rows)
Current schema: id, email, name, created_at
Requirement: No downtime during migration
""")
Generated Output
[Include actual output from agent]
Execution Results
- Migration time: 45 minutes
- Downtime: 0 seconds
- Rows affected: 5,000,000
- Success: ✅ EOF
---
## Troubleshooting
### Issue 1: Validation Fails on YAML
**Error:** "Invalid YAML frontmatter"
**Cause:** Syntax error in YAML (missing colon, improper indentation)
**Fix:**
- Check YAML syntax: https://www.yamllint.com/
- Ensure proper indentation (2 spaces, not tabs)
- Verify all required fields present
### Issue 2: Agent Name Mismatch
**Error:** "Agent name 'database-migration' does not match filename 'database-migration-specialist.md'"
**Fix:**
```yaml
# Change name field to match filename (without .md)
---
name: database-migration-specialist
Issue 3: Description Too Long
Error: "Description exceeds 1024 character limit (current: 1400)"
Fix:
- Shorten description to core value proposition
- Move details to agent body
- Keep under 1024 characters
Issue 4: Agent Doesn't Load
Symptom: Task(subagent_type="database-migration-specialist", ...) fails
Checks:
- Verify file in
.coditect/agents/directory - Check YAML frontmatter is valid
- Ensure
namefield matches filename - Verify agent is activated (if using activation system)
Best Practices Summary
Do:
- ✅ Start with clear purpose and scope
- ✅ Use descriptive, specific names
- ✅ Provide concrete, realistic examples
- ✅ Test agent invocation before committing
- ✅ Document integrations with related components
- ✅ State limitations explicitly
- ✅ Follow CODITECT standards
Don't:
- ❌ Use vague or generic names
- ❌ Create overlapping agents (check existing agents first)
- ❌ Skip validation steps
- ❌ Forget to test actual invocation
- ❌ Leave limitations implicit
- ❌ Use Markdown headers instead of YAML frontmatter
- ❌ Exceed token limits in descriptions
Quick Reference Checklist
Before submitting your agent:
File Structure:
- File in
.coditect/agents/{agent-name}.md - Filename is lowercase with hyphens
- Filename matches YAML
namefield
YAML Frontmatter:
- Starts with
--- - Contains
name,description,tools,model - Name matches filename (without .md)
- Description under 1024 characters
- Tools are valid tool names
- Model is sonnet, opus, or haiku
Content:
- Agent role statement (1-2 paragraphs)
- Core Responsibilities section (3-7 items)
- Capabilities section with examples
- Invocation section with Task() pattern
- Limitations section
- Integration section
Quality:
- Automated validation passed (Grade B+ minimum)
- Manual invocation tested
- Examples are accurate and complete
- Documentation clear and concise
Git:
- Committed with conventional commit message
- Pushed to feature branch
- Pull request created with description
Next Steps
After creating your agent:
-
Create related components (if needed):
- Skill: See HOW-TO-CREATE-NEW-SKILL.md
- Command: See HOW-TO-CREATE-NEW-COMMAND.md
- Script: See HOW-TO-CREATE-NEW-SCRIPT.md
-
Add to workflows:
- Update workflow documentation
- Create example use cases
- Train team on new capabilities
-
Monitor usage:
- Track invocations in analytics
- Gather user feedback
- Iterate based on real-world use
Need Help?
- Review CODITECT-STANDARD-AGENTS.md
- Check existing agents for examples
- Ask in #coditect-development Slack channel
- Review Agent SDK documentation
Document Version: 1.0.0 Last Updated: December 3, 2025 Maintainer: CODITECT Core Team