Production Folder Structure Skill
Production Folder Structure Skill
How to Use This Skill
- Review the patterns and examples below
- Apply the relevant patterns to your implementation
- Follow the best practices outlined in this skill
Expert skill for validating, detecting, and organizing project folder structures to CODITECT production standards.
When to Use
✅ Use this skill when:
- Validating project organization against production standards
- Detecting project type (frontend/backend/monorepo)
- Scoring production readiness (0-100 scale)
- Identifying misplaced files and directories
- Generating organization migration plans
- Analyzing root directory cleanliness
- Checking for required documentation files
- Time savings: 75% faster than manual review (2h→30min)
- Proven: Component Viewer 65% → 95% readiness
❌ Don't use this skill when:
- Organizing file content (not folder structure)
- Refactoring code (code organization, not folders)
- Simple file moves (use basic mv instead)
- Project-specific customization without standards
Core Methods
1. detect_project_type(path) → ProjectType
Purpose: Automatically detect project type for correct standard application
Detection Logic:
def detect_project_type(path: str) -> ProjectType:
"""
Detects: frontend | backend | monorepo | unknown
Priority order:
1. Monorepo (if has workspace config)
2. Frontend (if has components/, public/)
3. Backend (if has routes/, models/, migrations/)
4. Unknown (fallback to universal only)
"""
# Check for monorepo indicators
if has_workspace_config(path):
return "monorepo"
# Check for frontend indicators
if has_frontend_structure(path):
return "frontend"
# Check for backend indicators
if has_backend_structure(path):
return "backend"
return "unknown"
def has_workspace_config(path: str) -> bool:
"""Check for monorepo workspace configuration"""
indicators = [
"pnpm-workspace.yaml",
"nx.json",
"turbo.json",
"lerna.json"
]
return any(os.path.exists(os.path.join(path, f)) for f in indicators)
def has_frontend_structure(path: str) -> bool:
"""Check for frontend project indicators"""
indicators = [
"src/components/",
"src/pages/",
"app/", # Next.js
"public/",
"index.html"
]
# Check package.json for frontend dependencies
pkg_json = os.path.join(path, "package.json")
if os.path.exists(pkg_json):
with open(pkg_json) as f:
content = f.read()
if any(dep in content for dep in ["react", "vue", "svelte", "angular"]):
return True
return any(os.path.exists(os.path.join(path, f)) for f in indicators)
def has_backend_structure(path: str) -> bool:
"""Check for backend project indicators"""
indicators = [
"src/routes/",
"src/api/",
"src/handlers/",
"src/models/",
"src/controllers/",
"migrations/",
"alembic/",
"main.py",
"src/main.rs"
]
# Check for backend dependencies
if os.path.exists(os.path.join(path, "requirements.txt")):
with open(os.path.join(path, "requirements.txt")) as f:
content = f.read()
if any(dep in content for dep in ["fastapi", "django", "flask"]):
return True
return any(os.path.exists(os.path.join(path, f)) for f in indicators)
Returns:
{
"project_type": "frontend",
"confidence": 0.95,
"indicators_found": [
"src/components/",
"package.json with react",
"public/ directory"
],
"framework": "React + Vite",
"applicable_standards": [
"CODITECT-STANDARD-PRODUCTION-FOLDERS-UNIVERSAL",
"CODITECT-STANDARD-PRODUCTION-FOLDERS-FRONTEND"
]
}
2. validate_structure(path, project_type) → ValidationResult
Purpose: Score production readiness and identify violations
Validation Categories:
def validate_structure(path: str, project_type: str) -> ValidationResult:
"""
Validates against applicable standards
Returns score (0-100) and detailed violations
"""
score = 0
violations = []
# Universal Standard (applies to all)
universal_score, universal_violations = validate_universal(path)
score += universal_score * 0.6 # 60% weight
violations.extend(universal_violations)
# Project-specific standard (40% weight)
if project_type == "frontend":
specific_score, specific_violations = validate_frontend(path)
elif project_type == "backend":
specific_score, specific_violations = validate_backend(path)
elif project_type == "monorepo":
specific_score, specific_violations = validate_monorepo(path)
else:
specific_score, specific_violations = 100, []
score += specific_score * 0.4 # 40% weight
violations.extend(specific_violations)
return ValidationResult(
score=int(score),
status="production_ready" if score >= 90 else "needs_improvement" if score >= 70 else "not_ready",
violations=violations
)
def validate_universal(path: str) -> tuple[float, list]:
"""Validate against universal standard"""
score = 0
violations = []
# Check required documentation (30 points)
required_docs = ["README.md", "CLAUDE.md", "CONTRIBUTING.md",
"CHANGELOG.md", "LICENSE", "SECURITY.md"]
for doc in required_docs:
if os.path.exists(os.path.join(path, doc)):
score += 5
else:
violations.append(f"Missing required file: {doc}")
# Check docs/ hierarchy (30 points)
required_doc_dirs = [
"docs/architecture/",
"docs/architecture/adrs/",
"docs/deployment/",
"docs/project-management/",
"docs/testing/"
]
for doc_dir in required_doc_dirs:
if os.path.exists(os.path.join(path, doc_dir)):
score += 6
else:
violations.append(f"Missing directory: {doc_dir}")
# Check scripts/ (20 points)
if os.path.exists(os.path.join(path, "scripts/")):
score += 10
required_scripts = ["build.sh", "test-all.sh", "start-dev.sh"]
for script in required_scripts:
if os.path.exists(os.path.join(path, "scripts", script)):
score += 3.33
else:
violations.append(f"Missing script: scripts/{script}")
else:
violations.append("Missing directory: scripts/")
# Check .gitignore (20 points)
if os.path.exists(os.path.join(path, ".gitignore")):
score += 10
# Check .gitignore completeness
with open(os.path.join(path, ".gitignore")) as f:
gitignore = f.read()
essential = ["node_modules", "dist", "build", ".env", "__pycache__"]
for item in essential:
if item in gitignore:
score += 2
else:
violations.append(f".gitignore missing: {item}")
else:
violations.append("Missing file: .gitignore")
return score, violations
Returns:
{
"score": 85,
"status": "needs_improvement",
"violations": [
"Missing required file: SECURITY.md",
"Missing directory: docs/testing/",
"Missing script: scripts/deploy.sh"
],
"passed_checks": 17,
"total_checks": 20,
"breakdown": {
"documentation": {"score": 25, "max": 30},
"hierarchy": {"score": 24, "max": 30},
"scripts": {"score": 16, "max": 20},
"configuration": {"score": 20, "max": 20}
}
}
3. find_misplaced_files(path) → list[MisplacedFile]
Purpose: Identify files in wrong locations
Categorization Rules:
def find_misplaced_files(path: str) -> list[MisplacedFile]:
"""
Scans root directory and categorizes misplaced files
Returns list of files that should be moved
"""
misplaced = []
# Get all files in root (not directories)
root_files = [f for f in os.listdir(path)
if os.path.isfile(os.path.join(path, f))]
for file in root_files:
# Skip allowed root files
if is_allowed_in_root(file):
continue
# Categorize and determine target location
category = categorize_file(file)
target = get_target_location(file, category)
misplaced.append(MisplacedFile(
current_path=file,
target_path=target,
category=category,
reason=get_reason(file, category)
))
return misplaced
def is_allowed_in_root(filename: str) -> bool:
"""Check if file is allowed in root"""
allowed_patterns = [
# Documentation
r"README\.md", r"CLAUDE\.md", r"CONTRIBUTING\.md",
r"CHANGELOG\.md", r"LICENSE", r"SECURITY\.md",
# Configuration
r"package\.json", r"tsconfig\.json", r"\.gitignore",
r"\.eslintrc\.", r"\.prettierrc", r"\.editorconfig",
r"\.env\.example", r"requirements\.txt", r"Cargo\.toml",
# Build config
r"vite\.config\.", r"webpack\.config\.", r"next\.config\.",
r"Dockerfile", r"docker-compose\.yml", r"Makefile",
# Entry points
r"index\.html", r"main\.py", r"server\.ts"
]
return any(re.match(pattern, filename) for pattern in allowed_patterns)
def categorize_file(filename: str) -> str:
"""Categorize file by purpose"""
patterns = {
"session_export": r".*EXPORT.*\.txt|.*SESSION.*\.txt",
"research": r"RESEARCH-.*\.md|ANALYSIS-.*\.md",
"status_report": r"STATUS-.*\.md|DEPLOYMENT-STATUS-.*\.md",
"implementation_plan": r"IMPLEMENTATION-.*\.md|.*-PLAN-.*\.md|CHECKPOINT-.*\.md",
"guide": r".*-GUIDE\.md|DEVELOPMENT-.*\.md",
"reference": r".*-PATTERNS\.md|.*-CHECKLIST\.md",
"checkpoint": r"\d{4}-\d{2}-\d{2}-.*\.md"
}
for category, pattern in patterns.items():
if re.match(pattern, filename):
return category
return "unknown"
def get_target_location(filename: str, category: str) -> str:
"""Get target location for file"""
targets = {
"session_export": "docs/project-management/checkpoints/",
"research": "docs/architecture/",
"status_report": "docs/project-management/",
"implementation_plan": "docs/project-management/",
"guide": "docs/user-guides/",
"reference": "docs/",
"checkpoint": "docs/project-management/checkpoints/"
}
return targets.get(category, "docs/")
Returns:
{
"misplaced_files": [
{
"current_path": "EXPORT-SESSION-2025-12-04.txt",
"target_path": "docs/project-management/checkpoints/EXPORT-SESSION-2025-12-04.txt",
"category": "session_export",
"reason": "Session export artifacts belong in checkpoints/"
},
{
"current_path": "WEBSOCKET-RESEARCH.md",
"target_path": "docs/architecture/WEBSOCKET-RESEARCH.md",
"category": "research",
"reason": "Research documents belong in docs/architecture/"
}
],
"total_misplaced": 2,
"root_item_count": 18,
"target_root_count": 16
}
4. check_required_files(path, project_type) → list[MissingFile]
Purpose: Identify missing required files
Implementation:
def check_required_files(path: str, project_type: str) -> list[MissingFile]:
"""
Checks for required files based on project type
Returns list of missing files
"""
missing = []
# Universal required files
universal_required = [
("README.md", "User documentation"),
("CLAUDE.md", "AI agent context"),
("CONTRIBUTING.md", "Contribution guidelines"),
("CHANGELOG.md", "Version history"),
("LICENSE", "License file"),
("SECURITY.md", "Security policy"),
(".gitignore", "Git ignore rules")
]
for filename, description in universal_required:
if not os.path.exists(os.path.join(path, filename)):
missing.append(MissingFile(
filename=filename,
location=path,
description=description,
priority="high"
))
# Project-specific required files
if project_type == "frontend":
frontend_required = [
("src/", "Source code directory"),
("public/", "Static assets directory"),
("package.json", "Node.js dependencies")
]
missing.extend(check_specific_files(path, frontend_required))
elif project_type == "backend":
backend_required = [
("src/", "Source code directory"),
("docs/api/", "API documentation")
]
missing.extend(check_specific_files(path, backend_required))
return missing
Returns:
{
"missing_files": [
{
"filename": "SECURITY.md",
"location": "/",
"description": "Security policy",
"priority": "high"
},
{
"filename": "docs/testing/",
"location": "/docs/",
"description": "Testing documentation",
"priority": "medium"
}
],
"total_missing": 2,
"critical_count": 1,
"high_count": 1,
"medium_count": 0
}
5. generate_migration_plan(path) → OrganizationPlan
Purpose: Create actionable organization plan
Plan Generation:
def generate_migration_plan(path: str) -> OrganizationPlan:
"""
Generates complete organization plan with:
- Project type detection
- Production readiness score
- Misplaced files to move
- Missing files to create
- Git commands for moves
- Verification steps
"""
# Step 1: Detect project type
project_type = detect_project_type(path)
# Step 2: Validate current structure
validation = validate_structure(path, project_type)
# Step 3: Find misplaced files
misplaced = find_misplaced_files(path)
# Step 4: Check for missing files
missing = check_required_files(path, project_type)
# Step 5: Generate move commands
move_commands = generate_move_commands(misplaced)
# Step 6: Generate create commands
create_commands = generate_create_commands(missing)
return OrganizationPlan(
project_type=project_type,
current_score=validation.score,
target_score=95,
misplaced_files=misplaced,
missing_files=missing,
move_commands=move_commands,
create_commands=create_commands,
estimated_time="15-30 minutes",
verification_steps=[
"Run tests to ensure nothing breaks",
"Check git status for untracked files",
"Verify all imports still work",
"Re-run validation"
]
)
Returns:
{
"project_type": "frontend",
"current_score": 65,
"target_score": 95,
"improvement": "+30 points",
"misplaced_files": [
{
"file": "EXPORT-SESSION.txt",
"target": "docs/project-management/checkpoints/"
}
],
"missing_files": [
"SECURITY.md",
"docs/testing/"
],
"move_commands": [
"mkdir -p docs/project-management/checkpoints",
"git mv EXPORT-SESSION.txt docs/project-management/checkpoints/"
],
"create_commands": [
"Use CODITECT-CORE-STANDARDS/TEMPLATES/SECURITY-TEMPLATE.md",
"mkdir -p docs/testing"
],
"estimated_time": "15-30 minutes",
"verification_steps": [
"Run npm test",
"Check git status",
"Verify imports",
"Re-run /organize-production --validate"
]
}
Usage Examples
Example 1: Validate Component Viewer
# Detect project type
project_info = detect_project_type("/path/to/component-viewer")
# Returns: { "project_type": "frontend", "framework": "React + Vite" }
# Validate structure
validation = validate_structure("/path/to/component-viewer", "frontend")
# Returns: { "score": 85, "status": "needs_improvement" }
# Find misplaced files
misplaced = find_misplaced_files("/path/to/component-viewer")
# Returns: List of files in root that should be in docs/
# Check missing files
missing = check_required_files("/path/to/component-viewer", "frontend")
# Returns: ["SECURITY.md", "docs/testing/"]
# Generate migration plan
plan = generate_migration_plan("/path/to/component-viewer")
# Returns: Complete organization plan with commands
Example 2: Organize Backend API
# Detect project type
project_info = detect_project_type("/path/to/api")
# Returns: { "project_type": "backend", "framework": "FastAPI" }
# Validate structure
validation = validate_structure("/path/to/api", "backend")
# Returns: { "score": 72, "status": "needs_improvement" }
# Generate migration plan
plan = generate_migration_plan("/path/to/api")
# Returns: Plan to organize src/, docs/, scripts/
Integration Points
Used By
- project-organizer agent - Calls methods for validation and organization
- /organize-production command - User-facing interface
- pre-commit-folder-check.sh hook - Git hook validation
- organize_production.py script - CLI tool
Uses
- CODITECT-STANDARD-PRODUCTION-FOLDERS-FRONTEND.md
- CODITECT-STANDARD-PRODUCTION-FOLDERS-BACKEND.md
- CODITECT-STANDARD-PRODUCTION-FOLDERS-MONOREPO.md
- CODITECT-STANDARD-PRODUCTION-FOLDERS-UNIVERSAL.md
Token Efficiency
Manual Review:
- Read standards documents: 30 min
- Scan project manually: 45 min
- Identify violations: 30 min
- Create move plan: 15 min
- Total: 2 hours
With This Skill:
- Automated detection: instant
- Automated validation: 5 seconds
- Automated plan generation: 10 seconds
- Review and approve: 30 min
- Total: 30 minutes
Savings: 75% (1.5 hours saved)
Quality Gates
Before marking organization complete:
- Production readiness score ≥ 90
- All required files present
- No misplaced files in root
- Root directory ≤ 25 items
- All tests passing after moves
- Git history preserved (git mv)
Limitations
- Does not validate code quality (only folder structure)
- Does not fix code imports after moves
- Does not handle circular dependencies
- Requires manual approval for destructive operations
- Cannot detect project-specific customizations
References
- CODITECT Component Viewer - Reference implementation (65% → 95%)
- Production Folder Standards - All standards documents
- project-organizer agent - Primary consumer
Success Output
When successful, this skill MUST output:
✅ SKILL COMPLETE: production-folder-structure
Completed:
- [x] Project type detected (frontend/backend/monorepo)
- [x] Production readiness scored (0-100)
- [x] Misplaced files identified and categorized
- [x] Missing required files/directories listed
- [x] Migration plan generated with git commands
- [x] Verification steps documented
Outputs:
- Project type detection report
- Production readiness score (target: ≥90)
- List of misplaced files with target locations
- List of missing required files
- Executable migration plan (bash script)
- Verification checklist
Verification:
- Production score ≥ 90
- Root directory ≤ 25 items
- All required docs/ subdirectories exist
- All required root files present
- Git mv commands preserve history
Completion Checklist
Before marking this skill as complete, verify:
- Project type correctly detected (95%+ confidence)
- Production readiness score calculated
- All misplaced files identified with target locations
- All missing required files listed
- Git mv commands generated (preserves history)
- Create commands generated for missing directories
- Root directory item count reduced
- All required documentation files present
- All tests still pass after reorganization
- Verification steps executed successfully
Failure Indicators
This skill has FAILED if:
- ❌ Project type detection confidence <80%
- ❌ Production score calculation errors
- ❌ Misplaced file categorization incorrect
- ❌ Migration plan breaks git history
- ❌ Missing files list incomplete
- ❌ Tests fail after reorganization
- ❌ Import paths broken after moves
- ❌ Circular dependency in migration plan
- ❌ Root directory still cluttered (>30 items)
- ❌ Violated project-specific standards
When NOT to Use
Do NOT use this skill when:
- Project in active development (wait for stable state)
- Custom folder structure is intentional (monorepo edge cases)
- Legacy project with complex dependencies
- Reorganization would break critical workflows
- Team has not agreed on standards
- Alternative: Use project-specific organization patterns
Use alternative approaches for:
- Code refactoring → code-organization-patterns skill
- File content organization → documentation-structuring skill
- Build output organization → build-configuration skill
- Source control cleanup → git-repository-cleanup skill
Anti-Patterns (Avoid)
| Anti-Pattern | Problem | Solution |
|---|---|---|
| mv instead of git mv | Loses file history | Always use git mv for tracked files |
| Organizing during active development | Merge conflicts, disruption | Wait for stable milestone |
| One-size-fits-all standards | Ignores project-specific needs | Check for custom .coditect/config/tracks.json |
| No verification after moves | Broken imports, tests | Always run tests after reorganization |
| Deleting files instead of moving | Permanent data loss | Use git mv, archive don't delete |
| Root directory obsession | Move everything, lose accessibility | Keep essential files in root |
| Ignoring build artifacts | Clutter returns | Update .gitignore properly |
| No team communication | Surprise reorganization | Get team buy-in first |
| Automated execution without review | Risky, may break project | Generate plan, human approves, then execute |
Principles
This skill embodies these CODITECT principles:
- #5 Eliminate Ambiguity - Clear categorization of file purposes
- #6 Clear, Understandable, Explainable - Self-documenting folder structure
- #8 No Assumptions - Verify project type before applying standards
- #10 Iterative Refinement - Improve organization incrementally
- Standards Compliance - Enforce production-grade organization
- Git History Preservation - Never lose file provenance
Related Standards:
- CODITECT-STANDARD-PRODUCTION-FOLDERS-UNIVERSAL.md
- CODITECT-STANDARD-PRODUCTION-FOLDERS-FRONTEND.md
- CODITECT-STANDARD-PRODUCTION-FOLDERS-BACKEND.md
- CODITECT-STANDARD-AUTOMATION.md
Status: Production Ready ✅ Version: 1.0.0 Last Updated: 2025-12-04 Proven: Component Viewer production organization