Skip to main content

Research: Backend-Frontend Synchronization Issues and Deployment Requirements

Date: 2025-10-14T14:20:53+00:00 Git Commit: 04ef4b44bff48f6f026a05b0ede16afdbfdaa22a Branch: main Repository: t2 (Coditect AI IDE)


Research Question

Primary: "Given the database changes we made, won't we need to rebuild the backend and redeploy? What do we need to analyze to make sure that the frontend and backend sync and the WebSocket integrates the IDE terminal with the container pod and sync session with FoundationDB for persistence of user tenant session?"

Secondary Questions:

  1. What is the current backend deployment status vs. local code?
  2. Are there mismatches between frontend API contracts and backend implementation?
  3. What is the correct Docker build and deployment process?
  4. Are there model naming inconsistencies between handlers and repositories?
  5. What is the JWT Claims structure and is it properly implemented?

Executive Summary

Status: 🔴 CRITICAL ISSUES FOUND - Backend Rebuild Required

Key Findings

  1. Deployment Process Verified: Backend uses Google Artifact Registry (not GCR) with Cloud Build
  2. 🔴 Auth Handler Broken: JWT Claims missing session_id and token_family fields - will not compile
  3. 🔴 Model Name Mismatch: Handlers reference Session but actual model is workspaceSession
  4. ⚠️ Frontend API Contract Mismatch: Multiple endpoint and response format mismatches
  5. Docker Build Process Documented: Multi-stage build with dependency caching

Impact

  • Backend will not compile due to missing Claims fields and incorrect model imports
  • Frontend cannot create sessions due to endpoint path mismatch (/sessions/create vs /sessions)
  • Token rotation will fail without AuthSession creation in login/register
  • Session management broken due to non-existent repository references

Required Actions

  1. Fix auth handler Claims creation (add AuthSession)
  2. Rename all SessionworkspaceSession in handlers
  3. Fix frontend endpoint paths and response access
  4. Rebuild backend Docker image
  5. Deploy to GKE with new image tag

Research Methodology

Approach: Parallel specialized agent research using 5 concurrent investigations:

  1. codebase-analyzer (Docker config) - Analyzed build process, Dockerfile, Cloud Build
  2. codebase-analyzer (Auth handlers) - Analyzed JWT Claims structure and implementation
  3. codebase-analyzer (Frontend service) - Analyzed API contracts and endpoint paths
  4. codebase-locator (Model definitions) - Located session models and repositories
  5. codebase-locator (Deployment configs) - Located K8s manifests and registry references

Tools Used: Read, Grep, Glob, file analysis

Time Investment: ~5 minutes of parallel research


Detailed Findings

1. Docker Build Configuration and Deployment Process

Research Focus: How is the backend currently built and deployed?

1.1 Multi-Stage Docker Build

File: /home/hal/v4/PROJECTS/t2/backend/Dockerfile (66 lines)

Build Stage (Lines 2-31):

  • Base: rust:1.90
  • Installs FoundationDB 7.1.27 client libraries
  • Installs clang and libclang-dev for bindgen
  • Dependency Caching Strategy:
    • Creates dummy main.rs
    • Builds dependencies only (cargo build --release)
    • Deletes dummy source
    • Copies real source
    • Result: Dependencies cached in Docker layer (~5-10 min savings per build)

Runtime Stage (Lines 34-66):

  • Base: debian:bookworm-slim
  • Copies compiled binary: /app/api-server
  • Copies FDB cluster file: /app/fdb.cluster
  • Environment:
    • HOST=0.0.0.0
    • PORT=8080
    • RUST_LOG=info
    • FDB_CLUSTER_FILE=/app/fdb.cluster

1.2 Container Registry (Artifact Registry, NOT GCR)

Registry Path: us-central1-docker.pkg.dev/serene-voltage-464305-n2/coditect

Image Tags:

  • Backend: coditect-v5-api:latest and coditect-v5-api:${SHORT_SHA}
  • Combined: coditect-combined:latest and coditect-combined:${BUILD_ID}

Important: Old gcr.io references in documentation are obsolete - project migrated to Artifact Registry.

1.3 Cloud Build Process

File: /home/hal/v4/PROJECTS/t2/backend/cloudbuild.yaml (93 lines)

Build Steps:

  1. Build Docker image with dual tags (SHA + latest)
  2. Push SHA-tagged image
  3. Push latest tag
  4. Deploy to Cloud Run (optional)

Configuration:

  • Build machine: N1_HIGHCPU_8 (8 vCPUs)
  • Timeout: 1800s (30 minutes)
  • Disk: 100GB

1.4 Deployment Configuration

File: /home/hal/v4/PROJECTS/t2/backend/k8s-deployment.yaml

Image Reference:

image: us-central1-docker.pkg.dev/serene-voltage-464305-n2/coditect/coditect-v5-api:latest

Deployment Command:

gcloud builds submit --config cloudbuild.yaml --project=serene-voltage-464305-n2

GKE Cluster:

  • Name: codi-poc-e2-cluster
  • Namespace: coditect-app
  • Deployment: coditect-api-v5

2. JWT Claims Structure Issues

Research Focus: Are JWT Claims properly implemented in auth handlers?

2.1 Claims Struct Definition

File: /home/hal/v4/PROJECTS/t2/backend/src/handlers/auth.rs:49-59

pub struct Claims {
pub sub: String, // User ID
pub tenant_id: String, // Tenant ID
pub email: String, // User email
pub session_id: String, // ❌ Required but NOT set in login/register
pub token_family: String, // ❌ Required but NOT set in login/register
pub exp: usize, // Expiration
pub iat: usize, // Issued at
}

2.2 Login Handler Implementation

File: /home/hal/v4/PROJECTS/t2/backend/src/handlers/auth.rs:128-134

Current Code:

let claims = Claims {
sub: user.user_id.to_string(),
tenant_id: user.primary_tenant_id.to_string(), // ⚠️ Field doesn't exist
email: user.email.clone(),
// ❌ Missing: session_id
// ❌ Missing: token_family
exp: (Utc::now() + Duration::hours(24)).timestamp() as usize,
iat: Utc::now().timestamp() as usize,
};

Problems:

  1. Will not compile: Missing required fields session_id and token_family
  2. Field error: User model has tenant_id, not primary_tenant_id
  3. No AuthSession created: No call to AuthSession::new() or AuthSessionRepository::create()
  4. No metadata extraction: No IP address or user agent capture

2.3 Register Handler Implementation

File: /home/hal/v4/PROJECTS/t2/backend/src/handlers/auth.rs:232-237

Same issues as login handler:

  • ❌ Missing session_id and token_family
  • ❌ References non-existent primary_tenant_id
  • ❌ No AuthSession creation

2.4 Refresh Handler (CORRECT Implementation)

File: /home/hal/v4/PROJECTS/t2/backend/src/handlers/auth.rs:341-349

let access_claims = Claims {
sub: claims.sub.clone(),
tenant_id: new_session.tenant_id.to_string(),
email: claims.email.clone(),
session_id: new_session.session_id.to_string(), // ✅ Correctly populated
token_family: new_session.token_family.to_string(), // ✅ Correctly populated
exp: (Utc::now() + Duration::hours(1)).timestamp() as usize,
iat: Utc::now().timestamp() as usize,
};

This shows the correct pattern: Create AuthSession first, then populate Claims from it.

2.5 Required Fix

Login handler needs (lines 114-134):

// After password verification, add:
use crate::db::models::AuthSession;
use crate::db::repositories::AuthSessionRepository;

let ip_address = req
.connection_info()
.realip_remote_addr()
.unwrap_or("unknown")
.to_string();

let user_agent = req
.headers()
.get("User-Agent")
.and_then(|v| v.to_str().ok())
.unwrap_or("unknown")
.to_string();

let auth_session = AuthSession::new(
user.user_id,
user.tenant_id, // Not primary_tenant_id!
ip_address,
user_agent,
);

let session_repo = AuthSessionRepository::new(app_state.db.clone());
session_repo.create(&auth_session).await?;

let claims = Claims {
sub: user.user_id.to_string(),
tenant_id: user.tenant_id.to_string(), // Fixed field name
email: user.email.clone(),
session_id: auth_session.session_id.to_string(),
token_family: auth_session.token_family.to_string(),
exp: (Utc::now() + Duration::hours(24)).timestamp() as usize,
iat: Utc::now().timestamp() as usize,
};

Same fix needed in register handler (lines 232-237).


3. Model Naming Mismatch

Research Focus: Do session handlers reference the correct model names?

3.1 What Exists in Codebase

File: /home/hal/v4/PROJECTS/t2/backend/src/db/models.rs

Line 344: pub struct workspaceSession { ... } Line 376: impl workspaceSession { ... }

File: /home/hal/v4/PROJECTS/t2/backend/src/db/repositories.rs

Line 640: pub struct workspaceSessionRepository { ... } Line 644: impl workspaceSessionRepository { ... }

3.2 What Handlers Reference

File: /home/hal/v4/PROJECTS/t2/backend/src/handlers/sessions.rs

Lines 40-41:

use crate::db::models::Session;           // ❌ DOES NOT EXIST
use crate::db::repositories::SessionRepository; // ❌ DOES NOT EXIST

Usage in handlers:

  • Line 59: Session::new()
  • Line 63: SessionRepository::new()
  • Line 101: SessionRepository::new()
  • Line 154: SessionRepository::new()
  • Line 200: SessionRepository::new()

Result: 🔴 Code will not compile - Session and SessionRepository do not exist.

3.3 Repository Methods Available

workspaceSessionRepository methods (lines 640-827):

  • create(&self, session: &workspaceSession) (line 658)
  • get(&self, session_id: Uuid) (line 707)
  • list_by_tenant(&self, tenant_id: Uuid) (line 731)
  • list_by_user(&self, user_id: Uuid) (line 770)
  • update(&self, session: &workspaceSession) (line 827)

All methods expect workspaceSession, not Session.

3.4 Required Fix

Option 1: Rename Model (Recommended):

// In models.rs (line 344):
pub struct Session { ... } // Rename from workspaceSession

// In repositories.rs (line 640):
pub struct SessionRepository { ... } // Rename from workspaceSessionRepository

Option 2: Fix Handlers:

// In handlers/sessions.rs (lines 40-41):
use crate::db::models::workspaceSession;
use crate::db::repositories::workspaceSessionRepository;

// Update all Session → workspaceSession references

Recommendation: Option 1 (rename model) is simpler since handlers already use the correct names.


4. Frontend API Contract Mismatches

Research Focus: Does frontend session-service match backend API?

4.1 Base URL Configuration

File: /home/hal/v4/PROJECTS/t2/src/services/session-service.ts:29

constructor(baseUrl: string = 'http://34.46.212.40/api/v5') {

Issues:

  • ⚠️ Hardcoded production IP: Should use relative URL /api/v5
  • ⚠️ No environment variable: Cannot switch between dev/staging/prod

Fix:

constructor(baseUrl: string = '/api/v5') {

4.2 Endpoint Path Mismatch

Frontend (line 98):

const response = await fetch(`${this.baseUrl}/sessions/create`, {

Backend (main.rs:100):

.service(handlers::sessions::create_session)  // POST /sessions

Issue: ❌ Frontend calls /sessions/create, backend expects /sessions

Fix (line 98):

const response = await fetch(`${this.baseUrl}/sessions`, {

4.3 Response Format Mismatch

Frontend Expectation (line 64):

return data.sessions || []  // Expects { sessions: Session[] }

Backend Response (sessions.rs:174):

Ok(HttpResponse::Ok().json(ApiResponse::success(responses)))
// Returns: { success: true, data: Vec<SessionResponse> }

Issue: ⚠️ Frontend accesses data.sessions, but backend returns data.data

Fix (line 64):

return data.data || []  // Access the correct field

Same issue in lines: 86, 109, 132

4.4 Session Interface Mismatch

Frontend Interface (lines 9-17):

interface Session {
id: string
tenant_id: string
name: string
type: 'workspace' | 'ai-studio' | 'theia' // ❌ Backend doesn't have this
settings?: Record<string, any>
created_at: string
updated_at: string
}

Backend Response (sessions.rs:16-32):

pub struct SessionResponse {
pub id: String,
pub name: String,
pub tenant_id: String,
pub user_id: String, // ✅ Backend has this
pub workspace_path: Option<String>, // ✅ Backend has this
pub created_at: String,
pub updated_at: String,
pub last_accessed_at: String, // ✅ Backend has this
// ❌ No `type` field
// ❌ No `settings` field
}

Impact: Frontend expects fields that backend doesn't provide.


5. Deployment Process Documentation

Research Focus: Where are deployment configurations and scripts?

5.1 Active Kubernetes Manifests

Combined Deployment (current production):

  • File: /home/hal/v4/PROJECTS/t2/k8s-combined-deployment.yaml
  • Image: us-central1-docker.pkg.dev/serene-voltage-464305-n2/coditect/coditect-combined:latest
  • Purpose: Single-container (V5 + theia + NGINX)

Backend API Deployment:

  • File: /home/hal/v4/PROJECTS/t2/backend/k8s-deployment.yaml
  • Image: us-central1-docker.pkg.dev/serene-voltage-464305-n2/coditect/coditect-v5-api:latest
  • Purpose: Rust/Actix-web API server

5.2 Cloud Build Configurations

Active Builds:

  • /home/hal/v4/PROJECTS/t2/cloudbuild-combined.yaml - Combined deployment
  • /home/hal/v4/PROJECTS/t2/backend/cloudbuild.yaml - Backend API
  • /home/hal/v4/PROJECTS/t2/backend/cloudbuild-simple.yaml - Minimal build
  • /home/hal/v4/PROJECTS/t2/backend/cloudbuild-minimal.yaml - Test build

5.3 Deployment Scripts

Pre-Deployment Testing:

  • File: /home/hal/v4/PROJECTS/t2/scripts/test-deployment.sh (403 lines)
  • Purpose: Validate config before deployment
  • Usage: Run before Cloud Build submission

Endpoint Testing:

  • File: /home/hal/v4/PROJECTS/t2/scripts/test-endpoints.sh (122 lines)
  • Purpose: Test live deployed endpoints

Makefile:

  • File: /home/hal/v4/PROJECTS/t2/Makefile
  • Line 63-65: make deploy command
  • Command: gcloud builds submit --config cloudbuild-combined.yaml

5.4 Obsolete/Archived Files

Archived:

  • /home/hal/v4/PROJECTS/t2/docs/99-archive/deployment-obsolete/ - Old deployments
  • /home/hal/v4/PROJECTS/t2/archive/v4-reference/ - V4 reference materials

Note: All old gcr.io references are obsolete - project uses Artifact Registry now.


Code References

Auth Handler Issues:

  • backend/src/handlers/auth.rs:49-59 - Claims struct definition
  • backend/src/handlers/auth.rs:128-134 - Login Claims creation (broken)
  • backend/src/handlers/auth.rs:232-237 - Register Claims creation (broken)
  • backend/src/handlers/auth.rs:341-349 - Refresh Claims creation (correct)

Model Naming Issues:

  • backend/src/db/models.rs:344 - workspaceSession definition
  • backend/src/db/repositories.rs:640 - workspaceSessionRepository definition
  • backend/src/handlers/sessions.rs:40-41 - Incorrect imports

Frontend Contract Issues:

  • src/services/session-service.ts:29 - Hardcoded base URL
  • src/services/session-service.ts:98 - Wrong endpoint path
  • src/services/session-service.ts:64, 86, 109, 132 - Wrong response access

Deployment Configuration:

  • backend/Dockerfile - Multi-stage build
  • backend/cloudbuild.yaml - Build process
  • backend/k8s-deployment.yaml - K8s deployment
  • k8s-combined-deployment.yaml - Combined deployment

Architecture Documentation

Current Implementation Patterns

1. JWT Token Family Pattern:

  • Each login creates new token_family UUID
  • Refresh tokens share same family
  • Logout invalidates entire family
  • Status: Defined but not fully implemented in login/register

2. Multi-Tenant Isolation:

  • All FDB keys prefixed: /{tenant_id}/...
  • User model has tenant_id field
  • Claims include tenant_id for authorization
  • Status: Implemented correctly

3. Session Architecture:

  • AuthSession: JWT authentication sessions
  • workspaceSession: IDE workspace sessions
  • Separation: Auth sessions track login, workspace sessions track IDE state
  • Status: Models defined, handlers have naming mismatch

4. Docker Build Strategy:

  • Multi-stage: Builder stage (Rust) + Runtime stage (Debian)
  • Dependency caching: Dummy main.rs technique
  • Dual tags: SHA-tagged (immutable) + latest (mutable)
  • Status: Working correctly

5. Deployment Pipeline:

  • Local: Docker build → Test locally
  • Cloud: Cloud Build → Artifact Registry → GKE
  • Registry: Google Artifact Registry (not GCR)
  • Status: Process documented and working

Summary of Issues

Critical (Prevent Compilation)

  1. Auth Handler Claims - Missing session_id and token_family fields

    • Impact: Code will not compile
    • Location: backend/src/handlers/auth.rs:128-134, 232-237
    • Fix: Create AuthSession before JWT generation
  2. Model Name Mismatch - Handlers import non-existent Session and SessionRepository

    • Impact: Code will not compile
    • Location: backend/src/handlers/sessions.rs:40-41
    • Fix: Use workspaceSession and workspaceSessionRepository

High (Runtime Failures)

  1. Frontend Endpoint Mismatch - Calls /sessions/create but backend has /sessions

    • Impact: Session creation will fail with 404
    • Location: src/services/session-service.ts:98
    • Fix: Change to /sessions
  2. Frontend Response Access - Accesses data.sessions but backend returns data.data

    • Impact: Frontend receives undefined instead of session list
    • Location: src/services/session-service.ts:64, 86, 109, 132
    • Fix: Access data.data

Medium (Configuration)

  1. Hardcoded Base URL - Frontend uses hardcoded production IP

    • Impact: Cannot switch environments
    • Location: src/services/session-service.ts:29
    • Fix: Use relative URL /api/v5
  2. User Model Field - Handlers reference primary_tenant_id but model has tenant_id

    • Impact: Code will not compile
    • Location: backend/src/handlers/auth.rs:130, 234
    • Fix: Use tenant_id

1. Fix Backend Code (30 minutes)

File: backend/src/handlers/auth.rs

  • Add AuthSession creation in login() (after line 114)
  • Add AuthSession creation in register() (after line 218)
  • Fix primary_tenant_idtenant_id (lines 130, 234)

File: backend/src/handlers/sessions.rs

  • Change imports: SessionworkspaceSession (line 40)
  • Change imports: SessionRepositoryworkspaceSessionRepository (line 41)
  • Update all references throughout file

2. Fix Frontend Code (10 minutes)

File: src/services/session-service.ts

  • Change baseUrl to /api/v5 (line 29)
  • Change /sessions/create/sessions (line 98)
  • Change data.sessionsdata.data (line 64)
  • Change data.sessiondata.data (lines 86, 109, 132)

3. Rebuild and Deploy (20 minutes)

# Build new image
cd /home/hal/v4/PROJECTS/t2/backend
docker build -t us-central1-docker.pkg.dev/serene-voltage-464305-n2/coditect/coditect-v5-api:oct14-v2 .

# Or use Cloud Build
gcloud builds submit --config cloudbuild.yaml --project=serene-voltage-464305-n2

# Deploy to GKE
kubectl set image deployment/coditect-api-v5 \
coditect-api-v5=us-central1-docker.pkg.dev/serene-voltage-464305-n2/coditect/coditect-v5-api:oct14-v2 \
-n coditect-app

# Verify
kubectl rollout status deployment/coditect-api-v5 -n coditect-app

4. Test End-to-End (30 minutes)

# Test registration
curl -X POST https://coditect.ai/api/v5/auth/register \
-H "Content-Type: application/json" \
-d '{"email":"test@example.com","password":"test123","firstName":"Test","lastName":"User"}'

# Test login
curl -X POST https://coditect.ai/api/v5/auth/login \
-H "Content-Type: application/json" \
-d '{"email":"test@example.com","password":"test123"}'

# Save JWT token and test session creation
curl -X POST https://coditect.ai/api/v5/sessions \
-H "Authorization: Bearer <TOKEN>" \
-H "Content-Type: application/json" \
-d '{"name":"Test Session","workspacePath":"/workspace/test"}'

Total Time: ~1.5 hours


Backend Implementation:

  • /home/hal/v4/PROJECTS/t2/backend/src/handlers/auth.rs - Auth handlers
  • /home/hal/v4/PROJECTS/t2/backend/src/handlers/sessions.rs - Session handlers
  • /home/hal/v4/PROJECTS/t2/backend/src/db/models.rs - Data models
  • /home/hal/v4/PROJECTS/t2/backend/src/db/repositories.rs - FDB repositories

Frontend Implementation:

  • /home/hal/v4/PROJECTS/t2/src/services/session-service.ts - Session API client
  • /home/hal/v4/PROJECTS/t2/src/stores/auth-store.ts - Authentication state

Deployment:

  • /home/hal/v4/PROJECTS/t2/backend/Dockerfile - Docker build
  • /home/hal/v4/PROJECTS/t2/backend/cloudbuild.yaml - Cloud Build config
  • /home/hal/v4/PROJECTS/t2/backend/k8s-deployment.yaml - K8s deployment
  • /home/hal/v4/PROJECTS/t2/Makefile - Build commands

Analysis Documents:

  • /home/hal/v4/PROJECTS/t2/BACKEND-FRONTEND-SYNC-analysis.md - Previous analysis (50+ sections)
  • /home/hal/v4/PROJECTS/t2/deployment-status-2025-10-14.md - Current deployment status
  • /home/hal/v4/PROJECTS/t2/mvp-critical-path-2025-10-14.md - Overall roadmap

Conclusion

The research confirms that backend rebuild is required before deployment. The current deployed code (Oct 9) does not include:

  • 2,860 lines of FDB models/repositories (Oct 14)
  • Auth and session handlers (Oct 14)
  • JWT token family support

Additionally, the code will not compile due to:

  • Missing JWT Claims fields
  • Non-existent model/repository imports
  • Incorrect field references

The frontend also has API contract mismatches that will cause runtime failures:

  • Wrong endpoint paths
  • Incorrect response data access
  • Hardcoded production IP

All issues have been documented with exact file paths and line numbers for rapid resolution. The fixes are straightforward and can be completed in ~1.5 hours total.


Generated: 2025-10-14T14:20:53+00:00 Research Duration: 5 minutes (parallel agent research) Next Steps: Fix code issues → Rebuild → Deploy → Test