AZ1.AI theia IDE - Deployment Guide
Quick Start (Local Development)​
The theia IDE is currently running on port 3000.
Access the IDE:
- From Docker container: http://localhost:3000
- From Windows host: http://localhost:3000 (if Docker port is mapped)
- From network: http://
:3000
Deployment Options​
1. Local Development (Current Setup)​
# Start theia directly
cd theia-app
npm start
# Access at http://localhost:3000
Status: ✅ Currently Running
2. Docker Deployment (Single Container)​
# Build Docker image
docker build -t az1ai-theia-ide:latest .
# Run container
docker run -d \
--name az1ai-theia \
-p 3000:3000 \
-e LM_STUDIO_HOST=host.docker.internal \
-e LM_STUDIO_PORT=1234 \
-v theia-workspace:/home/theia/workspace \
az1ai-theia-ide:latest
# Access at http://localhost:3000
3. Docker Compose (Recommended for Development)​
# Start all services
docker-compose up -d
# View logs
docker-compose logs -f theia
# Stop services
docker-compose down
# Rebuild and restart
docker-compose up -d --build
Features:
- Persistent workspace volume
- Automatic restart on failure
- Health checks
- LM Studio integration configured
- Optional NGINX load balancer
4. Production Deployment (GCP Cloud Run)​
Based on ADR-020: GCP Cloud Run Deployment
Prerequisites​
# Install Google Cloud SDK
curl https://sdk.cloud.google.com | bash
exec -l $SHELL
# Authenticate
gcloud auth login
gcloud config set project YOUR_PROJECT_ID
Deploy to Cloud Run​
# Build and push to Google Container Registry
gcloud builds submit --tag gcr.io/YOUR_PROJECT_ID/az1ai-theia-ide
# Deploy to Cloud Run
gcloud run deploy az1ai-theia \
--image gcr.io/YOUR_PROJECT_ID/az1ai-theia-ide \
--platform managed \
--region us-central1 \
--allow-unauthenticated \
--memory 4Gi \
--cpu 2 \
--timeout 3600 \
--concurrency 80 \
--port 3000
# Get the service URL
gcloud run services describe az1ai-theia \
--platform managed \
--region us-central1 \
--format 'value(status.url)'
Terraform Deployment (Infrastructure as Code)​
See docs/adr/adr-020-gcp-cloud-run-deployment.md for complete Terraform configuration.
# Initialize Terraform
cd terraform/
terraform init
# Plan deployment
terraform plan -out=plan.tfplan
# Apply deployment
terraform apply plan.tfplan
5. Kubernetes Deployment (Enterprise)​
# kubernetes/deployment.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
name: az1ai-theia
spec:
replicas: 3
selector:
matchLabels:
app: az1ai-theia
template:
metadata:
labels:
app: az1ai-theia
spec:
containers:
- name: theia
image: az1ai-theia-ide:latest
ports:
- containerPort: 3000
env:
- name: NODE_ENV
value: "production"
resources:
requests:
memory: "2Gi"
cpu: "1000m"
limits:
memory: "4Gi"
cpu: "2000m"
livenessProbe:
httpGet:
path: /
port: 3000
initialDelaySeconds: 40
periodSeconds: 30
---
apiVersion: v1
kind: Service
metadata:
name: az1ai-theia-service
spec:
type: LoadBalancer
selector:
app: az1ai-theia
ports:
- port: 80
targetPort: 3000
Deploy:
kubectl apply -f kubernetes/deployment.yaml
kubectl apply -f kubernetes/service.yaml
Environment Variables​
| Variable | Description | Default |
|---|---|---|
NODE_ENV | Environment (development/production) | development |
THEIA_DEFAULT_PLUGINS | Plugin directory | local-dir:/home/theia/app/plugins |
LM_STUDIO_HOST | LM Studio hostname | host.docker.internal |
LM_STUDIO_PORT | LM Studio port | 1234 |
LM_STUDIO_API_BASE | LM Studio API base URL | http://host.docker.internal:1234/v1 |
Persistent Storage​
Docker Volumes​
# List volumes
docker volume ls | grep theia
# Inspect workspace volume
docker volume inspect az1ai_theia-workspace
# Backup workspace
docker run --rm -v az1ai_theia-workspace:/data -v $(pwd):/backup \
alpine tar czf /backup/workspace-backup.tar.gz -C /data .
# Restore workspace
docker run --rm -v az1ai_theia-workspace:/data -v $(pwd):/backup \
alpine sh -c "cd /data && tar xzf /backup/workspace-backup.tar.gz"
Cloud Storage (GCP)​
# Create GCS bucket
gcloud storage buckets create gs://az1ai-workspaces \
--location=us-central1 \
--uniform-bucket-level-access
# Backup to GCS
gsutil -m rsync -r /home/theia/workspace gs://az1ai-workspaces/
NGINX Load Balancer (Production)​
Based on ADR-016: Use NGINX Load Balancer
Enable NGINX in Docker Compose​
# Edit docker-compose.yml and uncomment nginx service
# Ensure SSL certificates are in ./ssl/ directory
docker-compose up -d nginx
SSL Certificate Setup​
# Using Let's Encrypt (certbot)
certbot certonly --standalone -d your-domain.com
# Copy certificates
mkdir -p ssl/
cp /etc/letsencrypt/live/your-domain.com/fullchain.pem ssl/cert.pem
cp /etc/letsencrypt/live/your-domain.com/privkey.pem ssl/key.pem
# Update nginx.conf to enable HTTPS server block
Horizontal Scaling​
To add more theia instances:
# docker-compose.yml
services:
theia-1:
# ... (same config as theia)
ports:
- "3001:3000"
theia-2:
# ... (same config as theia)
ports:
- "3002:3000"
Update nginx.conf:
upstream theia_backend {
least_conn;
server theia-1:3000 max_fails=3 fail_timeout=30s;
server theia-2:3000 max_fails=3 fail_timeout=30s;
}
Monitoring & Health Checks​
Docker Health Check​
# Check container health
docker ps --filter "name=az1ai-theia" --format "table {{.Names}}\t{{.Status}}"
# View health check logs
docker inspect az1ai-theia --format='{{json .State.Health}}' | jq
Manual Health Check​
# Check if theia is responding
curl -I http://localhost:3000
# Expected output:
# HTTP/1.1 200 OK
# X-Powered-By: Express
Logs​
# Docker container logs
docker logs -f az1ai-theia
# Docker Compose logs
docker-compose logs -f theia
# Application logs (inside container)
docker exec -it az1ai-theia tail -f /home/theia/app/theia-app/logs/theia.log
Troubleshooting​
Port Already in Use​
# Find process using port 3000
lsof -i :3000
# or
ss -tlnp | grep 3000
# Kill process
kill -9 <PID>
Container Won't Start​
# Check logs
docker logs az1ai-theia
# Rebuild without cache
docker-compose build --no-cache
docker-compose up -d
WebSocket Connection Failed​
-
Check NGINX configuration for WebSocket support:
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade"; -
Verify timeout settings:
proxy_read_timeout 3600s; -
Check firewall rules (Cloud Run):
gcloud compute firewall-rules list
LM Studio Connection Failed​
# Test LM Studio from container
docker exec -it az1ai-theia curl http://host.docker.internal:1234/v1/models
# Verify environment variables
docker exec -it az1ai-theia env | grep LM_STUDIO
Performance Optimization​
Resource Limits​
Docker:
# docker-compose.yml
services:
theia:
deploy:
resources:
limits:
cpus: '2.0'
memory: 4G
reservations:
cpus: '1.0'
memory: 2G
Cloud Run:
gcloud run services update az1ai-theia \
--memory 8Gi \
--cpu 4 \
--concurrency 100
Node.js Options​
# docker-compose.yml
environment:
- NODE_OPTIONS=--max-old-space-size=4096
Security​
Enable Authentication (Cloud Run)​
# Deploy with authentication
gcloud run deploy az1ai-theia \
--no-allow-unauthenticated
# Add IAM policy binding for specific users
gcloud run services add-iam-policy-binding az1ai-theia \
--member="user:user@example.com" \
--role="roles/run.invoker"
Environment Secrets​
# Create secret in Google Secret Manager
echo -n "your-secret-value" | \
gcloud secrets create lm-studio-api-key --data-file=-
# Grant access to Cloud Run service account
gcloud secrets add-iam-policy-binding lm-studio-api-key \
--member="serviceAccount:YOUR_SERVICE_ACCOUNT@YOUR_PROJECT_ID.iam.gserviceaccount.com" \
--role="roles/secretmanager.secretAccessor"
# Update Cloud Run to use secret
gcloud run services update az1ai-theia \
--update-secrets=LM_STUDIO_API_KEY=lm-studio-api-key:latest
Next Steps​
- Test MVP Locally ✅ (Currently running on port 3000)
- Build Docker Image:
docker build -t az1ai-theia-ide:latest . - Test Docker Deployment:
docker-compose up -d - Deploy to Production: Follow GCP Cloud Run steps above
- Configure Monitoring: Set up Cloud Logging and Cloud Monitoring
- Enable HTTPS: Configure SSL certificates for production
- Scale Horizontally: Add more theia instances behind NGINX
References​
- ADR-016: Use NGINX Load Balancer
- ADR-017: WebSocket Backend Architecture
- ADR-020: GCP Cloud Run Deployment
- sdd.md: Software Design Document
- tdd.md: Technical Design Document