Spaces:
Running
on
Zero
Running
on
Zero
Warbler CDA Docker Build Performance
Build Configuration
- Dockerfile: Minimal FractalStat testing setup
- Base Image: python:3.11-slim
- Build Context Optimization: .dockerignore excludes cache files and large directories
- Dependency Strategy: Minimal ML dependencies for FractalStat testing
Performance Measurements
Optimized Build Results (Windows with WSL)
β
FINAL OPTIMIZED BUILD: 38.4 seconds (~40 seconds)
βββ Base Image Pull: 3.7 seconds
βββ System Dependencies: 20.5 seconds (git install)
βββ Dependencies (pip install): 5.8 seconds
β - pydantic>=2.0.0 (only needed library!)
β - pytest>=7.0.0 (testing framework)
βββ Code Copy: 0.2 seconds
βββ Layer Export: 6.4 seconds
βββ Image Unpack: 1.7 seconds
Performance Improvement Achieved
π Optimization Results:
- Build Time Reduction: 94% faster (601.6s β 38.4s)
- Pip Install Reduction: 98% faster (295.6s β 5.8s)
- Context Size: 556B (highly optimized .dockerignore - final reduction)
- Expected Image Size: ~250MB (vs 12.29GB bloated)
π Bottleneck Eliminated:
- Removed PyTorch/Transformers dependency chain causing 98% of bloat
- FractalStat modules require zero ML libraries
- Pure Python with dataclasses, enums, typing, json
π Root Cause Identified:
Original bloat caused by transformers[torch] pulling:
- PyTorch CPU (~1GB)
- 100+ optional dependencies (~11GB)
- All unnecessary for FractalStat core functionality
Recommendations for Faster Builds
For Development Builds
- Use cached layers - Base image and system dependencies rarely change
- Separate dependency layers - Cache pip installs when code changes frequently
- Minimal dependencies - Only install what's needed for testing FractalStat specifically
For Production Builds
- Multi-stage builds - Separate testing and runtime images
- Dependency optimization - Use Docker layer caching more effectively
- Alternative base images - Consider smaller Python images or compiled binaries
Testing Results
- β All 70 FractalStat entity tests pass
- β FractalStat coordinates and entities work correctly
- β RAG bridge integration functions properly
- β Container startup and imports work as expected
Performance Notes
- First-time build: ~10 minutes (acceptable for ML dependencies)
- Subsequent builds: Should be faster with Docker layer caching
- Network dependency: Download times vary by internet connection
- WSL overhead: Minimal impact on overall build time