File size: 10,181 Bytes
4e909c7 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 |
# Laddr
**A transparent, Docker-native, observable, distributed agent framework.**
Laddr is a superset of CrewAI that removes excessive abstractions and introduces real distributed runtime, local observability, and explicit agent communication.
## ๐ฏ Philosophy
CrewAI is too abstract, making it nearly impossible to understand or debug what's happening under the hood.
Laddr fixes this by being:
- **Transparent** โ All logic (task flow, prompts, tool calls) visible and traceable
- **Pluggable** โ Configure your own queues, databases, models, or tools
- **Observable** โ Every agent action recorded via OpenTelemetry
- **Containerized** โ Everything runs inside Docker for predictable behavior
> **In short:** Laddr = CrewAI with explicit communication, Docker-native execution, local observability, and zero hidden magic.
## ๐๏ธ Architecture
### Communication Model
Unlike CrewAI's internal synchronous calls, Laddr uses **Redis Streams** for explicit message passing:
```
Controller โ Redis Queue โ Agent Worker โ Redis Response Stream
```
Each agent runs in its own container and consumes tasks from a dedicated Redis stream.
### Services
- **PostgreSQL** (with pgvector) โ Stores traces, job history, agent metadata
- **Redis** โ Message bus for task distribution
- **MinIO** โ S3-compatible storage for artifacts and large payloads
- **Jaeger** โ OpenTelemetry trace collection and visualization
- **Prometheus** โ Metrics collection and monitoring
- **API Server** โ FastAPI server for job submission and queries
- **Worker Containers** โ One per agent, consumes and processes tasks
- **Dashboard** โ Real-time monitoring and agent interaction
## ๐ Quick Start
### Installation
```bash
# Clone the repository
cd lib/laddr
# Install locally (for now)
pip install -e .
```
### Create a Project
```bash
# Initialize a new project
laddr init my_project
# Navigate to project
cd my_project
# Configure API keys in .env
# Edit .env and add your GEMINI_API_KEY and SERPER_API_KEY
# Start the environment (includes default researcher agent)
laddr run dev
```
This will start all services with a working researcher agent and web_search tool ready to use.
**What's included out-of-the-box:**
- Default `researcher` agent with Gemini 2.0 Flash
- `web_search` tool powered by Serper.dev
- Sample `research_pipeline.yml`
- Full observability stack (Jaeger, Prometheus, Dashboard)
Access the dashboard at `http://localhost:5173` to interact with your agents.
## ๐ฆ Project Structure
```
my_project/
โโโ laddr.yml # Project configuration
โโโ docker-compose.yml # Docker services (auto-generated)
โโโ Dockerfile # Container definition
โโโ .env # Environment variables
โโโ agents/ # Agent configurations
โ โโโ summarizer/
โ โ โโโ agent.yml
โ โโโ analyzer/
โ โโโ agent.yml
โโโ tools/ # Custom tools
โ โโโ my_tool.py
โโโ pipelines/ # Pipeline definitions
โโโ analysis_pipeline.yml
```
## ๐ค Creating Agents
### Add an Agent
```bash
laddr add agent researcher
```
This will:
1. Create `agents/researcher/agent.yml`
2. Add worker service to `docker-compose.yml`
3. Register agent in `laddr.yml`
**Note**: A default `researcher` agent with `web_search` tool is created automatically when you run `laddr init`.
### Agent Configuration
`agents/researcher/agent.yml`:
```yaml
name: researcher
role: Research Agent
goal: Research topics on the web and summarize findings concisely
backstory: A helpful researcher that gathers and condenses information from reliable web sources
llm:
provider: gemini
model: gemini-2.5-flash
api_key: ${GEMINI_API_KEY}
temperature: 0.7
max_tokens: 2048
tools:
- web_search
max_iterations: 15
allow_delegation: false
verbose: true
```
### LLM Providers
Laddr supports multiple LLM providers:
- **Gemini** (default) - Google's Gemini models
- **OpenAI** - GPT-4, GPT-3.5, etc.
- **Anthropic** - Claude models
- **Groq** - Fast inference
- **Ollama** - Local models
- **llama.cpp** - Local C++ inference
Set your API keys in `.env`:
```bash
GEMINI_API_KEY=your_key_here
OPENAI_API_KEY=your_key_here
ANTHROPIC_API_KEY=your_key_here
GROQ_API_KEY=your_key_here
```
## ๐ง Custom Tools
### Default Tool: web_search
A `web_search` tool using Serper.dev is included by default:
```python
# tools/web_search.py
def web_search(query: str, max_results: int = 5) -> str:
"""Search the web using Serper.dev API."""
# Uses SERPER_API_KEY from .env
# Get your free API key at https://serper.dev
```
**Setup**: Add your Serper.dev API key to `.env`:
```bash
SERPER_API_KEY=your_serper_key_here
```
### Add More Tools
```bash
laddr add tool my_custom_tool
```
Edit `tools/my_custom_tool.py`:
```python
def my_custom_tool(param: str) -> str:
"""Your custom tool logic."""
return result
```
## ๐ Pipelines
A sample pipeline (`research_pipeline.yml`) is created automatically on init.
### Example Pipeline
`pipelines/research_pipeline.yml`:
```yaml
name: research_pipeline
description: Example research pipeline using the researcher agent
tasks:
- name: search_topic
description: "Search the web for information about: {topic}"
agent: researcher
expected_output: A comprehensive summary of web search results
tools:
- web_search
async_execution: false
- name: analyze_results
description: Analyze the search results and extract key insights
agent: researcher
expected_output: Key insights and recommendations based on the research
context:
- search_topic
async_execution: false
```
### Run a Pipeline
```bash
laddr run pipeline pipelines/analysis.yml
```
Note: Pipeline inputs are defined in the YAML file or can be passed via API.
## ๐ Observability
### View Traces
Navigate to Jaeger at `http://localhost:16686` to see:
- Task execution traces
- LLM API calls
- Tool invocations
- Error spans
### View Metrics
Navigate to Prometheus at `http://localhost:9090` to query:
- `laddr_agent_task_duration_seconds` โ Task execution time
- `laddr_queue_depth` โ Pending tasks per agent
- `laddr_tokens_total` โ Token usage
- `laddr_errors_total` โ Error counts
### Agent Logs
```bash
# View logs for an agent
laddr logs summarizer
# Follow logs in real-time
laddr logs summarizer -f
```
## ๐ API Reference
### Submit Job
```bash
curl -X POST http://localhost:8000/jobs \
-H "Content-Type: application/json" \
-d '{
"pipeline_name": "analysis",
"inputs": {"document": "report.pdf"}
}'
```
### Get Job Status
```bash
curl http://localhost:8000/jobs/{job_id}
```
### List Agents
```bash
curl http://localhost:8000/agents
```
## ๐ Dashboard
Access the dashboard at `http://localhost:5173` to:
- View all active agents
- Monitor real-time logs
- Inspect OpenTelemetry traces
- Interact with individual agents
- Visualize job workflows
- Check system health metrics
## ๐ณ Docker Commands
```bash
# Start all services
laddr run dev
# View logs
laddr logs <agent_name>
# Stop all services
laddr stop
# Rebuild containers
docker compose up -d --build
```
## โ๏ธ Configuration
### Environment Variables
Edit `.env` to customize:
```bash
DATABASE_URL=postgresql://postgres:postgres@postgres:5432/laddr
REDIS_URL=redis://redis:6379
MINIO_ENDPOINT=minio:9000
OTEL_EXPORTER_OTLP_ENDPOINT=http://jaeger:4318
API_HOST=0.0.0.0
API_PORT=8000
```
### Project Configuration
Edit `laddr.yml`:
```yaml
project:
name: my_project
broker: redis
database: postgres
storage: minio
tracing: true
metrics: true
agents:
- summarizer
- analyzer
```
## ๐ Message Format
### Task Message
```json
{
"task_id": "uuid",
"job_id": "uuid",
"source_agent": "controller",
"target_agent": "summarizer",
"payload": {
"description": "Summarize this document",
"context": "...",
"expected_output": "..."
},
"trace_parent": "trace-id",
"created_at": "timestamp"
}
```
### Response Message
```json
{
"task_id": "uuid",
"job_id": "uuid",
"agent_name": "summarizer",
"status": "completed",
"result": {"output": "..."},
"metrics": {
"tokens": 2200,
"latency_ms": 5200
},
"trace_parent": "trace-id",
"completed_at": "timestamp"
}
```
## ๐ง Development
### Prerequisites
- Python 3.10+
- Docker & Docker Compose
- Git
### Setup
```bash
# Clone repository
git clone https://github.com/laddr/laddr.git
cd laddr
# Install dependencies
cd lib/laddr
pip install -e .[dev]
# Run tests
pytest
```
## ๐ CLI Reference
```bash
laddr init [project_name] # Initialize new project
laddr add agent <name> # Add new agent
laddr add tool <name> # Add custom tool
laddr run dev # Start development environment
laddr run agent <agent> # Run single agent locally
laddr run pipeline <file.yml> # Run a pipeline
laddr logs <agent> # View agent logs
laddr stop # Stop all services
```
## ๐ Laddr vs CrewAI
| Feature | CrewAI | Laddr |
|---------|--------|---------|
| **Communication** | Hidden internal calls | Explicit Redis message bus |
| **Runtime** | In-memory Python | Docker containers per agent |
| **Observability** | Limited logging | Full OpenTelemetry + Prometheus |
| **Scalability** | Single process | Distributed workers |
| **Transparency** | Opaque orchestration | Visible task flow |
| **Storage** | In-memory | MinIO/S3 for artifacts |
| **Monitoring** | None | Dashboard + Jaeger + Prometheus |
| **Configuration** | Code-based | YAML + Docker Compose |
## ๐ค Contributing
Contributions are welcome! Please see [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.
## ๐ License
MIT License - see [LICENSE](LICENSE) for details.
## ๐ Links
- **Documentation**: Coming soon
- **GitHub**: https://github.com/laddr/laddr
- **Issues**: https://github.com/laddr/laddr/issues
---
**Built with transparency in mind. No hidden magic. Just distributed agents.**
|