Spaces:
Sleeping
Sleeping
File size: 2,482 Bytes
a8a2486 1397957 a8a2486 1397957 a8a2486 1397957 a8a2486 1397957 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 |
---
title: opencode-api
emoji: 🤖
colorFrom: blue
colorTo: purple
sdk: docker
app_port: 7860
pinned: false
license: mit
---
# OpenCode API
LLM Agent API Server - ported from TypeScript [opencode](https://github.com/anomalyco/opencode) to Python.
## Features
- **Multi-provider LLM support**: Anthropic (Claude), OpenAI (GPT-4)
- **Tool system**: Web search, web fetch, todo management
- **Session management**: Persistent conversations with history
- **SSE streaming**: Real-time streaming responses
- **REST API**: FastAPI with automatic OpenAPI docs
## API Endpoints
### Sessions
- `GET /session` - List all sessions
- `POST /session` - Create a new session
- `GET /session/{id}` - Get session details
- `DELETE /session/{id}` - Delete a session
- `POST /session/{id}/message` - Send a message (SSE streaming response)
- `POST /session/{id}/abort` - Cancel ongoing generation
### Providers
- `GET /provider` - List available LLM providers
- `GET /provider/{id}` - Get provider details
- `GET /provider/{id}/model` - List provider models
### Events
- `GET /event` - Subscribe to real-time events (SSE)
## Environment Variables
Set these as Hugging Face Space secrets:
| Variable | Description |
| -------------------------- | ----------------------------------- |
| `ANTHROPIC_API_KEY` | Anthropic API key for Claude models |
| `OPENAI_API_KEY` | OpenAI API key for GPT models |
| `BLABLADOR_API_KEY` | Blablador API key |
| `TOKEN` | Authentication token for API access |
| `OPENCODE_SERVER_PASSWORD` | Optional: Basic auth password |
## Local Development
```bash
# Install dependencies
pip install -r requirements.txt
# Run server
python app.py
# Or with uvicorn
uvicorn app:app --host 0.0.0.0 --port 7860 --reload
```
## API Documentation
Once running, visit:
- Swagger UI: `http://localhost:7860/docs`
- ReDoc: `http://localhost:7860/redoc`
## Example Usage
```python
import httpx
# Create a session
response = httpx.post("http://localhost:7860/session")
session = response.json()
session_id = session["id"]
# Send a message (with SSE streaming)
with httpx.stream(
"POST",
f"http://localhost:7860/session/{session_id}/message",
json={"content": "Hello, what can you help me with?"}
) as response:
for line in response.iter_lines():
if line.startswith("data: "):
print(line[6:])
```
## License
MIT
|