File size: 1,780 Bytes
0128def
e3d7fcc
0128def
 
 
 
5c62b72
0128def
 
3ce50c4
e0cc0af
e8d4494
 
 
e3d7fcc
0128def
 
0c6f69a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5c62b72
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0c6f69a
 
 
 
 
 
 
 
 
 
 
 
5c62b72
 
 
 
0c6f69a
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
---
title: First Agent Template
emoji: 
colorFrom: pink
colorTo: yellow
sdk: gradio
sdk_version: 5.49.1
app_file: app.py
pinned: false
tags:
- smolagents
- agent
- smolagent
- tool
- agent-course
---

Check out the configuration reference at <https://huggingface.co/docs/hub/spaces-config-reference>

## Clone repository

```shell
git clone https://huggingface.co/spaces/2stacks/First_agent_template
cd First_agent_template
```

## Create and activate Python environment

```shell
python -m venv env
source env/bin/activate
```

## Configuration (Optional)

The application uses environment variables for model configuration. Create a `.env` file in the project root to customize settings:

```shell
# Ollama configuration (for local models)
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_MODEL_ID=qwen2.5-coder:32b

# HuggingFace configuration (fallback when Ollama is unavailable)
HF_MODEL_ID=Qwen/Qwen2.5-Coder-32B-Instruct
```

**Environment Variables:**

- `OLLAMA_BASE_URL`: URL for your Ollama service (default: `http://localhost:11434`)
- `OLLAMA_MODEL_ID`: Model name in Ollama (default: `qwen2.5-coder:32b`)
- `HF_MODEL_ID`: HuggingFace model to use as fallback (default: `Qwen/Qwen2.5-Coder-32B-Instruct`)

The app automatically checks if Ollama is available with the specified model. If not, it falls back to HuggingFace.

## Install dependencies and run

```shell
pip install -r requirements.txt
python app.py
```

## Run with Docker

```shell
docker run -it -p 7860:7860 \
    --platform=linux/amd64 \
    -e HF_TOKEN="YOUR_VALUE_HERE" \
    -e OLLAMA_BASE_URL="http://localhost:11434" \
    -e OLLAMA_MODEL_ID="qwen2.5-coder:32b" \
    -e HF_MODEL_ID="Qwen/Qwen2.5-Coder-32B-Instruct" \
    registry.hf.space/2stacks-first-agent-template:latest python app.py
```