File size: 1,562 Bytes
39d272b 99e8215 dc0e7c0 39d272b 99e8215 39d272b dc0e7c0 39d272b dc0e7c0 99e8215 dc0e7c0 99e8215 39d272b 99e8215 39d272b 99e8215 39d272b 99e8215 39d272b 99e8215 dc0e7c0 39d272b 99e8215 39d272b 99e8215 39d272b 99e8215 39d272b 99e8215 dc0e7c0 6e01b1d 39d272b 99e8215 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 |
---
library_name: transformers
tags:
- lora
- sequence-classification
- end-of-utterance
- multilingual
- english
- spanish
license: apache-2.0
datasets:
- marc-es/orga-dynamic-dataset
model_type: llama
language:
- es
- en
base_model:
- HuggingFaceTB/SmolLM2-135M-Instruct
metrics:
- accuracy
---
# Orga Dynamic (1) — Bilingual End-of-Utterance Classifier
**Orga Dynamic (1)** es un adaptador LoRA (Low-Rank Adaptation) entrenado para detectar automáticamente el **fin de turno** (End of Utterance, EOU) en conversaciones.
- **Base model:** `HuggingFaceTB/SmolLM2-135M-Instruct`
- **Method:** LoRA-r16 / α32 sobre `q_proj`, `k_proj`, `v_proj`, `o_proj`
- **Training data:** 4 000 intervenciones
- **Metrics (test 20 %)**
| Metric | EN + ES |
|--------|---------|
| Accuracy | **0.951** |
| F1 | **0.948** |
---
## Model Details
| | |
|---|---|
| **Languages** | English (en), Spanish (es) |
| **Labels** | `0 = NO_EOU`, `1 = EOU` |
| **Precision** | fp16 (LoRA weights ≈ 5 MB) |
| **License** | Apache 2.0 |
| **Author** | @marc-es |
---
## Quick Start
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
from peft import PeftModel
base = AutoModelForSequenceClassification.from_pretrained(
"HuggingFaceTB/SmolLM2-135M-Instruct", num_labels=2)
model = PeftModel.from_pretrained(base, "marc-es/orga-dynamic-1")
tok = AutoTokenizer.from_pretrained("marc-es/orga-dynamic-1")
def is_end(text):
out = model(**tok(text, return_tensors="pt"))[0]
return out.argmax(-1).item() == 1 # True = EOU |