Cecilia: The Cuban Language Model

Cecilia is a family of language models continual pretrained specifically on Cuban written text, capturing the linguistic, cultural, and social nuances of Cuban Spanish. These models are designed to support natural language processing tasks with a focus on Cuban language varieties and cultural context.

About Cecilia FT MS v1

This model is a fine-tuned version of Cecilia 2B v0.1 which is a continual pre-trained model based on Salamandra 2b. It belongs to the Cecilia collection and follows the same lineage as Cecilia 2B v0.1.

Model Formats

This repository is a Hybrid Release containing:

  • Safetensors: For use with Hugging Face transformers.
  • GGUF (FP16): For use with llama.cpp, vLLM, or local inference tools.

Quantizations

Official quantized GGUF versions (Q8_0, Q6_K, Q4_K_M) in the repository gia-uh/cecilia-2b-instruct-v1-GGUF

Quickstart (Transformers)

from transformers import AutoConfig, AutoModel, AutoTokenizer

repo_id = "gia-uh/cecilia_ft_ms_v1"

# Load model and tokenizer
config = AutoConfig.from_pretrained(repo_id, trust_remote_code=False)
tokenizer = AutoTokenizer.from_pretrained(repo_id)
model = AutoModel.from_pretrained(repo_id, trust_remote_code=False)

# Simple inference
inputs = tokenizer("Hola, que bolá?", return_tensors="pt")
outputs = model(**inputs)
Downloads last month
312
Safetensors
Model size
2B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for gia-uh/cecilia-2b-instruct-v1

Quantized
(1)
this model
Quantizations
3 models

Dataset used to train gia-uh/cecilia-2b-instruct-v1

Space using gia-uh/cecilia-2b-instruct-v1 1

Collection including gia-uh/cecilia-2b-instruct-v1