ViT5-Base Finetuned on vietnews Abstractive Summarization (No prefix needed)

State-of-the-art pretrained Transformer-based encoder-decoder model for Vietnamese. PWC

How to use

For more details, do check out our Github repo and eval script.

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

# Load model và tokenizer
model_name = "ViFortune-AI/ViT5Summer"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSeq2SeqLM.from_pretrained(model_name)
model.cuda()

# DỮ LIỆU ĐẦU VÀO CỦA BẠN: nguyên văn hội thoại (giống trong dataset)
sentence = "Bạn đã thanh toán cho cà phê không?>> Hmm... tôi nghĩ không phải là vậy, nhưng nó cũng không sao, tôi sẽ thanh toán anh ta mai nhé."

# ✅ KHÔNG thêm "summarize:", KHÔNG thêm "</s>"
encoding = tokenizer(
    sentence,
    return_tensors="pt",
    max_length=512,
    truncation=True,
    padding=False  # hoặc "max_length" nếu muốn
)

input_ids = encoding["input_ids"].to("cuda")
attention_mask = encoding["attention_mask"].to("cuda")

# Generate
outputs = model.generate(
    input_ids=input_ids,
    attention_mask=attention_mask,
    max_length=256,
    min_length=10,
    num_beams=4,
    early_stopping=True,
    no_repeat_ngram_size=2,
    length_penalty=1.0
)

# Decode
for output in outputs:
    summary = tokenizer.decode(output, skip_special_tokens=True, clean_up_tokenization_spaces=True)
    print("Tóm tắt:", summary)

Citation

@inproceedings{phan-etal-2022-vit5,
    title = "{V}i{T}5: Pretrained Text-to-Text Transformer for {V}ietnamese Language Generation",
    author = "Phan, Long and Tran, Hieu and Nguyen, Hieu and Trinh, Trieu H.",
    booktitle = "Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Student Research Workshop",
    year = "2022",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2022.naacl-srw.18",
    pages = "136--142",
}
Downloads last month
12
Safetensors
Model size
0.2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train ViFortune-AI/ViT5Summer

Collection including ViFortune-AI/ViT5Summer