YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Biblo XLM-RoBERTa ๋ชจ๋ธ

์ด ๋ ˆํฌ์ง€ํ† ๋ฆฌ๋Š” ํŒŒ์ธํŠœ๋‹๋œ XLM-RoBERTa ๋ชจ๋ธ์„ ํฌํ•จํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.

๋ชจ๋ธ ์ •๋ณด

์ด ๋ชจ๋ธ์€ XLM-RoBERTa ์•„ํ‚คํ…์ฒ˜๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•˜๋ฉฐ, ํŠน์ • ํƒœ์Šคํฌ์— ๋งž๊ฒŒ ํŒŒ์ธํŠœ๋‹๋˜์—ˆ์Šต๋‹ˆ๋‹ค.

์‚ฌ์šฉ ๋ฐฉ๋ฒ•

์•„๋ž˜ ์ฝ”๋“œ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๋ชจ๋ธ๊ณผ ํ† ํฌ๋‚˜์ด์ €๋ฅผ ๋ถˆ๋Ÿฌ์˜ฌ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

from transformers import XLMRobertaModel, XLMRobertaTokenizer

# ๋ชจ๋ธ๊ณผ ํ† ํฌ๋‚˜์ด์ € ๋กœ๋“œ
model_name = "vanguard-huggingface/biblo-model"  # ์—ฌ๊ธฐ์— ๋ณธ์ธ์˜ ๋ชจ๋ธ ID๋ฅผ ๋„ฃ์œผ์„ธ์š”
model = XLMRobertaModel.from_pretrained(model_name)
tokenizer = XLMRobertaTokenizer.from_pretrained(model_name)

# ์˜ˆ์‹œ ์‚ฌ์šฉ
text = "์—ฌ๊ธฐ์— ์˜ˆ์‹œ ํ…์ŠคํŠธ๋ฅผ ๋„ฃ์œผ์„ธ์š”."
inputs = tokenizer(text, return_tensors="pt")
outputs = model(**inputs)

์š”๊ตฌ์‚ฌํ•ญ

์ด ๋ชจ๋ธ์„ ์‚ฌ์šฉํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š” ๋‹ค์Œ ํŒจํ‚ค์ง€๊ฐ€ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค:

  • transformers
  • torch
  • sentencepiece

๋‹ค์Œ ๋ช…๋ น์–ด๋กœ ์„ค์น˜ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

pip install transformers torch sentencepiece

๋ผ์ด์„ผ์Šค

์ด ๋ชจ๋ธ์€ ๊ฐœ์ธ ๋ฐ ์—ฐ๊ตฌ ๋ชฉ์ ์œผ๋กœ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

Downloads last month
5
Safetensors
Model size
0.3B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support