Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
WesScivetti
/
SNACS_Japanese
like
0
Token Classification
Transformers
Safetensors
xlm-roberta
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
SNACS_Japanese
2.26 GB
1 contributor
History:
3 commits
WesScivetti
Upload tokenizer
96e1cc0
verified
12 months ago
.gitattributes
1.57 kB
Upload tokenizer
12 months ago
README.md
5.17 kB
Upload XLMRobertaForTokenClassification
12 months ago
config.json
21 kB
Upload XLMRobertaForTokenClassification
12 months ago
model.safetensors
2.24 GB
xet
Upload XLMRobertaForTokenClassification
12 months ago
sentencepiece.bpe.model
5.07 MB
xet
Upload tokenizer
12 months ago
special_tokens_map.json
964 Bytes
Upload tokenizer
12 months ago
tokenizer.json
17.1 MB
xet
Upload tokenizer
12 months ago
tokenizer_config.json
1.18 kB
Upload tokenizer
12 months ago