YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Markov 5-gram LM (500M tokens)
Classical 5-gram language model with Modified Kneser-Ney smoothing.
- Architecture: N-gram with GPU hash tables (sorted int64 + searchsorted)
- Training data: 500M tokens from OpenTransformer web crawl datasets
- Tokenizer: GPT-2 (50257 vocab)
- Model size: 61.6M n-gram entries, 1.83GB GPU memory
- Eval (Pile): Perplexity 46047, Top-1 accuracy 15.14%
- Inference: 176K tok/s eval throughput on RTX 3060 Trained by OpenTransformers Ltd. Part of AGILLM research.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support