|
|
--- |
|
|
license: apache-2.0 |
|
|
datasets: |
|
|
- tomg-group-umd/wikipedia-en-2k-samples |
|
|
- BASF-AI/WikipediaEasy10Classification |
|
|
language: |
|
|
- en |
|
|
--- |
|
|
# SLiNeP |
|
|
|
|
|
*Super Low Parameter Wikipedia-based Neural Predictor* |
|
|
Super small gpt 2 based model trained from 11.6 Mb of wikipedia data. |
|
|
|
|
|
Each topic starts with `Topic Name : Topic Text` and ends with ` <|endmsg|> `. |
|
|
This is the medium version repo. |
|
|
Since the model is small and trained on minimal data |
|
|
it doesnt know many facts, but it learned some english. |
|
|
Its recommended to finetune it. |
|
|
I also didn't test it yet but it probably needs low temp to stop saying nonsensical things |