| license: mit | |
| datasets: | |
| - EleutherAI/pile | |
| language: | |
| - en | |
| These SAEs were trained on the outputs of each of the MLPs in [EleutherAI/pythia-70m](https://huggingface.co/EleutherAI/pythia-70m). We used 8.2 billion tokens from the Pile training set at a context length of 2049. The number of latents is 32,768. |