microjulia-data / README.md
LisaMegaWatts's picture
Upload README.md with huggingface_hub
601985d verified
metadata
dataset_info:
  features:
    - name: text
      dtype: string
  splits:
    - name: train
      num_examples: 7637
    - name: validation
      num_examples: 849
language:
  - en
license: mit
tags:
  - character-level
  - philosophy
  - mathematics
  - julia
  - microgpt
size_categories:
  - 1K<n<10K

MicroJulia Training Data

Character-level training corpus for MicroJulia, a minimal GPT built in pure Julia with scalar autograd.

Sources (ordered, not shuffled)

  1. Aristotle - Rhetoric (5,478 chunks) — MIT Classics
  2. Euclid - The Elements (3,009 chunks) — Project Gutenberg

Vocabulary

48 characters: a-z, 0-9, space, and punctuation (.,;:!?'"-()') Plus BOS token = 49 vocab total.

Format

One chunk per line in , max 256 chars each, split at sentence boundaries.

Usage