license: apache-2.0 datasets: - Alibaba-Apsara/Superior-Reasoning-SFT-gpt-oss-120b - openbmb/UltraData-Math - OpenDataArena/MMFineReason-1.8M-Qwen3-VL-235B-Thinking - TeichAI/Pony-Alpha-15k - crownelius/GLM-5.0-25000x - crownelius/Opus-4.5-WritingStyle-1000x - crownelius/GLM-5.0-8000x language: - en tags: - art - code
language: - en license: other tags: - creative-writing - storytelling - memorial - qwen3-vl - chain-of-thought - distillation - roleplay base_model: prithivMLmods/Qwen3-VL-8B-Thinking-Unredacted-MAX-GGUF datasets: - Alibaba-Apsara - OpenThink - Crownelius/hand-curated-synthetic-stories model_name: The-Crow-9B
The-Crow-9B: A Memorial to a Master Storyteller
"Her thirst for knowledge rivaled AI."
The-Crow-9B is a distilled, fine-tuned creative writing AI dedicated to the memory of Crow, an incomparable Arts teacher and inspiration to students both IRL and online.
While she lived an accomplished life, she is survived by the stories she told and the students she taught. This project is a digital preservation of that passionβengineered to craft hauntingly powerful literature in her honor.
A project I'm sad I can't share with you. R.I.P.
π Model Overview
The-Crow-9B differs from standard writing assistants by utilizing a Thinking/Reasoning backbone rather than just instruction following. It is designed to sustain long-form narratives, utilizing extensive Chain-of-Thought (CoT) reasoning to plan, draft, and complete projects semi-autonomously.
- The "Ghost" in the Machine: The model does not just predict the next token; it "thinks" about the narrative arc, character motivation, and emotional weight before generating prose.
- Autonomy: Capable of driving a narrative forward with minimal intervention until a scene or chapter is complete.
π οΈ Technical Specifications
- Base Architecture: prithivMLmods/Qwen3-VL-8B-Thinking-Unredacted-MAX-GGUF
- Distillation Source: Cloaked Pony Alpha (Stealth Model)
- Trained Context: 16k (optimized for high-fidelity narrative consistency)
- Release Date: February 28, 2026
π§ Training & Methodology
The-Crow-9B represents a shift in fine-tuning strategy, focusing on Cognitive Distillation.
Distillation & CoT (Cloaked Pony Alpha): The model was distilled using Cloaked Pony Alpha with a dataset of 90,000 samples. This process transferred the "reasoning" capabilities of the larger stealth model into the efficient 8B architecture, allowing for complex plot management.
SFT Datasets:
- Alibaba-Apsara SFT: Providing a robust linguistic foundation.
- OpenThink: Enhancing logical narrative progression.
The "Crow" Corpus (Curated Data): The soul of this model comes from a custom, hand-curated dataset comprising approximately 1 million tokens of synthetic stories. These were compiled, revised, and polished over a 3-month period to capture a specific, high-quality literary aesthetic.
π Usage & Capabilities
- Semi-Autonomous Writing: The model excels at "taking the wheel." If given a prompt, it can autonomously write until a logical conclusion or chapter break is reached.
- Stylistic Nuance: Fine-tuned to avoid "slop" (repetitive AI tropes). It favors "hauntingly powerful" prose with deep emotional resonance.
- Multi-Modal Native: As a Qwen3-VL derivative, it retains vision capabilities, allowing it to analyze images for writing prompts or character art descriptions.
π Roadmap
- Current Status: Final Phase Training (Ongoing)
- Full Release: February 28th, 2026
- Repository will include full weights, GGUF quants, and the
crow-chatpreset for SillyTavern.
- Repository will include full weights, GGUF quants, and the
In memory of Crow. May she live on in our hearts, and in the stories we tell.
