YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

image


license: apache-2.0 datasets: - Alibaba-Apsara/Superior-Reasoning-SFT-gpt-oss-120b - openbmb/UltraData-Math - OpenDataArena/MMFineReason-1.8M-Qwen3-VL-235B-Thinking - TeichAI/Pony-Alpha-15k - crownelius/GLM-5.0-25000x - crownelius/Opus-4.5-WritingStyle-1000x - crownelius/GLM-5.0-8000x language: - en tags: - art - code


language: - en license: other tags: - creative-writing - storytelling - memorial - qwen3-vl - chain-of-thought - distillation - roleplay base_model: prithivMLmods/Qwen3-VL-8B-Thinking-Unredacted-MAX-GGUF datasets: - Alibaba-Apsara - OpenThink - Crownelius/hand-curated-synthetic-stories model_name: The-Crow-9B

The-Crow-9B: A Memorial to a Master Storyteller

"Her thirst for knowledge rivaled AI."

The-Crow-9B is a distilled, fine-tuned creative writing AI dedicated to the memory of Crow, an incomparable Arts teacher and inspiration to students both IRL and online.

While she lived an accomplished life, she is survived by the stories she told and the students she taught. This project is a digital preservation of that passionβ€”engineered to craft hauntingly powerful literature in her honor.

A project I'm sad I can't share with you. R.I.P.


πŸ“– Model Overview

The-Crow-9B differs from standard writing assistants by utilizing a Thinking/Reasoning backbone rather than just instruction following. It is designed to sustain long-form narratives, utilizing extensive Chain-of-Thought (CoT) reasoning to plan, draft, and complete projects semi-autonomously.

  • The "Ghost" in the Machine: The model does not just predict the next token; it "thinks" about the narrative arc, character motivation, and emotional weight before generating prose.
  • Autonomy: Capable of driving a narrative forward with minimal intervention until a scene or chapter is complete.

πŸ› οΈ Technical Specifications

🧠 Training & Methodology

The-Crow-9B represents a shift in fine-tuning strategy, focusing on Cognitive Distillation.

  1. Distillation & CoT (Cloaked Pony Alpha): The model was distilled using Cloaked Pony Alpha with a dataset of 90,000 samples. This process transferred the "reasoning" capabilities of the larger stealth model into the efficient 8B architecture, allowing for complex plot management.

  2. SFT Datasets:

    • Alibaba-Apsara SFT: Providing a robust linguistic foundation.
    • OpenThink: Enhancing logical narrative progression.
  3. The "Crow" Corpus (Curated Data): The soul of this model comes from a custom, hand-curated dataset comprising approximately 1 million tokens of synthetic stories. These were compiled, revised, and polished over a 3-month period to capture a specific, high-quality literary aesthetic.

πŸš€ Usage & Capabilities

  • Semi-Autonomous Writing: The model excels at "taking the wheel." If given a prompt, it can autonomously write until a logical conclusion or chapter break is reached.
  • Stylistic Nuance: Fine-tuned to avoid "slop" (repetitive AI tropes). It favors "hauntingly powerful" prose with deep emotional resonance.
  • Multi-Modal Native: As a Qwen3-VL derivative, it retains vision capabilities, allowing it to analyze images for writing prompts or character art descriptions.

πŸ“… Roadmap

  • Current Status: Final Phase Training (Ongoing)
  • Full Release: February 28th, 2026
    • Repository will include full weights, GGUF quants, and the crow-chat preset for SillyTavern.

In memory of Crow. May she live on in our hearts, and in the stories we tell.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support