| | --- |
| | language: |
| | - cs |
| | - sk |
| | - en |
| | - de |
| | license: apache-2.0 |
| | base_model: EuroLLM-9B |
| | quantization: Q8_0 |
| | tags: |
| | - gguf |
| | - llama.cpp |
| | - offline |
| | - local-ai |
| | - multilingual |
| | - cli-runtime |
| | pipeline_tag: text-generation |
| | library_name: llama.cpp |
| | --- |
| | |
| | # Offline AI 2.1 – EuroLLM-9B-Q8_0 (GGUF) |
| | |
| | Offline AI 2.1 is a fully local AI runtime built around digital sovereignty, privacy, and system autonomy. |
| | |
| | No cloud. |
| | No telemetry. |
| | No tracking. |
| | No external dependencies. |
| | |
| | Everything runs locally via llama.cpp. |
| | |
| | --- |
| | |
| | ## 🖥️ CLI Preview |
| | |
| | Below is the Offline AI runtime interface: |
| | |
| |  |
| | |
| | Offline AI is not just a model launcher — it is a structured local AI workspace with: |
| | |
| | - Profile handling |
| | - Runtime status inspection |
| | - Controlled execution flow |
| | - Modular architecture foundation |
| | - Admin mode (locked access for advanced system control) |
| | |
| | --- |
| | |
| | ## 🔧 TECHNICAL INFORMATION |
| | |
| | Base model: EuroLLM-9B |
| | Quantization: Q8_0 (GGUF) |
| | Format: llama.cpp compatible |
| | Runtime: llama.cpp |
| | Offline AI Version: 2.1 |
| | Recommended RAM: 16 GB |
| | Platforms: macOS, Windows |
| |
|
| | This repository distributes a quantized GGUF Q8_0 variant of the EuroLLM-9B model for efficient offline inference. |
| | The original model weights are unmodified and not fine-tuned as part of this project. |
| | |
| | --- |
| | |
| | ## 🧠 WHAT'S NEW IN 2.1 |
| | |
| | - Refined CLI architecture |
| | - Structured command system |
| | - Improved response handling |
| | - More stable execution |
| | - Admin access layer (locked system control mode) |
| | - Cleaner internal logic separation |
| | |
| | Offline AI 2.1 transitions from a simple launcher to a structured local runtime environment. |
| | |
| | --- |
| | |
| | ## 🔐 PROJECT PHILOSOPHY |
| | |
| | Offline AI demonstrates that: |
| | |
| | - Modern AI can operate without cloud infrastructure |
| | - Open models can run independently |
| | - AI tools can respect user privacy |
| | - Local-first computing is viable |
| | |
| | The project promotes: |
| | |
| | - Digital independence |
| | - Transparent system design |
| | - Offline experimentation |
| | - User-controlled AI environments |
| | |
| | --- |
| | |
| | ## 📄 MODEL ORIGIN & LICENSE |
| | |
| | Model: EuroLLM-9B |
| | Original authors: EuroLLM Project consortium |
| | Funded by: European Union research initiatives |
| | Base model license: Apache License 2.0 |
| | |
| | Quantized distribution: GGUF Q8_0 |
| | Runtime: llama.cpp (MIT License) |
| | Offline AI interface and wrapper: © David Káninský |
| |
|
| | All components are used in compliance with their respective licenses. |
| |
|
| | --- |
| |
|
| | ## ⚠️ DISCLAIMER |
| |
|
| | This project is an educational and experimental implementation. |
| |
|
| | It is not a commercial AI service and does not replace professional advice. |
| | Outputs are not intended for legal, medical, financial, or critical decision-making use. |
| |
|
| | Use beyond personal, research, or educational purposes is at your own responsibility. |
| |
|
| | --- |
| |
|
| | ## 🌍 PROJECT |
| |
|
| | Website: https://OfflineAI.online |
| | Domains: .cz / .sk / .de |
| | Author: David Káninský |