Spaces:
Sleeping
Sleeping
YanBoChen
Add .gitattributes file, update app.py for Spaces compatibility, and create requirements files
b3ad1ee
| title: OnCall.ai - Medical Emergency Assistant | |
| emoji: π₯ | |
| colorFrom: red | |
| colorTo: blue | |
| sdk: gradio | |
| sdk_version: "5.38.0" | |
| app_file: app.py | |
| python_version: "3.11" | |
| pinned: false | |
| license: mit | |
| tags: | |
| - medical | |
| - healthcare | |
| - RAG | |
| - emergency | |
| - clinical-guidance | |
| - gradio | |
| # π₯ OnCall.ai - Medical Emergency Assistant | |
| A RAG-based medical assistant system that provides **evidence-based clinical guidance** for emergency medical situations using real medical guidelines and advanced language models. | |
| ## π― What This App Does | |
| OnCall.ai helps healthcare professionals by: | |
| - **Processing medical queries** through multi-level validation system | |
| - **Retrieving relevant medical guidelines** from curated emergency medicine datasets | |
| - **Generating evidence-based clinical advice** using specialized medical LLMs (Llama3-Med42-70B) | |
| - **Providing transparent, traceable medical guidance** with source attribution | |
| ## π How to Use | |
| 1. **Enter your medical query** in the text box (e.g., "Patient with chest pain and shortness of breath") | |
| 2. **Click Submit** to process your query through our RAG pipeline | |
| 3. **Review the response** which includes: | |
| - Clinical guidance based on medical guidelines | |
| - Evidence sources and reasoning | |
| - Confidence level and validation status | |
| ## βοΈ Configuration | |
| This app requires a **HuggingFace token** for accessing the medical language models. | |
| ### Environment Variables | |
| - `HF_TOKEN`: Your HuggingFace API token (required for LLM access) | |
| - `ONCALL_DEBUG`: Set to 'true' to enable debug mode (optional) | |
| ### To set up your HuggingFace token: | |
| 1. Get your token from [HuggingFace Settings](https://huggingface.co/settings/tokens) | |
| 2. In this Space, go to **Settings** β **Variables and Secrets** | |
| 3. Add `HF_TOKEN` with your token value | |
| ## ποΈ System Architecture | |
| ### Multi-Level Query Processing Pipeline | |
| 1. **Level 1**: Predefined medical condition mapping (instant response) | |
| 2. **Level 2**: LLM-based condition extraction (Llama3-Med42-70B) | |
| 3. **Level 3**: Semantic search fallback | |
| 4. **Level 4**: Medical query validation (100% non-medical rejection) | |
| 5. **Level 5**: Generic medical search for rare conditions | |
| ### Dual-Index Retrieval System | |
| - **Emergency Guidelines Index**: Fast retrieval for critical conditions | |
| - **Treatment Protocols Index**: Comprehensive clinical procedures | |
| - **Semantic Search**: Vector-based similarity matching using sentence transformers | |
| ## π Technical Details | |
| ### Key Features | |
| - **Complete RAG Pipeline**: Query β Condition Extraction β Retrieval β Generation | |
| - **Multi-level fallback validation** for robust query processing | |
| - **Evidence-based medical advice** with transparent source attribution | |
| - **Gradio interface** for easy interaction | |
| - **Environment-controlled debug mode** for development | |
| ### Models Used | |
| - **Medical LLM**: Llama3-Med42-70B (specialized medical reasoning) | |
| - **Embedding Model**: Sentence Transformers for semantic search | |
| - **Retrieval**: Annoy index for fast approximate nearest neighbor search | |
| ### Dataset | |
| - Curated medical guidelines and emergency protocols | |
| - Treatment procedures and clinical decision trees | |
| - Evidence-based medical knowledge base | |
| ## β οΈ Important Disclaimers | |
| π¨ **This tool is for educational and research purposes only.** | |
| - **Not a substitute for professional medical advice** | |
| - **Not for use in actual medical emergencies** | |
| - **Always consult qualified healthcare professionals** | |
| - **Verify all information with authoritative medical sources** | |
| ## π§ Development Information | |
| ### Project Structure | |
| ``` | |
| βββ app.py # Main Gradio application | |
| βββ src/ # Core modules | |
| β βββ user_prompt.py # Query processing | |
| β βββ retrieval.py # RAG retrieval system | |
| β βββ generation.py # Medical advice generation | |
| β βββ llm_clients.py # LLM interface | |
| β βββ medical_conditions.py # Condition mapping | |
| βββ models/ # Pre-trained models and indices | |
| β βββ embeddings/ # Vector embeddings | |
| β βββ indices/ # Search indices | |
| βββ requirements.txt # Dependencies | |
| ``` | |
| ### Version | |
| - **Current Version**: 0.9.0 | |
| - **Last Updated**: 2025-07-31 | |
| - **Author**: OnCall.ai Team | |
| --- | |
| **π For technical details, issues, or contributions, please refer to the project documentation.** |