metadata
library_name: peft
base_model: BAAI/bge-m3
tags:
- medical
- cardiology
- embeddings
- domain-adaptation
- lora
- sentence-transformers
- sentence-similarity
language:
- en
license: apache-2.0
CardioEmbed-BGE-M3
Domain-specialized cardiology text embeddings using LoRA-adapted BGE-M3
Part of a comparative study of 10 embedding architectures for clinical cardiology.
Performance
| Metric | Score |
|---|---|
| Separation Score | 0.209 |
Usage
from transformers import AutoModel, AutoTokenizer
from peft import PeftModel
base_model = AutoModel.from_pretrained("BAAI/bge-m3")
tokenizer = AutoTokenizer.from_pretrained("BAAI/bge-m3")
model = PeftModel.from_pretrained(base_model, "richardyoung/CardioEmbed-BGE-M3")
Training
- Training Data: 106,535 cardiology text pairs from medical textbooks
- Method: LoRA fine-tuning (r=16, alpha=32)
- Loss: Multiple Negatives Ranking Loss (InfoNCE)
Citation
@article{young2024comparative,
title={Comparative Analysis of LoRA-Adapted Embedding Models for Clinical Cardiology Text Representation},
author={Young, Richard J and Matthews, Alice M},
journal={arXiv preprint},
year={2024}
}