Update README.md
Browse files
README.md
CHANGED
|
@@ -59,6 +59,72 @@ The model was evaluated on the test split of the `LLMTrace Classification datase
|
|
| 59 |
| Mean Accuracy | 98.46 |
|
| 60 |
| TPR @ FPR=0.01 | 97.93 |
|
| 61 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 62 |
## Citation
|
| 63 |
|
| 64 |
If you use this model in your research, please cite our papers:
|
|
|
|
| 59 |
| Mean Accuracy | 98.46 |
|
| 60 |
| TPR @ FPR=0.01 | 97.93 |
|
| 61 |
|
| 62 |
+
|
| 63 |
+
## Quick start
|
| 64 |
+
|
| 65 |
+
Requirements:
|
| 66 |
+
- python3.11
|
| 67 |
+
- [gigacheck](https://github.com/ai-forever/gigacheck)
|
| 68 |
+
|
| 69 |
+
```bash
|
| 70 |
+
pip install git+https://github.com/ai-forever/gigacheck
|
| 71 |
+
```
|
| 72 |
+
|
| 73 |
+
### Inference with transformers (with trust_remote_code=True)
|
| 74 |
+
|
| 75 |
+
```python
|
| 76 |
+
from transformers import AutoModel
|
| 77 |
+
import torch
|
| 78 |
+
|
| 79 |
+
gigacheck_model = AutoModel.from_pretrained(
|
| 80 |
+
"iitolstykh/GigaCheck-Classifier-Multi",
|
| 81 |
+
trust_remote_code=True,
|
| 82 |
+
device_map="cuda:0",
|
| 83 |
+
torch_dtype=torch.bfloat16
|
| 84 |
+
)
|
| 85 |
+
|
| 86 |
+
text = """To be, or not to be, that is the question:
|
| 87 |
+
Whether ’tis nobler in the mind to suffer
|
| 88 |
+
The slings and arrows of outrageous fortune,
|
| 89 |
+
Or to take arms against a sea of troubles
|
| 90 |
+
And by opposing end them.
|
| 91 |
+
"""
|
| 92 |
+
|
| 93 |
+
output = gigacheck_model([text])
|
| 94 |
+
|
| 95 |
+
print([gigacheck_model.config.id2label[int(c_id)] for c_id in output.pred_label_ids])
|
| 96 |
+
|
| 97 |
+
```
|
| 98 |
+
|
| 99 |
+
### Inference with gigacheck
|
| 100 |
+
|
| 101 |
+
```python
|
| 102 |
+
import torch
|
| 103 |
+
from transformers import AutoConfig
|
| 104 |
+
from gigacheck.inference.src.mistral_detector import MistralDetector
|
| 105 |
+
|
| 106 |
+
model_name = "iitolstykh/GigaCheck-Classifier-Multi"
|
| 107 |
+
|
| 108 |
+
config = AutoConfig.from_pretrained(model_name)
|
| 109 |
+
model = MistralDetector(
|
| 110 |
+
max_seq_len=config.max_length,
|
| 111 |
+
with_detr=config.with_detr,
|
| 112 |
+
id2label=config.id2label,
|
| 113 |
+
device="cpu" if not torch.cuda.is_available() else "cuda:0",
|
| 114 |
+
).from_pretrained(model_name)
|
| 115 |
+
|
| 116 |
+
text = """To be, or not to be, that is the question:
|
| 117 |
+
Whether ’tis nobler in the mind to suffer
|
| 118 |
+
The slings and arrows of outrageous fortune,
|
| 119 |
+
Or to take arms against a sea of troubles
|
| 120 |
+
And by opposing end them.
|
| 121 |
+
"""
|
| 122 |
+
|
| 123 |
+
output = model.predict(text)
|
| 124 |
+
print(output)
|
| 125 |
+
```
|
| 126 |
+
|
| 127 |
+
|
| 128 |
## Citation
|
| 129 |
|
| 130 |
If you use this model in your research, please cite our papers:
|