eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 6
values | T
stringclasses 6
values | Weight type
stringclasses 2
values | Architecture
stringclasses 52
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 1.03
52
| Hub License
stringclasses 26
values | Hub ❤️
int64 0
5.9k
| #Params (B)
int64 -1
140
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.03
107
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.27
0.75
| BBH
float64 0.81
63.5
| MATH Lvl 5 Raw
float64 0
0.51
| MATH Lvl 5
float64 0
50.7
| GPQA Raw
float64 0.22
0.44
| GPQA
float64 0
24.9
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.5
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 424
values | Submission Date
stringclasses 169
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
win10_EVA-Norns-Qwen2.5-v0.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/win10/EVA-Norns-Qwen2.5-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/EVA-Norns-Qwen2.5-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__EVA-Norns-Qwen2.5-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
win10/EVA-Norns-Qwen2.5-v0.1
|
90c3ca66e700b4a7d2c509634f9b9748a2e4c3ab
| 24.657872
| 1
| 7
| false
| false
| false
| true
| 0.656661
| 0.621963
| 62.196306
| 0.507241
| 30.060942
| 0.154834
| 15.483384
| 0.285235
| 4.697987
| 0.40451
| 8.563802
| 0.342503
| 26.944814
| false
| false
|
2024-11-17
|
2024-11-18
| 1
|
win10/EVA-Norns-Qwen2.5-v0.1 (Merge)
|
|
win10_Llama-3.2-3B-Instruct-24-9-29_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/win10/Llama-3.2-3B-Instruct-24-9-29" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Llama-3.2-3B-Instruct-24-9-29</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Llama-3.2-3B-Instruct-24-9-29-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
win10/Llama-3.2-3B-Instruct-24-9-29
|
4defb10e2415111abb873d695dd40c387c1d6d57
| 23.929169
|
llama3.2
| 0
| 3
| true
| false
| false
| true
| 0.713606
| 0.733221
| 73.322119
| 0.461423
| 24.196426
| 0.166163
| 16.616314
| 0.274329
| 3.243848
| 0.355521
| 1.440104
| 0.322806
| 24.756206
| false
| false
|
2024-09-29
|
2024-10-11
| 2
|
meta-llama/Llama-3.2-3B-Instruct
|
win10_Norns-Qwen2.5-12B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/win10/Norns-Qwen2.5-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Norns-Qwen2.5-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Norns-Qwen2.5-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
win10/Norns-Qwen2.5-12B
|
464793295c8633a95e6faedad24dfa8f0fd35663
| 16.386375
| 1
| 12
| false
| false
| false
| true
| 1.622972
| 0.489697
| 48.969734
| 0.461892
| 23.769257
| 0.004532
| 0.453172
| 0.283557
| 4.474273
| 0.35549
| 2.202865
| 0.266041
| 18.448951
| false
| false
|
2024-11-17
|
2024-11-17
| 1
|
win10/Norns-Qwen2.5-12B (Merge)
|
|
win10_Norns-Qwen2.5-7B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/win10/Norns-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Norns-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Norns-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
win10/Norns-Qwen2.5-7B
|
148d9156f734a8050812892879cf13d1ca01f137
| 24.593277
| 0
| 7
| false
| false
| false
| true
| 0.649914
| 0.612221
| 61.222113
| 0.507289
| 30.250415
| 0.155589
| 15.558912
| 0.284396
| 4.58613
| 0.408479
| 9.126563
| 0.34134
| 26.815529
| false
| false
|
2024-11-17
|
2024-11-18
| 1
|
win10/Norns-Qwen2.5-7B (Merge)
|
|
win10_llama3-13.45b-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/win10/llama3-13.45b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/llama3-13.45b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__llama3-13.45b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
win10/llama3-13.45b-Instruct
|
94cc0f415e355c6d3d47168a6ff5239ca586904a
| 17.277282
|
llama3
| 1
| 13
| true
| false
| false
| true
| 2.136535
| 0.414435
| 41.443481
| 0.486542
| 26.67569
| 0.020393
| 2.039275
| 0.258389
| 1.118568
| 0.38476
| 6.328385
| 0.334525
| 26.058289
| true
| false
|
2024-06-09
|
2024-06-26
| 1
|
win10/llama3-13.45b-Instruct (Merge)
|
winglian_Llama-3-8b-64k-PoSE_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/winglian/Llama-3-8b-64k-PoSE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">winglian/Llama-3-8b-64k-PoSE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/winglian__Llama-3-8b-64k-PoSE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
winglian/Llama-3-8b-64k-PoSE
|
5481d9b74a3ec5a95789673e194c8ff86e2bc2bc
| 11.004738
| 74
| 8
| false
| false
| false
| true
| 0.911021
| 0.285691
| 28.569086
| 0.370218
| 13.307317
| 0.033233
| 3.323263
| 0.260906
| 1.454139
| 0.339552
| 3.077344
| 0.246676
| 16.297281
| false
| false
|
2024-04-24
|
2024-06-26
| 0
|
winglian/Llama-3-8b-64k-PoSE
|
|
winglian_llama-3-8b-256k-PoSE_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/winglian/llama-3-8b-256k-PoSE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">winglian/llama-3-8b-256k-PoSE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/winglian__llama-3-8b-256k-PoSE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
winglian/llama-3-8b-256k-PoSE
|
93e7b0b6433c96583ffcef3bc47203e6fdcbbe8b
| 6.557715
| 42
| 8
| false
| false
| false
| true
| 1.050723
| 0.290911
| 29.091145
| 0.315658
| 5.502849
| 0.015106
| 1.510574
| 0.25755
| 1.006711
| 0.331552
| 0.94401
| 0.111619
| 1.291002
| false
| false
|
2024-04-26
|
2024-06-26
| 0
|
winglian/llama-3-8b-256k-PoSE
|
|
xMaulana_FinMatcha-3B-Instruct_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xMaulana/FinMatcha-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xMaulana/FinMatcha-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xMaulana__FinMatcha-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xMaulana/FinMatcha-3B-Instruct
|
be2c0c04fc4dc3fb93631e3c663721da92fea8fc
| 24.016243
|
apache-2.0
| 0
| 3
| true
| false
| false
| true
| 6.577035
| 0.754828
| 75.48283
| 0.453555
| 23.191023
| 0.135952
| 13.595166
| 0.269295
| 2.572707
| 0.363333
| 5.016667
| 0.318152
| 24.239066
| false
| false
|
2024-09-29
|
2024-10-22
| 1
|
xMaulana/FinMatcha-3B-Instruct (Merge)
|
xinchen9_Llama3.1_8B_Instruct_CoT_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xinchen9/Llama3.1_8B_Instruct_CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Llama3.1_8B_Instruct_CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Llama3.1_8B_Instruct_CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xinchen9/Llama3.1_8B_Instruct_CoT
|
cab1b33ddff08de11c5daea8ae079d126d503d8b
| 16.190743
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.856552
| 0.297357
| 29.735657
| 0.439821
| 21.142866
| 0.05287
| 5.287009
| 0.302013
| 6.935123
| 0.437062
| 13.166146
| 0.287899
| 20.87766
| false
| false
|
2024-09-16
|
2024-09-19
| 0
|
xinchen9/Llama3.1_8B_Instruct_CoT
|
xinchen9_Llama3.1_CoT_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xinchen9/Llama3.1_CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Llama3.1_CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Llama3.1_CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xinchen9/Llama3.1_CoT
|
3cb467f51a59ff163bb942fcde3ef60573c12b79
| 13.351283
|
apache-2.0
| 0
| 8
| true
| false
| false
| true
| 0.950099
| 0.224616
| 22.461624
| 0.434101
| 19.899124
| 0.015106
| 1.510574
| 0.288591
| 5.145414
| 0.430458
| 11.773958
| 0.273853
| 19.317007
| false
| false
|
2024-09-04
|
2024-09-06
| 0
|
xinchen9/Llama3.1_CoT
|
xinchen9_Llama3.1_CoT_V1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xinchen9/Llama3.1_CoT_V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Llama3.1_CoT_V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Llama3.1_CoT_V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xinchen9/Llama3.1_CoT_V1
|
c5ed4b8bfc364ebae1843af14799818551f5251f
| 14.394947
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.873462
| 0.245299
| 24.529914
| 0.4376
| 20.166003
| 0.01284
| 1.283988
| 0.279362
| 3.914989
| 0.457219
| 16.41901
| 0.280502
| 20.055777
| false
| false
|
2024-09-06
|
2024-09-07
| 0
|
xinchen9/Llama3.1_CoT_V1
|
xinchen9_Mistral-7B-CoT_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/xinchen9/Mistral-7B-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Mistral-7B-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Mistral-7B-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xinchen9/Mistral-7B-CoT
|
9a3c8103dac20d5497d1b8fc041bb5125ff4dc00
| 11.202955
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 1.888689
| 0.279871
| 27.987074
| 0.387268
| 14.806193
| 0.019637
| 1.963746
| 0.249161
| 0
| 0.399427
| 8.195052
| 0.228391
| 14.265662
| false
| false
|
2024-09-09
|
2024-09-23
| 0
|
xinchen9/Mistral-7B-CoT
|
xinchen9_llama3-b8-ft-dis_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xinchen9/llama3-b8-ft-dis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/llama3-b8-ft-dis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__llama3-b8-ft-dis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xinchen9/llama3-b8-ft-dis
|
e4da730f28f79543262de37908943c35f8df81fe
| 13.897963
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 1.062327
| 0.154599
| 15.459869
| 0.462579
| 24.727457
| 0.034743
| 3.47432
| 0.312919
| 8.389262
| 0.365375
| 6.405208
| 0.324385
| 24.931664
| false
| false
|
2024-06-28
|
2024-07-11
| 0
|
xinchen9/llama3-b8-ft-dis
|
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table
|
c083d6796f54f66b4cec2261657a02801c761093
| 22.421029
| 0
| 8
| false
| false
| false
| true
| 0.624231
| 0.637475
| 63.747523
| 0.491227
| 27.422821
| 0.067976
| 6.797583
| 0.259228
| 1.230425
| 0.382
| 5.483333
| 0.3686
| 29.844489
| false
| false
|
2024-09-30
|
2024-10-01
| 0
|
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table
|
|
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table
|
5416d34b5243559914a377ee9d95ce4830bf8dba
| 24.502405
| 0
| 8
| false
| false
| false
| true
| 0.750264
| 0.727451
| 72.745094
| 0.505686
| 29.398353
| 0.084592
| 8.459215
| 0.260067
| 1.342282
| 0.381906
| 5.104948
| 0.369681
| 29.964539
| false
| false
|
2024-09-30
|
2024-10-01
| 0
|
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table
|
|
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table
|
235204157d7fac0d64fa609d5aee3cebb49ccd11
| 22.236354
| 0
| 8
| false
| false
| false
| true
| 0.671741
| 0.656859
| 65.685936
| 0.495183
| 27.6952
| 0.064955
| 6.495468
| 0.259228
| 1.230425
| 0.359396
| 2.291146
| 0.37018
| 30.019947
| false
| false
|
2024-09-30
|
2024-09-30
| 0
|
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table
|
|
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table
|
9db00cbbba84453b18956fcc76f264f94a205955
| 22.935265
| 0
| 8
| false
| false
| false
| true
| 0.719228
| 0.66208
| 66.207995
| 0.500449
| 28.508587
| 0.077795
| 7.779456
| 0.259228
| 1.230425
| 0.380542
| 5.001042
| 0.359957
| 28.884087
| false
| false
|
2024-09-30
|
2024-09-30
| 0
|
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table
|
|
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001
|
1062757826de031a4ae82277e6e737e19e82e514
| 21.845481
| 0
| 8
| false
| false
| false
| true
| 0.615003
| 0.604228
| 60.422789
| 0.493606
| 27.613714
| 0.064955
| 6.495468
| 0.259228
| 1.230425
| 0.379333
| 5.216667
| 0.370844
| 30.093824
| false
| false
|
2024-09-30
|
2024-10-01
| 0
|
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001
|
|
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002
|
e5d2f179b4a7bd851dcf2b7db6358b13001bf1af
| 23.938825
| 0
| 8
| false
| false
| false
| true
| 0.841468
| 0.713188
| 71.318768
| 0.499638
| 28.574879
| 0.069486
| 6.94864
| 0.258389
| 1.118568
| 0.387208
| 6.067708
| 0.366439
| 29.604388
| false
| false
|
2024-09-30
|
2024-10-01
| 0
|
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002
|
|
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001
|
0e319ad47ed2b2636b72d07ee9b32657e1e50412
| 21.224624
| 0
| 8
| false
| false
| false
| true
| 0.679841
| 0.594711
| 59.471092
| 0.489922
| 26.943904
| 0.073263
| 7.326284
| 0.259228
| 1.230425
| 0.358094
| 2.328385
| 0.370429
| 30.047651
| false
| false
|
2024-09-30
|
2024-09-30
| 0
|
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001
|
|
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002
|
0877f2458ea667edcf9213383df41294c788190f
| 22.69358
| 0
| 8
| false
| false
| false
| true
| 0.769119
| 0.645319
| 64.531887
| 0.495108
| 28.046978
| 0.067976
| 6.797583
| 0.260067
| 1.342282
| 0.393875
| 7.334375
| 0.352975
| 28.108378
| false
| false
|
2024-09-30
|
2024-10-01
| 0
|
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002
|
|
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table
|
d2b87100e5ba3215fddbd308bb17b7bf12fe6c9e
| 21.01778
| 0
| 8
| false
| false
| false
| true
| 0.98643
| 0.575602
| 57.560163
| 0.490121
| 26.866404
| 0.079305
| 7.930514
| 0.259228
| 1.230425
| 0.365969
| 2.979427
| 0.365858
| 29.539746
| false
| false
|
2024-09-28
|
2024-09-29
| 0
|
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table
|
|
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table
|
19a48ccf5ea463afbbbc61d650b8fb63ff2d94c7
| 23.969226
| 0
| 8
| false
| false
| false
| true
| 0.590153
| 0.703446
| 70.344575
| 0.509187
| 29.731239
| 0.086858
| 8.685801
| 0.259228
| 1.230425
| 0.373906
| 3.904948
| 0.369265
| 29.918366
| false
| false
|
2024-09-28
|
2024-09-29
| 0
|
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table
|
|
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table
|
0fe230b3432fb2b0f89942d7926291a4dbeb2820
| 21.781466
| 0
| 8
| false
| false
| false
| true
| 0.665521
| 0.602379
| 60.237946
| 0.496953
| 27.892403
| 0.086103
| 8.610272
| 0.259228
| 1.230425
| 0.367365
| 3.18724
| 0.365775
| 29.530511
| false
| false
|
2024-09-28
|
2024-09-29
| 0
|
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table
|
|
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table
|
d1e19da1029f2d4d45de015754bc52dcb1ea5570
| 23.059714
| 0
| 8
| false
| false
| false
| true
| 0.588419
| 0.66203
| 66.203008
| 0.499994
| 28.439824
| 0.083082
| 8.308157
| 0.259228
| 1.230425
| 0.381812
| 5.126562
| 0.361453
| 29.05031
| false
| false
|
2024-09-28
|
2024-09-29
| 0
|
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table
|
|
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001
|
a478aa202c59773eba615ae37feb4cc750757695
| 20.364052
| 0
| 8
| false
| false
| false
| true
| 0.586443
| 0.533636
| 53.363631
| 0.491487
| 27.145374
| 0.06571
| 6.570997
| 0.259228
| 1.230425
| 0.377969
| 4.71276
| 0.36245
| 29.161126
| false
| false
|
2024-09-28
|
2024-09-29
| 0
|
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001
|
|
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002
|
8ef9ef7e2bf522e707a7b090af55f2ec1eafd4b9
| 23.261322
| 0
| 8
| false
| false
| false
| true
| 0.869474
| 0.685161
| 68.516093
| 0.507516
| 29.74055
| 0.054381
| 5.438066
| 0.258389
| 1.118568
| 0.383177
| 5.630469
| 0.362118
| 29.124187
| false
| false
|
2024-09-28
|
2024-09-29
| 0
|
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002
|
|
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001
|
86673872245ad902f8d466bdc20edae9c115b965
| 20.032169
| 0
| 8
| false
| false
| false
| true
| 0.675094
| 0.548224
| 54.822427
| 0.488717
| 26.839803
| 0.044562
| 4.456193
| 0.260906
| 1.454139
| 0.363271
| 2.942187
| 0.367104
| 29.678265
| false
| false
|
2024-09-28
|
2024-09-29
| 0
|
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001
|
|
xukp20_llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table
|
abb3afe2b0398b24ed823b0124c8a72d354487bd
| 23.498955
| 0
| 8
| false
| false
| false
| true
| 1.379342
| 0.690931
| 69.093117
| 0.497846
| 28.119887
| 0.0929
| 9.29003
| 0.259228
| 1.230425
| 0.367333
| 3.083333
| 0.371592
| 30.176936
| false
| false
|
2024-09-22
|
2024-09-23
| 0
|
xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table
|
|
xxx777xxxASD_L3.1-ClaudeMaid-4x8B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/xxx777xxxASD/L3.1-ClaudeMaid-4x8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xxx777xxxASD/L3.1-ClaudeMaid-4x8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xxx777xxxASD__L3.1-ClaudeMaid-4x8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
xxx777xxxASD/L3.1-ClaudeMaid-4x8B
|
2a98d9cb91c7aa775acbf5bfe7bb91beb2faf682
| 26.190883
|
llama3.1
| 7
| 24
| true
| true
| false
| true
| 2.376185
| 0.669649
| 66.964875
| 0.507085
| 29.437348
| 0.128399
| 12.839879
| 0.291107
| 5.480984
| 0.428937
| 13.750521
| 0.358045
| 28.67169
| false
| false
|
2024-07-27
|
2024-07-28
| 0
|
xxx777xxxASD/L3.1-ClaudeMaid-4x8B
|
yam-peleg_Hebrew-Gemma-11B-Instruct_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
GemmaForCausalLM
|
<a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Gemma-11B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Gemma-11B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Gemma-11B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yam-peleg/Hebrew-Gemma-11B-Instruct
|
a40259d1efbcac4829ed44d3b589716f615ed362
| 13.919763
|
other
| 22
| 10
| true
| false
| false
| true
| 1.937267
| 0.302077
| 30.207738
| 0.403578
| 16.862741
| 0.057402
| 5.740181
| 0.276007
| 3.467562
| 0.408854
| 9.973438
| 0.255402
| 17.266918
| false
| false
|
2024-03-06
|
2024-07-31
| 0
|
yam-peleg/Hebrew-Gemma-11B-Instruct
|
yam-peleg_Hebrew-Mistral-7B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yam-peleg/Hebrew-Mistral-7B
|
3d32134b5959492fd7efbbf16395352594bc89f7
| 13.302117
|
apache-2.0
| 63
| 7
| true
| false
| false
| false
| 1.399281
| 0.232834
| 23.283443
| 0.433404
| 20.17694
| 0.049849
| 4.984894
| 0.279362
| 3.914989
| 0.397656
| 7.673698
| 0.278009
| 19.778738
| false
| false
|
2024-04-26
|
2024-07-11
| 0
|
yam-peleg/Hebrew-Mistral-7B
|
yam-peleg_Hebrew-Mistral-7B-200K_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Mistral-7B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Mistral-7B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Mistral-7B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yam-peleg/Hebrew-Mistral-7B-200K
|
7b51c7b31e3d9e29ea964c579a45233cfad255fe
| 10.644291
|
apache-2.0
| 15
| 7
| true
| false
| false
| false
| 0.735312
| 0.185573
| 18.557317
| 0.414927
| 17.493603
| 0.023414
| 2.34139
| 0.276007
| 3.467562
| 0.376479
| 4.526563
| 0.257314
| 17.479314
| false
| false
|
2024-05-05
|
2024-07-11
| 0
|
yam-peleg/Hebrew-Mistral-7B-200K
|
yam-peleg_Hebrew-Mistral-7B-200K_bfloat16
|
bfloat16
|
🟩 continuously pretrained
|
🟩
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Mistral-7B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Mistral-7B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Mistral-7B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yam-peleg/Hebrew-Mistral-7B-200K
|
7b51c7b31e3d9e29ea964c579a45233cfad255fe
| 8.235612
|
apache-2.0
| 15
| 7
| true
| false
| false
| true
| 1.684494
| 0.17698
| 17.698041
| 0.34105
| 7.671324
| 0.021903
| 2.190332
| 0.253356
| 0.447427
| 0.374
| 4.416667
| 0.252909
| 16.989879
| false
| false
|
2024-05-05
|
2024-08-06
| 0
|
yam-peleg/Hebrew-Mistral-7B-200K
|
ycros_BagelMIsteryTour-v2-8x7B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/ycros/BagelMIsteryTour-v2-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ycros/BagelMIsteryTour-v2-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ycros__BagelMIsteryTour-v2-8x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ycros/BagelMIsteryTour-v2-8x7B
|
98a8b319707be3dab1659594da69a37ed8f8c148
| 24.258614
|
cc-by-nc-4.0
| 16
| 46
| true
| false
| false
| true
| 3.649132
| 0.599432
| 59.943173
| 0.515924
| 31.699287
| 0.07855
| 7.854985
| 0.30453
| 7.270694
| 0.420292
| 11.303125
| 0.347324
| 27.480423
| true
| false
|
2024-01-19
|
2024-06-28
| 1
|
ycros/BagelMIsteryTour-v2-8x7B (Merge)
|
ycros_BagelMIsteryTour-v2-8x7B_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/ycros/BagelMIsteryTour-v2-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ycros/BagelMIsteryTour-v2-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ycros__BagelMIsteryTour-v2-8x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ycros/BagelMIsteryTour-v2-8x7B
|
98a8b319707be3dab1659594da69a37ed8f8c148
| 24.724802
|
cc-by-nc-4.0
| 16
| 46
| true
| false
| false
| true
| 3.619337
| 0.62621
| 62.620957
| 0.514194
| 31.366123
| 0.087613
| 8.761329
| 0.307886
| 7.718121
| 0.41375
| 10.31875
| 0.348072
| 27.563534
| true
| false
|
2024-01-19
|
2024-08-04
| 1
|
ycros/BagelMIsteryTour-v2-8x7B (Merge)
|
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table
|
97b2d0e790a6fcdf39c34a2043f0818368c7dcb3
| 22.974571
| 0
| 8
| false
| false
| false
| true
| 0.618253
| 0.670898
| 67.089766
| 0.498661
| 28.170107
| 0.073263
| 7.326284
| 0.259228
| 1.230425
| 0.372698
| 3.853906
| 0.371592
| 30.176936
| false
| false
|
2024-09-29
|
2024-09-30
| 0
|
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table
|
|
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table
|
e8786291c206d5cd1b01d29466e3b397278f4e2b
| 24.877776
| 0
| 8
| false
| false
| false
| true
| 0.640663
| 0.733271
| 73.327105
| 0.508036
| 29.308128
| 0.097432
| 9.743202
| 0.260067
| 1.342282
| 0.380604
| 5.008854
| 0.374834
| 30.537086
| false
| false
|
2024-09-29
|
2024-09-30
| 0
|
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table
|
|
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table
|
0d9cb29aa87b0c17ed011ffbc83803f3f6dd18e7
| 23.168114
| 0
| 8
| false
| false
| false
| true
| 0.679554
| 0.678466
| 67.846647
| 0.494121
| 27.469588
| 0.095166
| 9.516616
| 0.259228
| 1.230425
| 0.364667
| 2.75
| 0.371759
| 30.195405
| false
| false
|
2024-09-29
|
2024-09-29
| 0
|
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table
|
|
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table
|
7a326a956e6169b287a04ef93cdc0342a0f3311a
| 24.001677
| 0
| 8
| false
| false
| false
| true
| 0.648184
| 0.713188
| 71.318768
| 0.502536
| 28.604424
| 0.093656
| 9.365559
| 0.259228
| 1.230425
| 0.371333
| 3.683333
| 0.368268
| 29.80755
| false
| false
|
2024-09-29
|
2024-09-29
| 0
|
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table
|
|
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001
|
e5c8baadbf6ce17b344596ad42bd3546f66e253e
| 22.364867
| 0
| 8
| false
| false
| false
| true
| 0.582235
| 0.649565
| 64.956538
| 0.497946
| 28.099199
| 0.048338
| 4.833837
| 0.259228
| 1.230425
| 0.377969
| 4.846094
| 0.372008
| 30.223109
| false
| false
|
2024-09-29
|
2024-09-30
| 0
|
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001
|
|
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002
|
064e237b850151938caf171a4c8c7e34c93e580e
| 24.319539
| 0
| 8
| false
| false
| false
| true
| 0.606022
| 0.719607
| 71.960731
| 0.504515
| 28.785911
| 0.07855
| 7.854985
| 0.260067
| 1.342282
| 0.383146
| 5.593229
| 0.373421
| 30.380098
| false
| false
|
2024-09-29
|
2024-09-30
| 0
|
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002
|
|
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001
|
b685b90063258e05f8b4930fdbce2e565f13f620
| 22.384837
| 0
| 8
| false
| false
| false
| true
| 0.649092
| 0.65044
| 65.043972
| 0.495788
| 27.825253
| 0.073263
| 7.326284
| 0.259228
| 1.230425
| 0.366031
| 2.853906
| 0.370263
| 30.029181
| false
| false
|
2024-09-29
|
2024-09-29
| 0
|
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001
|
|
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002
|
5ab3f2cfc96bdda3b5a629ab4a81adf7394ba90a
| 23.522522
| 0
| 8
| false
| false
| false
| true
| 0.60769
| 0.701597
| 70.159732
| 0.499155
| 28.120615
| 0.073263
| 7.326284
| 0.259228
| 1.230425
| 0.377906
| 4.638281
| 0.366938
| 29.659796
| false
| false
|
2024-09-29
|
2024-09-29
| 0
|
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002
|
|
yifAI_Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yifAI__Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002
|
7a046b74179225d6055dd8aa601b5234f817b1e5
| 22.624782
| 0
| 8
| false
| false
| false
| true
| 0.672016
| 0.648966
| 64.896586
| 0.491452
| 27.281064
| 0.068731
| 6.873112
| 0.261745
| 1.565996
| 0.389875
| 7.134375
| 0.351978
| 27.997562
| false
| false
|
2024-09-30
| 0
|
Removed
|
||
ylalain_ECE-PRYMMAL-YL-1B-SLERP-V8_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ylalain__ECE-PRYMMAL-YL-1B-SLERP-V8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8
|
2c00dbc74e55d42fbc8b08f474fb9568f820edb9
| 9.604139
|
apache-2.0
| 0
| 1
| true
| false
| false
| false
| 0.548428
| 0.150527
| 15.052727
| 0.397557
| 15.175392
| 0
| 0
| 0.28943
| 5.257271
| 0.387458
| 6.765625
| 0.238364
| 15.373818
| false
| false
|
2024-11-13
|
2024-11-13
| 0
|
ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8
|
ymcki_gemma-2-2b-ORPO-jpn-it-abliterated-18_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-ORPO-jpn-it-abliterated-18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18
|
aed2a9061ffa21beaec0d617a9605e160136aab4
| 14.633781
|
gemma
| 0
| 2
| true
| false
| false
| true
| 6.200402
| 0.463095
| 46.309459
| 0.40529
| 16.301992
| 0.003776
| 0.377644
| 0.288591
| 5.145414
| 0.375427
| 4.728385
| 0.234458
| 14.93979
| false
| false
|
2024-10-30
|
2024-11-16
| 3
|
google/gemma-2-2b
|
ymcki_gemma-2-2b-ORPO-jpn-it-abliterated-18-merge_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-ORPO-jpn-it-abliterated-18-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge
|
b72be0a7879f0d82cb2024cfc1d02c370ce3efe8
| 15.737663
|
gemma
| 0
| 2
| true
| false
| false
| true
| 1.98799
| 0.521821
| 52.182099
| 0.414689
| 17.348337
| 0.008308
| 0.830816
| 0.283557
| 4.474273
| 0.351396
| 3.357813
| 0.246094
| 16.232639
| false
| false
|
2024-10-30
|
2024-11-16
| 3
|
google/gemma-2-2b
|
ymcki_gemma-2-2b-jpn-it-abliterated-17_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ymcki/gemma-2-2b-jpn-it-abliterated-17
|
e6f82b93dae0b8207aa3252ab4157182e2610787
| 15.002982
|
gemma
| 1
| 2
| true
| false
| false
| true
| 1.104509
| 0.508157
| 50.815724
| 0.407627
| 16.234749
| 0
| 0
| 0.271812
| 2.908277
| 0.370062
| 3.891146
| 0.245512
| 16.167996
| false
| false
|
2024-10-16
|
2024-10-18
| 3
|
google/gemma-2-2b
|
ymcki_gemma-2-2b-jpn-it-abliterated-17-18-24_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-18-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24
|
38f56fcb99bd64278a1d90dd23aea527036329a0
| 14.019765
|
gemma
| 0
| 2
| true
| false
| false
| true
| 0.704859
| 0.505484
| 50.548434
| 0.381236
| 13.114728
| 0
| 0
| 0.28104
| 4.138702
| 0.350156
| 2.069531
| 0.228225
| 14.247193
| false
| false
|
2024-11-06
|
2024-11-06
| 3
|
google/gemma-2-2b
|
ymcki_gemma-2-2b-jpn-it-abliterated-17-ORPO_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO
|
531b2e2043285cb40cd0433f5ad43441f8ac6b6c
| 14.516851
|
gemma
| 1
| 2
| true
| false
| false
| true
| 9.681597
| 0.474785
| 47.478468
| 0.389798
| 14.389413
| 0.042296
| 4.229607
| 0.274329
| 3.243848
| 0.37676
| 4.528385
| 0.219082
| 13.231383
| false
| false
|
2024-10-18
|
2024-10-27
| 3
|
google/gemma-2-2b
|
ymcki_gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca
|
5503b5e892be463fa4b1d265b8ba9ba4304af012
| 12.001731
|
gemma
| 2
| 2
| true
| false
| false
| true
| 1.184666
| 0.306473
| 30.647349
| 0.40716
| 16.922412
| 0.000755
| 0.075529
| 0.269295
| 2.572707
| 0.396917
| 7.914583
| 0.2249
| 13.877807
| false
| false
|
2024-10-27
|
2024-10-27
| 3
|
google/gemma-2-2b
|
ymcki_gemma-2-2b-jpn-it-abliterated-18_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ymcki/gemma-2-2b-jpn-it-abliterated-18
|
c50b85f9b60b444f85fe230b8d77fcbc7b18ef91
| 15.503245
|
gemma
| 1
| 2
| true
| false
| false
| true
| 1.052664
| 0.517525
| 51.752461
| 0.413219
| 17.143415
| 0
| 0
| 0.27349
| 3.131991
| 0.374156
| 4.269531
| 0.250499
| 16.722074
| false
| false
|
2024-10-15
|
2024-10-18
| 3
|
google/gemma-2-2b
|
ymcki_gemma-2-2b-jpn-it-abliterated-18-ORPO_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-18-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO
|
b9f41f53827b8a5a600546b41f63023bf84617a3
| 14.943472
|
gemma
| 0
| 2
| true
| false
| false
| true
| 1.610377
| 0.474235
| 47.423503
| 0.403894
| 16.538079
| 0.035498
| 3.549849
| 0.261745
| 1.565996
| 0.395333
| 7.416667
| 0.218501
| 13.166741
| false
| false
|
2024-10-22
|
2024-10-22
| 3
|
google/gemma-2-2b
|
ymcki_gemma-2-2b-jpn-it-abliterated-24_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ymcki/gemma-2-2b-jpn-it-abliterated-24
|
06c129ba5261ee88e32035c88f90ca11d835175d
| 15.604076
|
gemma
| 0
| 2
| true
| false
| false
| true
| 0.810442
| 0.497866
| 49.786566
| 0.41096
| 16.77259
| 0
| 0
| 0.277685
| 3.691275
| 0.39149
| 7.002865
| 0.24734
| 16.371158
| false
| false
|
2024-10-24
|
2024-10-25
| 3
|
google/gemma-2-2b
|
yuvraj17_Llama3-8B-SuperNova-Spectrum-Hermes-DPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-SuperNova-Spectrum-Hermes-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO
|
0da9f780f7dd94ed1e10c8d3e082472ff2922177
| 18.075579
|
apache-2.0
| 0
| 8
| true
| false
| false
| true
| 0.97203
| 0.46909
| 46.908979
| 0.439987
| 21.238563
| 0.055891
| 5.589124
| 0.302013
| 6.935123
| 0.401219
| 9.61901
| 0.263464
| 18.162677
| false
| false
|
2024-09-24
|
2024-09-30
| 0
|
yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO
|
yuvraj17_Llama3-8B-SuperNova-Spectrum-dare_ties_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-SuperNova-Spectrum-dare_ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties
|
998d15b32900bc230727c8a7984e005f611723e9
| 19.134801
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.914144
| 0.401271
| 40.127085
| 0.461579
| 23.492188
| 0.082326
| 8.232628
| 0.275168
| 3.355705
| 0.421094
| 11.003385
| 0.35738
| 28.597813
| true
| false
|
2024-09-22
|
2024-09-23
| 1
|
yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties (Merge)
|
yuvraj17_Llama3-8B-abliterated-Spectrum-slerp_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-abliterated-Spectrum-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-abliterated-Spectrum-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-abliterated-Spectrum-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
yuvraj17/Llama3-8B-abliterated-Spectrum-slerp
|
28789950975ecf5aac846c3f2c0a5d6841651ee6
| 17.687552
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.82666
| 0.288488
| 28.848788
| 0.497791
| 28.54693
| 0.058157
| 5.81571
| 0.301174
| 6.823266
| 0.399823
| 11.011198
| 0.325715
| 25.079418
| true
| false
|
2024-09-22
|
2024-09-23
| 1
|
yuvraj17/Llama3-8B-abliterated-Spectrum-slerp (Merge)
|
zake7749_gemma-2-2b-it-chinese-kyara-dpo_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zake7749/gemma-2-2b-it-chinese-kyara-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zake7749/gemma-2-2b-it-chinese-kyara-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zake7749__gemma-2-2b-it-chinese-kyara-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zake7749/gemma-2-2b-it-chinese-kyara-dpo
|
bbc011dae0416c1664a0287f3a7a0f9563deac91
| 19.334585
|
gemma
| 7
| 2
| true
| false
| false
| false
| 1.279309
| 0.538208
| 53.820751
| 0.425746
| 19.061804
| 0.066465
| 6.646526
| 0.266779
| 2.237136
| 0.457563
| 16.761979
| 0.257314
| 17.479314
| false
| false
|
2024-08-18
|
2024-10-17
| 1
|
zake7749/gemma-2-2b-it-chinese-kyara-dpo (Merge)
|
zelk12_Gemma-2-TM-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/Gemma-2-TM-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Gemma-2-TM-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Gemma-2-TM-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/Gemma-2-TM-9B
|
42366d605e6bdad354a5632547e37d34d300ff7a
| 30.151929
| 0
| 10
| false
| false
| false
| true
| 1.967893
| 0.804462
| 80.446216
| 0.598659
| 42.049491
| 0
| 0
| 0.346477
| 12.863535
| 0.41524
| 11.238281
| 0.408826
| 34.314051
| false
| false
|
2024-11-06
|
2024-11-06
| 1
|
zelk12/Gemma-2-TM-9B (Merge)
|
|
zelk12_MT-Gen1-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Gen1-gemma-2-9B
|
b78f8883614cbbdf182ebb4acf8a8c124bc782ae
| 33.041356
| 0
| 10
| false
| false
| false
| true
| 3.362746
| 0.788625
| 78.862529
| 0.61
| 44.011247
| 0.133686
| 13.36858
| 0.346477
| 12.863535
| 0.421688
| 11.577604
| 0.438082
| 37.564642
| false
| false
|
2024-10-23
|
2024-10-23
| 1
|
zelk12/MT-Gen1-gemma-2-9B (Merge)
|
|
zelk12_MT-Gen2-GI-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen2-GI-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen2-GI-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen2-GI-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Gen2-GI-gemma-2-9B
|
e970fbcbf974f4626dcc6db7d2b02d4f24c72744
| 33.315847
| 1
| 10
| false
| false
| false
| true
| 1.868506
| 0.791398
| 79.139794
| 0.609556
| 44.002591
| 0.133686
| 13.36858
| 0.350671
| 13.422819
| 0.428323
| 12.673698
| 0.435588
| 37.287603
| false
| false
|
2024-11-10
|
2024-11-28
| 1
|
zelk12/MT-Gen2-GI-gemma-2-9B (Merge)
|
|
zelk12_MT-Gen2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Gen2-gemma-2-9B
|
c723f8b9b7334fddd1eb8b6e5230b76fb18139a5
| 33.644495
| 1
| 10
| false
| false
| false
| true
| 1.989448
| 0.790749
| 79.074855
| 0.610049
| 44.107782
| 0.148792
| 14.879154
| 0.346477
| 12.863535
| 0.432292
| 13.303125
| 0.438747
| 37.63852
| false
| false
|
2024-11-10
|
2024-11-10
| 1
|
zelk12/MT-Gen2-gemma-2-9B (Merge)
|
|
zelk12_MT-Gen3-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Gen3-gemma-2-9B
|
84627594655776ce67f1e01233113b658333fa71
| 32.936869
| 2
| 10
| false
| false
| false
| true
| 1.813248
| 0.802014
| 80.201421
| 0.609711
| 43.950648
| 0.114048
| 11.404834
| 0.348993
| 13.199105
| 0.421688
| 11.577604
| 0.435588
| 37.287603
| false
| false
|
2024-11-28
|
2024-11-30
| 1
|
zelk12/MT-Gen3-gemma-2-9B (Merge)
|
|
zelk12_MT-Merge-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Merge-gemma-2-9B
|
f4c3b001bc8692bcbbd7005b6f8db048e651aa46
| 33.393208
| 3
| 10
| false
| false
| false
| true
| 3.219056
| 0.803538
| 80.353795
| 0.611838
| 44.320842
| 0.13142
| 13.141994
| 0.348154
| 13.087248
| 0.425625
| 12.103125
| 0.43617
| 37.352246
| false
| false
|
2024-10-22
|
2024-10-22
| 1
|
zelk12/MT-Merge-gemma-2-9B (Merge)
|
|
zelk12_MT-Merge1-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Merge1-gemma-2-9B
|
71bb4577c877715f3f6646a224b184544639c856
| 33.130536
| 1
| 10
| false
| false
| false
| true
| 4.036662
| 0.788625
| 78.862529
| 0.61
| 44.058246
| 0.126888
| 12.688822
| 0.35151
| 13.534676
| 0.424385
| 12.148177
| 0.437417
| 37.490765
| false
| false
|
2024-11-07
|
2024-11-07
| 1
|
zelk12/MT-Merge1-gemma-2-9B (Merge)
|
|
zelk12_MT-Merge2-MU-gemma-2-MTg2MT1g2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge2-MU-gemma-2-MTg2MT1g2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B
|
6d73ec2204800f7978c376567d3c6361c0a072cd
| 33.557528
| 2
| 10
| false
| false
| false
| true
| 1.844885
| 0.795595
| 79.559458
| 0.608389
| 43.8402
| 0.138218
| 13.821752
| 0.350671
| 13.422819
| 0.432229
| 13.228646
| 0.437251
| 37.472296
| false
| false
|
2024-11-25
|
2024-11-28
| 1
|
zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B (Merge)
|
|
zelk12_MT-Merge2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-Merge2-gemma-2-9B
|
a695e722e6fab77852f9fe59bbc4d69fe23c4208
| 33.498975
| 2
| 10
| false
| false
| false
| true
| 1.850791
| 0.787701
| 78.770108
| 0.610668
| 44.157197
| 0.155589
| 15.558912
| 0.350671
| 13.422819
| 0.421688
| 11.510938
| 0.438165
| 37.573877
| false
| false
|
2024-11-25
|
2024-11-25
| 1
|
zelk12/MT-Merge2-gemma-2-9B (Merge)
|
|
zelk12_MT-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT-gemma-2-9B
|
24e1f894517b86dd866c1a5999ced4a5924dcd90
| 30.239612
| 2
| 10
| false
| false
| false
| true
| 3.023399
| 0.796843
| 79.684349
| 0.60636
| 43.324243
| 0.003021
| 0.302115
| 0.345638
| 12.751678
| 0.407115
| 9.55599
| 0.422374
| 35.819297
| false
| false
|
2024-10-11
|
2024-10-11
| 1
|
zelk12/MT-gemma-2-9B (Merge)
|
|
zelk12_MT1-Gen1-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT1-Gen1-gemma-2-9B
|
939ac6c12059a18fc1117cdb3861f46816eff2fb
| 33.232259
| 0
| 10
| false
| false
| false
| true
| 3.362485
| 0.797443
| 79.744301
| 0.611779
| 44.273282
| 0.122356
| 12.23565
| 0.34396
| 12.527964
| 0.430958
| 13.103125
| 0.437583
| 37.509235
| false
| false
|
2024-10-23
|
2024-10-24
| 1
|
zelk12/MT1-Gen1-gemma-2-9B (Merge)
|
|
zelk12_MT1-Gen2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT1-Gen2-gemma-2-9B
|
aeaca7dc7d50a425a5d3c38d7c4a7daf1c772ad4
| 33.142398
| 2
| 10
| false
| false
| false
| true
| 1.995995
| 0.798367
| 79.836722
| 0.609599
| 43.919191
| 0.113293
| 11.329305
| 0.352349
| 13.646532
| 0.428354
| 12.844271
| 0.435505
| 37.278369
| false
| false
|
2024-11-11
|
2024-11-11
| 1
|
zelk12/MT1-Gen2-gemma-2-9B (Merge)
|
|
zelk12_MT1-Gen3-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT1-Gen3-gemma-2-9B
|
5cc4ee1c70f08a5b1a195d43f044d9bf6fca29f5
| 32.964927
| 0
| 10
| false
| false
| false
| true
| 1.944877
| 0.795969
| 79.596914
| 0.610155
| 43.990306
| 0.117825
| 11.782477
| 0.348993
| 13.199105
| 0.424323
| 12.007031
| 0.434924
| 37.213726
| false
| false
|
2024-12-01
|
2024-12-01
| 1
|
zelk12/MT1-Gen3-gemma-2-9B (Merge)
|
|
zelk12_MT1-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT1-gemma-2-9B
|
3a5e77518ca9c3c8ea2edac4c03bc220ee91f3ed
| 33.633829
| 1
| 10
| false
| false
| false
| true
| 3.345719
| 0.79467
| 79.467036
| 0.610875
| 44.161526
| 0.149547
| 14.954683
| 0.345638
| 12.751678
| 0.432229
| 13.161979
| 0.435755
| 37.306073
| false
| false
|
2024-10-12
|
2024-10-14
| 1
|
zelk12/MT1-gemma-2-9B (Merge)
|
|
zelk12_MT2-Gen1-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT2-Gen1-gemma-2-9B
|
167abf8eb4ea01fecd42dc32ad68160c51a8685a
| 32.460223
| 0
| 10
| false
| false
| false
| true
| 3.38321
| 0.785578
| 78.557782
| 0.61008
| 44.141103
| 0.101208
| 10.120846
| 0.343121
| 12.416107
| 0.424323
| 12.007031
| 0.437666
| 37.518469
| false
| false
|
2024-10-24
|
2024-10-27
| 1
|
zelk12/MT2-Gen1-gemma-2-9B (Merge)
|
|
zelk12_MT2-Gen2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT2-Gen2-gemma-2-9B
|
24c487499b5833424ffb9932eed838bb254f61b4
| 33.471172
| 3
| 10
| false
| false
| false
| true
| 2.037441
| 0.7889
| 78.890012
| 0.609292
| 44.044503
| 0.148036
| 14.803625
| 0.346477
| 12.863535
| 0.427021
| 12.577604
| 0.43883
| 37.647754
| false
| false
|
2024-11-12
|
2024-11-12
| 1
|
zelk12/MT2-Gen2-gemma-2-9B (Merge)
|
|
zelk12_MT2-Gen3-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT2-Gen3-gemma-2-9B
|
bb750c2b76328c6dbc9adf9ae3d09551f3723758
| 32.967895
| 0
| 10
| false
| false
| false
| true
| 1.924377
| 0.781007
| 78.100662
| 0.610477
| 44.007274
| 0.132931
| 13.293051
| 0.346477
| 12.863535
| 0.423083
| 12.052083
| 0.437417
| 37.490765
| false
| false
|
2024-12-04
|
2024-12-04
| 1
|
zelk12/MT2-Gen3-gemma-2-9B (Merge)
|
|
zelk12_MT2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT2-gemma-2-9B
|
d20d7169ce0f53d586504c50b4b7dc470bf8a781
| 33.2825
| 1
| 10
| false
| false
| false
| true
| 3.19411
| 0.788575
| 78.857542
| 0.611511
| 44.167481
| 0.147281
| 14.728097
| 0.347315
| 12.975391
| 0.421656
| 11.540365
| 0.436835
| 37.426123
| false
| false
|
2024-10-14
|
2024-10-15
| 1
|
zelk12/MT2-gemma-2-9B (Merge)
|
|
zelk12_MT3-Gen1-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT3-Gen1-gemma-2-9B
|
cd78df9e67e2e710d8d305f5a03a92c01b1b425d
| 31.054845
| 1
| 10
| false
| false
| false
| true
| 3.113666
| 0.783779
| 78.377926
| 0.610676
| 44.119495
| 0.032477
| 3.247734
| 0.346477
| 12.863535
| 0.415115
| 10.75599
| 0.43268
| 36.964391
| false
| false
|
2024-10-24
|
2024-10-28
| 1
|
zelk12/MT3-Gen1-gemma-2-9B (Merge)
|
|
zelk12_MT3-Gen2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT3-Gen2-gemma-2-9B
|
e4ef057d20751d89934025e9088ba98d89b921b5
| 30.963626
| 1
| 10
| false
| false
| false
| true
| 1.919108
| 0.784329
| 78.432891
| 0.609147
| 43.940226
| 0.020393
| 2.039275
| 0.357383
| 14.317673
| 0.411115
| 10.022656
| 0.433261
| 37.029034
| false
| false
|
2024-11-20
|
2024-11-20
| 1
|
zelk12/MT3-Gen2-gemma-2-9B (Merge)
|
|
zelk12_MT3-Gen3-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT3-Gen3-gemma-2-9B
|
4ad54d6295f6364aa87f7aaa2a7bd112fb92ec00
| 32.359994
| 0
| 10
| false
| false
| false
| true
| 1.904463
| 0.785628
| 78.562769
| 0.608889
| 43.78374
| 0.090634
| 9.063444
| 0.35151
| 13.534676
| 0.42575
| 12.51875
| 0.430269
| 36.696587
| false
| false
|
2024-12-07
|
2024-12-07
| 1
|
zelk12/MT3-Gen3-gemma-2-9B (Merge)
|
|
zelk12_MT3-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT3-gemma-2-9B
|
d501b6ea59896fac3dc0a623501a5493b3573cde
| 32.352524
| 1
| 10
| false
| false
| false
| true
| 3.136653
| 0.778609
| 77.860854
| 0.613078
| 44.248465
| 0.104985
| 10.498489
| 0.344799
| 12.639821
| 0.424292
| 11.903125
| 0.43268
| 36.964391
| false
| false
|
2024-10-15
|
2024-10-16
| 1
|
zelk12/MT3-gemma-2-9B (Merge)
|
|
zelk12_MT4-Gen1-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT4-Gen1-gemma-2-9B
|
6ed2c66246c7f354decfd3579acb534dc4b0b48c
| 33.544994
| 0
| 10
| false
| false
| false
| true
| 2.103561
| 0.7895
| 78.949964
| 0.609383
| 44.009524
| 0.150302
| 15.030211
| 0.34396
| 12.527964
| 0.432229
| 13.095313
| 0.438913
| 37.656989
| false
| false
|
2024-10-25
|
2024-10-29
| 1
|
zelk12/MT4-Gen1-gemma-2-9B (Merge)
|
|
zelk12_MT4-Gen2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT4-Gen2-gemma-2-9B
|
4d61a5799b11641a24e8b0f3eda0e987ff392089
| 33.794732
| 1
| 10
| false
| false
| false
| true
| 1.977047
| 0.805062
| 80.506168
| 0.610835
| 44.176658
| 0.1571
| 15.70997
| 0.345638
| 12.751678
| 0.425656
| 12.207031
| 0.436752
| 37.416888
| false
| false
|
2024-11-22
|
2024-11-22
| 1
|
zelk12/MT4-Gen2-gemma-2-9B (Merge)
|
|
zelk12_MT4-Gen3-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT4-Gen3-gemma-2-9B
|
f93026d28ca1707e8c21620be8558eed6be43b1c
| 33.239752
| 0
| 10
| false
| false
| false
| true
| 1.958701
| 0.784054
| 78.405409
| 0.608711
| 43.89439
| 0.151057
| 15.10574
| 0.34396
| 12.527964
| 0.424323
| 11.940365
| 0.438082
| 37.564642
| false
| false
|
2024-12-08
|
2024-12-08
| 1
|
zelk12/MT4-Gen3-gemma-2-9B (Merge)
|
|
zelk12_MT4-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT4-gemma-2-9B
|
2167ea02baf9145a697a7d828a17c75b86e5e282
| 33.447349
| 0
| 10
| false
| false
| false
| true
| 3.155259
| 0.776161
| 77.616059
| 0.607314
| 43.553827
| 0.173716
| 17.371601
| 0.338087
| 11.744966
| 0.430927
| 12.999219
| 0.436586
| 37.398419
| false
| false
|
2024-10-16
|
2024-10-20
| 1
|
zelk12/MT4-gemma-2-9B (Merge)
|
|
zelk12_MT5-Gen1-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT5-Gen1-gemma-2-9B
|
0291b776e80f38381788cd8f1fb2c3435ad891b5
| 31.897632
| 0
| 10
| false
| false
| false
| true
| 2.017253
| 0.78313
| 78.312987
| 0.611048
| 44.183335
| 0.068731
| 6.873112
| 0.347315
| 12.975391
| 0.420385
| 11.614844
| 0.436835
| 37.426123
| false
| false
|
2024-10-25
|
2024-10-31
| 1
|
zelk12/MT5-Gen1-gemma-2-9B (Merge)
|
|
zelk12_MT5-Gen2-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT5-Gen2-gemma-2-9B
|
3ee2822fcba6708bd9032b79249a2789e5996b6a
| 32.600392
| 1
| 10
| false
| false
| false
| true
| 1.858381
| 0.796244
| 79.624397
| 0.610541
| 44.113215
| 0.103474
| 10.347432
| 0.35151
| 13.534676
| 0.416292
| 10.436458
| 0.437916
| 37.546173
| false
| false
|
2024-11-23
|
2024-11-23
| 1
|
zelk12/MT5-Gen2-gemma-2-9B (Merge)
|
|
zelk12_MT5-Gen3-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT5-Gen3-gemma-2-9B
|
4b3811c689fec5c9cc483bb1ed696734e5e88fcf
| 32.801838
| 0
| 10
| false
| false
| false
| true
| 1.937333
| 0.78253
| 78.253035
| 0.609049
| 43.885913
| 0.115559
| 11.555891
| 0.35151
| 13.534676
| 0.423052
| 12.08151
| 0.4375
| 37.5
| false
| false
|
2024-12-08
|
2024-12-08
| 1
|
zelk12/MT5-Gen3-gemma-2-9B (Merge)
|
|
zelk12_MT5-gemma-2-9B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/MT5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/MT5-gemma-2-9B
|
b627ae7d796b1ae85b59c55e0e043b8d3ae73d83
| 32.595305
| 0
| 10
| false
| false
| false
| true
| 3.26983
| 0.804787
| 80.478685
| 0.611223
| 44.271257
| 0.095166
| 9.516616
| 0.343121
| 12.416107
| 0.420385
| 11.48151
| 0.436669
| 37.407654
| false
| false
|
2024-10-19
|
2024-10-21
| 1
|
zelk12/MT5-gemma-2-9B (Merge)
|
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1
|
b4208ddf6c741884c16c77b9433d9ead8f216354
| 30.344893
| 2
| 10
| false
| false
| false
| true
| 3.443191
| 0.764895
| 76.489492
| 0.607451
| 43.706516
| 0.013595
| 1.359517
| 0.349832
| 13.310962
| 0.413625
| 10.303125
| 0.432098
| 36.899749
| false
| false
|
2024-10-03
|
2024-10-03
| 1
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1 (Merge)
|
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25
|
e652c9e07265526851dad994f4640aa265b9ab56
| 33.300246
| 1
| 10
| false
| false
| false
| true
| 3.194991
| 0.770665
| 77.066517
| 0.607543
| 43.85035
| 0.155589
| 15.558912
| 0.343121
| 12.416107
| 0.43226
| 13.132552
| 0.439993
| 37.777039
| false
| false
|
2024-10-04
|
2024-10-04
| 1
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 (Merge)
|
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75
|
eb0e589291630ba20328db650f74af949d217a97
| 28.421762
| 0
| 10
| false
| false
| false
| true
| 3.751453
| 0.720806
| 72.080635
| 0.59952
| 42.487153
| 0
| 0
| 0.349832
| 13.310962
| 0.395115
| 7.75599
| 0.414063
| 34.895833
| false
| false
|
2024-10-04
|
2024-10-04
| 1
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75 (Merge)
|
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2
|
76f56b25bf6d8704282f8c77bfda28ca384883bc
| 30.113979
| 1
| 10
| false
| false
| false
| true
| 3.413675
| 0.759999
| 75.999902
| 0.606626
| 43.633588
| 0.012085
| 1.208459
| 0.348154
| 13.087248
| 0.410958
| 9.836458
| 0.432264
| 36.918218
| false
| false
|
2024-10-07
|
2024-10-11
| 1
|
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2 (Merge)
|
|
zelk12_recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1
|
1e3e623e9f0b386bfd967c629dd39c87daef5bed
| 31.626376
| 1
| 10
| false
| false
| false
| true
| 6.461752
| 0.761523
| 76.152276
| 0.609878
| 43.941258
| 0.073263
| 7.326284
| 0.341443
| 12.192394
| 0.431021
| 13.310937
| 0.431516
| 36.835106
| false
| false
|
2024-10-07
|
2024-10-07
| 1
|
zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1 (Merge)
|
|
zelk12_recoilme-gemma-2-Ifable-9B-v0.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ifable-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ifable-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ifable-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/recoilme-gemma-2-Ifable-9B-v0.1
|
8af6620b39c9a36239879b6b2bd88f66e9e9d930
| 32.254423
| 0
| 10
| false
| false
| false
| true
| 6.542869
| 0.794396
| 79.439554
| 0.60644
| 43.39057
| 0.09139
| 9.138973
| 0.35151
| 13.534676
| 0.420229
| 11.095313
| 0.432347
| 36.927453
| false
| false
|
2024-10-07
|
2024-10-07
| 1
|
zelk12/recoilme-gemma-2-Ifable-9B-v0.1 (Merge)
|
|
zelk12_recoilme-gemma-2-psy10k-mental_healt-9B-v0.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-psy10k-mental_healt-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1
|
ced039b03be6f65ac0f713efcee76c6534e65639
| 32.448061
| 0
| 10
| false
| false
| false
| true
| 3.13222
| 0.744537
| 74.453672
| 0.597759
| 42.132683
| 0.180514
| 18.05136
| 0.34396
| 12.527964
| 0.429469
| 12.183594
| 0.418052
| 35.339096
| false
| false
|
2024-10-07
|
2024-10-07
| 1
|
zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1 (Merge)
|
|
zetasepic_Qwen2.5-72B-Instruct-abliterated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/zetasepic/Qwen2.5-72B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zetasepic/Qwen2.5-72B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zetasepic__Qwen2.5-72B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zetasepic/Qwen2.5-72B-Instruct-abliterated
|
af94b3c05c9857dbac73afb1cbce00e4833ec9ef
| 45.293139
|
other
| 11
| 72
| true
| false
| false
| false
| 18.809182
| 0.715261
| 71.526106
| 0.715226
| 59.912976
| 0.46148
| 46.148036
| 0.406879
| 20.917226
| 0.471917
| 19.122917
| 0.587184
| 54.131575
| false
| false
|
2024-10-01
|
2024-11-08
| 2
|
Qwen/Qwen2.5-72B
|
zhengr_MixTAO-7Bx2-MoE-v8.1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zhengr/MixTAO-7Bx2-MoE-v8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zhengr__MixTAO-7Bx2-MoE-v8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
zhengr/MixTAO-7Bx2-MoE-v8.1
|
828e963abf2db0f5af9ed0d4034e538fc1cf5f40
| 17.168311
|
apache-2.0
| 55
| 12
| true
| true
| false
| true
| 0.92739
| 0.418781
| 41.878106
| 0.420194
| 19.176907
| 0.066465
| 6.646526
| 0.298658
| 6.487696
| 0.397625
| 8.303125
| 0.284658
| 20.517509
| false
| false
|
2024-02-26
|
2024-06-27
| 0
|
zhengr/MixTAO-7Bx2-MoE-v8.1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.