eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 6
values | T
stringclasses 6
values | Weight type
stringclasses 2
values | Architecture
stringclasses 52
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 1.03
52
| Hub License
stringclasses 26
values | Hub ❤️
int64 0
5.9k
| #Params (B)
int64 -1
140
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.03
107
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.27
0.75
| BBH
float64 0.81
63.5
| MATH Lvl 5 Raw
float64 0
0.51
| MATH Lvl 5
float64 0
50.7
| GPQA Raw
float64 0.22
0.44
| GPQA
float64 0
24.9
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.5
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 424
values | Submission Date
stringclasses 169
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
DreadPoor_BaeZel-8B-LINEAR_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/BaeZel-8B-LINEAR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/BaeZel-8B-LINEAR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__BaeZel-8B-LINEAR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/BaeZel-8B-LINEAR
|
1deac3287de191794c50543d69d523f43654a803
| 30.296459
|
apache-2.0
| 1
| 8
| true
| false
| false
| true
| 0.665069
| 0.737792
| 73.779239
| 0.54638
| 35.535376
| 0.178248
| 17.824773
| 0.321309
| 9.50783
| 0.422708
| 13.338542
| 0.386137
| 31.792996
| true
| false
|
2024-11-08
|
2024-11-08
| 1
|
DreadPoor/BaeZel-8B-LINEAR (Merge)
|
DreadPoor_Condensed_Milk-8B-Model_Stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/Condensed_Milk-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Condensed_Milk-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Condensed_Milk-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/Condensed_Milk-8B-Model_Stock
|
6e5b73099b9d5a794c9c744c4c5c158b1feb8916
| 30.070583
|
apache-2.0
| 1
| 8
| true
| false
| false
| true
| 0.654664
| 0.753629
| 75.362926
| 0.543486
| 35.12062
| 0.173716
| 17.371601
| 0.321309
| 9.50783
| 0.41601
| 11.101302
| 0.387633
| 31.95922
| true
| false
|
2024-11-27
|
2024-11-27
| 1
|
DreadPoor/Condensed_Milk-8B-Model_Stock (Merge)
|
DreadPoor_CoolerCoder-8B-LINEAR_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/CoolerCoder-8B-LINEAR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/CoolerCoder-8B-LINEAR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__CoolerCoder-8B-LINEAR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/CoolerCoder-8B-LINEAR
|
db14b0fa821b0b6b07802111fd19ba722344a32b
| 19.148011
| 0
| 8
| false
| false
| false
| true
| 1.444021
| 0.451929
| 45.192866
| 0.47615
| 26.365383
| 0.061934
| 6.193353
| 0.290268
| 5.369128
| 0.396354
| 7.777604
| 0.315908
| 23.989731
| false
| false
|
2024-11-20
| 0
|
Removed
|
||
DreadPoor_Damasteel-8B-LINEAR_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/Damasteel-8B-LINEAR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Damasteel-8B-LINEAR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Damasteel-8B-LINEAR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/Damasteel-8B-LINEAR
|
cfc389c15e614b14f1d8d16740dcc183047b435a
| 28.964891
| 0
| 8
| false
| false
| false
| true
| 0.674659
| 0.738442
| 73.844178
| 0.538814
| 34.106138
| 0.166163
| 16.616314
| 0.298658
| 6.487696
| 0.42125
| 11.85625
| 0.377909
| 30.878768
| false
| false
|
2024-11-01
| 0
|
Removed
|
||
DreadPoor_Dearly_Beloved-8B-TIES_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/Dearly_Beloved-8B-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Dearly_Beloved-8B-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Dearly_Beloved-8B-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/Dearly_Beloved-8B-TIES
|
af6515ee730d6aa17d77687fe2c06c57fa9533fb
| 26.034998
| 0
| 8
| false
| false
| false
| true
| 0.715241
| 0.826669
| 82.666879
| 0.404983
| 16.671813
| 0.196375
| 19.637462
| 0.298658
| 6.487696
| 0.417469
| 10.45026
| 0.282663
| 20.295878
| false
| false
|
2024-11-22
| 0
|
Removed
|
||
DreadPoor_Emu_Eggs-9B-Model_Stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/Emu_Eggs-9B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Emu_Eggs-9B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Emu_Eggs-9B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/Emu_Eggs-9B-Model_Stock
|
3fb1b2da72f3618f6943aedfd1600df27886792a
| 29.611715
|
apache-2.0
| 2
| 9
| true
| false
| false
| true
| 3.08835
| 0.760698
| 76.069828
| 0.605166
| 42.783674
| 0.02568
| 2.567976
| 0.333054
| 11.073826
| 0.407083
| 9.31875
| 0.422706
| 35.856235
| true
| false
|
2024-10-18
|
2024-10-18
| 0
|
DreadPoor/Emu_Eggs-9B-Model_Stock
|
DreadPoor_Eunoia_Vespera-8B-LINEAR_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/Eunoia_Vespera-8B-LINEAR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Eunoia_Vespera-8B-LINEAR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Eunoia_Vespera-8B-LINEAR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/Eunoia_Vespera-8B-LINEAR
|
c674956327af664735cf39b20c7a8276dfa579f9
| 28.931156
|
apache-2.0
| 2
| 8
| true
| false
| false
| true
| 0.81326
| 0.723529
| 72.352912
| 0.539931
| 34.216103
| 0.152568
| 15.256798
| 0.307047
| 7.606264
| 0.41849
| 12.611198
| 0.383893
| 31.543661
| true
| false
|
2024-09-22
|
2024-09-22
| 1
|
DreadPoor/Eunoia_Vespera-8B-LINEAR (Merge)
|
DreadPoor_Heart_Stolen-8B-Model_Stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/Heart_Stolen-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Heart_Stolen-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Heart_Stolen-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/Heart_Stolen-8B-Model_Stock
|
6d77987af7115c7455ddb072c48316815b018999
| 29.24739
|
cc-by-nc-4.0
| 5
| 8
| true
| false
| false
| true
| 0.749301
| 0.724453
| 72.445334
| 0.539544
| 34.444822
| 0.162387
| 16.238671
| 0.317114
| 8.948546
| 0.416229
| 12.361979
| 0.379405
| 31.044991
| true
| false
|
2024-09-09
|
2024-09-10
| 1
|
DreadPoor/Heart_Stolen-8B-Model_Stock (Merge)
|
DreadPoor_Heart_Stolen-ALT-8B-Model_Stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/Heart_Stolen-ALT-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Heart_Stolen-ALT-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Heart_Stolen-ALT-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/Heart_Stolen-ALT-8B-Model_Stock
|
03d1d70cb7eb5a743468b97c9c580028df487564
| 27.754545
| 0
| 8
| false
| false
| false
| true
| 0.735627
| 0.718358
| 71.83584
| 0.526338
| 32.354424
| 0.149547
| 14.954683
| 0.301174
| 6.823266
| 0.4055
| 9.754167
| 0.377244
| 30.804891
| false
| false
|
2024-09-11
| 0
|
Removed
|
||
DreadPoor_Irina-8B-model_stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/Irina-8B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Irina-8B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Irina-8B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/Irina-8B-model_stock
|
b282e3ab449d71a31f48b8c13eb43a4435968728
| 25.32468
| 0
| 8
| false
| false
| false
| true
| 0.745586
| 0.67994
| 67.994034
| 0.523664
| 32.08833
| 0.100453
| 10.045317
| 0.284396
| 4.58613
| 0.400292
| 8.636458
| 0.35738
| 28.597813
| false
| false
|
2024-08-30
| 0
|
Removed
|
||
DreadPoor_L3.1-BaeZel-8B-Della_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/L3.1-BaeZel-8B-Della" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/L3.1-BaeZel-8B-Della</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__L3.1-BaeZel-8B-Della-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/L3.1-BaeZel-8B-Della
|
ec61b6f5355a7f3975d80f1afac69e0407e612e5
| 26.167555
| 0
| 8
| false
| false
| false
| true
| 0.658995
| 0.518024
| 51.80244
| 0.544845
| 35.157455
| 0.169184
| 16.918429
| 0.319631
| 9.284116
| 0.419979
| 11.597396
| 0.390209
| 32.245493
| false
| false
|
2024-11-15
| 0
|
Removed
|
||
DreadPoor_Matryoshka-8B-LINEAR_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/Matryoshka-8B-LINEAR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Matryoshka-8B-LINEAR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Matryoshka-8B-LINEAR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/Matryoshka-8B-LINEAR
|
20d260e6d881fcd3b4f76071797675d095ba8e98
| 29.825026
| 0
| 8
| false
| false
| false
| true
| 0.662314
| 0.726252
| 72.62519
| 0.544428
| 35.110912
| 0.175227
| 17.522659
| 0.32047
| 9.395973
| 0.42525
| 12.45625
| 0.386553
| 31.83917
| false
| false
|
2024-12-02
| 0
|
Removed
|
||
DreadPoor_Mercury_In_Retrograde-8b-Model-Stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/Mercury_In_Retrograde-8b-Model-Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Mercury_In_Retrograde-8b-Model-Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Mercury_In_Retrograde-8b-Model-Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/Mercury_In_Retrograde-8b-Model-Stock
|
6c761644a57ab267624987ec2211c4af7a51a16a
| 29.236114
| 0
| 8
| false
| false
| false
| true
| 0.673547
| 0.729624
| 72.962406
| 0.539051
| 34.384865
| 0.163142
| 16.314199
| 0.316275
| 8.836689
| 0.419885
| 11.485677
| 0.382896
| 31.432846
| false
| false
|
2024-12-03
| 0
|
Removed
|
||
DreadPoor_ONeil-model_stock-8B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/ONeil-model_stock-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/ONeil-model_stock-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__ONeil-model_stock-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/ONeil-model_stock-8B
|
d4b84956211fd57b85122fe0c6f88b2cb9a9c86a
| 26.935908
|
cc-by-nc-4.0
| 2
| 8
| true
| false
| false
| true
| 0.764449
| 0.678566
| 67.85662
| 0.554834
| 36.412613
| 0.101208
| 10.120846
| 0.305369
| 7.38255
| 0.417344
| 10.967969
| 0.359874
| 28.874852
| true
| false
|
2024-07-06
|
2024-07-15
| 1
|
DreadPoor/ONeil-model_stock-8B (Merge)
|
DreadPoor_Promissum_Mane-8B-LINEAR_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/Promissum_Mane-8B-LINEAR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Promissum_Mane-8B-LINEAR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Promissum_Mane-8B-LINEAR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/Promissum_Mane-8B-LINEAR
|
ff399e7004040e1807e8d08b4d0967206fc50afa
| 29.049297
| 1
| 8
| false
| false
| false
| true
| 0.828647
| 0.715036
| 71.50361
| 0.545768
| 35.25319
| 0.152568
| 15.256798
| 0.30453
| 7.270694
| 0.420042
| 13.338542
| 0.385057
| 31.672946
| false
| false
|
2024-09-30
|
2024-09-30
| 1
|
DreadPoor/Promissum_Mane-8B-LINEAR (Merge)
|
|
DreadPoor_Promissum_Mane-8B-LINEAR-lorablated_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/Promissum_Mane-8B-LINEAR-lorablated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Promissum_Mane-8B-LINEAR-lorablated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Promissum_Mane-8B-LINEAR-lorablated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/Promissum_Mane-8B-LINEAR-lorablated
|
34c4a30b7462704810e35e033aa5ef33b075a97b
| 28.810739
| 0
| 8
| false
| false
| false
| true
| 0.792342
| 0.715636
| 71.563562
| 0.543518
| 34.609107
| 0.152568
| 15.256798
| 0.303691
| 7.158837
| 0.419792
| 13.840625
| 0.37392
| 30.435505
| false
| false
|
2024-09-30
| 0
|
Removed
|
||
DreadPoor_Sellen-8B-model_stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/Sellen-8B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Sellen-8B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Sellen-8B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/Sellen-8B-model_stock
|
accde7145d81a428c782695ea61eebc608efd980
| 26.362467
| 0
| 8
| false
| false
| false
| true
| 0.807471
| 0.711289
| 71.128938
| 0.523168
| 31.360979
| 0.132175
| 13.217523
| 0.274329
| 3.243848
| 0.396042
| 10.671875
| 0.356965
| 28.55164
| false
| false
|
2024-08-27
| 0
|
Removed
|
||
DreadPoor_Sweetened_Condensed_Milk-8B-Model_Stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/Sweetened_Condensed_Milk-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Sweetened_Condensed_Milk-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Sweetened_Condensed_Milk-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/Sweetened_Condensed_Milk-8B-Model_Stock
|
307d93aeb51160a8b0ce236b8abd13e04873fef1
| 29.523394
| 0
| 8
| false
| false
| false
| true
| 0.665669
| 0.741714
| 74.171421
| 0.540629
| 34.670888
| 0.185045
| 18.504532
| 0.302852
| 7.04698
| 0.410677
| 11.101302
| 0.384807
| 31.645242
| false
| false
|
2024-11-27
| 0
|
Removed
|
||
DreadPoor_Trinas_Nectar-8B-model_stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/Trinas_Nectar-8B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Trinas_Nectar-8B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Trinas_Nectar-8B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/Trinas_Nectar-8B-model_stock
|
cb46b8431872557904d83fc5aa1b90dabeb74acc
| 27.535042
|
cc-by-nc-4.0
| 3
| 8
| true
| false
| false
| true
| 0.866724
| 0.725927
| 72.592721
| 0.525612
| 31.975094
| 0.153323
| 15.332326
| 0.286074
| 4.809843
| 0.406771
| 11.413021
| 0.361785
| 29.087249
| true
| false
|
2024-08-16
|
2024-08-27
| 1
|
DreadPoor/Trinas_Nectar-8B-model_stock (Merge)
|
DreadPoor_WIP-Acacia-8B-Model_Stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/WIP-Acacia-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/WIP-Acacia-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__WIP-Acacia-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/WIP-Acacia-8B-Model_Stock
|
ae4f1a21b9de70ec75d02e9a84209ae6360a01e9
| 26.560174
| 0
| 8
| false
| false
| false
| true
| 0.662951
| 0.624636
| 62.463597
| 0.519467
| 31.162353
| 0.1571
| 15.70997
| 0.306208
| 7.494407
| 0.422583
| 12.122917
| 0.37367
| 30.407801
| false
| false
|
2024-11-28
| 0
|
Removed
|
||
DreadPoor_WIP_Damascus-8B-TIES_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/WIP_Damascus-8B-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/WIP_Damascus-8B-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__WIP_Damascus-8B-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/WIP_Damascus-8B-TIES
|
c7720a0b0a8d24e62bf71b0e955b1aca8e62f1cb
| 24.731381
| 0
| 8
| false
| false
| false
| true
| 0.818112
| 0.477633
| 47.763268
| 0.541067
| 34.522306
| 0.151057
| 15.10574
| 0.307047
| 7.606264
| 0.411854
| 12.715104
| 0.37608
| 30.675606
| false
| false
|
2024-10-29
| 0
|
Removed
|
||
DreadPoor_felix_dies-mistral-7B-model_stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/felix_dies-mistral-7B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/felix_dies-mistral-7B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__felix_dies-mistral-7B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/felix_dies-mistral-7B-model_stock
|
bb317aa7565625327e18c5158aebd4710aa1d925
| 18.101828
| 0
| 7
| false
| false
| false
| false
| 0.661572
| 0.300779
| 30.07786
| 0.490092
| 28.890798
| 0.05136
| 5.135952
| 0.291946
| 5.592841
| 0.451823
| 15.477865
| 0.310921
| 23.435653
| false
| false
|
2024-09-30
| 0
|
Removed
|
||
DreadPoor_remember_to_breathe-8b-Model-Stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/DreadPoor/remember_to_breathe-8b-Model-Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/remember_to_breathe-8b-Model-Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__remember_to_breathe-8b-Model-Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
DreadPoor/remember_to_breathe-8b-Model-Stock
|
fa88f1b06cf9ca7bd0d859c6a4b2240485363ae0
| 28.168407
| 0
| 8
| false
| false
| false
| true
| 0.66354
| 0.710415
| 71.041503
| 0.541165
| 34.678991
| 0.143505
| 14.350453
| 0.301174
| 6.823266
| 0.414458
| 11.440625
| 0.37608
| 30.675606
| false
| false
|
2024-12-06
| 0
|
Removed
|
||
EVA-UNIT-01_EVA-Qwen2.5-72B-v0.2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EVA-UNIT-01__EVA-Qwen2.5-72B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EVA-UNIT-01/EVA-Qwen2.5-72B-v0.2
|
2590214b30391392b9a84e7cbe40fff3a92c6814
| 43.541838
|
other
| 9
| 72
| true
| false
| false
| true
| 22.955098
| 0.687884
| 68.78837
| 0.708801
| 59.066733
| 0.390483
| 39.048338
| 0.408557
| 21.14094
| 0.471979
| 19.730729
| 0.581283
| 53.475916
| false
| false
|
2024-11-21
|
2024-11-27
| 1
|
Qwen/Qwen2.5-72B
|
EleutherAI_gpt-j-6b_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
GPTJForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-j-6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-j-6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-j-6b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/gpt-j-6b
|
47e169305d2e8376be1d31e765533382721b2cc1
| 6.557824
|
apache-2.0
| 1,450
| 6
| true
| false
| false
| false
| 0.767432
| 0.252219
| 25.221856
| 0.319104
| 4.912818
| 0.01284
| 1.283988
| 0.245805
| 0
| 0.36575
| 5.252083
| 0.124086
| 2.676197
| false
| true
|
2022-03-02
|
2024-08-19
| 0
|
EleutherAI/gpt-j-6b
|
EleutherAI_gpt-neo-1.3B_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-1.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-1.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-1.3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/gpt-neo-1.3B
|
dbe59a7f4a88d01d1ba9798d78dbe3fe038792c8
| 5.340738
|
mit
| 269
| 1
| true
| false
| false
| false
| 0.359424
| 0.207905
| 20.790503
| 0.303923
| 3.024569
| 0.007553
| 0.755287
| 0.255872
| 0.782998
| 0.381656
| 4.873698
| 0.116356
| 1.817376
| false
| true
|
2022-03-02
|
2024-06-12
| 0
|
EleutherAI/gpt-neo-1.3B
|
EleutherAI_gpt-neo-125m_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-125m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-125m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-125m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/gpt-neo-125m
|
21def0189f5705e2521767faed922f1f15e7d7db
| 4.382146
|
mit
| 184
| 0
| true
| false
| false
| false
| 0.202902
| 0.190544
| 19.054442
| 0.311516
| 3.436739
| 0.004532
| 0.453172
| 0.253356
| 0.447427
| 0.359333
| 2.616667
| 0.10256
| 0.284427
| false
| true
|
2022-03-02
|
2024-08-10
| 0
|
EleutherAI/gpt-neo-125m
|
EleutherAI_gpt-neo-2.7B_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-2.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-2.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/gpt-neo-2.7B
|
e24fa291132763e59f4a5422741b424fb5d59056
| 6.355519
|
mit
| 444
| 2
| true
| false
| false
| false
| 0.508381
| 0.258963
| 25.896289
| 0.313952
| 4.178603
| 0.006042
| 0.60423
| 0.26594
| 2.12528
| 0.355365
| 3.520573
| 0.116273
| 1.808141
| false
| true
|
2022-03-02
|
2024-06-12
| 0
|
EleutherAI/gpt-neo-2.7B
|
EleutherAI_gpt-neox-20b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neox-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neox-20b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neox-20b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/gpt-neox-20b
|
c292233c833e336628618a88a648727eb3dff0a7
| 6.003229
|
apache-2.0
| 543
| 20
| true
| false
| false
| false
| 3.146736
| 0.258688
| 25.868806
| 0.316504
| 4.929114
| 0.006798
| 0.679758
| 0.243289
| 0
| 0.364667
| 2.816667
| 0.115525
| 1.72503
| false
| true
|
2022-04-07
|
2024-06-09
| 0
|
EleutherAI/gpt-neox-20b
|
EleutherAI_pythia-12b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/pythia-12b
|
35c9d7f32fbb108fb8b5bdd574eb03369d1eed49
| 5.93396
|
apache-2.0
| 131
| 12
| true
| false
| false
| false
| 1.118007
| 0.247148
| 24.714757
| 0.317965
| 4.987531
| 0.009063
| 0.906344
| 0.246644
| 0
| 0.364698
| 3.78724
| 0.110871
| 1.20789
| false
| true
|
2023-02-28
|
2024-06-12
| 0
|
EleutherAI/pythia-12b
|
EleutherAI_pythia-160m_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-160m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-160m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-160m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/pythia-160m
|
50f5173d932e8e61f858120bcb800b97af589f46
| 5.617102
|
apache-2.0
| 26
| 0
| true
| false
| false
| false
| 0.235339
| 0.181552
| 18.155162
| 0.297044
| 2.198832
| 0.002266
| 0.226586
| 0.258389
| 1.118568
| 0.417938
| 10.675521
| 0.111951
| 1.32794
| false
| true
|
2023-02-08
|
2024-06-09
| 0
|
EleutherAI/pythia-160m
|
EleutherAI_pythia-2.8b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-2.8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-2.8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-2.8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/pythia-2.8b
|
2a259cdd96a4beb1cdf467512e3904197345f6a9
| 5.454241
|
apache-2.0
| 29
| 2
| true
| false
| false
| false
| 0.753902
| 0.217322
| 21.732226
| 0.322409
| 5.077786
| 0.007553
| 0.755287
| 0.25
| 0
| 0.348573
| 3.638281
| 0.113697
| 1.521868
| false
| true
|
2023-02-13
|
2024-06-12
| 0
|
EleutherAI/pythia-2.8b
|
EleutherAI_pythia-410m_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-410m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-410m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-410m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/pythia-410m
|
9879c9b5f8bea9051dcb0e68dff21493d67e9d4f
| 5.113779
|
apache-2.0
| 22
| 0
| true
| false
| false
| false
| 0.377082
| 0.219545
| 21.954525
| 0.302813
| 2.715428
| 0.003021
| 0.302115
| 0.259228
| 1.230425
| 0.357813
| 3.059896
| 0.112783
| 1.420287
| false
| true
|
2023-02-13
|
2024-06-09
| 0
|
EleutherAI/pythia-410m
|
EleutherAI_pythia-6.9b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-6.9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-6.9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-6.9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EleutherAI/pythia-6.9b
|
f271943e880e60c0c715fd10e4dc74ec4e31eb44
| 5.865842
|
apache-2.0
| 48
| 6
| true
| false
| false
| false
| 0.868867
| 0.228114
| 22.811363
| 0.323229
| 5.881632
| 0.008308
| 0.830816
| 0.251678
| 0.223714
| 0.359052
| 3.814844
| 0.114694
| 1.632683
| false
| true
|
2023-02-14
|
2024-06-12
| 0
|
EleutherAI/pythia-6.9b
|
Enno-Ai_EnnoAi-Pro-French-Llama-3-8B-v0.4_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-French-Llama-3-8B-v0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4
|
328722ae96e3a112ec900dbe77d410788a526c5c
| 15.180945
|
creativeml-openrail-m
| 0
| 8
| true
| false
| false
| true
| 1.009128
| 0.418881
| 41.888079
| 0.407495
| 16.875928
| 0.006042
| 0.60423
| 0.270973
| 2.796421
| 0.417
| 10.758333
| 0.263464
| 18.162677
| false
| false
|
2024-06-27
|
2024-06-30
| 0
|
Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4
|
Enno-Ai_EnnoAi-Pro-Llama-3-8B_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Enno-Ai/EnnoAi-Pro-Llama-3-8B
|
6a5d745bdd304753244fe601e2a958d37d13cd71
| 12.174667
|
creativeml-openrail-m
| 0
| 8
| true
| false
| false
| true
| 1.184337
| 0.319538
| 31.953772
| 0.415158
| 17.507545
| 0.001511
| 0.151057
| 0.261745
| 1.565996
| 0.407052
| 9.08151
| 0.215093
| 12.788121
| false
| false
|
2024-07-01
|
2024-07-08
| 0
|
Enno-Ai/EnnoAi-Pro-Llama-3-8B
|
Enno-Ai_EnnoAi-Pro-Llama-3-8B-v0.3_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3-8B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3
|
cf29b8b484a909132e3a1f85ce891d28347c0d13
| 17.524058
|
creativeml-openrail-m
| 0
| 8
| true
| false
| false
| true
| 1.470836
| 0.508257
| 50.825698
| 0.410058
| 16.668386
| 0.012085
| 1.208459
| 0.265101
| 2.013423
| 0.423573
| 12.313281
| 0.299036
| 22.1151
| false
| false
|
2024-06-26
|
2024-06-26
| 0
|
Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3
|
Enno-Ai_EnnoAi-Pro-Llama-3.1-8B-v0.9_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3.1-8B-v0.9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9
|
c740871122fd471a1a225cf2b4368e333752d74c
| 14.945694
|
apache-2.0
| 0
| 8
| true
| false
| false
| true
| 0.932571
| 0.468915
| 46.89147
| 0.416027
| 17.498296
| 0
| 0
| 0.26594
| 2.12528
| 0.383177
| 5.430469
| 0.259558
| 17.72865
| false
| false
|
2024-08-22
|
2024-09-06
| 0
|
Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9
|
EnnoAi_EnnoAi-Pro-Llama-3.1-8B-v1.0_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EnnoAi__EnnoAi-Pro-Llama-3.1-8B-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0
|
c740871122fd471a1a225cf2b4368e333752d74c
| 14.97109
|
apache-2.0
| 0
| 8
| true
| false
| false
| true
| 0.945642
| 0.470438
| 47.043844
| 0.416027
| 17.498296
| 0
| 0
| 0.26594
| 2.12528
| 0.383177
| 5.430469
| 0.259558
| 17.72865
| false
| false
|
2024-08-22
|
2024-09-06
| 0
|
EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0
|
Epiculous_Azure_Dusk-v0.2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Epiculous/Azure_Dusk-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Epiculous/Azure_Dusk-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Epiculous__Azure_Dusk-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Epiculous/Azure_Dusk-v0.2
|
ebddf1b2efbe7f9cae066d263b0991ded89c88e8
| 14.050827
|
apache-2.0
| 7
| 12
| true
| false
| false
| true
| 1.991411
| 0.346716
| 34.67156
| 0.411972
| 17.396414
| 0.018127
| 1.812689
| 0.260906
| 1.454139
| 0.383458
| 6.365625
| 0.303441
| 22.604536
| false
| false
|
2024-09-09
|
2024-09-14
| 0
|
Epiculous/Azure_Dusk-v0.2
|
Epiculous_Crimson_Dawn-v0.2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Epiculous/Crimson_Dawn-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Epiculous/Crimson_Dawn-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Epiculous__Crimson_Dawn-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Epiculous/Crimson_Dawn-v0.2
|
4cceb1e25026afef241ad5325097e88eccd8f37a
| 14.884541
|
apache-2.0
| 10
| 12
| true
| false
| false
| true
| 3.492384
| 0.310345
| 31.034544
| 0.448238
| 21.688249
| 0.030967
| 3.096677
| 0.276007
| 3.467562
| 0.415177
| 10.897135
| 0.272108
| 19.123079
| false
| false
|
2024-09-02
|
2024-09-05
| 0
|
Epiculous/Crimson_Dawn-v0.2
|
Epiculous_NovaSpark_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Epiculous/NovaSpark" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Epiculous/NovaSpark</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Epiculous__NovaSpark-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Epiculous/NovaSpark
|
a46340895859e470c3e69661f0b894677cf4c5cb
| 25.228562
|
apache-2.0
| 6
| 8
| true
| false
| false
| true
| 0.818185
| 0.640847
| 64.08474
| 0.506396
| 29.526911
| 0.150302
| 15.030211
| 0.297819
| 6.375839
| 0.388198
| 6.92474
| 0.36486
| 29.42893
| false
| false
|
2024-10-13
|
2024-10-20
| 1
|
Epiculous/NovaSpark (Merge)
|
Epiculous_Violet_Twilight-v0.2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Epiculous/Violet_Twilight-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Epiculous/Violet_Twilight-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Epiculous__Violet_Twilight-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Epiculous/Violet_Twilight-v0.2
|
30c8bad3c1f565150afbf2fc90cacf4f45d096f6
| 18.552773
|
apache-2.0
| 17
| 12
| true
| false
| false
| true
| 1.770436
| 0.453178
| 45.317757
| 0.461455
| 23.940537
| 0.028701
| 2.870091
| 0.26594
| 2.12528
| 0.429938
| 13.608854
| 0.311087
| 23.454122
| true
| false
|
2024-09-12
|
2024-09-16
| 0
|
Epiculous/Violet_Twilight-v0.2
|
EpistemeAI_Alpaca-Llama3.1-8B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Alpaca-Llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Alpaca-Llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Alpaca-Llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Alpaca-Llama3.1-8B
|
3152dfa17322dff7c6af6dbf3daceaf5db51e230
| 13.922106
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.920853
| 0.159869
| 15.986915
| 0.475526
| 25.935227
| 0.046828
| 4.682779
| 0.290268
| 5.369128
| 0.34026
| 6.599219
| 0.324634
| 24.959368
| false
| false
|
2024-09-11
|
2024-08-13
| 2
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI_Athena-gemma-2-2b-it_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Athena-gemma-2-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athena-gemma-2-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athena-gemma-2-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Athena-gemma-2-2b-it
|
661c1dc6a1a096222e33416e099bd02b7b970405
| 14.294329
|
apache-2.0
| 2
| 2
| true
| false
| false
| false
| 2.036798
| 0.313417
| 31.341729
| 0.426423
| 19.417818
| 0.033988
| 3.398792
| 0.268456
| 2.46085
| 0.435052
| 13.348177
| 0.242188
| 15.798611
| false
| false
|
2024-08-29
|
2024-09-06
| 4
|
google/gemma-2-9b
|
EpistemeAI_Athena-gemma-2-2b-it-Philos_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Athena-gemma-2-2b-it-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athena-gemma-2-2b-it-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athena-gemma-2-2b-it-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Athena-gemma-2-2b-it-Philos
|
dea2b35d496bd32ed3c88d42ff3022654153f2e1
| 15.122657
|
apache-2.0
| 0
| 2
| true
| false
| false
| true
| 1.128593
| 0.462095
| 46.209502
| 0.379478
| 13.212088
| 0.004532
| 0.453172
| 0.28104
| 4.138702
| 0.431365
| 12.853906
| 0.224817
| 13.868573
| false
| false
|
2024-09-05
|
2024-09-05
| 1
|
unsloth/gemma-2-2b-it-bnb-4bit
|
EpistemeAI_Athene-codegemma-2-7b-it-alpaca-v1.3_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
GemmaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athene-codegemma-2-7b-it-alpaca-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3
|
9c26e1242a11178b53937bc0e9a744ef6141e05a
| 17.314022
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 0.971978
| 0.402994
| 40.299406
| 0.433192
| 20.873795
| 0.061934
| 6.193353
| 0.280201
| 4.026846
| 0.450302
| 14.854427
| 0.258727
| 17.636303
| false
| false
|
2024-09-06
|
2024-09-06
| 2
|
Removed
|
EpistemeAI_FineLlama3.1-8B-Instruct_4bit
|
4bit
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/FineLlama3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/FineLlama3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__FineLlama3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/FineLlama3.1-8B-Instruct
|
a8b0fc584b10e0110e04f9d21c7f10d24391c1d5
| 11.100787
| 0
| 14
| false
| false
| false
| false
| 2.354961
| 0.08001
| 8.000993
| 0.455736
| 23.506619
| 0.026435
| 2.643505
| 0.280201
| 4.026846
| 0.348167
| 4.954167
| 0.311253
| 23.472592
| false
| false
|
2024-08-10
| 0
|
Removed
|
||
EpistemeAI_Fireball-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-12B
|
e2ed12c3244f2502321fb20e76dfc72ad7817d6e
| 15.509355
|
apache-2.0
| 1
| 12
| true
| false
| false
| false
| 1.618521
| 0.18335
| 18.335018
| 0.511089
| 30.666712
| 0.039275
| 3.927492
| 0.261745
| 1.565996
| 0.423635
| 12.521094
| 0.334358
| 26.03982
| false
| false
|
2024-08-20
|
2024-08-21
| 2
|
Removed
|
EpistemeAI_Fireball-12B-v1.13a-philosophers_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-12B-v1.13a-philosophers" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-12B-v1.13a-philosophers</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-12B-v1.13a-philosophers-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-12B-v1.13a-philosophers
|
7fa824d4a40abca3f8c75d432ea151dc0d1d67d6
| 14.440865
|
apache-2.0
| 2
| 12
| true
| false
| false
| false
| 1.662663
| 0.087553
| 8.755325
| 0.51027
| 30.336233
| 0.044562
| 4.456193
| 0.301174
| 6.823266
| 0.408073
| 9.975781
| 0.336686
| 26.298389
| false
| false
|
2024-08-28
|
2024-09-03
| 1
|
Removed
|
EpistemeAI_Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200
|
27d67626304954db71f21fec9e7fc516421274ec
| 21.066974
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.922381
| 0.457724
| 45.772439
| 0.48384
| 26.377774
| 0.119335
| 11.933535
| 0.300336
| 6.711409
| 0.394458
| 6.907292
| 0.358295
| 28.699394
| false
| false
|
2024-09-16
|
2024-09-16
| 4
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI_Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta
|
2851384717556dd6ac14c00ed87aac1f267eb263
| 25.179287
|
apache-2.0
| 0
| 8
| true
| false
| false
| true
| 0.885645
| 0.727401
| 72.740107
| 0.486489
| 26.897964
| 0.148792
| 14.879154
| 0.280201
| 4.026846
| 0.361938
| 4.275521
| 0.354305
| 28.256132
| false
| false
|
2024-09-12
|
2024-09-14
| 5
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI_Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2
|
b19336101aa5f4807d1574f4c11eebc1c1a1c34e
| 22.537889
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.811743
| 0.467316
| 46.731561
| 0.493203
| 28.247009
| 0.123112
| 12.311178
| 0.286074
| 4.809843
| 0.462365
| 16.995573
| 0.335189
| 26.132166
| false
| false
|
2024-09-14
|
2024-09-14
| 3
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-0.001-128K-auto_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-0.001-128K-auto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-0.001-128K-auto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-0.001-128K-auto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-0.001-128K-auto
|
19b23c434b6c4524e2146926cdbf4f0e927ae3ab
| 21.57995
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.694994
| 0.443186
| 44.31863
| 0.482364
| 26.832967
| 0.133686
| 13.36858
| 0.312081
| 8.277405
| 0.406646
| 8.730729
| 0.351563
| 27.951389
| false
| false
|
2024-11-14
|
2024-11-15
| 2
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K
|
b4a88fb5fb27fc5d8a503303cdb7aaeff373fd92
| 20.627168
|
apache-2.0
| 3
| 8
| true
| false
| false
| false
| 0.814786
| 0.445734
| 44.573399
| 0.489732
| 28.025161
| 0.120846
| 12.084592
| 0.294463
| 5.928412
| 0.376229
| 4.895312
| 0.354305
| 28.256132
| false
| false
|
2024-09-26
|
2024-10-05
| 1
|
Removed
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code
|
8e8f1569a8a01ed3d6588f2669c730d4993355b5
| 23.89695
|
apache-2.0
| 2
| 8
| true
| false
| false
| false
| 0.854318
| 0.597533
| 59.753343
| 0.490419
| 28.171888
| 0.13142
| 13.141994
| 0.302013
| 6.935123
| 0.401031
| 8.46224
| 0.342254
| 26.91711
| false
| false
|
2024-10-04
|
2024-10-05
| 2
|
Removed
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds
|
8b73dd02349f0544c48c581cc73ada5cac6ff946
| 22.993108
|
llama3.1
| 2
| 8
| true
| false
| false
| true
| 1.716734
| 0.669099
| 66.90991
| 0.466807
| 24.462654
| 0.124622
| 12.462236
| 0.272651
| 3.020134
| 0.341781
| 4.55599
| 0.33893
| 26.547725
| false
| false
|
2024-10-14
|
2024-10-15
| 4
|
Removed
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto
|
f18598c62a783bcc0d436a35df0c8a335e8ee5d7
| 23.749941
|
apache-2.0
| 6
| 8
| true
| false
| false
| true
| 2.285306
| 0.730498
| 73.049841
| 0.464925
| 24.586737
| 0.139728
| 13.97281
| 0.26594
| 2.12528
| 0.320885
| 1.210677
| 0.347989
| 27.5543
| false
| false
|
2024-10-21
|
2024-10-29
| 1
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto (Merge)
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto
|
055e87600d18e58594a8d193f45c0ee9a90e1780
| 23.488818
|
apache-2.0
| 6
| 8
| true
| false
| false
| true
| 0.672068
| 0.720707
| 72.070661
| 0.461009
| 23.544253
| 0.123112
| 12.311178
| 0.270134
| 2.684564
| 0.34324
| 4.171615
| 0.335356
| 26.150635
| false
| false
|
2024-10-21
|
2024-11-27
| 1
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto (Merge)
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT
|
bb90c19dc7c4a509e7bd73f4620dca818b58be25
| 20.832251
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.839037
| 0.457824
| 45.782413
| 0.476052
| 25.820865
| 0.136707
| 13.670695
| 0.293624
| 5.816555
| 0.388135
| 6.45026
| 0.347074
| 27.452719
| false
| false
|
2024-10-11
|
2024-10-11
| 3
|
Removed
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto
|
db5ddb161ed26bc16baa814e31892dbe2f22b7a0
| 23.760965
|
apache-2.0
| 0
| 8
| true
| false
| false
| true
| 0.745131
| 0.720482
| 72.048166
| 0.48178
| 26.45206
| 0.136707
| 13.670695
| 0.248322
| 0
| 0.33
| 2.083333
| 0.354804
| 28.31154
| false
| false
|
2024-11-14
|
2024-11-14
| 1
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-ds-auto (Merge)
|
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Math_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Math
|
677c97b4f92bfc330d4fae628e9a1df1ef606dcc
| 20.545341
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.910272
| 0.462296
| 46.22956
| 0.498295
| 28.959344
| 0.107251
| 10.725076
| 0.291107
| 5.480984
| 0.364073
| 5.975781
| 0.333112
| 25.9013
| false
| false
|
2024-09-23
|
2024-09-23
| 3
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI_Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO
|
b3c0fce7daa359cd8ed5be6595dd1a76ca2cfea2
| 21.205445
|
apache-2.0
| 1
| 8
| true
| false
| false
| false
| 0.833576
| 0.461097
| 46.109656
| 0.480101
| 26.317878
| 0.120091
| 12.009063
| 0.300336
| 6.711409
| 0.399823
| 8.077865
| 0.352061
| 28.006797
| false
| false
|
2024-10-08
|
2024-10-09
| 3
|
Removed
|
EpistemeAI_Fireball-Mistral-Nemo-Base-2407-v1-DPO2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Mistral-Nemo-Base-2407-v1-DPO2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2
|
2cf732fbffefdf37341b946edd7995f14d3f9487
| 15.2764
|
apache-2.0
| 0
| 12
| true
| false
| false
| false
| 1.771269
| 0.186073
| 18.607295
| 0.496777
| 28.567825
| 0.032477
| 3.247734
| 0.291946
| 5.592841
| 0.40401
| 9.501302
| 0.335273
| 26.141401
| false
| false
|
2024-08-19
|
2024-08-19
| 1
|
Removed
|
EpistemeAI_Llama-3.2-3B-Agent007-Coder_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Llama-3.2-3B-Agent007-Coder" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Llama-3.2-3B-Agent007-Coder</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Llama-3.2-3B-Agent007-Coder-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Llama-3.2-3B-Agent007-Coder
|
7ff4e77796b6d308e96d0150e1a01081c0b82e01
| 18.901974
|
apache-2.0
| 0
| 3
| true
| false
| false
| false
| 0.710816
| 0.539956
| 53.995621
| 0.430376
| 19.025809
| 0.110272
| 11.02719
| 0.25755
| 1.006711
| 0.366802
| 7.783594
| 0.285156
| 20.572917
| false
| false
|
2024-10-08
|
2024-10-08
| 2
|
meta-llama/Llama-3.2-3B-Instruct
|
EpistemeAI_Mistral-Nemo-Instruct-12B-Philosophy-Math_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI/Mistral-Nemo-Instruct-12B-Philosophy-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Mistral-Nemo-Instruct-12B-Philosophy-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Mistral-Nemo-Instruct-12B-Philosophy-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI/Mistral-Nemo-Instruct-12B-Philosophy-Math
|
1ac4205f8da109326b4a5cf173e5491a20087d76
| 16.566232
|
apache-2.0
| 0
| 12
| true
| false
| false
| false
| 1.363607
| 0.069468
| 6.94679
| 0.536493
| 33.835811
| 0.093656
| 9.365559
| 0.331376
| 10.850112
| 0.429219
| 12.885677
| 0.329621
| 25.513446
| false
| false
|
2024-09-15
|
2024-09-26
| 1
|
unsloth/Mistral-Nemo-Instruct-2407-bnb-4bit
|
EpistemeAI2_Athene-codegemma-2-7b-it-alpaca-v1.2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
GemmaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Athene-codegemma-2-7b-it-alpaca-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Athene-codegemma-2-7b-it-alpaca-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Athene-codegemma-2-7b-it-alpaca-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Athene-codegemma-2-7b-it-alpaca-v1.2
|
21b31062334a316b50680e8c3a141a72e4c30b61
| 15.693215
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 0.969635
| 0.435118
| 43.511771
| 0.417542
| 18.97137
| 0.040785
| 4.07855
| 0.270973
| 2.796421
| 0.416969
| 10.38776
| 0.229721
| 14.413416
| false
| false
|
2024-08-26
|
2024-08-26
| 2
|
Removed
|
EpistemeAI2_Fireball-12B-v1.2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-12B-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-12B-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-12B-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-12B-v1.2
|
57af42edf8232189ee99e9a21e33a0c306e3f561
| 15.162522
|
apache-2.0
| 1
| 12
| true
| false
| false
| false
| 1.872565
| 0.135539
| 13.553926
| 0.501858
| 29.776014
| 0.039275
| 3.927492
| 0.298658
| 6.487696
| 0.417313
| 11.264062
| 0.333693
| 25.965943
| false
| false
|
2024-08-27
|
2024-08-28
| 1
|
Removed
|
EpistemeAI2_Fireball-Alpaca-Llama3.1-8B-Philos_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos
|
3dcca4cf9bdd9003c8dc91f5c78cefef1d4ae0d7
| 22.539085
|
apache-2.0
| 1
| 8
| true
| false
| false
| false
| 0.848332
| 0.49864
| 49.864027
| 0.497758
| 29.259226
| 0.117825
| 11.782477
| 0.292785
| 5.704698
| 0.427667
| 11.891667
| 0.340592
| 26.732417
| false
| false
|
2024-08-29
|
2024-08-29
| 3
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Alpaca-Llama3.1.01-8B-Philos_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.01-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos
|
f97293ed5cec7fb9482b16600259967c6c923e4b
| 21.567144
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.870572
| 0.421179
| 42.117914
| 0.495611
| 28.628475
| 0.135952
| 13.595166
| 0.288591
| 5.145414
| 0.437062
| 13.432813
| 0.338348
| 26.483082
| false
| false
|
2024-09-03
|
2024-09-03
| 3
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Alpaca-Llama3.1.03-8B-Philos_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.03-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos
|
6e60f783f80f7d126b8e4f2b417e14dea63d2c4f
| 20.29975
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.797523
| 0.388081
| 38.80814
| 0.495087
| 27.992549
| 0.129909
| 12.990937
| 0.278523
| 3.803132
| 0.42801
| 12.034635
| 0.335522
| 26.169105
| false
| false
|
2024-09-04
|
2024-09-04
| 3
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Alpaca-Llama3.1.04-8B-Philos_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.04-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos
|
efd0c251373e1a2fa2bc8cead502c03ff6dc7c8b
| 21.031577
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.765248
| 0.40844
| 40.843961
| 0.493001
| 27.963798
| 0.116314
| 11.63142
| 0.290268
| 5.369128
| 0.437219
| 13.685677
| 0.340259
| 26.695479
| false
| false
|
2024-09-05
|
2024-09-05
| 3
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo
|
3e76f190b505b515479cc25e92f8229c2b05159f
| 21.829867
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.934774
| 0.486576
| 48.657562
| 0.488077
| 27.207177
| 0.128399
| 12.839879
| 0.297819
| 6.375839
| 0.393188
| 6.848437
| 0.361453
| 29.05031
| false
| false
|
2024-09-09
|
2024-09-09
| 5
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Alpaca-Llama3.1.07-8B-Philos-Math_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math
|
0b2842bddfa6c308f67eb5a20daf04536a4e6d1a
| 21.870165
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.90203
| 0.507908
| 50.790791
| 0.484702
| 26.901201
| 0.114048
| 11.404834
| 0.296141
| 6.152125
| 0.406302
| 7.854427
| 0.353059
| 28.117612
| false
| false
|
2024-09-10
|
2024-09-10
| 4
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection
|
dc900138b4406353b7e84251bc8649d70c16f13f
| 20.882037
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.883974
| 0.395226
| 39.522578
| 0.495531
| 27.571611
| 0.123867
| 12.386707
| 0.299497
| 6.599553
| 0.404813
| 10.401563
| 0.359292
| 28.81021
| false
| false
|
2024-09-16
|
2024-09-16
| 6
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1
|
c57c786426123635baf6c8b4d30638d2053f4565
| 22.410483
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.909759
| 0.531638
| 53.163828
| 0.482793
| 26.763685
| 0.117825
| 11.782477
| 0.29698
| 6.263982
| 0.410302
| 8.454427
| 0.352311
| 28.034501
| false
| false
|
2024-09-13
|
2024-09-13
| 4
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-Llama-3.1-8B-Philos-Reflection_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Reflection" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Reflection</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Llama-3.1-8B-Philos-Reflection-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Reflection
|
4b0b75d9235886e8a947c45b94f87c5a65a81467
| 20.389309
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.894943
| 0.359605
| 35.960474
| 0.489769
| 27.769796
| 0.129154
| 12.915408
| 0.307886
| 7.718121
| 0.395729
| 9.632813
| 0.355053
| 28.339243
| false
| false
|
2024-09-17
|
2024-09-17
| 5
|
meta-llama/Meta-Llama-3.1-8B
|
EpistemeAI2_Fireball-MathMistral-Nemo-Base-2407-v2dpo_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-MathMistral-Nemo-Base-2407-v2dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-MathMistral-Nemo-Base-2407-v2dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-MathMistral-Nemo-Base-2407-v2dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-MathMistral-Nemo-Base-2407-v2dpo
|
6b7d851c66359f39d16da6fbcf810b816dc6e4bc
| 11.332218
|
apache-2.0
| 1
| 11
| true
| false
| false
| true
| 1.881426
| 0.30972
| 30.972043
| 0.432764
| 21.145528
| 0.034743
| 3.47432
| 0.263423
| 1.789709
| 0.402958
| 8.969792
| 0.114777
| 1.641918
| false
| false
|
2024-08-21
|
2024-08-24
| 2
|
unsloth/Mistral-Nemo-Base-2407-bnb-4bit
|
EpistemeAI2_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math
|
aa21037cf0984cb293facb69c41895e7fccb1340
| 22.677605
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.791683
| 0.551547
| 55.154656
| 0.480756
| 26.743767
| 0.132175
| 13.217523
| 0.30453
| 7.270694
| 0.36925
| 6.789583
| 0.342005
| 26.889406
| false
| false
|
2024-10-11
|
2024-10-12
| 3
|
Removed
|
EpistemeAI2_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT
|
cf8b99d4aa00c18fdaebfb24fa3c674ee6defa1a
| 20.999994
|
apache-2.0
| 0
| 8
| true
| false
| false
| false
| 0.800818
| 0.46332
| 46.331955
| 0.479083
| 26.400992
| 0.114804
| 11.480363
| 0.312081
| 8.277405
| 0.377438
| 5.013021
| 0.356466
| 28.496232
| false
| false
|
2024-10-11
|
2024-10-11
| 3
|
Removed
|
EpistemeAI2_Fireball-Phi-3-medium-4k-inst-Philos_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Phi-3-medium-4k-inst-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Phi-3-medium-4k-inst-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Phi-3-medium-4k-inst-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
EpistemeAI2/Fireball-Phi-3-medium-4k-inst-Philos
|
147715051102034fac98091e2a0cae6cade15ae0
| 29.172842
|
apache-2.0
| 0
| 13
| true
| false
| false
| true
| 0.771814
| 0.531288
| 53.128809
| 0.617784
| 46.208873
| 0.140483
| 14.048338
| 0.332215
| 10.961969
| 0.413906
| 10.704948
| 0.459857
| 39.984116
| false
| false
|
2024-09-19
|
2024-09-20
| 1
|
unsloth/phi-3-medium-4k-instruct-bnb-4bit
|
Eric111_CatunaMayo_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Eric111/CatunaMayo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Eric111/CatunaMayo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Eric111__CatunaMayo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Eric111/CatunaMayo
|
23337893381293975cbcc35f75b634954fbcefaf
| 21.299155
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 0.550825
| 0.407416
| 40.741566
| 0.524364
| 33.299426
| 0.086103
| 8.610272
| 0.291946
| 5.592841
| 0.45399
| 15.348698
| 0.317819
| 24.202128
| true
| false
|
2024-02-15
|
2024-07-03
| 0
|
Eric111/CatunaMayo
|
Eric111_CatunaMayo-DPO_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Eric111/CatunaMayo-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Eric111/CatunaMayo-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Eric111__CatunaMayo-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Eric111/CatunaMayo-DPO
|
6bdbe06c10d57d152dd8a79a71edd8e30135b689
| 21.255121
|
apache-2.0
| 0
| 7
| true
| false
| false
| false
| 0.554023
| 0.421454
| 42.145396
| 0.522399
| 33.089952
| 0.079305
| 7.930514
| 0.291946
| 5.592841
| 0.445031
| 14.66224
| 0.316988
| 24.109781
| true
| false
|
2024-02-21
|
2024-06-27
| 0
|
Eric111/CatunaMayo-DPO
|
Etherll_Chocolatine-3B-Instruct-DPO-Revised-Ties_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Chocolatine-3B-Instruct-DPO-Revised-Ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties
|
8a9c3d745e0805e769b544622b3f5c039abc9b07
| 24.402767
| 0
| 3
| false
| false
| false
| false
| 0.635497
| 0.372469
| 37.246949
| 0.541065
| 35.583343
| 0.128399
| 12.839879
| 0.323826
| 9.8434
| 0.464938
| 17.817187
| 0.397773
| 33.085845
| false
| false
|
2024-10-28
| 0
|
Removed
|
||
Etherll_Chocolatine-3B-Instruct-DPO-Revised-Ties-v2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Phi3ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Chocolatine-3B-Instruct-DPO-Revised-Ties-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties-v2
|
121b0831361743558e1a56fd89ae3d3c03272cc4
| 24.428163
| 0
| 3
| false
| false
| false
| false
| 0.631296
| 0.373993
| 37.399323
| 0.541065
| 35.583343
| 0.128399
| 12.839879
| 0.323826
| 9.8434
| 0.464938
| 17.817187
| 0.397773
| 33.085845
| false
| false
|
2024-10-29
| 0
|
Removed
|
||
Etherll_Herplete-LLM-Llama-3.1-8b_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Herplete-LLM-Llama-3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Herplete-LLM-Llama-3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Herplete-LLM-Llama-3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Herplete-LLM-Llama-3.1-8b
|
b3829cf437216f099c031a9ab5e4c8ec974766dd
| 19.588708
| 5
| 8
| false
| false
| false
| true
| 0.973685
| 0.467191
| 46.71915
| 0.501343
| 28.952591
| 0.027946
| 2.794562
| 0.286074
| 4.809843
| 0.386
| 6.683333
| 0.348155
| 27.572769
| false
| false
|
2024-08-24
|
2024-08-29
| 1
|
Etherll/Herplete-LLM-Llama-3.1-8b (Merge)
|
|
Etherll_Herplete-LLM-Llama-3.1-8b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Herplete-LLM-Llama-3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Herplete-LLM-Llama-3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Herplete-LLM-Llama-3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Herplete-LLM-Llama-3.1-8b
|
d1383d993fad005d515be4d815797019601c679f
| 26.260139
| 5
| 8
| false
| false
| false
| false
| 0.854807
| 0.610598
| 61.059766
| 0.534725
| 33.206608
| 0.154834
| 15.483384
| 0.314597
| 8.612975
| 0.399052
| 8.614844
| 0.375249
| 30.583259
| false
| false
|
2024-08-24
|
2024-10-18
| 1
|
Etherll/Herplete-LLM-Llama-3.1-8b (Merge)
|
|
Etherll_Herplete-LLM-Llama-3.1-8b-Ties_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Herplete-LLM-Llama-3.1-8b-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Herplete-LLM-Llama-3.1-8b-Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Herplete-LLM-Llama-3.1-8b-Ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Herplete-LLM-Llama-3.1-8b-Ties
| 26.571056
| 0
| 8
| false
| false
| false
| false
| 0.862201
| 0.616368
| 61.63679
| 0.533798
| 33.07089
| 0.162387
| 16.238671
| 0.317114
| 8.948546
| 0.401719
| 8.948177
| 0.375249
| 30.583259
| false
| false
|
2024-10-03
|
2024-10-17
| 1
|
Etherll/Herplete-LLM-Llama-3.1-8b-Ties (Merge)
|
||
Etherll_Qwen2.5-7B-della-test_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Qwen2.5-7B-della-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Qwen2.5-7B-della-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Qwen2.5-7B-della-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Qwen2.5-7B-della-test
|
c2b2ffc38627e68e7b43a1b596dc16ee93c1c63b
| 27.659468
| 1
| 7
| false
| false
| false
| true
| 1.385742
| 0.762497
| 76.249684
| 0.544733
| 35.546894
| 0
| 0
| 0.308725
| 7.829978
| 0.404698
| 8.98724
| 0.436087
| 37.343011
| false
| false
|
2024-11-01
|
2024-11-14
| 1
|
Etherll/Qwen2.5-7B-della-test (Merge)
|
|
Etherll_Qwen2.5-Coder-7B-Instruct-Ties_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Qwen2.5-Coder-7B-Instruct-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Qwen2.5-Coder-7B-Instruct-Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Qwen2.5-Coder-7B-Instruct-Ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Qwen2.5-Coder-7B-Instruct-Ties
|
d8c1624a2fa60f05030e34a128af391b5d8be332
| 24.474445
| 0
| 7
| false
| false
| false
| false
| 1.197181
| 0.500539
| 50.053857
| 0.489514
| 28.008294
| 0.169184
| 16.918429
| 0.329698
| 10.626398
| 0.437281
| 13.426823
| 0.350316
| 27.812869
| false
| false
|
2024-09-30
|
2024-10-28
| 1
|
Etherll/Qwen2.5-Coder-7B-Instruct-Ties (Merge)
|
|
Etherll_Replete-LLM-V3-Llama-3.1-8b_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/Replete-LLM-V3-Llama-3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Replete-LLM-V3-Llama-3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Replete-LLM-V3-Llama-3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/Replete-LLM-V3-Llama-3.1-8b
|
e79849d72f70ef74677ed81a8885403973b2470c
| 17.927882
| 5
| 8
| false
| false
| false
| true
| 0.789329
| 0.526292
| 52.629246
| 0.454338
| 22.902455
| 0.000755
| 0.075529
| 0.268456
| 2.46085
| 0.351646
| 2.055729
| 0.346991
| 27.443484
| false
| false
|
2024-08-24
|
2024-08-26
| 1
|
Etherll/Replete-LLM-V3-Llama-3.1-8b (Merge)
|
|
Etherll_SuperHermes_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Etherll/SuperHermes" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/SuperHermes</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__SuperHermes-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Etherll/SuperHermes
|
7edd56cb37722d09b0334826e0532b223d334939
| 26.604602
| 1
| 8
| false
| false
| false
| false
| 0.750015
| 0.545902
| 54.590154
| 0.528953
| 32.840317
| 0.146526
| 14.652568
| 0.323826
| 9.8434
| 0.440042
| 14.938542
| 0.394864
| 32.762633
| false
| false
|
2024-10-27
|
2024-10-27
| 1
|
Etherll/SuperHermes (Merge)
|
|
Eurdem_Defne-llama3.1-8B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Eurdem/Defne-llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Eurdem/Defne-llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Eurdem__Defne-llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Eurdem/Defne-llama3.1-8B
|
7832ba3066636bf4dab3e7d658c0b3ded12491ae
| 25.095429
|
llama3.1
| 3
| 8
| true
| false
| false
| false
| 1.7203
| 0.503612
| 50.361153
| 0.532098
| 32.822381
| 0.15861
| 15.861027
| 0.296141
| 6.152125
| 0.433094
| 13.536719
| 0.386553
| 31.83917
| false
| false
|
2024-07-29
|
2024-08-14
| 0
|
Eurdem/Defne-llama3.1-8B
|
FallenMerick_Chewy-Lemon-Cookie-11B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/FallenMerick/Chewy-Lemon-Cookie-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FallenMerick/Chewy-Lemon-Cookie-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FallenMerick__Chewy-Lemon-Cookie-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FallenMerick/Chewy-Lemon-Cookie-11B
|
0f5d0d6d218b3ef034f58eba32d6fe7ac4c237ae
| 22.018549
|
cc-by-4.0
| 0
| 10
| true
| false
| false
| false
| 0.857274
| 0.487524
| 48.752421
| 0.525112
| 33.0143
| 0.05287
| 5.287009
| 0.279362
| 3.914989
| 0.454552
| 15.952344
| 0.326712
| 25.190233
| true
| false
|
2024-06-06
|
2024-06-27
| 1
|
FallenMerick/Chewy-Lemon-Cookie-11B (Merge)
|
Felladrin_Llama-160M-Chat-v1_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/Felladrin/Llama-160M-Chat-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Felladrin/Llama-160M-Chat-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Felladrin__Llama-160M-Chat-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Felladrin/Llama-160M-Chat-v1
|
e7f50665676821867ee7dfad32d0ca9fb68fc6bc
| 4.101061
|
apache-2.0
| 16
| 0
| true
| false
| false
| true
| 0.181581
| 0.157546
| 15.754642
| 0.303608
| 3.166756
| 0
| 0
| 0.25755
| 1.006711
| 0.366125
| 3.165625
| 0.113614
| 1.512633
| false
| false
|
2023-12-20
|
2024-07-23
| 1
|
JackFram/llama-160m
|
Felladrin_Minueza-32M-UltraChat_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/Felladrin/Minueza-32M-UltraChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Felladrin/Minueza-32M-UltraChat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Felladrin__Minueza-32M-UltraChat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
Felladrin/Minueza-32M-UltraChat
|
28506b99c5902d2215eb378ec91d4226a7396c49
| 3.848727
|
apache-2.0
| 4
| 0
| true
| false
| false
| true
| 0.168067
| 0.137563
| 13.756278
| 0.294148
| 2.43729
| 0
| 0
| 0.255872
| 0.782998
| 0.374187
| 4.640104
| 0.113281
| 1.475694
| false
| false
|
2024-02-27
|
2024-07-23
| 1
|
Felladrin/Minueza-32M-Base
|
FlofloB_100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit_float16
|
float16
|
🟩 continuously pretrained
|
🟩
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit
|
ea6ceae8a6894f1c6ea3fe978846b2a66c3e369c
| 7.871072
|
apache-2.0
| 1
| 0
| true
| false
| false
| true
| 0.483694
| 0.308322
| 30.832192
| 0.332339
| 7.347825
| 0
| 0
| 0.269295
| 2.572707
| 0.330219
| 0.94401
| 0.149767
| 5.529699
| false
| false
|
2024-11-28
|
2024-11-29
| 3
|
Qwen/Qwen2.5-0.5B
|
FlofloB_10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit_float16
|
float16
|
🟩 continuously pretrained
|
🟩
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit
|
a2eb0460779e76bb511339bcc2545b4729c9d78e
| 23.879918
|
apache-2.0
| 1
| 16
| true
| false
| false
| true
| 0.487545
| 0.509731
| 50.973085
| 0.521499
| 32.6078
| 0.087613
| 8.761329
| 0.299497
| 6.599553
| 0.430958
| 13.569792
| 0.376912
| 30.767952
| false
| false
|
2024-11-22
|
2024-11-22
| 1
|
unsloth/phi-3-mini-4k-instruct-bnb-4bit
|
FlofloB_10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit_float16
|
float16
|
🟩 continuously pretrained
|
🟩
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit
|
2152657b389375f48fc5073413bba17835117bcc
| 7.847811
|
apache-2.0
| 1
| 0
| true
| false
| false
| true
| 0.508365
| 0.281544
| 28.154408
| 0.330552
| 7.530229
| 0
| 0
| 0.279362
| 3.914989
| 0.330219
| 1.477344
| 0.154089
| 6.0099
| false
| false
|
2024-11-25
|
2024-11-25
| 3
|
Qwen/Qwen2.5-0.5B
|
FlofloB_40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit_float16
|
float16
|
🟩 continuously pretrained
|
🟩
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/FlofloB/40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
FlofloB/40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit
|
64c61d9c777da56597a338afd7586cc4ad07d350
| 7.827703
|
apache-2.0
| 1
| 0
| true
| false
| false
| true
| 0.481567
| 0.301578
| 30.157759
| 0.332461
| 7.53209
| 0
| 0
| 0.267617
| 2.348993
| 0.340823
| 1.536198
| 0.148521
| 5.391179
| false
| false
|
2024-11-25
|
2024-11-25
| 3
|
Qwen/Qwen2.5-0.5B
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.