eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 2
values | Architecture
stringclasses 64
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 0.74
52.1
| Hub License
stringclasses 27
values | Hub ❤️
int64 0
6.09k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
187
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.22
0.83
| BBH
float64 0.25
76.7
| MATH Lvl 5 Raw
float64 0
0.71
| MATH Lvl 5
float64 0
71.5
| GPQA Raw
float64 0.21
0.47
| GPQA
float64 0
29.4
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.7
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 525
values | Submission Date
stringclasses 263
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
stabilityai_stablelm-2-1_6b-chat_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
StableLmForCausalLM
|
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-1_6b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-1_6b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-2-1_6b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
stabilityai/stablelm-2-1_6b-chat
|
f3fe67057c2789ae1bb1fe42b038da99840d4f13
| 8.867361
|
other
| 33
| 1.645
| true
| false
| false
| true
| 0.990853
| 0.305999
| 30.599919
| 0.339017
| 7.493378
| 0.024924
| 2.492447
| 0.247483
| 0
| 0.357969
| 5.71276
| 0.162151
| 6.905659
| false
| true
|
2024-04-08
|
2024-06-12
| 0
|
stabilityai/stablelm-2-1_6b-chat
|
stabilityai_stablelm-2-zephyr-1_6b_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
StableLmForCausalLM
|
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-zephyr-1_6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-zephyr-1_6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-2-zephyr-1_6b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
stabilityai/stablelm-2-zephyr-1_6b
|
2f275b1127d59fc31e4f7c7426d528768ada9ea4
| 9.458168
|
other
| 182
| 1.645
| true
| false
| false
| true
| 0.946177
| 0.327931
| 32.7931
| 0.335161
| 6.70871
| 0.033233
| 3.323263
| 0.243289
| 0
| 0.351146
| 5.993229
| 0.171376
| 7.930703
| false
| true
|
2024-01-19
|
2024-06-12
| 0
|
stabilityai/stablelm-2-zephyr-1_6b
|
stabilityai_stablelm-3b-4e1t_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
StableLmForCausalLM
|
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-3b-4e1t" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-3b-4e1t</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-3b-4e1t-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
stabilityai/stablelm-3b-4e1t
|
fa4a6a92fca83c3b4223a3c9bf792887090ebfba
| 7.326191
|
cc-by-sa-4.0
| 310
| 2.795
| true
| false
| false
| false
| 0.86853
| 0.22032
| 22.031986
| 0.350421
| 9.01307
| 0.010574
| 1.057402
| 0.237416
| 0
| 0.377781
| 4.422656
| 0.166888
| 7.432033
| false
| true
|
2023-09-29
|
2024-08-10
| 0
|
stabilityai/stablelm-3b-4e1t
|
stabilityai_stablelm-zephyr-3b_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
StableLmForCausalLM
|
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-zephyr-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-zephyr-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-zephyr-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
stabilityai/stablelm-zephyr-3b
|
a14f62d95754d96aea2be6e24c0f6966636797b9
| 12.369207
|
other
| 253
| 2.795
| true
| false
| false
| true
| 0.768047
| 0.368323
| 36.832272
| 0.386636
| 14.759119
| 0.043051
| 4.305136
| 0.239094
| 0
| 0.418302
| 9.78776
| 0.176779
| 8.530954
| false
| true
|
2023-11-21
|
2024-06-12
| 0
|
stabilityai/stablelm-zephyr-3b
|
sthenno_tempesthenno-0120_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sthenno/tempesthenno-0120" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno/tempesthenno-0120</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno__tempesthenno-0120-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sthenno/tempesthenno-0120
|
89ddd0c32c5fdc31060cd50b3cbaf52dd4ffae8a
| 36.467061
| 0
| 14.766
| false
| false
| false
| false
| 5.683678
| 0.539032
| 53.903199
| 0.637317
| 47.909018
| 0.335347
| 33.534743
| 0.394295
| 19.239374
| 0.463323
| 16.548698
| 0.529006
| 47.667332
| false
| false
|
2025-01-29
| 0
|
Removed
|
||
sthenno_tempesthenno-fusion-0309_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sthenno/tempesthenno-fusion-0309" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno/tempesthenno-fusion-0309</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno__tempesthenno-fusion-0309-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sthenno/tempesthenno-fusion-0309
|
1b89614a5b377efc9da0c320cbd2dbb8322e6c2a
| 42.13889
|
apache-2.0
| 2
| 14.766
| true
| false
| false
| true
| 1.62178
| 0.769191
| 76.91913
| 0.658088
| 50.979856
| 0.476586
| 47.65861
| 0.369966
| 15.995526
| 0.43251
| 13.963802
| 0.525848
| 47.316415
| true
| false
|
2025-03-08
|
2025-03-08
| 1
|
sthenno/tempesthenno-fusion-0309 (Merge)
|
sthenno_tempesthenno-kto-0205-ckpt80_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sthenno/tempesthenno-kto-0205-ckpt80" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno/tempesthenno-kto-0205-ckpt80</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno__tempesthenno-kto-0205-ckpt80-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sthenno/tempesthenno-kto-0205-ckpt80
|
9ed4d0238da8de732203da2d07e342e56c2538dd
| 41.790944
|
apache-2.0
| 3
| 14.766
| true
| false
| false
| false
| 5.002423
| 0.805436
| 80.543624
| 0.654274
| 50.643797
| 0.459215
| 45.92145
| 0.348154
| 13.087248
| 0.42476
| 12.928385
| 0.52859
| 47.621158
| false
| false
|
2025-02-05
|
2025-02-05
| 1
|
sthenno/tempesthenno-kto-0205-ckpt80 (Merge)
|
sthenno_tempesthenno-nuslerp-001_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sthenno/tempesthenno-nuslerp-001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno/tempesthenno-nuslerp-001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno__tempesthenno-nuslerp-001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sthenno/tempesthenno-nuslerp-001
|
d507c25ccad162616c5d6d8fee3612324ee521f4
| 42.586152
|
apache-2.0
| 4
| 14.766
| true
| false
| false
| true
| 2.95422
| 0.792647
| 79.264684
| 0.657768
| 51.044911
| 0.475831
| 47.583082
| 0.373322
| 16.442953
| 0.43
| 13.883333
| 0.525682
| 47.297946
| true
| false
|
2025-01-11
|
2025-01-27
| 1
|
sthenno/tempesthenno-nuslerp-001 (Merge)
|
sthenno_tempesthenno-nuslerp-0124_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sthenno/tempesthenno-nuslerp-0124" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno/tempesthenno-nuslerp-0124</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno__tempesthenno-nuslerp-0124-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sthenno/tempesthenno-nuslerp-0124
|
9769b900fbc116b28cb618e7f16c92552b78b5ff
| 41.287889
|
apache-2.0
| 4
| 14.766
| true
| false
| false
| false
| 5.399205
| 0.700398
| 70.039828
| 0.646855
| 49.276795
| 0.411631
| 41.163142
| 0.390101
| 18.680089
| 0.485927
| 20.207552
| 0.535239
| 48.359929
| true
| false
|
2025-01-27
|
2025-01-29
| 1
|
sthenno/tempesthenno-nuslerp-0124 (Merge)
|
sthenno_tempesthenno-ppo-ckpt40_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sthenno/tempesthenno-ppo-ckpt40" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno/tempesthenno-ppo-ckpt40</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno__tempesthenno-ppo-ckpt40-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sthenno/tempesthenno-ppo-ckpt40
|
c7e00f975d12b48394474908d0596e4be2957e05
| 42.73562
|
apache-2.0
| 4
| 14.766
| true
| false
| false
| true
| 5.832257
| 0.792322
| 79.232215
| 0.65496
| 50.573172
| 0.473565
| 47.356495
| 0.377517
| 17.002237
| 0.435177
| 14.563802
| 0.529172
| 47.685801
| false
| false
|
2025-01-16
|
2025-01-27
| 1
|
sthenno/tempesthenno-ppo-ckpt40 (Merge)
|
sthenno_tempesthenno-sft-0309-ckpt10_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sthenno/tempesthenno-sft-0309-ckpt10" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno/tempesthenno-sft-0309-ckpt10</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno__tempesthenno-sft-0309-ckpt10-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sthenno/tempesthenno-sft-0309-ckpt10
|
e13c4281c3cccf9fded2ec8c3b2ef6d24c906403
| 42.192397
|
apache-2.0
| 2
| 14.766
| true
| false
| false
| true
| 1.550085
| 0.774362
| 77.436203
| 0.655165
| 50.600903
| 0.472054
| 47.205438
| 0.371644
| 16.219239
| 0.436417
| 14.385417
| 0.525765
| 47.307181
| false
| false
|
2025-03-08
|
2025-03-08
| 1
|
sthenno/tempesthenno-sft-0309-ckpt10 (Merge)
|
sthenno_tempesthenno-sft-0314-stage1-ckpt50_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sthenno/tempesthenno-sft-0314-stage1-ckpt50" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno/tempesthenno-sft-0314-stage1-ckpt50</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno__tempesthenno-sft-0314-stage1-ckpt50-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sthenno/tempesthenno-sft-0314-stage1-ckpt50
|
d82f4a9e3272a0776f5461664ec0cca123f21495
| 41.886892
|
apache-2.0
| 3
| 14.766
| true
| false
| false
| true
| 1.514297
| 0.739366
| 73.936599
| 0.660102
| 51.259313
| 0.468278
| 46.827795
| 0.373322
| 16.442953
| 0.442865
| 15.058073
| 0.53017
| 47.796616
| false
| false
|
2025-03-13
|
2025-03-13
| 1
|
sthenno/tempesthenno-sft-0314-stage1-ckpt50 (Merge)
|
sthenno_tempestissimo-14b-0309_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sthenno/tempestissimo-14b-0309" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno/tempestissimo-14b-0309</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno__tempestissimo-14b-0309-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sthenno/tempestissimo-14b-0309
|
cff092839c0ce638f754c4ab743c3cd1bdc69f16
| 41.88724
|
apache-2.0
| 4
| 14.766
| true
| false
| false
| true
| 1.571916
| 0.754878
| 75.487817
| 0.658733
| 50.922767
| 0.479607
| 47.960725
| 0.366611
| 15.548098
| 0.43124
| 13.838281
| 0.528092
| 47.565751
| true
| false
|
2025-03-08
|
2025-03-08
| 1
|
sthenno/tempestissimo-14b-0309 (Merge)
|
sthenno-com_miscii-14b-0130_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sthenno-com/miscii-14b-0130" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno-com/miscii-14b-0130</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno-com__miscii-14b-0130-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sthenno-com/miscii-14b-0130
|
df4b3c169aeab40831f87751076bc67c32209fe8
| 41.085926
|
apache-2.0
| 8
| 14.766
| true
| false
| false
| false
| 3.886329
| 0.664703
| 66.470299
| 0.650541
| 49.838839
| 0.432024
| 43.202417
| 0.381711
| 17.561521
| 0.491167
| 20.9625
| 0.53632
| 48.479979
| true
| false
|
2025-01-30
|
2025-01-30
| 1
|
sthenno-com/miscii-14b-0130 (Merge)
|
sthenno-com_miscii-14b-0218_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sthenno-com/miscii-14b-0218" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno-com/miscii-14b-0218</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno-com__miscii-14b-0218-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sthenno-com/miscii-14b-0218
|
9fba75f9b793d0e79e1b0174f54c6919bbc66d67
| 42.89726
|
apache-2.0
| 20
| 14.766
| true
| false
| false
| true
| 1.546942
| 0.765594
| 76.559418
| 0.655871
| 50.644566
| 0.51435
| 51.435045
| 0.383389
| 17.785235
| 0.427271
| 13.208854
| 0.529754
| 47.750443
| true
| false
|
2025-02-19
|
2025-02-19
| 1
|
sthenno-com/miscii-14b-0218 (Merge)
|
sthenno-com_miscii-14b-1028_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sthenno-com/miscii-14b-1028" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno-com/miscii-14b-1028</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno-com__miscii-14b-1028-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sthenno-com/miscii-14b-1028
|
a60c866621ee35d04e84cf366e972f2466d617b1
| 42.3807
|
apache-2.0
| 18
| 14.77
| true
| false
| false
| true
| 3.067457
| 0.823671
| 82.367119
| 0.644833
| 49.262668
| 0.503021
| 50.302115
| 0.356544
| 14.205817
| 0.418156
| 12.002865
| 0.515293
| 46.143617
| false
| false
|
2024-11-12
|
2024-11-17
| 1
|
sthenno-com/miscii-14b-1028 (Merge)
|
sthenno-com_miscii-14b-1225_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sthenno-com/miscii-14b-1225" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno-com/miscii-14b-1225</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno-com__miscii-14b-1225-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sthenno-com/miscii-14b-1225
|
3d26f676424307cc2496c6b11710bbfa35275685
| 42.349512
|
apache-2.0
| 25
| 14.766
| true
| false
| false
| true
| 2.896994
| 0.787801
| 78.780081
| 0.657171
| 50.912806
| 0.451662
| 45.166163
| 0.377517
| 17.002237
| 0.436573
| 14.771615
| 0.527178
| 47.46417
| true
| false
|
2024-12-24
|
2024-12-24
| 1
|
sthenno-com/miscii-14b-1225 (Merge)
|
streamerbtw1002_Nexuim-R1-7B-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/streamerbtw1002/Nexuim-R1-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">streamerbtw1002/Nexuim-R1-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/streamerbtw1002__Nexuim-R1-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
streamerbtw1002/Nexuim-R1-7B-Instruct
|
f53f6fa3ec8ec90cd2d62b8ee232ad49695c323a
| 30.443047
|
apache-2.0
| 0
| 7.616
| true
| false
| false
| true
| 0.670074
| 0.693429
| 69.342899
| 0.517517
| 31.444219
| 0.445619
| 44.561934
| 0.259228
| 1.230425
| 0.335552
| 1.210677
| 0.413813
| 34.868129
| false
| false
|
2025-03-11
|
2025-03-12
| 2
|
Qwen/Qwen2.5-7B
|
stupidity-ai_Llama-3-8B-Instruct-MultiMoose_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/stupidity-ai/Llama-3-8B-Instruct-MultiMoose" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stupidity-ai/Llama-3-8B-Instruct-MultiMoose</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stupidity-ai__Llama-3-8B-Instruct-MultiMoose-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
stupidity-ai/Llama-3-8B-Instruct-MultiMoose
|
2aff10399de6ed9206a59a48c49bd704962cca1a
| 4.768702
|
llama3
| 0
| 8.03
| true
| false
| false
| true
| 0.777553
| 0.23181
| 23.181049
| 0.282297
| 1.207693
| 0
| 0
| 0.253356
| 0.447427
| 0.348542
| 2.734375
| 0.109375
| 1.041667
| true
| false
|
2025-02-17
|
2025-03-07
| 1
|
stupidity-ai/Llama-3-8B-Instruct-MultiMoose (Merge)
|
suayptalha_Clarus-7B-v0.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/suayptalha/Clarus-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/Clarus-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__Clarus-7B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
suayptalha/Clarus-7B-v0.1
|
0d2982fbacb05c10a97af807f0649fcad7a82479
| 36.70526
|
mit
| 5
| 7.616
| true
| false
| false
| true
| 0.646591
| 0.745411
| 74.541106
| 0.549661
| 36.031164
| 0.492447
| 49.244713
| 0.307047
| 7.606264
| 0.442958
| 15.169792
| 0.438747
| 37.63852
| true
| false
|
2025-02-24
|
2025-02-24
| 1
|
suayptalha/Clarus-7B-v0.1 (Merge)
|
suayptalha_Clarus-7B-v0.2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/suayptalha/Clarus-7B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/Clarus-7B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__Clarus-7B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
suayptalha/Clarus-7B-v0.2
|
63712049fb59216dae1f2e2f5c993e235b21b6c7
| 36.860643
|
mit
| 4
| 7.613
| true
| false
| false
| true
| 0.674242
| 0.767942
| 76.794239
| 0.549006
| 36.018804
| 0.48565
| 48.564955
| 0.302013
| 6.935123
| 0.441656
| 15.073698
| 0.439993
| 37.777039
| true
| false
|
2025-02-24
|
2025-02-24
| 1
|
suayptalha/Clarus-7B-v0.2 (Merge)
|
suayptalha_Clarus-7B-v0.3_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/suayptalha/Clarus-7B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/Clarus-7B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__Clarus-7B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
suayptalha/Clarus-7B-v0.3
|
e1e28ebc8cb7da944cc22aa9e65a322bff2731ef
| 36.776154
|
mit
| 4
| 7.616
| true
| false
| false
| true
| 0.6434
| 0.750906
| 75.090648
| 0.552599
| 36.457869
| 0.487915
| 48.791541
| 0.312081
| 8.277405
| 0.440229
| 14.428646
| 0.438497
| 37.610816
| true
| false
|
2025-02-28
|
2025-02-28
| 1
|
suayptalha/Clarus-7B-v0.3 (Merge)
|
suayptalha_DeepSeek-R1-Distill-Llama-3B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/suayptalha/DeepSeek-R1-Distill-Llama-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/DeepSeek-R1-Distill-Llama-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__DeepSeek-R1-Distill-Llama-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
suayptalha/DeepSeek-R1-Distill-Llama-3B
|
5980166d03fa0d2a63f6dfbf59fe6b5abc7005e0
| 23.273682
|
mit
| 11
| 3.213
| true
| false
| false
| true
| 1.216313
| 0.709266
| 70.926586
| 0.445179
| 21.448756
| 0.209215
| 20.92145
| 0.260906
| 1.454139
| 0.339583
| 2.914583
| 0.297789
| 21.976581
| false
| false
|
2025-02-23
|
2025-02-24
| 1
|
suayptalha/DeepSeek-R1-Distill-Llama-3B (Merge)
|
suayptalha_Falcon3-Jessi-v0.4-7B-Slerp_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/suayptalha/Falcon3-Jessi-v0.4-7B-Slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/Falcon3-Jessi-v0.4-7B-Slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__Falcon3-Jessi-v0.4-7B-Slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
suayptalha/Falcon3-Jessi-v0.4-7B-Slerp
|
bf21816f8fbfcaab7cfc811c9ffd13b25988514b
| 36.077232
|
other
| 9
| 7.456
| true
| false
| false
| true
| 1.455744
| 0.767618
| 76.76177
| 0.559093
| 37.285897
| 0.396526
| 39.652568
| 0.312081
| 8.277405
| 0.481219
| 20.485677
| 0.406001
| 34.000074
| true
| false
|
2025-01-20
|
2025-01-20
| 1
|
suayptalha/Falcon3-Jessi-v0.4-7B-Slerp (Merge)
|
suayptalha_HomerCreativeAnvita-Mix-Qw7B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/suayptalha/HomerCreativeAnvita-Mix-Qw7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/HomerCreativeAnvita-Mix-Qw7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__HomerCreativeAnvita-Mix-Qw7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
suayptalha/HomerCreativeAnvita-Mix-Qw7B
|
5be9b48b59652687d3e5b88f9e935b51869756ad
| 35.464382
|
apache-2.0
| 12
| 7.616
| true
| false
| false
| true
| 1.299761
| 0.780782
| 78.078166
| 0.556465
| 36.984168
| 0.361027
| 36.102719
| 0.314597
| 8.612975
| 0.441594
| 14.732552
| 0.444481
| 38.275709
| true
| false
|
2024-11-22
|
2024-11-24
| 1
|
suayptalha/HomerCreativeAnvita-Mix-Qw7B (Merge)
|
suayptalha_Komodo-Llama-3.2-3B-v2-fp16_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/suayptalha/Komodo-Llama-3.2-3B-v2-fp16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/Komodo-Llama-3.2-3B-v2-fp16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__Komodo-Llama-3.2-3B-v2-fp16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
suayptalha/Komodo-Llama-3.2-3B-v2-fp16
|
1ff4b55d952597429c249ca71dc08b823eba17c0
| 20.317373
|
apache-2.0
| 6
| 3
| true
| false
| false
| true
| 1.19613
| 0.634053
| 63.40532
| 0.4355
| 20.204329
| 0.106495
| 10.649547
| 0.277685
| 3.691275
| 0.340573
| 3.371615
| 0.285239
| 20.582151
| false
| false
|
2024-11-19
|
2024-11-19
| 1
|
suayptalha/Komodo-Llama-3.2-3B-v2-fp16 (Merge)
|
suayptalha_Lamarckvergence-14B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/suayptalha/Lamarckvergence-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/Lamarckvergence-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__Lamarckvergence-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
suayptalha/Lamarckvergence-14B
|
7a1463829bbb7f8f7ad4b92e96260a3a27997bbe
| 43.320333
|
apache-2.0
| 19
| 14.766
| true
| false
| false
| true
| 3.161
| 0.765594
| 76.559418
| 0.651699
| 50.329236
| 0.54003
| 54.003021
| 0.363255
| 15.100671
| 0.442156
| 16.336198
| 0.528341
| 47.593454
| true
| false
|
2025-02-06
|
2025-02-06
| 1
|
suayptalha/Lamarckvergence-14B (Merge)
|
suayptalha_Lix-14B-v0.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/suayptalha/Lix-14B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/Lix-14B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__Lix-14B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
suayptalha/Lix-14B-v0.1
|
058e2f097fec3761d7383464673e0dda25192f7e
| 43.317632
|
apache-2.0
| 8
| 14.766
| true
| false
| false
| true
| 1.617435
| 0.781331
| 78.133131
| 0.660791
| 51.473725
| 0.529456
| 52.945619
| 0.369966
| 15.995526
| 0.433781
| 13.422656
| 0.531416
| 47.935136
| true
| false
|
2025-03-06
|
2025-03-06
| 1
|
suayptalha/Lix-14B-v0.1 (Merge)
|
suayptalha_Luminis-phi-4_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/suayptalha/Luminis-phi-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/Luminis-phi-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__Luminis-phi-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
suayptalha/Luminis-phi-4
|
8415367af0b7dfa4b2c3aaf0a4fd281b350b011f
| 41.757466
|
mit
| 11
| 14.66
| true
| false
| false
| true
| 1.838789
| 0.690007
| 69.000696
| 0.692021
| 55.80283
| 0.463746
| 46.374622
| 0.35151
| 13.534676
| 0.457156
| 16.677865
| 0.542387
| 49.154108
| true
| false
|
2025-02-06
|
2025-02-06
| 1
|
suayptalha/Luminis-phi-4 (Merge)
|
suayptalha_Maestro-10B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/suayptalha/Maestro-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/Maestro-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__Maestro-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
suayptalha/Maestro-10B
|
d37a2f19d52242ceb836466635982921f33a69b0
| 32.831841
|
other
| 7
| 10.306
| true
| false
| false
| true
| 1.813117
| 0.77676
| 77.676011
| 0.574609
| 39.544981
| 0.191088
| 19.108761
| 0.333054
| 11.073826
| 0.439729
| 13.832813
| 0.421792
| 35.754654
| false
| false
|
2025-01-31
|
2025-01-31
| 1
|
suayptalha/Maestro-10B (Merge)
|
suayptalha_Rombos-2.5-T.E-8.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/suayptalha/Rombos-2.5-T.E-8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/Rombos-2.5-T.E-8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__Rombos-2.5-T.E-8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
suayptalha/Rombos-2.5-T.E-8.1
|
c0ee2950b07377e1d0e01fc013a0f200b0306ea2
| 35.404162
|
cc-by-nc-sa-4.0
| 7
| 7.616
| true
| false
| false
| true
| 1.372031
| 0.692505
| 69.250478
| 0.551464
| 36.499861
| 0.492447
| 49.244713
| 0.311242
| 8.165548
| 0.416635
| 10.979427
| 0.444564
| 38.284944
| true
| false
|
2024-11-16
|
2024-11-16
| 1
|
suayptalha/Rombos-2.5-T.E-8.1 (Merge)
|
sumink_Qmerft_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/Qmerft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/Qmerft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__Qmerft-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/Qmerft
|
dd12d37190a97eaff0c8180a1c679097a9aaa393
| 3.970463
| 0
| 1.777
| false
| false
| false
| false
| 1.220912
| 0.156397
| 15.639725
| 0.293909
| 1.949014
| 0.002266
| 0.226586
| 0.252517
| 0.33557
| 0.36876
| 3.928385
| 0.115691
| 1.743499
| false
| false
|
2025-01-15
|
2025-01-15
| 1
|
sumink/Qmerft (Merge)
|
|
sumink_Qwenftmodel_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/Qwenftmodel" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/Qwenftmodel</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__Qwenftmodel-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/Qwenftmodel
|
7fe96b05b36aaa1be229c436b4fe3b476be9e2dd
| 10.104914
|
other
| 0
| 1.544
| true
| false
| false
| false
| 2.028829
| 0.172909
| 17.290899
| 0.38227
| 14.041352
| 0.089124
| 8.912387
| 0.256711
| 0.894855
| 0.361719
| 4.614844
| 0.233876
| 14.875148
| false
| false
|
2024-12-05
|
2024-12-05
| 0
|
sumink/Qwenftmodel
|
sumink_Qwenmplus_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/Qwenmplus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/Qwenmplus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__Qwenmplus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/Qwenmplus
|
2f6d29692e18a32bc179e81d09d4ecdefefb85d8
| 9.390912
|
other
| 0
| 1.543
| true
| false
| false
| false
| 2.191708
| 0.204033
| 20.403308
| 0.367551
| 12.706589
| 0.024924
| 2.492447
| 0.285235
| 4.697987
| 0.382833
| 5.020833
| 0.199219
| 11.024306
| false
| false
|
2025-01-03
|
2025-01-03
| 0
|
sumink/Qwenmplus
|
sumink_Qwensci_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/Qwensci" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/Qwensci</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__Qwensci-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/Qwensci
|
5cfce5a410358536c582e79a8484600ae384991a
| 5.56254
|
other
| 0
| 1.543
| true
| false
| false
| false
| 2.085459
| 0.173983
| 17.398281
| 0.328187
| 6.319843
| 0.020393
| 2.039275
| 0.258389
| 1.118568
| 0.360885
| 3.610677
| 0.125997
| 2.888593
| false
| false
|
2025-01-03
|
2025-01-03
| 0
|
sumink/Qwensci
|
sumink_bbhqwen_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/bbhqwen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/bbhqwen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__bbhqwen-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/bbhqwen
|
0e0815e15549c966e25dcf7e1bbd84d998878ba7
| 7.833827
| 0
| 3.086
| true
| false
| false
| false
| 0.789214
| 0.180852
| 18.085236
| 0.338825
| 6.631749
| 0.010574
| 1.057402
| 0.25755
| 1.006711
| 0.43524
| 13.371615
| 0.161652
| 6.850251
| false
| false
|
2025-02-25
|
2025-02-25
| 0
|
sumink/bbhqwen
|
|
sumink_bbhqwen2_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/bbhqwen2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/bbhqwen2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__bbhqwen2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/bbhqwen2
|
407c3fba41610ba6a42409221f58ae1bc758626d
| 6.258601
| 0
| 3.086
| true
| false
| false
| false
| 0.792937
| 0.1533
| 15.329991
| 0.306632
| 3.864304
| 0.006042
| 0.60423
| 0.262584
| 1.677852
| 0.443052
| 14.414844
| 0.114943
| 1.660387
| false
| false
|
2025-02-25
|
2025-02-25
| 0
|
sumink/bbhqwen2
|
|
sumink_bbhqwen3_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/bbhqwen3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/bbhqwen3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__bbhqwen3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/bbhqwen3
|
d8075190856fdab9a14a4aa56f2a962dda6b8436
| 4.947843
| 0
| 3.086
| true
| false
| false
| false
| 0.769589
| 0.194291
| 19.429115
| 0.295084
| 2.18766
| 0
| 0
| 0.25755
| 1.006711
| 0.379615
| 5.21849
| 0.116606
| 1.84508
| false
| false
|
2025-02-25
|
2025-02-25
| 0
|
sumink/bbhqwen3
|
|
sumink_bbhqwen4_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/bbhqwen4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/bbhqwen4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__bbhqwen4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/bbhqwen4
|
44934bb91bbfec64cef43bc0a987937bdbba960a
| 5.656084
| 0
| 3.086
| true
| false
| false
| false
| 0.788585
| 0.144857
| 14.485676
| 0.31994
| 4.892301
| 0.006042
| 0.60423
| 0.244128
| 0
| 0.402896
| 8.295312
| 0.150931
| 5.658983
| false
| false
|
2025-02-25
|
2025-02-25
| 0
|
sumink/bbhqwen4
|
|
sumink_bbhqwen5_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/bbhqwen5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/bbhqwen5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__bbhqwen5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/bbhqwen5
|
81acc4a3dcb973997691073e2395e3c43acd5620
| 5.199437
| 0
| 3.086
| true
| false
| false
| false
| 0.763493
| 0.152151
| 15.215074
| 0.29131
| 2.813265
| 0.002266
| 0.226586
| 0.260067
| 1.342282
| 0.401938
| 10.142187
| 0.113115
| 1.457225
| false
| false
|
2025-02-25
|
2025-02-25
| 0
|
sumink/bbhqwen5
|
|
sumink_bbhqwen6_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/bbhqwen6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/bbhqwen6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__bbhqwen6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/bbhqwen6
|
1b75baaea3ec9cabe8998ea5f3ee2cdb60d3de27
| 4.366129
| 0
| 3.086
| true
| false
| false
| false
| 0.774703
| 0.189296
| 18.929551
| 0.278224
| 2.129704
| 0.000755
| 0.075529
| 0.258389
| 1.118568
| 0.357969
| 2.246094
| 0.115276
| 1.697326
| false
| false
|
2025-02-25
|
2025-02-25
| 0
|
sumink/bbhqwen6
|
|
sumink_flflmillama_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/flflmillama" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/flflmillama</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__flflmillama-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/flflmillama
|
e6e15070ab0783d5d75f6a67a57b26d86c989079
| 9.043355
| 0
| 3.213
| true
| false
| false
| false
| 1.186744
| 0.167563
| 16.756318
| 0.385113
| 13.745934
| 0.019637
| 1.963746
| 0.291946
| 5.592841
| 0.359115
| 4.022656
| 0.209608
| 12.178635
| false
| false
|
2025-02-05
|
2025-02-05
| 0
|
sumink/flflmillama
|
|
sumink_ftgpt_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
GPT2LMHeadModel
|
<a target="_blank" href="https://huggingface.co/sumink/ftgpt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/ftgpt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__ftgpt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/ftgpt
|
fea7c59fff2443a73a7fd11a78b1d80eb5f0c4e6
| 3.951784
|
mit
| 0
| 0.124
| true
| false
| false
| false
| 0.105635
| 0.07871
| 7.871004
| 0.291909
| 1.931277
| 0
| 0
| 0.264262
| 1.901566
| 0.413844
| 10.097135
| 0.117188
| 1.909722
| false
| false
|
2024-11-06
|
2024-11-20
| 0
|
sumink/ftgpt
|
sumink_llamaft_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/llamaft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/llamaft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__llamaft-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/llamaft
|
99b36b73b173c63decde2e4f8ef49f78d04ea568
| 8.1562
| 0
| 3.213
| true
| false
| false
| false
| 1.191148
| 0.160869
| 16.086872
| 0.376278
| 12.950582
| 0.016616
| 1.661631
| 0.270973
| 2.796421
| 0.349813
| 3.059896
| 0.211436
| 12.381797
| false
| false
|
2025-02-05
|
2025-02-05
| 0
|
sumink/llamaft
|
|
sumink_llamamerge_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/llamamerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/llamamerge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__llamamerge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/llamamerge
|
ab032bb4dd5e7fe4950e00517afb641f3a0c26a6
| 14.736807
|
llama3
| 0
| 13.016
| true
| false
| false
| false
| 1.816641
| 0.267181
| 26.718108
| 0.463162
| 24.376394
| 0.015106
| 1.510574
| 0.298658
| 6.487696
| 0.423979
| 11.664063
| 0.258976
| 17.664007
| false
| false
|
2025-01-14
|
2025-01-14
| 0
|
sumink/llamamerge
|
sumink_llftfl7_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/llftfl7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/llftfl7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__llftfl7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/llftfl7
|
fba0f95abad2633bd64b4d4cedfd1910716b2025
| 7.811247
| 0
| 3.213
| true
| false
| false
| false
| 1.217381
| 0.171435
| 17.143513
| 0.378643
| 13.272908
| 0.010574
| 1.057402
| 0.28104
| 4.138702
| 0.363208
| 3.001042
| 0.174285
| 8.253915
| false
| false
|
2025-02-05
|
2025-02-05
| 0
|
sumink/llftfl7
|
|
sumink_llmer_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/llmer" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/llmer</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__llmer-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/llmer
|
c73859c891e2db1e79e0e32d43fd685418f0c2fc
| 17.98371
| 0
| 8.03
| false
| false
| false
| false
| 1.418529
| 0.319113
| 31.911329
| 0.488459
| 26.830897
| 0.064955
| 6.495468
| 0.297819
| 6.375839
| 0.403917
| 8.189583
| 0.352892
| 28.099143
| false
| false
|
2025-01-15
|
2025-01-15
| 1
|
sumink/llmer (Merge)
|
|
sumink_qwft_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/qwft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/qwft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__qwft-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/qwft
|
75279ae06c78ca9f3957ace3780f11dd95435b2b
| 3.106141
|
other
| 0
| 7.616
| true
| false
| false
| false
| 1.354008
| 0.119653
| 11.965252
| 0.300218
| 2.872788
| 0
| 0
| 0.252517
| 0.33557
| 0.358063
| 2.024479
| 0.112949
| 1.438756
| false
| false
|
2025-01-14
|
2025-01-14
| 0
|
sumink/qwft
|
sumink_qwmer_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/qwmer" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/qwmer</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__qwmer-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/qwmer
|
18b1f1a5a4be93ed37fc75c3d832e23324ea4993
| 11.672277
| 0
| 7.616
| false
| false
| false
| false
| 1.353406
| 0.221244
| 22.124408
| 0.42988
| 20.382372
| 0.000755
| 0.075529
| 0.286913
| 4.9217
| 0.403177
| 9.030469
| 0.221493
| 13.499187
| false
| false
|
2025-01-15
|
2025-01-15
| 1
|
sumink/qwmer (Merge)
|
|
sumink_solarmer3_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/solarmer3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/solarmer3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__solarmer3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/solarmer3
|
cb294a464c44adf0e4c23ecd50ff1a65be3040e0
| 20.502114
| 0
| 10.732
| false
| false
| false
| false
| 1.419555
| 0.374143
| 37.414283
| 0.526599
| 33.442494
| 0.058157
| 5.81571
| 0.291107
| 5.480984
| 0.440135
| 15.05026
| 0.332281
| 25.808954
| false
| false
|
2025-01-23
|
2025-01-23
| 1
|
sumink/solarmer3 (Merge)
|
|
sumink_somer_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/somer" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/somer</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__somer-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/somer
|
b9b7857618e91e4f16ceeb02546f129f7cead152
| 19.647029
| 0
| 10.732
| false
| false
| false
| false
| 1.439342
| 0.29903
| 29.902991
| 0.51937
| 31.718258
| 0.041541
| 4.154079
| 0.298658
| 6.487696
| 0.465
| 18.425
| 0.344747
| 27.194149
| false
| false
|
2025-01-15
|
2025-01-15
| 1
|
sumink/somer (Merge)
|
|
sumink_somer2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/somer2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/somer2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__somer2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/somer2
|
4c274ca4914b34c26772de9dc2af0c7192529c0b
| 20.037357
| 0
| 10.732
| false
| false
| false
| false
| 1.517533
| 0.313243
| 31.324331
| 0.516679
| 31.375841
| 0.046828
| 4.682779
| 0.303691
| 7.158837
| 0.466302
| 18.654427
| 0.343251
| 27.027926
| false
| false
|
2025-01-15
|
2025-01-15
| 1
|
sumink/somer2 (Merge)
|
|
sumink_somerft_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sumink/somerft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/somerft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__somerft-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sumink/somerft
|
41944985c2aa6f7f704c5680859f6f154d931734
| 4.941854
| 0
| 1.543
| true
| false
| false
| false
| 1.293193
| 0.143058
| 14.30582
| 0.309346
| 3.616795
| 0.01435
| 1.435045
| 0.248322
| 0
| 0.404479
| 8.993229
| 0.111702
| 1.300236
| false
| false
|
2025-01-17
|
2025-01-17
| 0
|
sumink/somerft
|
|
sunbaby_BrainCog-8B-0.1-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sunbaby/BrainCog-8B-0.1-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sunbaby/BrainCog-8B-0.1-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sunbaby__BrainCog-8B-0.1-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sunbaby/BrainCog-8B-0.1-Instruct
|
6c03cb7af723c7f7785df9eee5d5838247619bee
| 18.380633
|
apache-2.0
| 0
| 8.03
| true
| false
| false
| true
| 1.669109
| 0.4253
| 42.530043
| 0.461822
| 24.283468
| 0.096677
| 9.667674
| 0.301174
| 6.823266
| 0.365594
| 6.332552
| 0.285821
| 20.646794
| false
| false
|
2024-07-31
|
2024-08-27
| 1
|
meta-llama/Meta-Llama-3-8B
|
swap-uniba_LLaMAntino-3-ANITA-8B-Inst-DPO-ITA_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/swap-uniba__LLaMAntino-3-ANITA-8B-Inst-DPO-ITA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA
|
2b6e46e4c9d341dc8bf8350a167492c880116b66
| 21.827553
|
llama3
| 24
| 8.03
| true
| false
| false
| false
| 1.633278
| 0.481505
| 48.150463
| 0.49357
| 27.990828
| 0.048338
| 4.833837
| 0.298658
| 6.487696
| 0.43874
| 13.242448
| 0.37234
| 30.260047
| false
| false
|
2024-04-29
|
2024-10-25
| 1
|
meta-llama/Meta-Llama-3-8B-Instruct
|
synergetic_FrankenQwen2.5-14B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/synergetic/FrankenQwen2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">synergetic/FrankenQwen2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/synergetic__FrankenQwen2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
synergetic/FrankenQwen2.5-14B
|
24e41619569b50aa44698e0afabbbee30af998bd
| 18.126546
| 0
| 16.972
| false
| false
| false
| true
| 4.517963
| 0.186947
| 18.69473
| 0.604775
| 44.273555
| 0
| 0
| 0.270134
| 2.684564
| 0.38426
| 5.532552
| 0.438165
| 37.573877
| false
| false
|
2024-11-30
|
2024-11-30
| 1
|
synergetic/FrankenQwen2.5-14B (Merge)
|
|
talha2001_Beast-Soul-new_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/talha2001/Beast-Soul-new" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">talha2001/Beast-Soul-new</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/talha2001__Beast-Soul-new-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
talha2001/Beast-Soul-new
|
e6cf8caa60264a3005df2ff4b9d967f684519d4b
| 21.792278
| 0
| 7.242
| false
| false
| false
| false
| 1.285766
| 0.485351
| 48.535109
| 0.522714
| 33.072759
| 0.074018
| 7.401813
| 0.281879
| 4.250559
| 0.445927
| 14.140885
| 0.310173
| 23.352541
| false
| false
|
2024-08-07
|
2024-08-07
| 1
|
talha2001/Beast-Soul-new (Merge)
|
|
tangledgroup_tangled-llama-pints-1.5b-v0.1-instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/tangledgroup/tangled-llama-pints-1.5b-v0.1-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tangledgroup/tangled-llama-pints-1.5b-v0.1-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tangledgroup__tangled-llama-pints-1.5b-v0.1-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tangledgroup/tangled-llama-pints-1.5b-v0.1-instruct
|
3e1429f20007740877c51e44ed63b870a57a2e17
| 4.366498
|
apache-2.0
| 0
| 1.5
| true
| false
| false
| true
| 0.590868
| 0.150902
| 15.090183
| 0.314344
| 3.842195
| 0.012085
| 1.208459
| 0.239933
| 0
| 0.376135
| 4.85026
| 0.110871
| 1.20789
| false
| false
|
2024-08-27
|
2024-08-29
| 1
|
pints-ai/1.5-Pints-16K-v0.1
|
tangledgroup_tangled-llama-pints-1.5b-v0.2-instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/tangledgroup/tangled-llama-pints-1.5b-v0.2-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tangledgroup/tangled-llama-pints-1.5b-v0.2-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tangledgroup__tangled-llama-pints-1.5b-v0.2-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tangledgroup/tangled-llama-pints-1.5b-v0.2-instruct
|
5c229e26f3ab3d0f0f613ed242f3f0f57c930155
| 4.745857
|
apache-2.0
| 0
| 1.5
| true
| false
| false
| true
| 0.595623
| 0.172409
| 17.240921
| 0.315835
| 4.080205
| 0.01284
| 1.283988
| 0.241611
| 0
| 0.364292
| 4.569792
| 0.111702
| 1.300236
| false
| false
|
2024-09-14
|
2024-09-15
| 1
|
pints-ai/1.5-Pints-16K-v0.1
|
tanliboy_lambda-gemma-2-9b-dpo_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/tanliboy/lambda-gemma-2-9b-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tanliboy/lambda-gemma-2-9b-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tanliboy__lambda-gemma-2-9b-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tanliboy/lambda-gemma-2-9b-dpo
|
b141471308bc41ffe15180a6668c735396c3949b
| 22.910404
|
gemma
| 1
| 9.242
| true
| false
| false
| true
| 4.483174
| 0.45008
| 45.008023
| 0.547172
| 35.554545
| 0.094411
| 9.441088
| 0.313758
| 8.501119
| 0.401656
| 7.940365
| 0.379156
| 31.017287
| false
| false
|
2024-07-24
|
2024-09-18
| 2
|
google/gemma-2-9b
|
tanliboy_lambda-gemma-2-9b-dpo_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/tanliboy/lambda-gemma-2-9b-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tanliboy/lambda-gemma-2-9b-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tanliboy__lambda-gemma-2-9b-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tanliboy/lambda-gemma-2-9b-dpo
|
b141471308bc41ffe15180a6668c735396c3949b
| 16.970109
|
gemma
| 1
| 9.242
| true
| false
| false
| true
| 2.903576
| 0.182925
| 18.292464
| 0.548791
| 35.739663
| 0
| 0
| 0.310403
| 8.053691
| 0.405625
| 8.569792
| 0.380485
| 31.165041
| false
| false
|
2024-07-24
|
2024-09-18
| 2
|
google/gemma-2-9b
|
tanliboy_lambda-qwen2.5-14b-dpo-test_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/tanliboy/lambda-qwen2.5-14b-dpo-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tanliboy/lambda-qwen2.5-14b-dpo-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tanliboy__lambda-qwen2.5-14b-dpo-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tanliboy/lambda-qwen2.5-14b-dpo-test
|
96607eea3c67f14f73e576580610dba7530c5dd9
| 42.617401
|
apache-2.0
| 9
| 14.77
| true
| false
| false
| true
| 3.601487
| 0.823122
| 82.312154
| 0.639351
| 48.45444
| 0.546073
| 54.607251
| 0.362416
| 14.988814
| 0.426031
| 12.58724
| 0.484791
| 42.754507
| false
| false
|
2024-09-20
|
2024-09-20
| 2
|
Qwen/Qwen2.5-14B
|
tanliboy_lambda-qwen2.5-32b-dpo-test_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/tanliboy/lambda-qwen2.5-32b-dpo-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tanliboy/lambda-qwen2.5-32b-dpo-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tanliboy__lambda-qwen2.5-32b-dpo-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tanliboy/lambda-qwen2.5-32b-dpo-test
|
675b60d6e859455a6139e6e284bbe1844b8ddf46
| 45.924593
|
apache-2.0
| 5
| 32.764
| true
| false
| false
| true
| 10.998607
| 0.808384
| 80.838398
| 0.67639
| 54.407961
| 0.610272
| 61.02719
| 0.356544
| 14.205817
| 0.427427
| 13.328385
| 0.565658
| 51.739805
| false
| false
|
2024-09-22
|
2024-09-30
| 2
|
Qwen/Qwen2.5-32B
|
tannedbum_Ellaria-9B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/tannedbum/Ellaria-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tannedbum/Ellaria-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tannedbum__Ellaria-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tannedbum/Ellaria-9B
|
087b263326da56de637912814bc7073b83b8d59a
| 33.049972
| 17
| 10.159
| false
| false
| false
| true
| 3.714227
| 0.78258
| 78.258022
| 0.59421
| 41.721561
| 0.207704
| 20.770393
| 0.333054
| 11.073826
| 0.415146
| 10.859896
| 0.420545
| 35.616135
| false
| false
|
2024-08-04
|
2025-01-07
| 1
|
tannedbum/Ellaria-9B (Merge)
|
|
tannedbum_L3-Nymeria-Maid-8B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/tannedbum/L3-Nymeria-Maid-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tannedbum/L3-Nymeria-Maid-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tannedbum__L3-Nymeria-Maid-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tannedbum/L3-Nymeria-Maid-8B
|
17cf2c77399d63638254353ac86adf5692b79c62
| 26.043176
|
cc-by-nc-4.0
| 12
| 8.03
| true
| false
| false
| true
| 0.911597
| 0.725003
| 72.500299
| 0.514606
| 31.240945
| 0.093656
| 9.365559
| 0.296141
| 6.152125
| 0.375052
| 6.48151
| 0.374668
| 30.518617
| true
| false
|
2024-06-21
|
2025-01-07
| 1
|
tannedbum/L3-Nymeria-Maid-8B (Merge)
|
tannedbum_L3-Nymeria-v2-8B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/tannedbum/L3-Nymeria-v2-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tannedbum/L3-Nymeria-v2-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tannedbum__L3-Nymeria-v2-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tannedbum/L3-Nymeria-v2-8B
|
6f0f2526cc89c9d749b850c3e1c3484db92e5c3b
| 25.709418
|
cc-by-nc-4.0
| 15
| 8.03
| true
| false
| false
| true
| 0.984977
| 0.716835
| 71.683467
| 0.52242
| 32.262544
| 0.092145
| 9.214502
| 0.290268
| 5.369128
| 0.369875
| 5.134375
| 0.375332
| 30.592494
| true
| false
|
2024-06-29
|
2025-01-07
| 1
|
tannedbum/L3-Nymeria-v2-8B (Merge)
|
tannedbum_L3-Rhaenys-8B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/tannedbum/L3-Rhaenys-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tannedbum/L3-Rhaenys-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tannedbum__L3-Rhaenys-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tannedbum/L3-Rhaenys-8B
|
a159e2aabf9d6ef31444dc46c3dce9fdadca77d9
| 26.454823
|
cc-by-nc-4.0
| 5
| 8.03
| true
| false
| false
| true
| 1.095766
| 0.736269
| 73.626866
| 0.529921
| 33.137944
| 0.087613
| 8.761329
| 0.297819
| 6.375839
| 0.372479
| 5.726563
| 0.379904
| 31.100399
| true
| false
|
2024-07-31
|
2025-01-07
| 1
|
tannedbum/L3-Rhaenys-8B (Merge)
|
teknium_CollectiveCognition-v1.1-Mistral-7B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/CollectiveCognition-v1.1-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/teknium__CollectiveCognition-v1.1-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
teknium/CollectiveCognition-v1.1-Mistral-7B
|
5f57f70ec99450c70da2540e94dd7fd67be4b23c
| 14.256397
|
apache-2.0
| 79
| 7
| true
| false
| false
| false
| 0.858636
| 0.279046
| 27.904626
| 0.449343
| 23.476134
| 0.030967
| 3.096677
| 0.286913
| 4.9217
| 0.386927
| 5.732552
| 0.28366
| 20.406693
| false
| true
|
2023-10-04
|
2024-06-12
| 1
|
mistralai/Mistral-7B-v0.1
|
teknium_OpenHermes-13B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/teknium/OpenHermes-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/teknium__OpenHermes-13B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
teknium/OpenHermes-13B
|
bcad6fff9f8591e091d2d57356a3f102197e8c5f
| 12.182264
|
mit
| 55
| 13
| true
| false
| false
| false
| 62.238233
| 0.266807
| 26.680652
| 0.420644
| 18.213328
| 0.012085
| 1.208459
| 0.272651
| 3.020134
| 0.40426
| 8.532552
| 0.238946
| 15.43846
| false
| true
|
2023-09-06
|
2024-06-12
| 1
|
NousResearch/Llama-2-13b-hf
|
teknium_OpenHermes-2-Mistral-7B_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/teknium/OpenHermes-2-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-2-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/teknium__OpenHermes-2-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
teknium/OpenHermes-2-Mistral-7B
|
4c6e34123b140ce773a8433cae5410949289102c
| 21.440476
|
apache-2.0
| 255
| 7
| true
| false
| false
| true
| 0.95006
| 0.528615
| 52.861519
| 0.494752
| 29.251839
| 0.045317
| 4.531722
| 0.283557
| 4.474273
| 0.451979
| 16.064062
| 0.293135
| 21.459441
| false
| true
|
2023-10-12
|
2024-06-12
| 1
|
mistralai/Mistral-7B-v0.1
|
teknium_OpenHermes-2.5-Mistral-7B_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-2.5-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/teknium__OpenHermes-2.5-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
teknium/OpenHermes-2.5-Mistral-7B
|
24c0bea14d53e6f67f1fbe2eca5bfe7cae389b33
| 21.317189
|
apache-2.0
| 830
| 7.242
| true
| false
| false
| true
| 0.945567
| 0.557142
| 55.714172
| 0.487001
| 27.770026
| 0.050604
| 5.060423
| 0.283557
| 4.474273
| 0.424198
| 12.058073
| 0.305436
| 22.826167
| false
| true
|
2023-10-29
|
2024-06-12
| 1
|
mistralai/Mistral-7B-v0.1
|
teknium_OpenHermes-7B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/teknium/OpenHermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/teknium__OpenHermes-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
teknium/OpenHermes-7B
|
9f55d6eb15f1edd52ee1fd863a220aa682e78a00
| 9.569249
|
mit
| 13
| 7
| true
| false
| false
| false
| 4.96618
| 0.181251
| 18.12513
| 0.362034
| 12.081395
| 0.015861
| 1.586103
| 0.269295
| 2.572707
| 0.432385
| 12.68151
| 0.193318
| 10.368647
| false
| true
|
2023-09-14
|
2024-06-12
| 1
|
NousResearch/Llama-2-7b-hf
|
tensopolis_falcon3-10b-tensopolis-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensopolis/falcon3-10b-tensopolis-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensopolis/falcon3-10b-tensopolis-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensopolis__falcon3-10b-tensopolis-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensopolis/falcon3-10b-tensopolis-v1
|
39f61a967cedf5db5de6229284eeffde9b4ede83
| 35.588967
|
other
| 0
| 10.306
| true
| false
| false
| true
| 10.090299
| 0.781656
| 78.165601
| 0.618227
| 45.05928
| 0.274924
| 27.492447
| 0.329698
| 10.626398
| 0.437531
| 14.191406
| 0.441988
| 37.99867
| false
| false
|
2025-01-29
|
2025-03-08
| 1
|
tensopolis/falcon3-10b-tensopolis-v1 (Merge)
|
tensopolis_falcon3-10b-tensopolis-v2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensopolis/falcon3-10b-tensopolis-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensopolis/falcon3-10b-tensopolis-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensopolis__falcon3-10b-tensopolis-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensopolis/falcon3-10b-tensopolis-v2
|
acff1f3390161a0ffe72ba89b59ee47a04622419
| 35.19044
|
other
| 0
| 10.306
| true
| false
| false
| true
| 0.857556
| 0.779208
| 77.920806
| 0.618227
| 45.046927
| 0.266616
| 26.661631
| 0.327181
| 10.290828
| 0.429688
| 13.177604
| 0.442404
| 38.044843
| false
| false
|
2025-03-08
|
2025-03-08
| 1
|
tensopolis/falcon3-10b-tensopolis-v2 (Merge)
|
tensopolis_lamarckvergence-14b-tensopolis-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensopolis/lamarckvergence-14b-tensopolis-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensopolis/lamarckvergence-14b-tensopolis-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensopolis__lamarckvergence-14b-tensopolis-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensopolis/lamarckvergence-14b-tensopolis-v1
|
4125c5592b0131d408a994b0c6a13ce857bc8951
| 42.917324
|
apache-2.0
| 1
| 14.766
| true
| false
| false
| true
| 3.110068
| 0.760374
| 76.037359
| 0.656115
| 50.983495
| 0.516616
| 51.661631
| 0.360738
| 14.765101
| 0.447458
| 16.832292
| 0.525017
| 47.224069
| false
| false
|
2025-03-09
|
2025-03-10
| 1
|
tensopolis/lamarckvergence-14b-tensopolis-v1 (Merge)
|
tensopolis_mistral-small-2501-tensopolis-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensopolis/mistral-small-2501-tensopolis-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensopolis/mistral-small-2501-tensopolis-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensopolis__mistral-small-2501-tensopolis-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensopolis/mistral-small-2501-tensopolis-v1
|
c5e5dec7262c0174932627029df32225ddfa77f4
| 39.24515
|
apache-2.0
| 0
| 23.572
| true
| false
| false
| true
| 1.381145
| 0.77621
| 77.621045
| 0.647474
| 48.693238
| 0.444109
| 44.410876
| 0.357383
| 14.317673
| 0.427979
| 11.930729
| 0.446476
| 38.49734
| false
| false
|
2025-02-08
|
2025-02-08
| 1
|
tensopolis/mistral-small-2501-tensopolis-v1 (Merge)
|
tensopolis_mistral-small-r1-tensopolis_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensopolis/mistral-small-r1-tensopolis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensopolis/mistral-small-r1-tensopolis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensopolis__mistral-small-r1-tensopolis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensopolis/mistral-small-r1-tensopolis
|
d7256a2ea0f80a2294177ef53d8ad596b8cf3d68
| 25.876978
|
apache-2.0
| 1
| 23.572
| true
| false
| false
| false
| 1.4434
| 0.46222
| 46.222024
| 0.543597
| 34.602283
| 0.290785
| 29.07855
| 0.281879
| 4.250559
| 0.37375
| 7.385417
| 0.403507
| 33.723035
| false
| false
|
2025-02-08
|
2025-02-24
| 2
|
mistralai/Mistral-Small-24B-Instruct-2501 (Merge)
|
tensopolis_phi-4-tensopolis-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensopolis/phi-4-tensopolis-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensopolis/phi-4-tensopolis-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensopolis__phi-4-tensopolis-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensopolis/phi-4-tensopolis-v1
|
5890f8b5f7e4a040b8dd5fbcfcc45f4311ac2873
| 40.455333
|
mit
| 0
| 14.66
| true
| false
| false
| true
| 2.706216
| 0.676668
| 67.666791
| 0.687183
| 55.036575
| 0.493958
| 49.39577
| 0.334732
| 11.297539
| 0.414063
| 10.624479
| 0.538398
| 48.710845
| false
| false
|
2025-02-07
|
2025-03-09
| 1
|
tensopolis/phi-4-tensopolis-v1 (Merge)
|
tensopolis_qwen2.5-14b-tensopolis-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensopolis/qwen2.5-14b-tensopolis-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensopolis/qwen2.5-14b-tensopolis-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensopolis__qwen2.5-14b-tensopolis-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensopolis/qwen2.5-14b-tensopolis-v1
|
34f20381d4b43bbbd288f9a1f81a26bcbdcbd3c8
| 41.14159
|
apache-2.0
| 1
| 14.77
| true
| false
| false
| true
| 11.845475
| 0.799017
| 79.901661
| 0.63636
| 47.96505
| 0.529456
| 52.945619
| 0.334732
| 11.297539
| 0.419333
| 11.283333
| 0.491107
| 43.456339
| false
| false
|
2025-02-01
|
2025-03-09
| 1
|
tensopolis/qwen2.5-14b-tensopolis-v1 (Merge)
|
tensopolis_qwen2.5-3b-or1-tensopolis_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensopolis/qwen2.5-3b-or1-tensopolis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensopolis/qwen2.5-3b-or1-tensopolis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensopolis__qwen2.5-3b-or1-tensopolis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensopolis/qwen2.5-3b-or1-tensopolis
|
3f7cee86465d2518298e835f93b1b904a5a799d7
| 18.280251
| 2
| 3.086
| false
| false
| false
| false
| 1.519746
| 0.35401
| 35.400958
| 0.44215
| 22.108988
| 0.172961
| 17.296073
| 0.294463
| 5.928412
| 0.374927
| 4.532552
| 0.319731
| 24.414524
| false
| false
|
2025-02-15
|
2025-02-24
| 3
|
Qwen/Qwen2.5-3B
|
|
tensopolis_qwen2.5-7b-tensopolis-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensopolis/qwen2.5-7b-tensopolis-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensopolis/qwen2.5-7b-tensopolis-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensopolis__qwen2.5-7b-tensopolis-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensopolis/qwen2.5-7b-tensopolis-v1
|
2cbc0765ff33ec948d54f305640f4db262221493
| 35.491677
|
apache-2.0
| 0
| 7.616
| true
| false
| false
| true
| 8.295151
| 0.766094
| 76.609396
| 0.537874
| 34.783528
| 0.456193
| 45.619335
| 0.296141
| 6.152125
| 0.433875
| 13.467708
| 0.426862
| 36.317967
| false
| false
|
2025-02-24
|
2025-03-07
| 1
|
tensopolis/qwen2.5-7b-tensopolis-v1 (Merge)
|
tensopolis_qwen2.5-7b-tensopolis-v2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensopolis/qwen2.5-7b-tensopolis-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensopolis/qwen2.5-7b-tensopolis-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensopolis__qwen2.5-7b-tensopolis-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensopolis/qwen2.5-7b-tensopolis-v2
|
d81d4112a0bebd9d1262e94136a330beac86299b
| 35.37815
|
apache-2.0
| 0
| 7.616
| true
| false
| false
| true
| 4.501445
| 0.752106
| 75.210552
| 0.541462
| 35.22412
| 0.481873
| 48.187311
| 0.290268
| 5.369128
| 0.424635
| 12.246094
| 0.424285
| 36.031693
| false
| false
|
2025-03-07
|
2025-03-08
| 1
|
tensopolis/qwen2.5-7b-tensopolis-v2 (Merge)
|
tensopolis_virtuoso-lite-tensopolis-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensopolis/virtuoso-lite-tensopolis-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensopolis/virtuoso-lite-tensopolis-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensopolis__virtuoso-lite-tensopolis-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensopolis/virtuoso-lite-tensopolis-v1
|
84293adf3f10f19f1c6e735633f0cc45b457ac62
| 36.389475
|
other
| 0
| 10.306
| true
| false
| false
| true
| 6.852793
| 0.80691
| 80.691011
| 0.610185
| 43.941335
| 0.254532
| 25.453172
| 0.344799
| 12.639821
| 0.45824
| 17.446615
| 0.443484
| 38.164894
| false
| false
|
2025-02-05
|
2025-03-08
| 1
|
tensopolis/virtuoso-lite-tensopolis-v1 (Merge)
|
tensopolis_virtuoso-lite-tensopolis-v2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensopolis/virtuoso-lite-tensopolis-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensopolis/virtuoso-lite-tensopolis-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensopolis__virtuoso-lite-tensopolis-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensopolis/virtuoso-lite-tensopolis-v2
|
427b560bfaa1d5c1993bcc743d9e4b8695b1a053
| 36.256179
|
other
| 0
| 10.306
| true
| false
| false
| true
| 6.034671
| 0.802938
| 80.293843
| 0.610019
| 43.93078
| 0.25
| 25
| 0.343121
| 12.416107
| 0.459542
| 17.676042
| 0.443983
| 38.220301
| false
| false
|
2025-02-06
|
2025-03-09
| 1
|
tensopolis/virtuoso-lite-tensopolis-v2 (Merge)
|
tensopolis_virtuoso-small-tensopolis-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensopolis/virtuoso-small-tensopolis-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensopolis/virtuoso-small-tensopolis-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensopolis__virtuoso-small-tensopolis-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensopolis/virtuoso-small-tensopolis-v1
|
aae1313e07e91002106d55f893978669f9b19389
| 38.41359
|
apache-2.0
| 0
| 14.77
| true
| false
| false
| true
| 4.667431
| 0.785628
| 78.562769
| 0.64154
| 48.171226
| 0.352719
| 35.271903
| 0.32802
| 10.402685
| 0.432635
| 13.979427
| 0.496842
| 44.093528
| false
| false
|
2025-01-28
|
2025-01-28
| 1
|
tensopolis/virtuoso-small-tensopolis-v1 (Merge)
|
tensopolis_virtuoso-small-tensopolis-v2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensopolis/virtuoso-small-tensopolis-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensopolis/virtuoso-small-tensopolis-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensopolis__virtuoso-small-tensopolis-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensopolis/virtuoso-small-tensopolis-v2
|
425502ec46cc5752412693ee7eea7f049088f904
| 40.11384
|
apache-2.0
| 2
| 14.77
| true
| false
| false
| true
| 4.714171
| 0.802014
| 80.201421
| 0.651584
| 50.22972
| 0.387462
| 38.746224
| 0.328859
| 10.514541
| 0.43524
| 14.838281
| 0.515376
| 46.152852
| false
| false
|
2025-01-29
|
2025-02-17
| 1
|
tensopolis/virtuoso-small-tensopolis-v2 (Merge)
|
tensopolis_virtuoso-small-v2-tensopolis-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensopolis/virtuoso-small-v2-tensopolis-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensopolis/virtuoso-small-v2-tensopolis-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensopolis__virtuoso-small-v2-tensopolis-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensopolis/virtuoso-small-v2-tensopolis-v1
|
71706588945ec9ad97f9d39e529e00d96d0119b3
| 42.697896
|
apache-2.0
| 1
| 14.766
| true
| false
| false
| true
| 28.932247
| 0.841906
| 84.190614
| 0.654475
| 50.96603
| 0.452417
| 45.241692
| 0.346477
| 12.863535
| 0.450927
| 16.532552
| 0.517537
| 46.392952
| false
| false
|
2025-02-01
|
2025-03-09
| 1
|
tensopolis/virtuoso-small-v2-tensopolis-v1 (Merge)
|
tensoropera_Fox-1-1.6B_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/tensoropera/Fox-1-1.6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensoropera/Fox-1-1.6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensoropera__Fox-1-1.6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tensoropera/Fox-1-1.6B
|
6389dde4d7e52aa1200ad954c565f03c7fdcf8db
| 7.764366
|
apache-2.0
| 31
| 1.665
| true
| false
| false
| false
| 2.685641
| 0.276598
| 27.659831
| 0.330737
| 7.399761
| 0.017372
| 1.73716
| 0.263423
| 1.789709
| 0.35499
| 3.873698
| 0.137134
| 4.126034
| false
| false
|
2024-06-13
|
2024-06-29
| 0
|
tensoropera/Fox-1-1.6B
|
tenyx_Llama3-TenyxChat-70B_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/tenyx/Llama3-TenyxChat-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tenyx/Llama3-TenyxChat-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tenyx__Llama3-TenyxChat-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
tenyx/Llama3-TenyxChat-70B
|
a85d31e3af8fcc847cc9169f1144cf02f5351fab
| 36.696015
|
llama3
| 64
| 70.554
| true
| false
| false
| true
| 18.734013
| 0.808709
| 80.870867
| 0.651149
| 49.61562
| 0.23565
| 23.564955
| 0.301174
| 6.823266
| 0.426031
| 12.520573
| 0.521027
| 46.780807
| false
| false
|
2024-04-26
|
2024-08-04
| 0
|
tenyx/Llama3-TenyxChat-70B
|
theo77186_Qwen2.5-Coder-7B-Instruct-20241106_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/theo77186/Qwen2.5-Coder-7B-Instruct-20241106" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theo77186/Qwen2.5-Coder-7B-Instruct-20241106</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theo77186__Qwen2.5-Coder-7B-Instruct-20241106-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
theo77186/Qwen2.5-Coder-7B-Instruct-20241106
|
3e2c48344212ca7a3c71b020bc785dd9f0919a7f
| 28.330799
|
apache-2.0
| 4
| 7.616
| true
| false
| false
| true
| 1.324065
| 0.610148
| 61.014774
| 0.500798
| 28.938504
| 0.388218
| 38.821752
| 0.291946
| 5.592841
| 0.407271
| 9.475521
| 0.335273
| 26.141401
| false
| false
|
2024-11-08
|
2025-01-19
| 1
|
theo77186/Qwen2.5-Coder-7B-Instruct-20241106 (Merge)
|
theprint_Boptruth-Agatha-7B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/theprint/Boptruth-Agatha-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/Boptruth-Agatha-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__Boptruth-Agatha-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
theprint/Boptruth-Agatha-7B
|
ef7c7570be29a58f4a8358a6d4c75f59a5282191
| 17.512381
| 0
| 7.242
| false
| false
| false
| false
| 0.77621
| 0.312419
| 31.241883
| 0.498394
| 29.286422
| 0.055136
| 5.513595
| 0.299497
| 6.599553
| 0.427667
| 11.758333
| 0.28607
| 20.674498
| false
| false
|
2024-09-11
|
2024-09-30
| 0
|
theprint/Boptruth-Agatha-7B
|
|
theprint_CleverBoi-7B-v2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/theprint/CleverBoi-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-7B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-7B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
theprint/CleverBoi-7B-v2
|
1d82629c1e6778cf8568b532a3c09b668805b15a
| 15.095915
|
apache-2.0
| 0
| 7.736
| true
| false
| false
| false
| 3.044795
| 0.216998
| 21.699757
| 0.453173
| 23.444181
| 0.026435
| 2.643505
| 0.288591
| 5.145414
| 0.469531
| 18.658073
| 0.270861
| 18.98456
| false
| false
|
2024-09-12
|
2024-09-13
| 2
|
mistralai/Mistral-7B-v0.3
|
theprint_CleverBoi-7B-v3_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/theprint/CleverBoi-7B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-7B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-7B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
theprint/CleverBoi-7B-v3
|
1d82629c1e6778cf8568b532a3c09b668805b15a
| 13.690467
|
apache-2.0
| 0
| 7.736
| true
| false
| false
| false
| 3.20578
| 0.23823
| 23.823012
| 0.441443
| 21.936747
| 0.04003
| 4.003021
| 0.26594
| 2.12528
| 0.407177
| 9.497135
| 0.286818
| 20.757609
| false
| false
|
2024-09-14
|
2024-09-22
| 2
|
mistralai/Mistral-7B-v0.3
|
theprint_CleverBoi-Llama-3.1-8B-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/theprint/CleverBoi-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
theprint/CleverBoi-Llama-3.1-8B-Instruct
|
3514c510ea4ba4d650522f467d4d0cef7de4a43c
| 13.970395
|
apache-2.0
| 1
| 16.061
| true
| false
| false
| false
| 3.740445
| 0.168163
| 16.81627
| 0.455962
| 24.048603
| 0.049094
| 4.909366
| 0.300336
| 6.711409
| 0.401438
| 8.279688
| 0.307513
| 23.057033
| false
| false
|
2024-08-27
|
2024-09-13
| 3
|
meta-llama/Meta-Llama-3.1-8B
|
theprint_CleverBoi-Llama-3.1-8B-v2_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/theprint/CleverBoi-Llama-3.1-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-Llama-3.1-8B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-Llama-3.1-8B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
theprint/CleverBoi-Llama-3.1-8B-v2
|
a8b0fc584b10e0110e04f9d21c7f10d24391c1d5
| 14.145588
|
apache-2.0
| 0
| 9.3
| true
| false
| false
| false
| 5.042759
| 0.19614
| 19.613958
| 0.466782
| 24.132845
| 0.05287
| 5.287009
| 0.286074
| 4.809843
| 0.373469
| 6.716927
| 0.318816
| 24.312943
| false
| false
|
2024-09-15
|
2024-09-22
| 2
|
meta-llama/Meta-Llama-3.1-8B
|
theprint_CleverBoi-Nemo-12B-v2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/theprint/CleverBoi-Nemo-12B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-Nemo-12B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-Nemo-12B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
theprint/CleverBoi-Nemo-12B-v2
|
cd1f9ee1c484f857bb0e5ae6aac37dc434911f10
| 17.858393
|
apache-2.0
| 4
| 13.933
| true
| false
| false
| false
| 7.011027
| 0.204583
| 20.458273
| 0.524109
| 31.652695
| 0.103474
| 10.347432
| 0.313758
| 8.501119
| 0.418677
| 11.434635
| 0.322806
| 24.756206
| false
| false
|
2024-09-16
|
2024-09-24
| 1
|
unsloth/Mistral-Nemo-Instruct-2407-bnb-4bit
|
theprint_Code-Llama-Bagel-8B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/theprint/Code-Llama-Bagel-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/Code-Llama-Bagel-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__Code-Llama-Bagel-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
theprint/Code-Llama-Bagel-8B
|
7fa415f3f758ab7930d7e1df27b2d16207513125
| 14.665251
|
llama3
| 1
| 8.03
| true
| false
| false
| false
| 1.636093
| 0.252968
| 25.296768
| 0.469742
| 25.338155
| 0.061178
| 6.117825
| 0.276007
| 3.467562
| 0.367979
| 7.530729
| 0.282164
| 20.24047
| true
| false
|
2024-06-21
|
2024-09-13
| 1
|
theprint/Code-Llama-Bagel-8B (Merge)
|
theprint_Conversely-Mistral-7B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/theprint/Conversely-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/Conversely-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__Conversely-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
theprint/Conversely-Mistral-7B
|
d8cadc02ac76bd617a919d50b092e59d2d110aff
| 15.032656
|
apache-2.0
| 0
| 14.496
| true
| false
| false
| false
| 2.073959
| 0.260811
| 26.081131
| 0.467235
| 25.706966
| 0.027946
| 2.794562
| 0.285235
| 4.697987
| 0.418896
| 10.628646
| 0.28258
| 20.286643
| false
| false
|
2024-12-05
|
2024-12-07
| 2
|
mistralai/Mistral-7B-v0.3
|
theprint_Llama-3.2-3B-VanRossum_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/theprint/Llama-3.2-3B-VanRossum" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/Llama-3.2-3B-VanRossum</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__Llama-3.2-3B-VanRossum-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
theprint/Llama-3.2-3B-VanRossum
|
7048abecd492a1f5d53981cb175431ec01bbced0
| 17.584809
|
apache-2.0
| 0
| 3.696
| true
| false
| false
| false
| 3.709177
| 0.478282
| 47.828207
| 0.427874
| 19.366362
| 0.097432
| 9.743202
| 0.267617
| 2.348993
| 0.344167
| 6.554167
| 0.277011
| 19.667923
| false
| false
|
2024-11-14
|
2024-11-14
| 2
|
meta-llama/Llama-3.2-3B-Instruct
|
theprint_ReWiz-7B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
theprint/ReWiz-7B
|
d9f28e67d52181d1478e7788e3edf252f5bf32a8
| 17.787041
|
apache-2.0
| 0
| 7.736
| true
| false
| false
| false
| 2.890811
| 0.404793
| 40.479262
| 0.456422
| 23.50443
| 0.040785
| 4.07855
| 0.275168
| 3.355705
| 0.461156
| 16.744531
| 0.267038
| 18.559767
| false
| false
|
2024-10-08
|
2024-10-08
| 3
|
mistralai/Mistral-7B-v0.3
|
Subsets and Splits
Top Models by Combined Score
Identifies top-performing models with fewer than 34 billion parameters based on a combined score of two evaluation metrics, providing insights into efficient model performance.
Top 100 Official Models <70
This query identifies the top 100 high-scoring, officially provided models with fewer than 70 billion parameters, offering a useful overview for comparing performance metrics.
Top 100 Official Models < 2
Identifies top-performing AI models with fewer than 20 billion parameters, offering insights into efficiency and precision in smaller models.
Top 500 Official Models by Score
Identifies top performing models based on a combined score of IFEval and MMLU-PRO metrics, filtering by official providers and parameter count, offering insights into efficient model performance.
Top 200 Official Models by Score
Discovers top high-performing models with less than 70 billion parameters, highlighting their evaluation scores and characteristics, which is valuable for model selection and optimization.
SQL Console for open-llm-leaderboard/contents
Identifies top-performing models with fewer than 70 billion parameters, combining two evaluation metrics to reveal the best balanced options.
Top 10 Official Leaderboard Models
The query identifies top 10 official providers with under 13 billion parameters, ordered by their average metric, revealing valuable insights into efficient models.
SQL Console for open-llm-leaderboard/contents
This query filters and ranks models within a specific parameter range (6-8 billion) for the LlamaForCausalLM architecture based on their average performance metric.
SQL Console for open-llm-leaderboard/contents
Retrieves entries related to chat models that are officially provided, offering a filtered view of the dataset.
SQL Console for open-llm-leaderboard/contents
The query retrieves entries marked as "Official Providers", offering basic filtering but limited analytical value.
Top 10 Official Training Data
The query retrieves a small sample of records from the 'train' dataset where the "Official Providers" flag is true, providing basic filtering with limited analytical value.