eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 2
values | Architecture
stringclasses 64
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 0.74
52.1
| Hub License
stringclasses 27
values | Hub ❤️
int64 0
6.09k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
187
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.22
0.83
| BBH
float64 0.25
76.7
| MATH Lvl 5 Raw
float64 0
0.71
| MATH Lvl 5
float64 0
71.5
| GPQA Raw
float64 0.21
0.47
| GPQA
float64 0
29.4
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.7
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 525
values | Submission Date
stringclasses 263
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
sethuiyer_Llamaverse-3.1-8B-Instruct_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sethuiyer/Llamaverse-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/Llamaverse-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sethuiyer__Llamaverse-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sethuiyer/Llamaverse-3.1-8B-Instruct
|
6d81e7054eef74a3aa3f26255d57537a9bb15f19
| 26.1921
|
llama3.1
| 5
| 8.03
| true
| false
| false
| true
| 1.432213
| 0.618541
| 61.854103
| 0.541416
| 34.782118
| 0.185801
| 18.58006
| 0.291107
| 5.480984
| 0.376167
| 8.420833
| 0.352311
| 28.034501
| true
| false
|
2025-01-14
|
2025-01-14
| 1
|
sethuiyer/Llamaverse-3.1-8B-Instruct (Merge)
|
sethuiyer_Llamazing-3.1-8B-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sethuiyer/Llamazing-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/Llamazing-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sethuiyer__Llamazing-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sethuiyer/Llamazing-3.1-8B-Instruct
|
2c9c702cbe3fce894de728399efcc1c36d6a81ac
| 23.606046
| 0
| 8.03
| false
| false
| false
| true
| 2.620575
| 0.57113
| 57.113016
| 0.529107
| 32.850609
| 0.054381
| 5.438066
| 0.312081
| 8.277405
| 0.397594
| 8.999219
| 0.360622
| 28.957964
| false
| false
|
2025-01-19
| 0
|
Removed
|
||
sethuiyer_Qwen2.5-7B-Anvita_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sethuiyer/Qwen2.5-7B-Anvita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sethuiyer/Qwen2.5-7B-Anvita</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sethuiyer__Qwen2.5-7B-Anvita-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sethuiyer/Qwen2.5-7B-Anvita
|
dc6f8ca6507cc282938e70b23b02c1a3db7b7ddc
| 29.898362
|
apache-2.0
| 1
| 7.616
| true
| false
| false
| true
| 2.160247
| 0.648042
| 64.804164
| 0.546586
| 35.482448
| 0.201662
| 20.166163
| 0.327181
| 10.290828
| 0.433656
| 13.473698
| 0.416556
| 35.172872
| false
| false
|
2024-10-11
|
2024-10-27
| 1
|
sethuiyer/Qwen2.5-7B-Anvita (Merge)
|
shadowml_BeagSake-7B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/shadowml/BeagSake-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">shadowml/BeagSake-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/shadowml__BeagSake-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
shadowml/BeagSake-7B
|
b7a3b25a188a4608fd05fc4247ddd504c1f529d1
| 19.000757
|
cc-by-nc-4.0
| 2
| 7.242
| true
| false
| false
| true
| 4.07867
| 0.521596
| 52.159603
| 0.471103
| 25.192945
| 0.050604
| 5.060423
| 0.28104
| 4.138702
| 0.412354
| 9.844271
| 0.258477
| 17.608599
| true
| false
|
2024-01-31
|
2024-10-29
| 1
|
shadowml/BeagSake-7B (Merge)
|
shadowml_Mixolar-4x7b_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/shadowml/Mixolar-4x7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">shadowml/Mixolar-4x7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/shadowml__Mixolar-4x7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
shadowml/Mixolar-4x7b
|
bb793526b063765e9861cad8834160fb0945e66d
| 20.252697
|
apache-2.0
| 3
| 36.099
| true
| true
| false
| false
| 4.709455
| 0.38933
| 38.933031
| 0.521595
| 32.728964
| 0.058157
| 5.81571
| 0.292785
| 5.704698
| 0.42575
| 12.71875
| 0.330535
| 25.615027
| true
| false
|
2023-12-30
|
2024-08-05
| 0
|
shadowml/Mixolar-4x7b
|
shastraai_Shastra-LLAMA2-Math-Commonsense-SFT_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/shastraai/Shastra-LLAMA2-Math-Commonsense-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">shastraai/Shastra-LLAMA2-Math-Commonsense-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/shastraai__Shastra-LLAMA2-Math-Commonsense-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
shastraai/Shastra-LLAMA2-Math-Commonsense-SFT
|
97a578246d4edecb5fde3dae262a64e4ec9f489a
| 10.490759
| 0
| 6.738
| false
| false
| false
| false
| 1.528084
| 0.304151
| 30.415076
| 0.384317
| 13.659523
| 0.017372
| 1.73716
| 0.259228
| 1.230425
| 0.360448
| 4.822656
| 0.199717
| 11.079713
| false
| false
|
2024-10-27
| 0
|
Removed
|
||
shivam9980_NEPALI-LLM_bfloat16
|
bfloat16
|
🟩 continuously pretrained
|
🟩
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/shivam9980/NEPALI-LLM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">shivam9980/NEPALI-LLM</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/shivam9980__NEPALI-LLM-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
shivam9980/NEPALI-LLM
|
5fe146065b53bfd6d8e242cffbe9176bc245551d
| 6.930553
|
apache-2.0
| 0
| 10.273
| true
| false
| false
| false
| 14.370849
| 0.041666
| 4.166611
| 0.382846
| 13.125244
| 0.009063
| 0.906344
| 0.261745
| 1.565996
| 0.412198
| 9.991406
| 0.206449
| 11.827719
| false
| false
|
2024-09-17
|
2024-09-24
| 2
|
google/gemma-2-9b
|
shivam9980_mistral-7b-news-cnn-merged_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/shivam9980/mistral-7b-news-cnn-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">shivam9980/mistral-7b-news-cnn-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/shivam9980__mistral-7b-news-cnn-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
shivam9980/mistral-7b-news-cnn-merged
|
a0d7029cb00c122843aef3d7ad61d514de334ea3
| 17.196276
|
apache-2.0
| 1
| 7.723
| true
| false
| false
| true
| 3.188185
| 0.463419
| 46.341928
| 0.363548
| 11.146536
| 0.018882
| 1.888218
| 0.308725
| 7.829978
| 0.45226
| 15.665885
| 0.282746
| 20.305112
| false
| false
|
2024-03-18
|
2024-09-12
| 2
|
mistralai/mistral-7b-instruct-v0.2
|
shivank21_mistral_dpo_self_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
<a target="_blank" href="https://huggingface.co/shivank21/mistral_dpo_self" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">shivank21/mistral_dpo_self</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/shivank21__mistral_dpo_self-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
shivank21/mistral_dpo_self
|
2dcff66eac0c01dc50e4c41eea959968232187fe
| 9.824436
| 0
| 7.913
| false
| false
| false
| true
| 2.263063
| 0.340346
| 34.034584
| 0.321626
| 5.548412
| 0.021903
| 2.190332
| 0.240772
| 0
| 0.324667
| 3.683333
| 0.22141
| 13.489953
| false
| false
|
2025-02-03
|
2025-02-05
| 0
|
shivank21/mistral_dpo_self
|
||
shuttleai_shuttle-3_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/shuttleai/shuttle-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">shuttleai/shuttle-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/shuttleai__shuttle-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
shuttleai/shuttle-3
|
b48807a86c65e121f31f0ebdb2d1272bdd253a9a
| 46.704607
|
other
| 36
| 72.706
| true
| false
| false
| true
| 47.041323
| 0.815403
| 81.540313
| 0.742033
| 64.053016
| 0.45997
| 45.996979
| 0.411913
| 21.588367
| 0.437688
| 14.644271
| 0.571642
| 52.404699
| false
| false
|
2024-10-09
|
2024-12-04
| 1
|
Qwen/Qwen2.5-72B
|
shyamieee_Padma-v7.0_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/shyamieee/Padma-v7.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">shyamieee/Padma-v7.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/shyamieee__Padma-v7.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
shyamieee/Padma-v7.0
|
caf70bd6e2f819cc6a18dda8516f2cbdc101fdde
| 19.756218
|
apache-2.0
| 0
| 7.242
| true
| false
| false
| false
| 1.179798
| 0.38411
| 38.410972
| 0.511879
| 31.657521
| 0.070242
| 7.024169
| 0.286074
| 4.809843
| 0.438552
| 14.085677
| 0.302942
| 22.549128
| true
| false
|
2024-06-26
|
2024-06-26
| 1
|
shyamieee/Padma-v7.0 (Merge)
|
silma-ai_SILMA-9B-Instruct-v1.0_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/silma-ai/SILMA-9B-Instruct-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">silma-ai/SILMA-9B-Instruct-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/silma-ai__SILMA-9B-Instruct-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
silma-ai/SILMA-9B-Instruct-v1.0
|
25d7b116ab3fb9f97417a297f8df4a7e34e7de68
| 26.308012
|
gemma
| 69
| 9.242
| true
| false
| false
| true
| 2.491998
| 0.584194
| 58.419438
| 0.521902
| 30.713003
| 0.116314
| 11.63142
| 0.305369
| 7.38255
| 0.463698
| 17.26224
| 0.391955
| 32.439421
| false
| false
|
2024-08-17
|
2024-11-12
| 0
|
silma-ai/SILMA-9B-Instruct-v1.0
|
silma-ai_SILMA-Kashif-2B-Instruct-v1.0_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/silma-ai/SILMA-Kashif-2B-Instruct-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">silma-ai/SILMA-Kashif-2B-Instruct-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/silma-ai__SILMA-Kashif-2B-Instruct-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
silma-ai/SILMA-Kashif-2B-Instruct-v1.0
|
c13e67581b7d38f79b9bfae90c273f15875d3aef
| 8.452456
|
gemma
| 14
| 2.614
| true
| false
| false
| false
| 2.387488
| 0.118078
| 11.807781
| 0.379322
| 12.844188
| 0.011329
| 1.132931
| 0.270134
| 2.684564
| 0.40426
| 8.265885
| 0.225814
| 13.979388
| false
| false
|
2025-01-26
|
2025-01-27
| 0
|
silma-ai/SILMA-Kashif-2B-Instruct-v1.0
|
siqi00_Mistral-7B-DFT_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/siqi00/Mistral-7B-DFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">siqi00/Mistral-7B-DFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/siqi00__Mistral-7B-DFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
siqi00/Mistral-7B-DFT
|
d0ec860cddca6094253d50d474ee78bfe371df2b
| 20.755222
|
apache-2.0
| 0
| 7.242
| true
| false
| false
| true
| 0.875522
| 0.556867
| 55.686689
| 0.466488
| 25.364499
| 0.037764
| 3.776435
| 0.30453
| 7.270694
| 0.419115
| 10.622656
| 0.296293
| 21.810358
| false
| false
|
2025-02-05
|
2025-02-05
| 1
|
mistralai/Mistral-7B-v0.1
|
siqi00_Mistral-7B-DFT2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/siqi00/Mistral-7B-DFT2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">siqi00/Mistral-7B-DFT2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/siqi00__Mistral-7B-DFT2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
siqi00/Mistral-7B-DFT2
|
77d3f365b9b65ffcdda6ee028fd303d145b117f4
| 19.875597
|
apache-2.0
| 0
| 7.242
| true
| false
| false
| true
| 0.401105
| 0.580372
| 58.03723
| 0.396838
| 15.39381
| 0.045317
| 4.531722
| 0.299497
| 6.599553
| 0.440073
| 14.109115
| 0.285239
| 20.582151
| false
| false
|
2025-02-09
|
2025-02-26
| 1
|
mistralai/Mistral-7B-v0.1
|
skumar9_Llama-medx_v2_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/skumar9/Llama-medx_v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">skumar9/Llama-medx_v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/skumar9__Llama-medx_v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
skumar9/Llama-medx_v2
|
3c955655894733b2f851de017134c84b0a62f380
| 19.883862
|
apache-2.0
| 1
| 8.03
| true
| false
| false
| false
| 2.01996
| 0.446234
| 44.623377
| 0.490859
| 27.423042
| 0.09139
| 9.138973
| 0.305369
| 7.38255
| 0.366125
| 3.365625
| 0.346326
| 27.369607
| false
| false
|
2024-04-29
|
2025-01-29
| 0
|
skumar9/Llama-medx_v2
|
skymizer_Llama2-7b-sft-chat-custom-template-dpo_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/skymizer/Llama2-7b-sft-chat-custom-template-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">skymizer/Llama2-7b-sft-chat-custom-template-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/skymizer__Llama2-7b-sft-chat-custom-template-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
skymizer/Llama2-7b-sft-chat-custom-template-dpo
|
22302ebd8c551a5f302fcb8366cc61fdeedf0e00
| 10.140548
|
llama2
| 0
| 6.738
| true
| false
| false
| false
| 1.232941
| 0.235282
| 23.528238
| 0.368847
| 11.238865
| 0.01435
| 1.435045
| 0.239094
| 0
| 0.442865
| 14.12474
| 0.194648
| 10.516401
| false
| false
|
2024-06-11
|
2024-07-01
| 1
|
Removed
|
someon98_qwen-CoMa-0.5b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/someon98/qwen-CoMa-0.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">someon98/qwen-CoMa-0.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/someon98__qwen-CoMa-0.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
someon98/qwen-CoMa-0.5b
|
67336cfb494c0aa1995be0efdeeb9fb0c6a386fe
| 5.85806
| 1
| 0.63
| false
| false
| false
| false
| 1.018358
| 0.227664
| 22.766371
| 0.295334
| 2.126794
| 0.004532
| 0.453172
| 0.239933
| 0
| 0.404573
| 8.704948
| 0.109874
| 1.097074
| false
| false
|
2024-12-29
|
2024-12-29
| 1
|
someon98/qwen-CoMa-0.5b (Merge)
|
|
sometimesanotion_ChocoTrio-14B-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/ChocoTrio-14B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/ChocoTrio-14B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__ChocoTrio-14B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/ChocoTrio-14B-v1
|
da10e1b6a7eb22cd4a1736fab5b17e8d026c57e9
| 41.158306
| 0
| 14.766
| false
| false
| false
| false
| 1.845762
| 0.708891
| 70.88913
| 0.650584
| 50.013292
| 0.397281
| 39.728097
| 0.385067
| 18.008949
| 0.482052
| 19.75651
| 0.536985
| 48.553856
| false
| false
|
2025-03-01
| 0
|
Removed
|
||
sometimesanotion_IF-reasoning-experiment-40_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/IF-reasoning-experiment-40" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/IF-reasoning-experiment-40</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__IF-reasoning-experiment-40-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/IF-reasoning-experiment-40
|
0064fffb67d18b0f946b6e7bf3227ca0c92af3eb
| 38.780696
| 0
| 14
| false
| false
| false
| false
| 3.809918
| 0.632979
| 63.297938
| 0.611186
| 44.306408
| 0.371601
| 37.160121
| 0.380034
| 17.337808
| 0.519417
| 25.860417
| 0.502493
| 44.721483
| false
| false
|
2024-12-29
| 0
|
Removed
|
||
sometimesanotion_IF-reasoning-experiment-80_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/IF-reasoning-experiment-80" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/IF-reasoning-experiment-80</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__IF-reasoning-experiment-80-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/IF-reasoning-experiment-80
|
d1441e8bd87f11235fd4c708f6ece69a9973c343
| 22.64532
| 0
| 7.383
| false
| false
| false
| false
| 3.773965
| 0.546276
| 54.62761
| 0.421038
| 17.48234
| 0.098943
| 9.89426
| 0.284396
| 4.58613
| 0.502458
| 22.973958
| 0.336769
| 26.307624
| false
| false
|
2024-12-29
| 0
|
Removed
|
||
sometimesanotion_KytheraMix-7B-v0.2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/KytheraMix-7B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/KytheraMix-7B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__KytheraMix-7B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/KytheraMix-7B-v0.2
|
2052860a45a71fa30196077b99596264d1002429
| 32.384079
| 0
| 7.613
| false
| false
| false
| false
| 1.381779
| 0.612871
| 61.287052
| 0.56352
| 37.5015
| 0.292296
| 29.229607
| 0.33557
| 11.409396
| 0.459417
| 15.927083
| 0.450549
| 38.949837
| false
| false
|
2025-02-05
| 0
|
Removed
|
||
sometimesanotion_Lamarck-14B-v0.1-experimental_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Lamarck-14B-v0.1-experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Lamarck-14B-v0.1-experimental</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Lamarck-14B-v0.1-experimental-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Lamarck-14B-v0.1-experimental
|
b0600e08e8c97b25d1abca543b997d9927245442
| 37.552164
| 0
| 14.766
| false
| false
| false
| false
| 3.789387
| 0.535385
| 53.5385
| 0.658254
| 50.794908
| 0.358006
| 35.800604
| 0.381711
| 17.561521
| 0.472844
| 18.638802
| 0.540808
| 48.97865
| false
| false
|
2024-12-09
| 0
|
Removed
|
||
sometimesanotion_Lamarck-14B-v0.3_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Lamarck-14B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Lamarck-14B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Lamarck-14B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Lamarck-14B-v0.3
|
781637d1b65766fe933ebde070632e48f91390ab
| 36.853034
|
apache-2.0
| 2
| 14.766
| true
| false
| false
| false
| 9.590061
| 0.503162
| 50.316161
| 0.66114
| 51.274309
| 0.340634
| 34.063444
| 0.388423
| 18.456376
| 0.468813
| 18.001563
| 0.541057
| 49.006353
| true
| false
|
2024-12-06
|
2024-12-09
| 1
|
sometimesanotion/Lamarck-14B-v0.3 (Merge)
|
sometimesanotion_Lamarck-14B-v0.4-Qwenvergence_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Lamarck-14B-v0.4-Qwenvergence" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Lamarck-14B-v0.4-Qwenvergence</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Lamarck-14B-v0.4-Qwenvergence-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Lamarck-14B-v0.4-Qwenvergence
|
add9a151dd5614603bebcf3d3740fa92e5d67632
| 36.620146
| 0
| 14.766
| false
| false
| false
| false
| 3.482527
| 0.490647
| 49.064704
| 0.653514
| 50.208045
| 0.339879
| 33.987915
| 0.378356
| 17.114094
| 0.484688
| 20.385937
| 0.540642
| 48.96018
| false
| false
|
2024-12-12
| 0
|
Removed
|
||
sometimesanotion_Lamarck-14B-v0.6_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Lamarck-14B-v0.6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Lamarck-14B-v0.6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Lamarck-14B-v0.6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Lamarck-14B-v0.6
|
e9c144208c045fe6954ef3f658a3bda38dbd0d82
| 41.167444
|
apache-2.0
| 14
| 14.766
| true
| false
| false
| false
| 3.844769
| 0.697251
| 69.725107
| 0.646031
| 49.297895
| 0.404079
| 40.407855
| 0.389262
| 18.568233
| 0.484688
| 20.119271
| 0.539977
| 48.886303
| true
| false
|
2025-01-04
|
2025-01-05
| 1
|
sometimesanotion/Lamarck-14B-v0.6 (Merge)
|
sometimesanotion_Lamarck-14B-v0.6-002-model_stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Lamarck-14B-v0.6-002-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Lamarck-14B-v0.6-002-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Lamarck-14B-v0.6-002-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Lamarck-14B-v0.6-002-model_stock
|
c2d5adb04b1839aeeca77a3f2a5be08864116da1
| 39.457579
| 0
| 14
| false
| false
| false
| false
| 3.775851
| 0.669224
| 66.922432
| 0.614335
| 45.006584
| 0.377644
| 37.76435
| 0.374161
| 16.55481
| 0.518021
| 25.452604
| 0.505402
| 45.044696
| false
| false
|
2025-01-01
| 0
|
Removed
|
||
sometimesanotion_Lamarck-14B-v0.6-model_stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Lamarck-14B-v0.6-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Lamarck-14B-v0.6-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Lamarck-14B-v0.6-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Lamarck-14B-v0.6-model_stock
|
4d4227285a889ffd23618ad32ff7b08d1bcfa5ae
| 40.676082
| 0
| 14
| false
| false
| false
| false
| 3.723279
| 0.678966
| 67.896625
| 0.626944
| 46.491326
| 0.424471
| 42.44713
| 0.384228
| 17.897092
| 0.500656
| 22.682031
| 0.519781
| 46.642287
| false
| false
|
2024-12-31
| 0
|
Removed
|
||
sometimesanotion_Lamarck-14B-v0.7-Fusion_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Lamarck-14B-v0.7-Fusion" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Lamarck-14B-v0.7-Fusion</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Lamarck-14B-v0.7-Fusion-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Lamarck-14B-v0.7-Fusion
|
f2413f4fa9d9fdc6a29b8c28f541875a7a8061df
| 41.681652
|
apache-2.0
| 8
| 14.766
| true
| false
| false
| false
| 5.515341
| 0.682113
| 68.211346
| 0.654364
| 50.4265
| 0.404079
| 40.407855
| 0.401007
| 20.134228
| 0.499135
| 22.12526
| 0.539063
| 48.784722
| true
| false
|
2025-02-23
|
2025-02-23
| 1
|
sometimesanotion/Lamarck-14B-v0.7-Fusion (Merge)
|
sometimesanotion_Lamarck-14B-v0.7-rc1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Lamarck-14B-v0.7-rc1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Lamarck-14B-v0.7-rc1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Lamarck-14B-v0.7-rc1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Lamarck-14B-v0.7-rc1
|
7735f8b60b6cf5728ee26b84e4d7fab846657ac4
| 41.141253
| 0
| 14.766
| false
| false
| false
| false
| 3.897561
| 0.730548
| 73.054828
| 0.648603
| 49.508161
| 0.385196
| 38.519637
| 0.389262
| 18.568233
| 0.471479
| 18.134896
| 0.541556
| 49.061761
| false
| false
|
2025-01-19
| 0
|
Removed
|
||
sometimesanotion_Lamarck-14B-v0.7-rc4_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Lamarck-14B-v0.7-rc4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Lamarck-14B-v0.7-rc4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Lamarck-14B-v0.7-rc4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Lamarck-14B-v0.7-rc4
|
724da952865e5fe0555e7d86bda9168541df0f2e
| 41.790135
|
apache-2.0
| 39
| 14.766
| true
| false
| false
| false
| 3.832462
| 0.721081
| 72.108118
| 0.650965
| 49.85495
| 0.402568
| 40.256798
| 0.389262
| 18.568233
| 0.491198
| 21.066406
| 0.539977
| 48.886303
| true
| false
|
2025-01-21
|
2025-01-21
| 1
|
sometimesanotion/Lamarck-14B-v0.7-rc4 (Merge)
|
sometimesanotion_LamarckInfusion-14B-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/LamarckInfusion-14B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/LamarckInfusion-14B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__LamarckInfusion-14B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/LamarckInfusion-14B-v1
|
39236e060b4aae1f882abeb6e2a3672076169c91
| 42.057673
| 0
| 14.766
| false
| false
| false
| false
| 1.981689
| 0.719832
| 71.983227
| 0.653925
| 50.347643
| 0.416918
| 41.691843
| 0.39094
| 18.791946
| 0.489896
| 20.903646
| 0.53765
| 48.627733
| false
| false
|
2025-02-25
| 0
|
Removed
|
||
sometimesanotion_LamarckInfusion-14B-v2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/LamarckInfusion-14B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/LamarckInfusion-14B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__LamarckInfusion-14B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/LamarckInfusion-14B-v2
|
fb7c7f4ae83dcaab6d9e9e6c21af7fe83f584561
| 42.110943
| 0
| 14.766
| false
| false
| false
| false
| 1.885215
| 0.681189
| 68.118924
| 0.656443
| 50.841491
| 0.438822
| 43.882175
| 0.387584
| 18.344519
| 0.49926
| 22.407552
| 0.541639
| 49.070996
| false
| false
|
2025-03-01
| 0
|
Removed
|
||
sometimesanotion_LamarckInfusion-14B-v2-hi_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/LamarckInfusion-14B-v2-hi" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/LamarckInfusion-14B-v2-hi</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__LamarckInfusion-14B-v2-hi-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/LamarckInfusion-14B-v2-hi
|
291a3e56e35b33172d8eaa574bbb64cdd13e46d3
| 41.509609
| 0
| 14.766
| false
| false
| false
| false
| 1.902566
| 0.685486
| 68.548562
| 0.655503
| 50.658425
| 0.422961
| 42.296073
| 0.388423
| 18.456376
| 0.484719
| 20.15651
| 0.540475
| 48.941711
| false
| false
|
2025-03-01
| 0
|
Removed
|
||
sometimesanotion_LamarckInfusion-14B-v2-lo_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/LamarckInfusion-14B-v2-lo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/LamarckInfusion-14B-v2-lo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__LamarckInfusion-14B-v2-lo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/LamarckInfusion-14B-v2-lo
|
50afe3fe5b5f3ba5455929f301f426a5f4229938
| 41.584078
| 0
| 14.766
| false
| false
| false
| false
| 1.815479
| 0.678791
| 67.879116
| 0.652844
| 50.252992
| 0.423716
| 42.371601
| 0.385906
| 18.120805
| 0.499104
| 22.021354
| 0.539727
| 48.858599
| false
| false
|
2025-03-01
| 0
|
Removed
|
||
sometimesanotion_LamarckInfusion-14B-v3_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/LamarckInfusion-14B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/LamarckInfusion-14B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__LamarckInfusion-14B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/LamarckInfusion-14B-v3
|
f2efbc9345e6d1edb59525901226e06dd38d23bf
| 41.583157
| 0
| 14.766
| false
| false
| false
| false
| 1.948833
| 0.713138
| 71.313781
| 0.651767
| 50.091807
| 0.412387
| 41.238671
| 0.386745
| 18.232662
| 0.482021
| 19.652604
| 0.540725
| 48.969415
| false
| false
|
2025-03-01
| 0
|
Removed
|
||
sometimesanotion_Qwen-14B-ProseStock-v4_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen-14B-ProseStock-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen-14B-ProseStock-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen-14B-ProseStock-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwen-14B-ProseStock-v4
|
7bbd108559500c0efca1f8925180bb1771425559
| 37.37646
| 0
| 14
| false
| false
| false
| false
| 3.688964
| 0.494219
| 49.421867
| 0.649827
| 49.5413
| 0.364048
| 36.404834
| 0.388423
| 18.456376
| 0.493833
| 21.695833
| 0.538647
| 48.738549
| false
| false
|
2024-12-24
| 0
|
Removed
|
||
sometimesanotion_Qwen-2.5-14B-Virmarckeoso_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen-2.5-14B-Virmarckeoso" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen-2.5-14B-Virmarckeoso</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen-2.5-14B-Virmarckeoso-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwen-2.5-14B-Virmarckeoso
| 36.636164
| 0
| 14.766
| false
| false
| false
| false
| 4.790309
| 0.48133
| 48.132954
| 0.656973
| 50.652295
| 0.356495
| 35.649547
| 0.379195
| 17.225951
| 0.479354
| 19.519271
| 0.537733
| 48.636968
| false
| false
|
2024-12-10
| 0
|
Removed
|
|||
sometimesanotion_Qwen2.5-14B-Vimarckoso_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen2.5-14B-Vimarckoso" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen2.5-14B-Vimarckoso</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen2.5-14B-Vimarckoso-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwen2.5-14B-Vimarckoso
|
0865365f6c0b221c08fdd5adf8965f3720645226
| 36.056941
| 0
| 14.766
| false
| false
| false
| false
| 3.135318
| 0.457424
| 45.742408
| 0.644635
| 49.178956
| 0.338369
| 33.836858
| 0.392617
| 19.01566
| 0.485865
| 20.466406
| 0.532912
| 48.101359
| false
| false
|
2024-12-11
| 0
|
Removed
|
||
sometimesanotion_Qwen2.5-14B-Vimarckoso-v2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen2.5-14B-Vimarckoso-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen2.5-14B-Vimarckoso-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen2.5-14B-Vimarckoso-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwen2.5-14B-Vimarckoso-v2
|
5768a4448e4e3a95a7f459ac2b106abbf8510840
| 36.185823
| 0
| 14
| false
| false
| false
| false
| 3.165081
| 0.45053
| 45.053015
| 0.655034
| 50.419625
| 0.358006
| 35.800604
| 0.38255
| 17.673378
| 0.481896
| 19.503646
| 0.537982
| 48.664672
| false
| false
|
2024-12-26
| 0
|
Removed
|
||
sometimesanotion_Qwen2.5-14B-Vimarckoso-v3_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen2.5-14B-Vimarckoso-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen2.5-14B-Vimarckoso-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen2.5-14B-Vimarckoso-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3
|
e2f4b596010057af0cd8f27ba992bf9d6af48801
| 41.026522
|
apache-2.0
| 11
| 14
| true
| false
| false
| false
| 3.857249
| 0.725652
| 72.565238
| 0.64146
| 48.581587
| 0.400302
| 40.030211
| 0.380034
| 17.337808
| 0.480688
| 19.385937
| 0.534325
| 48.258348
| true
| false
|
2024-12-27
|
2024-12-27
| 1
|
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3 (Merge)
|
sometimesanotion_Qwen2.5-14B-Vimarckoso-v3-IF-Variant_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-IF-Variant" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-IF-Variant</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen2.5-14B-Vimarckoso-v3-IF-Variant-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-IF-Variant
|
246b592926a9351b195650b5bcfe1cba9218a698
| 34.412379
| 0
| 14
| false
| false
| false
| false
| 3.934752
| 0.641297
| 64.129731
| 0.552079
| 35.653097
| 0.254532
| 25.453172
| 0.347315
| 12.975391
| 0.531917
| 28.389583
| 0.45886
| 39.873301
| false
| false
|
2024-12-28
| 0
|
Removed
|
||
sometimesanotion_Qwen2.5-14B-Vimarckoso-v3-Prose01_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-Prose01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-Prose01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen2.5-14B-Vimarckoso-v3-Prose01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-Prose01
|
3c65fc0b2ffb89149b4c6e984414d3a13000fd7c
| 40.27917
| 0
| 14
| false
| false
| false
| false
| 3.787916
| 0.687234
| 68.723432
| 0.635877
| 47.706625
| 0.399547
| 39.954683
| 0.386745
| 18.232662
| 0.480719
| 19.55651
| 0.52751
| 47.501108
| false
| false
|
2024-12-30
| 0
|
Removed
|
||
sometimesanotion_Qwen2.5-14B-Vimarckoso-v3-model_stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen2.5-14B-Vimarckoso-v3-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-model_stock
|
06ec138247d03a9308c886c8b326f210c18117e4
| 41.224844
| 0
| 14
| false
| false
| false
| false
| 3.769905
| 0.716185
| 71.618528
| 0.642092
| 48.761006
| 0.424471
| 42.44713
| 0.380034
| 17.337808
| 0.478115
| 19.23099
| 0.531582
| 47.953605
| false
| false
|
2024-12-27
| 0
|
Removed
|
||
sometimesanotion_Qwen2.5-7B-Gordion-v0.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen2.5-7B-Gordion-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen2.5-7B-Gordion-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen2.5-7B-Gordion-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwen2.5-7B-Gordion-v0.1
|
0f31fa5189c4d5106d374535ced13c9817cb2c8b
| 32.156837
| 0
| 7.613
| false
| false
| false
| true
| 1.386424
| 0.748184
| 74.818371
| 0.552381
| 36.011774
| 0.291541
| 29.154079
| 0.307886
| 7.718121
| 0.401625
| 8.569792
| 0.43002
| 36.668883
| false
| false
|
2025-02-07
| 0
|
Removed
|
||
sometimesanotion_Qwen2.5-7B-Gordion-v0.1-Prose_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen2.5-7B-Gordion-v0.1-Prose" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen2.5-7B-Gordion-v0.1-Prose</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen2.5-7B-Gordion-v0.1-Prose-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwen2.5-7B-Gordion-v0.1-Prose
|
12c9682b5d9e5e738e6b818b01ead86a76364ffc
| 30.51879
| 0
| 7.613
| false
| false
| false
| false
| 0.646903
| 0.53471
| 53.471012
| 0.559909
| 37.441327
| 0.289275
| 28.927492
| 0.32047
| 9.395973
| 0.450177
| 14.705469
| 0.452543
| 39.171469
| false
| false
|
2025-02-08
| 0
|
Removed
|
||
sometimesanotion_Qwen2.5-7B-Gordion-v0.1-Reason_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen2.5-7B-Gordion-v0.1-Reason" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen2.5-7B-Gordion-v0.1-Reason</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen2.5-7B-Gordion-v0.1-Reason-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwen2.5-7B-Gordion-v0.1-Reason
|
2a1913c4153e05dfab7194910864f91c9dac3e16
| 29.020682
| 0
| 7.613
| false
| false
| false
| false
| 0.670561
| 0.491721
| 49.172086
| 0.549817
| 36.259832
| 0.262085
| 26.208459
| 0.340604
| 12.080537
| 0.443417
| 13.660417
| 0.430685
| 36.74276
| false
| false
|
2025-02-08
| 0
|
Removed
|
||
sometimesanotion_Qwentessential-14B-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentessential-14B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentessential-14B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentessential-14B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwentessential-14B-v1
|
feea151e26c094b74bd8e76ef99b698854623b78
| 40.278762
| 0
| 14.766
| false
| false
| false
| false
| 1.957839
| 0.627908
| 62.790839
| 0.654517
| 50.365975
| 0.4071
| 40.70997
| 0.387584
| 18.344519
| 0.487292
| 20.778125
| 0.538148
| 48.683141
| false
| false
|
2025-02-16
| 0
|
Removed
|
||
sometimesanotion_Qwentinuum-14B-v013_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v013" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v013</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v013-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwentinuum-14B-v013
|
2e5ad6d32e76852a803b976078ac0ac2ff0aaaac
| 38.636063
| 0
| 14
| false
| false
| false
| false
| 3.847378
| 0.671123
| 67.112262
| 0.608663
| 43.965235
| 0.370846
| 37.084592
| 0.357383
| 14.317673
| 0.515417
| 24.99375
| 0.499086
| 44.342863
| false
| false
|
2024-12-26
| 0
|
Removed
|
||
sometimesanotion_Qwentinuum-14B-v1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwentinuum-14B-v1
|
cd71c7c9f4e18deed1fe8000ae4784b96c33281f
| 37.151309
| 0
| 14
| false
| false
| false
| false
| 3.836301
| 0.503162
| 50.316161
| 0.657257
| 50.737494
| 0.360272
| 36.02719
| 0.38255
| 17.673378
| 0.478052
| 19.15651
| 0.540974
| 48.997119
| false
| false
|
2024-12-21
| 0
|
Removed
|
||
sometimesanotion_Qwentinuum-14B-v2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwentinuum-14B-v2
|
70f5b77f646b5f4cc6f7decf7bd3c7b3bd4cebcf
| 37.915758
| 0
| 14
| false
| false
| false
| false
| 4.006593
| 0.537833
| 53.783295
| 0.655536
| 50.53548
| 0.375378
| 37.537764
| 0.388423
| 18.456376
| 0.471417
| 18.19375
| 0.540891
| 48.987884
| false
| false
|
2024-12-21
| 0
|
Removed
|
||
sometimesanotion_Qwentinuum-14B-v3_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwentinuum-14B-v3
|
2331e2c1afe4e224c9c019f4f03c2ad19bd15465
| 39.159304
| 0
| 14
| false
| false
| false
| false
| 3.826016
| 0.615768
| 61.576838
| 0.653865
| 50.037611
| 0.353474
| 35.347432
| 0.387584
| 18.344519
| 0.48599
| 20.615365
| 0.541307
| 49.034057
| false
| false
|
2024-12-22
| 0
|
Removed
|
||
sometimesanotion_Qwentinuum-14B-v5_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwentinuum-14B-v5
|
8be868ce00f239bf06c859c0c40fcf4c54a9205c
| 39.350778
| 0
| 14
| false
| false
| false
| false
| 3.767575
| 0.628558
| 62.855778
| 0.654985
| 50.283974
| 0.344411
| 34.441088
| 0.387584
| 18.344519
| 0.487385
| 21.089844
| 0.541805
| 49.089465
| false
| false
|
2024-12-22
| 0
|
Removed
|
||
sometimesanotion_Qwentinuum-14B-v6_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwentinuum-14B-v6
|
951576b4056fe63d02cdc31a653585d9706beba9
| 39.599469
| 0
| 14
| false
| false
| false
| false
| 3.807937
| 0.630406
| 63.040621
| 0.654452
| 50.23191
| 0.360272
| 36.02719
| 0.386745
| 18.232662
| 0.489958
| 21.178125
| 0.539977
| 48.886303
| false
| false
|
2024-12-22
| 0
|
Removed
|
||
sometimesanotion_Qwentinuum-14B-v6-Prose_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v6-Prose" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v6-Prose</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v6-Prose-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwentinuum-14B-v6-Prose
|
fc6086a7732bc8e87505f4c2bc49561a52ad04a9
| 38.696454
| 0
| 14
| false
| false
| false
| false
| 3.92707
| 0.564286
| 56.428609
| 0.654511
| 50.140601
| 0.370091
| 37.009063
| 0.388423
| 18.456376
| 0.49126
| 21.340885
| 0.539229
| 48.803191
| false
| false
|
2024-12-26
| 0
|
Removed
|
||
sometimesanotion_Qwentinuum-14B-v7_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwentinuum-14B-v7
|
e9505b4931323752ebb0c901494c050835f0e4d8
| 39.150356
| 0
| 14
| false
| false
| false
| false
| 3.795124
| 0.610922
| 61.092235
| 0.655143
| 50.347065
| 0.357251
| 35.725076
| 0.39094
| 18.791946
| 0.48199
| 19.948698
| 0.540974
| 48.997119
| false
| false
|
2024-12-22
| 0
|
Removed
|
||
sometimesanotion_Qwentinuum-14B-v8_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwentinuum-14B-v8
|
a856d3095937fd39f829824d6c6d9950cf56dc1d
| 38.48493
| 0
| 14
| false
| false
| false
| false
| 3.955933
| 0.541155
| 54.115525
| 0.653426
| 50.11143
| 0.391239
| 39.123867
| 0.383389
| 17.785235
| 0.487323
| 20.748698
| 0.541223
| 49.024823
| false
| false
|
2024-12-22
| 0
|
Removed
|
||
sometimesanotion_Qwentinuum-14B-v9_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwentinuum-14B-v9
|
3109d6342d8740336dc83569def5b3d80abfac38
| 37.217442
| 0
| 14
| false
| false
| false
| false
| 3.805382
| 0.51073
| 51.073042
| 0.658026
| 50.801347
| 0.348187
| 34.818731
| 0.385906
| 18.120805
| 0.478115
| 19.364323
| 0.542138
| 49.126404
| false
| false
|
2024-12-22
| 0
|
Removed
|
||
sometimesanotion_Qwenvergence-14B-qv256_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-qv256" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-qv256</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-qv256-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-qv256
|
13e8b600da0b78b23481738858b7ed2d533ee6e5
| 40.120387
| 0
| 14
| false
| false
| false
| false
| 3.927285
| 0.700623
| 70.062324
| 0.631208
| 47.078218
| 0.389728
| 38.97281
| 0.378356
| 17.114094
| 0.492594
| 21.074219
| 0.517786
| 46.420656
| false
| false
|
2025-01-01
| 0
|
Removed
|
||
sometimesanotion_Qwenvergence-14B-v0.6-004-model_stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v0.6-004-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v0.6-004-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v0.6-004-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v0.6-004-model_stock
|
1fa94759545d9b591bcbbe93a2c90f2a346f9580
| 40.60376
| 0
| 14
| false
| false
| false
| false
| 3.819502
| 0.685985
| 68.598541
| 0.624934
| 46.366654
| 0.409366
| 40.936556
| 0.383389
| 17.785235
| 0.503323
| 23.348698
| 0.519282
| 46.586879
| false
| false
|
2025-01-01
| 0
|
Removed
|
||
sometimesanotion_Qwenvergence-14B-v10_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v10" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v10</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v10-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v10
|
49b05dd6652ff43233ced904b0b49775c06abf75
| 41.47601
| 0
| 14.766
| false
| false
| false
| false
| 4.020614
| 0.675694
| 67.569383
| 0.631643
| 46.746254
| 0.478852
| 47.885196
| 0.379195
| 17.225951
| 0.499135
| 22.32526
| 0.523936
| 47.104019
| false
| false
|
2025-01-22
| 0
|
Removed
|
||
sometimesanotion_Qwenvergence-14B-v11_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v11" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v11</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v11-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v11
|
1bac3b52bdcbba680213f3771451d32ea86f3d28
| 41.516781
|
apache-2.0
| 4
| 14.766
| true
| false
| false
| false
| 3.71033
| 0.719233
| 71.923275
| 0.636755
| 47.548953
| 0.464502
| 46.450151
| 0.372483
| 16.331096
| 0.475448
| 18.764323
| 0.532746
| 48.08289
| true
| false
|
2025-01-29
|
2025-01-29
| 1
|
sometimesanotion/Qwenvergence-14B-v11 (Merge)
|
sometimesanotion_Qwenvergence-14B-v12-Prose_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v12-Prose" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v12-Prose</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v12-Prose-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v12-Prose
|
9e84960b3d2b763ac5b3a3316340af39a4130ba9
| 38.052478
| 2
| 14.766
| false
| false
| false
| false
| 3.510997
| 0.541205
| 54.120511
| 0.650425
| 49.672529
| 0.353474
| 35.347432
| 0.386745
| 18.232662
| 0.499135
| 22.258594
| 0.538148
| 48.683141
| false
| false
|
2025-01-29
|
2025-01-30
| 1
|
sometimesanotion/Qwenvergence-14B-v12-Prose (Merge)
|
|
sometimesanotion_Qwenvergence-14B-v12-Prose-DS_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v12-Prose-DS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v12-Prose-DS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v12-Prose-DS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v12-Prose-DS
|
51d945881cab30d74de0c8f91a8dda4ea7ed9dc4
| 41.203248
| 7
| 14.766
| false
| false
| false
| false
| 3.361177
| 0.617342
| 61.734199
| 0.650673
| 49.86582
| 0.430514
| 43.05136
| 0.394295
| 19.239374
| 0.515073
| 24.784115
| 0.536902
| 48.544622
| false
| false
|
2025-01-30
|
2025-01-30
| 1
|
sometimesanotion/Qwenvergence-14B-v12-Prose-DS (Merge)
|
|
sometimesanotion_Qwenvergence-14B-v13-Prose-DS_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v13-Prose-DS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v13-Prose-DS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v13-Prose-DS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v13-Prose-DS
|
f2c9340915c7a0e49ba980baf66391bc1d568695
| 41.078776
|
apache-2.0
| 8
| 14.766
| true
| false
| false
| false
| 5.834947
| 0.717809
| 71.780875
| 0.640508
| 48.43969
| 0.385952
| 38.595166
| 0.383389
| 17.785235
| 0.492656
| 21.548698
| 0.534907
| 48.322991
| true
| false
|
2025-02-01
|
2025-02-05
| 1
|
sometimesanotion/Qwenvergence-14B-v13-Prose-DS (Merge)
|
sometimesanotion_Qwenvergence-14B-v15-Prose-MS_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v15-Prose-MS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v15-Prose-MS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v15-Prose-MS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v15-Prose-MS
| 37.711738
| 0
| 14.766
| false
| false
| false
| false
| 1.945424
| 0.503211
| 50.321148
| 0.655013
| 50.278196
| 0.363293
| 36.329305
| 0.395134
| 19.35123
| 0.491292
| 21.178125
| 0.539312
| 48.812426
| false
| false
|
2025-02-26
| 0
|
Removed
|
|||
sometimesanotion_Qwenvergence-14B-v2-Prose_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v2-Prose" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v2-Prose</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v2-Prose-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v2-Prose
|
503b367e07a8ed3ce532d03ea35d40d8f17d6e35
| 36.955064
| 0
| 14
| false
| false
| false
| false
| 3.369578
| 0.470488
| 47.04883
| 0.651883
| 49.933472
| 0.35574
| 35.574018
| 0.393456
| 19.127517
| 0.492594
| 21.474219
| 0.537151
| 48.572326
| false
| false
|
2024-12-15
| 0
|
Removed
|
||
sometimesanotion_Qwenvergence-14B-v3_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v3
|
40c489fd71724f2fa3f7154e4874c6d00700c6c0
| 37.5173
| 0
| 14
| false
| false
| false
| false
| 3.80563
| 0.504411
| 50.441052
| 0.654824
| 50.352688
| 0.369335
| 36.933535
| 0.384228
| 17.897092
| 0.488594
| 20.740885
| 0.538647
| 48.738549
| false
| false
|
2024-12-21
| 0
|
Removed
|
||
sometimesanotion_Qwenvergence-14B-v3-Prose_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v3-Prose" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v3-Prose</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v3-Prose-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v3-Prose
|
15e4222295ef31aee17c2e5b6e7a31ffd21e3c7b
| 37.521867
|
apache-2.0
| 5
| 14.766
| true
| false
| false
| false
| 3.422683
| 0.491771
| 49.177072
| 0.651291
| 49.798367
| 0.364804
| 36.480363
| 0.395134
| 19.35123
| 0.493896
| 21.770313
| 0.536985
| 48.553856
| true
| false
|
2024-12-21
|
2024-12-21
| 1
|
sometimesanotion/Qwenvergence-14B-v3-Prose (Merge)
|
sometimesanotion_Qwenvergence-14B-v3-Reason_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v3-Reason" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v3-Reason</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v3-Reason-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v3-Reason
|
1e613b0e6bfdb08e7c21a3e6ba3b84e361cf8350
| 37.613265
| 0
| 14
| false
| false
| false
| false
| 3.791208
| 0.536684
| 53.668378
| 0.656128
| 50.694448
| 0.358006
| 35.800604
| 0.386745
| 18.232662
| 0.474021
| 18.452604
| 0.539478
| 48.830895
| false
| false
|
2024-12-21
| 0
|
Removed
|
||
sometimesanotion_Qwenvergence-14B-v3-Reason_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v3-Reason" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v3-Reason</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v3-Reason-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v3-Reason
|
6acf3cbc9c36b19d66ac683f073e32a9bf86d56e
| 36.714048
| 0
| 14
| false
| false
| false
| false
| 1.926172
| 0.527816
| 52.781619
| 0.655744
| 50.635776
| 0.311934
| 31.193353
| 0.384228
| 17.897092
| 0.475417
| 18.927083
| 0.539644
| 48.849365
| false
| false
|
2024-12-21
| 0
|
Removed
|
||
sometimesanotion_Qwenvergence-14B-v6-Prose_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v6-Prose" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v6-Prose</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v6-Prose-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v6-Prose
|
bbb6b0900b630a3120d036d3434ca0fa508ed559
| 38.950847
|
apache-2.0
| 0
| 14
| true
| false
| false
| false
| 3.870528
| 0.599007
| 59.90073
| 0.654375
| 50.119976
| 0.356495
| 35.649547
| 0.388423
| 18.456376
| 0.488656
| 21.015365
| 0.537068
| 48.563091
| true
| false
|
2024-12-26
|
2024-12-26
| 1
|
sometimesanotion/Qwenvergence-14B-v6-Prose (Merge)
|
sometimesanotion_Qwenvergence-14B-v6-Prose-model_stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v6-Prose-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v6-Prose-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v6-Prose-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v6-Prose-model_stock
| 37.160676
| 0
| 14
| false
| false
| false
| false
| 1.966122
| 0.481105
| 48.110458
| 0.653044
| 49.914126
| 0.360272
| 36.02719
| 0.393456
| 19.127517
| 0.489896
| 21.036979
| 0.53873
| 48.747784
| false
| false
|
2024-12-26
| 0
|
Removed
|
|||
sometimesanotion_Qwenvergence-14B-v8_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v8
|
2153b2ba874e99887b255967bb803222dc7d5c77
| 39.212222
| 0
| 14.766
| false
| false
| false
| false
| 4.021209
| 0.591339
| 59.133876
| 0.652246
| 49.834593
| 0.404834
| 40.483384
| 0.380872
| 17.449664
| 0.476781
| 19.097656
| 0.543467
| 49.274158
| false
| false
|
2025-01-16
| 0
|
Removed
|
||
sometimesanotion_Qwenvergence-14B-v9_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/Qwenvergence-14B-v9
|
68f4f5d82011dec96bccec481788dd7e591a6d75
| 39.814874
| 0
| 14.766
| false
| false
| false
| false
| 3.830781
| 0.659807
| 65.980709
| 0.616562
| 44.843356
| 0.413897
| 41.389728
| 0.368289
| 15.771812
| 0.514115
| 25.23099
| 0.511054
| 45.672651
| false
| false
|
2025-01-17
| 0
|
Removed
|
||
sometimesanotion_lamarck-14b-prose-model_stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/lamarck-14b-prose-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/lamarck-14b-prose-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__lamarck-14b-prose-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/lamarck-14b-prose-model_stock
|
d71942f5b5471fca97914ea26a9f66bb5866693e
| 35.677974
| 0
| 14.766
| false
| false
| false
| false
| 3.115039
| 0.427649
| 42.764864
| 0.648762
| 49.383876
| 0.34139
| 34.138973
| 0.393456
| 19.127517
| 0.484594
| 20.274219
| 0.535406
| 48.378398
| false
| false
|
2024-12-09
| 0
|
Removed
|
||
sometimesanotion_lamarck-14b-reason-model_stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/sometimesanotion/lamarck-14b-reason-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/lamarck-14b-reason-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__lamarck-14b-reason-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sometimesanotion/lamarck-14b-reason-model_stock
|
0f1d7f04b9219ffe3bc26aa3146380fba249d61a
| 36.961262
| 0
| 14.766
| false
| false
| false
| false
| 9.979105
| 0.496467
| 49.646715
| 0.65689
| 50.715404
| 0.358006
| 35.800604
| 0.384228
| 17.897092
| 0.474083
| 18.79375
| 0.540226
| 48.914007
| false
| false
|
2024-12-09
| 0
|
Removed
|
||
sonthenguyen_ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415
|
467eff1ac1c3395c130929bbe1f34a8194715e7c
| 8.889815
|
apache-2.0
| 0
| 7.723
| true
| false
| false
| true
| 3.255423
| 0.289338
| 28.933785
| 0.380418
| 12.789212
| 0.011329
| 1.132931
| 0.246644
| 0
| 0.386063
| 6.024479
| 0.140126
| 4.458481
| false
| false
|
2024-10-15
|
2024-10-16
| 1
|
unsloth/zephyr-sft-bnb-4bit
|
sonthenguyen_ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205
|
467eff1ac1c3395c130929bbe1f34a8194715e7c
| 12.932104
|
apache-2.0
| 0
| 7.723
| true
| false
| false
| true
| 3.177996
| 0.319938
| 31.993777
| 0.395862
| 16.710725
| 0.008308
| 0.830816
| 0.276007
| 3.467562
| 0.427177
| 12.097135
| 0.212434
| 12.492612
| false
| false
|
2024-10-15
|
2024-10-16
| 1
|
unsloth/zephyr-sft-bnb-4bit
|
sonthenguyen_ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522
|
467eff1ac1c3395c130929bbe1f34a8194715e7c
| 13.424509
|
apache-2.0
| 0
| 7.723
| true
| false
| false
| true
| 3.229396
| 0.376441
| 37.644118
| 0.382837
| 14.138282
| 0.009063
| 0.906344
| 0.265101
| 2.013423
| 0.440417
| 14.11875
| 0.205535
| 11.726138
| false
| false
|
2024-10-15
|
2024-10-16
| 1
|
unsloth/zephyr-sft-bnb-4bit
|
sonthenguyen_zephyr-sft-bnb-4bit-DPO-mtbc-213steps_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbc-213steps" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbc-213steps</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__zephyr-sft-bnb-4bit-DPO-mtbc-213steps-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbc-213steps
|
4ae2af48b6ac53f14e153b91309624100ae3d7c2
| 15.853793
|
apache-2.0
| 0
| 7.242
| true
| false
| false
| true
| 1.397621
| 0.427549
| 42.75489
| 0.419729
| 19.669907
| 0.02568
| 2.567976
| 0.261745
| 1.565996
| 0.408635
| 9.579427
| 0.270861
| 18.98456
| false
| false
|
2024-10-02
|
2024-10-03
| 0
|
sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbc-213steps
|
sonthenguyen_zephyr-sft-bnb-4bit-DPO-mtbo-180steps_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbo-180steps" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbo-180steps</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__zephyr-sft-bnb-4bit-DPO-mtbo-180steps-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbo-180steps
|
0393baf362e29cf51867596fb64746b5edafa6ed
| 15.602365
|
apache-2.0
| 0
| 7.242
| true
| false
| false
| true
| 1.35137
| 0.408714
| 40.871443
| 0.432259
| 21.351403
| 0.023414
| 2.34139
| 0.276007
| 3.467562
| 0.38851
| 6.163802
| 0.274767
| 19.418587
| false
| false
|
2024-10-02
|
2024-10-03
| 0
|
sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbo-180steps
|
sonthenguyen_zephyr-sft-bnb-4bit-DPO-mtbr-180steps_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbr-180steps" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbr-180steps</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__zephyr-sft-bnb-4bit-DPO-mtbr-180steps-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbr-180steps
|
c4ee848caf14649f9260166653d4cdb30bcfc52a
| 16.475407
|
apache-2.0
| 0
| 7.242
| true
| false
| false
| true
| 1.36845
| 0.403219
| 40.321901
| 0.430536
| 21.213568
| 0.024924
| 2.492447
| 0.280201
| 4.026846
| 0.42575
| 11.785417
| 0.27111
| 19.012264
| false
| false
|
2024-10-02
|
2024-10-03
| 0
|
sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbr-180steps
|
sophosympatheia_Midnight-Miqu-70B-v1.5_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/sophosympatheia/Midnight-Miqu-70B-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sophosympatheia/Midnight-Miqu-70B-v1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sophosympatheia__Midnight-Miqu-70B-v1.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
sophosympatheia/Midnight-Miqu-70B-v1.5
|
f6062ca8ccba38ce91eef16f85138e279160b9b9
| 25.990195
|
other
| 196
| 68.977
| true
| false
| false
| true
| 12.905934
| 0.611847
| 61.184657
| 0.560623
| 38.541462
| 0.070242
| 7.024169
| 0.296141
| 6.152125
| 0.424417
| 11.652083
| 0.38248
| 31.386673
| true
| false
|
2024-03-11
|
2024-10-22
| 1
|
sophosympatheia/Midnight-Miqu-70B-v1.5 (Merge)
|
speakleash_Bielik-11B-v2_bfloat16
|
bfloat16
|
🟩 continuously pretrained
|
🟩
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/speakleash/Bielik-11B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">speakleash/Bielik-11B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/speakleash__Bielik-11B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
speakleash/Bielik-11B-v2
|
a620588280793e605d1e0b125fe2a663030206ab
| 15.989069
|
apache-2.0
| 39
| 11.169
| true
| false
| false
| false
| 1.837466
| 0.238105
| 23.81049
| 0.493084
| 27.817907
| 0.07855
| 7.854985
| 0.288591
| 5.145414
| 0.392448
| 7.55599
| 0.313747
| 23.749631
| false
| true
|
2024-08-26
|
2024-10-16
| 0
|
speakleash/Bielik-11B-v2
|
speakleash_Bielik-11B-v2.0-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/speakleash/Bielik-11B-v2.0-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">speakleash/Bielik-11B-v2.0-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/speakleash__Bielik-11B-v2.0-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
speakleash/Bielik-11B-v2.0-Instruct
|
e4721e2af1152bad2e077c34375911a28aa1b8dc
| 24.661167
|
apache-2.0
| 4
| 11.169
| true
| false
| false
| true
| 1.776849
| 0.525243
| 52.524302
| 0.536158
| 33.774676
| 0.11858
| 11.858006
| 0.317114
| 8.948546
| 0.446708
| 14.738542
| 0.335106
| 26.122931
| false
| true
|
2024-08-26
|
2024-10-16
| 1
|
speakleash/Bielik-11B-v2
|
speakleash_Bielik-11B-v2.1-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/speakleash/Bielik-11B-v2.1-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">speakleash/Bielik-11B-v2.1-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/speakleash__Bielik-11B-v2.1-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
speakleash/Bielik-11B-v2.1-Instruct
|
c91776047eb235f51238a9e42f80f19e3ed114e7
| 27.197164
|
apache-2.0
| 3
| 11.169
| true
| false
| false
| true
| 2.611246
| 0.508982
| 50.898172
| 0.553012
| 36.290053
| 0.266616
| 26.661631
| 0.337248
| 11.63311
| 0.418521
| 10.515104
| 0.344664
| 27.184914
| false
| true
|
2024-08-26
|
2024-10-16
| 1
|
speakleash/Bielik-11B-v2
|
speakleash_Bielik-11B-v2.2-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/speakleash/Bielik-11B-v2.2-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">speakleash/Bielik-11B-v2.2-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/speakleash__Bielik-11B-v2.2-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
speakleash/Bielik-11B-v2.2-Instruct
|
b5502dab61fcc5e087e72c8a120057dea78082ad
| 27.979278
|
apache-2.0
| 59
| 11.169
| true
| false
| false
| true
| 2.921851
| 0.555194
| 55.519355
| 0.559656
| 36.958041
| 0.268127
| 26.812689
| 0.331376
| 10.850112
| 0.417125
| 10.107292
| 0.348654
| 27.628177
| false
| true
|
2024-08-26
|
2024-10-16
| 1
|
speakleash/Bielik-11B-v2
|
speakleash_Bielik-11B-v2.3-Instruct_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/speakleash/Bielik-11B-v2.3-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">speakleash/Bielik-11B-v2.3-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/speakleash__Bielik-11B-v2.3-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
speakleash/Bielik-11B-v2.3-Instruct
|
7494fdc4d648707ea12b908d40b0ae708989b329
| 28.331124
|
apache-2.0
| 47
| 11.169
| true
| false
| false
| true
| 1.812284
| 0.558291
| 55.829089
| 0.56627
| 38.062788
| 0.208459
| 20.845921
| 0.340604
| 12.080537
| 0.451823
| 16.011198
| 0.344415
| 27.15721
| true
| true
|
2024-08-30
|
2024-10-16
| 1
|
speakleash/Bielik-11B-v2.3-Instruct (Merge)
|
spmurrayzzz_Mistral-Syndicate-7B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/spmurrayzzz/Mistral-Syndicate-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spmurrayzzz/Mistral-Syndicate-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spmurrayzzz__Mistral-Syndicate-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
spmurrayzzz/Mistral-Syndicate-7B
|
c74379dd6055ef4a70339b105ea315cebec23d24
| 14.012818
|
apache-2.0
| 0
| 7.242
| true
| false
| false
| false
| 1.159718
| 0.249596
| 24.959552
| 0.424506
| 20.506252
| 0.033988
| 3.398792
| 0.276007
| 3.467562
| 0.438552
| 13.61901
| 0.263132
| 18.125739
| false
| false
|
2023-12-30
|
2024-06-27
| 1
|
mistralai/Mistral-7B-v0.1
|
spow12_ChatWaifu_12B_v2.0_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/spow12/ChatWaifu_12B_v2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spow12/ChatWaifu_12B_v2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spow12__ChatWaifu_12B_v2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
spow12/ChatWaifu_12B_v2.0
|
1fb38700b2e2a66d4ff32636817df76285cea5f1
| 21.979986
|
cc-by-nc-4.0
| 20
| 12.248
| true
| false
| false
| true
| 5.178855
| 0.476758
| 47.675833
| 0.520768
| 31.16524
| 0.070997
| 7.099698
| 0.276846
| 3.579418
| 0.443177
| 15.830469
| 0.338763
| 26.529255
| true
| false
|
2024-10-10
|
2024-10-14
| 1
|
spow12/ChatWaifu_12B_v2.0 (Merge)
|
spow12_ChatWaifu_22B_v2.0_preview_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/spow12/ChatWaifu_22B_v2.0_preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spow12/ChatWaifu_22B_v2.0_preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spow12__ChatWaifu_22B_v2.0_preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
spow12/ChatWaifu_22B_v2.0_preview
|
36af7ec06bc85405e8641986ad45c6d21353b114
| 29.545969
|
cc-by-nc-4.0
| 6
| 22.247
| true
| false
| false
| true
| 2.988408
| 0.674495
| 67.449478
| 0.617015
| 45.488294
| 0.188822
| 18.882175
| 0.315436
| 8.724832
| 0.368542
| 3.534375
| 0.39877
| 33.196661
| true
| false
|
2024-09-23
|
2024-09-24
| 1
|
spow12/ChatWaifu_22B_v2.0_preview (Merge)
|
spow12_ChatWaifu_v1.4_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/spow12/ChatWaifu_v1.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spow12/ChatWaifu_v1.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spow12__ChatWaifu_v1.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
spow12/ChatWaifu_v1.4
|
c5b2b30a8e9fa23722b6e30aa2ca1dab7fe1c2b5
| 25.706734
|
cc-by-nc-4.0
| 19
| 12.248
| true
| false
| false
| true
| 2.884285
| 0.569057
| 56.905677
| 0.517625
| 31.630554
| 0.10574
| 10.574018
| 0.307047
| 7.606264
| 0.474333
| 20.025
| 0.34749
| 27.498892
| true
| false
|
2024-09-03
|
2024-09-05
| 1
|
spow12/ChatWaifu_v1.4 (Merge)
|
spow12_ChatWaifu_v2.0_22B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/spow12/ChatWaifu_v2.0_22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spow12/ChatWaifu_v2.0_22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spow12__ChatWaifu_v2.0_22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
spow12/ChatWaifu_v2.0_22B
|
54771319920ed791ba3f0262b036f37a92b880f2
| 28.838098
|
cc-by-nc-4.0
| 10
| 22.247
| true
| false
| false
| true
| 2.739835
| 0.651089
| 65.108911
| 0.59263
| 42.286228
| 0.185801
| 18.58006
| 0.324664
| 9.955257
| 0.384198
| 5.591406
| 0.383561
| 31.506723
| true
| false
|
2024-10-11
|
2024-10-11
| 1
|
spow12/ChatWaifu_v2.0_22B (Merge)
|
spow12_ChatWaifu_v2.0_22B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/spow12/ChatWaifu_v2.0_22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spow12/ChatWaifu_v2.0_22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spow12__ChatWaifu_v2.0_22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
spow12/ChatWaifu_v2.0_22B
|
a6e7c206d9af77d3f85faf0ce4a711d62815b2ab
| 29.032305
|
cc-by-nc-4.0
| 10
| 22.247
| true
| false
| false
| true
| 2.79172
| 0.651738
| 65.17385
| 0.590805
| 42.019798
| 0.203172
| 20.317221
| 0.323826
| 9.8434
| 0.384198
| 5.591406
| 0.381233
| 31.248153
| true
| false
|
2024-10-11
|
2024-10-14
| 1
|
spow12/ChatWaifu_v2.0_22B (Merge)
|
ssmits_Qwen2.5-95B-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ssmits/Qwen2.5-95B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ssmits/Qwen2.5-95B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ssmits__Qwen2.5-95B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ssmits/Qwen2.5-95B-Instruct
|
9c0e7df57a4fcf4d364efd916a0fc0abdd2d20a3
| 45.257346
|
other
| 3
| 94.648
| true
| false
| false
| true
| 38.46699
| 0.843105
| 84.310518
| 0.70378
| 58.530351
| 0.530211
| 53.021148
| 0.364094
| 15.212528
| 0.428385
| 13.614844
| 0.521692
| 46.854684
| false
| false
|
2024-09-24
|
2024-09-26
| 1
|
ssmits/Qwen2.5-95B-Instruct (Merge)
|
stabilityai_StableBeluga2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/stabilityai/StableBeluga2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/StableBeluga2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__StableBeluga2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
stabilityai/StableBeluga2
|
cb47d3db70ea3ddc2cabdeb358c303b328f65900
| 22.808723
| 885
| 68.977
| true
| false
| false
| false
| 12.509347
| 0.378714
| 37.871403
| 0.582413
| 41.263261
| 0.043807
| 4.380665
| 0.316275
| 8.836689
| 0.472969
| 18.654427
| 0.332613
| 25.845892
| false
| true
|
2023-07-20
|
2024-06-13
| 0
|
stabilityai/StableBeluga2
|
|
stabilityai_stablelm-2-12b_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
StableLmForCausalLM
|
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-2-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
stabilityai/stablelm-2-12b
|
fead13ddbf4492970666650c3cd6f85f485411ec
| 13.998663
|
other
| 120
| 12.143
| true
| false
| false
| false
| 2.946558
| 0.156921
| 15.692141
| 0.450865
| 22.685797
| 0.043051
| 4.305136
| 0.278523
| 3.803132
| 0.447885
| 14.485677
| 0.307181
| 23.020095
| false
| true
|
2024-03-21
|
2024-06-12
| 0
|
stabilityai/stablelm-2-12b
|
stabilityai_stablelm-2-12b-chat_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
StableLmForCausalLM
|
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-12b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-12b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-2-12b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
stabilityai/stablelm-2-12b-chat
|
b6b62cd451b84e848514c00fafa66d9ead9297c5
| 16.778178
|
other
| 88
| 12.143
| true
| false
| false
| true
| 2.176193
| 0.408165
| 40.816478
| 0.467202
| 25.253697
| 0.053625
| 5.362538
| 0.266779
| 2.237136
| 0.391427
| 7.728385
| 0.273438
| 19.270833
| false
| true
|
2024-04-04
|
2024-06-12
| 0
|
stabilityai/stablelm-2-12b-chat
|
stabilityai_stablelm-2-1_6b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
StableLmForCausalLM
|
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-1_6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-1_6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-2-1_6b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
stabilityai/stablelm-2-1_6b
|
8879812cccd176fbbe9ceb747b815bcc7d6499f8
| 5.316831
|
other
| 189
| 1.645
| true
| false
| false
| false
| 1.099744
| 0.115705
| 11.570522
| 0.338458
| 8.632695
| 0.007553
| 0.755287
| 0.248322
| 0
| 0.388198
| 5.791406
| 0.14636
| 5.151079
| false
| true
|
2024-01-18
|
2024-06-12
| 0
|
stabilityai/stablelm-2-1_6b
|
Subsets and Splits
Top Models by Combined Score
Identifies top-performing models with fewer than 34 billion parameters based on a combined score of two evaluation metrics, providing insights into efficient model performance.
Top 100 Official Models <70
This query identifies the top 100 high-scoring, officially provided models with fewer than 70 billion parameters, offering a useful overview for comparing performance metrics.
Top 100 Official Models < 2
Identifies top-performing AI models with fewer than 20 billion parameters, offering insights into efficiency and precision in smaller models.
Top 500 Official Models by Score
Identifies top performing models based on a combined score of IFEval and MMLU-PRO metrics, filtering by official providers and parameter count, offering insights into efficient model performance.
Top 200 Official Models by Score
Discovers top high-performing models with less than 70 billion parameters, highlighting their evaluation scores and characteristics, which is valuable for model selection and optimization.
SQL Console for open-llm-leaderboard/contents
Identifies top-performing models with fewer than 70 billion parameters, combining two evaluation metrics to reveal the best balanced options.
Top 10 Official Leaderboard Models
The query identifies top 10 official providers with under 13 billion parameters, ordered by their average metric, revealing valuable insights into efficient models.
SQL Console for open-llm-leaderboard/contents
This query filters and ranks models within a specific parameter range (6-8 billion) for the LlamaForCausalLM architecture based on their average performance metric.
SQL Console for open-llm-leaderboard/contents
Retrieves entries related to chat models that are officially provided, offering a filtered view of the dataset.
SQL Console for open-llm-leaderboard/contents
The query retrieves entries marked as "Official Providers", offering basic filtering but limited analytical value.
Top 10 Official Training Data
The query retrieves a small sample of records from the 'train' dataset where the "Official Providers" flag is true, providing basic filtering with limited analytical value.