updated the graph image
Browse files
README.md
CHANGED
|
@@ -23,7 +23,7 @@ The hybrid design combines Transformer attention with Mamba (a state-space model
|
|
| 23 |
**Smart: Leading intelligence scores**
|
| 24 |
The model outperforms competitors, such as Gemma 3 4B, Llama 3.2 3B, and Granite 4.0 Micro, on a combined intelligence score that averages 6 standard benchmarks.
|
| 25 |
|
| 26 |
-
<img src="https://huggingface.co/ai21labs/AI21-Jamba-Reasoning-3B-GGUF/resolve/main/assets/
|
| 27 |
|
| 28 |
|
| 29 |
**Scalable: Handles very long contexts**
|
|
|
|
| 23 |
**Smart: Leading intelligence scores**
|
| 24 |
The model outperforms competitors, such as Gemma 3 4B, Llama 3.2 3B, and Granite 4.0 Micro, on a combined intelligence score that averages 6 standard benchmarks.
|
| 25 |
|
| 26 |
+
<img src="https://huggingface.co/ai21labs/AI21-Jamba-Reasoning-3B-GGUF/resolve/main/assets/Benchmark%20Performance%20-%20Jamba%20Reasoning%203B.png" width="900"/>
|
| 27 |
|
| 28 |
|
| 29 |
**Scalable: Handles very long contexts**
|