Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,10 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
This repo contains a low-rank adapter for LLaMA-7b fit on the Stanford Alpaca Odia dataset.
|
| 2 |
+
|
| 3 |
+
This version of the weights was trained with the following hyperparameters:
|
| 4 |
+
|
| 5 |
+
Epochs: 3 (load from best epoch)
|
| 6 |
+
Batch size: 128
|
| 7 |
+
Cutoff length: 512
|
| 8 |
+
Learning rate: 3e-4 Lora
|
| 9 |
+
r: 16
|
| 10 |
+
Lora target modules: q_proj, k_proj, v_proj, o_proj
|