minpeter commited on
Commit
b5b90c3
·
verified ·
1 Parent(s): e3a58f5

Training in progress, step 93

Browse files
README.md ADDED
@@ -0,0 +1,204 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: peft
3
+ license: other
4
+ base_model: minpeter/HCX-SEED-FC-3B
5
+ tags:
6
+ - axolotl
7
+ - generated_from_trainer
8
+ datasets:
9
+ - minpeter/xlam-function-calling-60k-hermes
10
+ - minpeter/xlam-irrelevance-7.5k-qwen2.5-72b-distill-hermes
11
+ - minpeter/hermes-function-calling-v1-jsonl
12
+ - minpeter/hermes-function-calling-v1-jsonl
13
+ model-index:
14
+ - name: LoRA-HCX-3b-sf-xlam-00
15
+ results: []
16
+ ---
17
+
18
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
19
+ should probably proofread and complete it, then remove this comment. -->
20
+
21
+ [<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
22
+ <details><summary>See axolotl config</summary>
23
+
24
+ axolotl version: `0.10.0.dev0`
25
+ ```yaml
26
+ base_model: minpeter/HCX-SEED-FC-3B
27
+ hub_model_id: minpeter/LoRA-HCX-3b-sf-xlam-00
28
+
29
+ load_in_8bit: false
30
+ load_in_4bit: false
31
+ strict: false
32
+
33
+ datasets:
34
+ - path: minpeter/xlam-function-calling-60k-hermes
35
+ data_files:
36
+ - result.parquet
37
+ type: chat_template
38
+ roles_to_train: ["assistant"]
39
+ field_messages: conversations
40
+ message_property_mappings:
41
+ role: from
42
+ content: value
43
+ shards: 120
44
+ - path: minpeter/xlam-irrelevance-7.5k-qwen2.5-72b-distill-hermes
45
+ data_files:
46
+ - result.parquet
47
+ type: chat_template
48
+ roles_to_train: ["assistant"]
49
+ field_messages: conversations
50
+ message_property_mappings:
51
+ role: from
52
+ content: value
53
+ shards: 15
54
+ - path: minpeter/hermes-function-calling-v1-jsonl
55
+ data_files:
56
+ - func-calling-singleturn.jsonl
57
+ - func-calling.jsonl
58
+ type: chat_template
59
+ roles_to_train: ["assistant"]
60
+ field_messages: conversations
61
+ message_property_mappings:
62
+ role: from
63
+ content: value
64
+ shards: 3
65
+ - path: minpeter/hermes-function-calling-v1-jsonl
66
+ data_files:
67
+ - glaive-function-calling-5k.jsonl
68
+ type: chat_template
69
+ roles_to_train: ["assistant"]
70
+ field_messages: conversations
71
+ message_property_mappings:
72
+ role: from
73
+ content: value
74
+ shards: 5
75
+ chat_template: chatml
76
+
77
+ dataset_prepared_path: last_run_prepared
78
+
79
+ output_dir: ./output
80
+
81
+ adapter: lora
82
+ lora_model_dir:
83
+
84
+ sequence_len: 20000
85
+ pad_to_sequence_len: true
86
+ sample_packing: true
87
+
88
+ val_set_size: 0.05
89
+ eval_sample_packing: true
90
+ evals_per_epoch: 3
91
+
92
+ lora_r: 8
93
+ lora_alpha: 16
94
+ lora_dropout: 0.05
95
+ lora_fan_in_fan_out:
96
+ lora_target_modules:
97
+ - gate_proj
98
+ - down_proj
99
+ - up_proj
100
+ - q_proj
101
+ - v_proj
102
+ - k_proj
103
+ - o_proj
104
+
105
+ wandb_project: "axolotl"
106
+ wandb_entity: "kasfiekfs-e"
107
+ wandb_watch:
108
+ wandb_name:
109
+ wandb_log_model:
110
+
111
+ gradient_accumulation_steps: 2
112
+ micro_batch_size: 2
113
+ num_epochs: 2
114
+ optimizer: adamw_8bit
115
+ lr_scheduler: cosine
116
+ learning_rate: 0.0002
117
+
118
+ train_on_inputs: false
119
+ group_by_length: false
120
+ bf16: auto
121
+ fp16:
122
+ tf32: false
123
+
124
+ gradient_checkpointing: true
125
+ early_stopping_patience:
126
+ resume_from_checkpoint:
127
+ local_rank:
128
+ logging_steps: 1
129
+ xformers_attention:
130
+ flash_attention: true
131
+
132
+ loss_watchdog_threshold: 5.0
133
+ loss_watchdog_patience: 3
134
+
135
+ warmup_steps: 10
136
+ saves_per_epoch: 1
137
+ debug:
138
+ deepspeed:
139
+ weight_decay: 0.0
140
+ fsdp:
141
+ fsdp_config:
142
+
143
+ special_tokens:
144
+ bos_token: "<|im_start|>"
145
+ eos_token: "<|im_end|>"
146
+ pad_token: "<|endoftext|>"
147
+
148
+ ```
149
+
150
+ </details><br>
151
+
152
+ # LoRA-HCX-3b-sf-xlam-00
153
+
154
+ This model is a fine-tuned version of [minpeter/HCX-SEED-FC-3B](https://huggingface.co/minpeter/HCX-SEED-FC-3B) on the minpeter/xlam-function-calling-60k-hermes, the minpeter/xlam-irrelevance-7.5k-qwen2.5-72b-distill-hermes, the minpeter/hermes-function-calling-v1-jsonl and the minpeter/hermes-function-calling-v1-jsonl datasets.
155
+ It achieves the following results on the evaluation set:
156
+ - Loss: 0.3603
157
+
158
+ ## Model description
159
+
160
+ More information needed
161
+
162
+ ## Intended uses & limitations
163
+
164
+ More information needed
165
+
166
+ ## Training and evaluation data
167
+
168
+ More information needed
169
+
170
+ ## Training procedure
171
+
172
+ ### Training hyperparameters
173
+
174
+ The following hyperparameters were used during training:
175
+ - learning_rate: 0.0002
176
+ - train_batch_size: 2
177
+ - eval_batch_size: 2
178
+ - seed: 42
179
+ - gradient_accumulation_steps: 2
180
+ - total_train_batch_size: 4
181
+ - optimizer: Use OptimizerNames.ADAMW_8BIT with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
182
+ - lr_scheduler_type: cosine
183
+ - lr_scheduler_warmup_steps: 10
184
+ - num_epochs: 2.0
185
+
186
+ ### Training results
187
+
188
+ | Training Loss | Epoch | Step | Validation Loss |
189
+ |:-------------:|:------:|:----:|:---------------:|
190
+ | 0.8224 | 0.0215 | 1 | 0.7483 |
191
+ | 0.4965 | 0.3441 | 16 | 0.4406 |
192
+ | 0.5057 | 0.6882 | 32 | 0.3911 |
193
+ | 0.4464 | 1.0215 | 48 | 0.3737 |
194
+ | 0.4376 | 1.3656 | 64 | 0.3651 |
195
+ | 0.3833 | 1.7097 | 80 | 0.3603 |
196
+
197
+
198
+ ### Framework versions
199
+
200
+ - PEFT 0.15.2
201
+ - Transformers 4.51.3
202
+ - Pytorch 2.6.0+cu124
203
+ - Datasets 3.5.1
204
+ - Tokenizers 0.21.1
adapter_config.json ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": null,
4
+ "base_model_name_or_path": "minpeter/HyperCLOVAX-SEED-Text-Instruct-3B-hf",
5
+ "bias": "none",
6
+ "corda_config": null,
7
+ "eva_config": null,
8
+ "exclude_modules": null,
9
+ "fan_in_fan_out": null,
10
+ "inference_mode": true,
11
+ "init_lora_weights": true,
12
+ "layer_replication": null,
13
+ "layers_pattern": null,
14
+ "layers_to_transform": null,
15
+ "loftq_config": {},
16
+ "lora_alpha": 16,
17
+ "lora_bias": false,
18
+ "lora_dropout": 0.05,
19
+ "megatron_config": null,
20
+ "megatron_core": "megatron.core",
21
+ "modules_to_save": null,
22
+ "peft_type": "LORA",
23
+ "r": 8,
24
+ "rank_pattern": {},
25
+ "revision": null,
26
+ "target_modules": [
27
+ "q_proj",
28
+ "down_proj",
29
+ "o_proj",
30
+ "up_proj",
31
+ "v_proj",
32
+ "k_proj",
33
+ "gate_proj"
34
+ ],
35
+ "task_type": "CAUSAL_LM",
36
+ "trainable_token_indices": null,
37
+ "use_dora": false,
38
+ "use_rslora": false
39
+ }
adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:22c2863faf8428cfbf432f181c0172061eba1bcf31a29f4ec56f6c8b515585ba
3
+ size 52487848
added_tokens.json ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "<EMAIL>": 110521,
3
+ "<KEY>": 110522,
4
+ "<NAME>": 110520,
5
+ "<PASSWORD>": 110523,
6
+ "<code_to_intermediate>": 110502,
7
+ "<empty_output>": 110501,
8
+ "<file_sep>": 110492,
9
+ "<intermediate_to_code>": 110503,
10
+ "<issue_closed>": 110495,
11
+ "<issue_comment>": 110494,
12
+ "<issue_start>": 110493,
13
+ "<jupyter_code>": 110498,
14
+ "<jupyter_output>": 110499,
15
+ "<jupyter_script>": 110500,
16
+ "<jupyter_start>": 110496,
17
+ "<jupyter_text>": 110497,
18
+ "<pr>": 110504,
19
+ "<pr_base>": 110507,
20
+ "<pr_base_code>": 110509,
21
+ "<pr_comment>": 110512,
22
+ "<pr_diff>": 110510,
23
+ "<pr_diff_hunk>": 110511,
24
+ "<pr_diff_hunk_comment_line>": 110519,
25
+ "<pr_event_id>": 110513,
26
+ "<pr_file>": 110508,
27
+ "<pr_in_reply_to_comment_id>": 110518,
28
+ "<pr_in_reply_to_review_id>": 110517,
29
+ "<pr_is_merged>": 110506,
30
+ "<pr_review>": 110514,
31
+ "<pr_review_comment>": 110516,
32
+ "<pr_review_state>": 110515,
33
+ "<pr_status>": 110505,
34
+ "<repo_name>": 110491
35
+ }
config.json ADDED
@@ -0,0 +1,34 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_attn_implementation_autoset": true,
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 100272,
9
+ "end_token_id": 100257,
10
+ "eos_token_id": 100273,
11
+ "head_dim": 128,
12
+ "hidden_act": "silu",
13
+ "hidden_size": 3072,
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 7168,
16
+ "logits_scaling": 1.0,
17
+ "max_position_embeddings": 131072,
18
+ "mlp_bias": false,
19
+ "model_type": "llama",
20
+ "num_attention_heads": 24,
21
+ "num_hidden_layers": 32,
22
+ "num_key_value_heads": 8,
23
+ "pad_token_id": 100257,
24
+ "pretraining_tp": 1,
25
+ "resid_pdrop": 0.2,
26
+ "rms_norm_eps": 1e-05,
27
+ "rope_scaling": null,
28
+ "rope_theta": 100000000,
29
+ "tie_word_embeddings": true,
30
+ "torch_dtype": "bfloat16",
31
+ "transformers_version": "4.51.3",
32
+ "use_cache": false,
33
+ "vocab_size": 110592
34
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
special_tokens_map.json ADDED
@@ -0,0 +1,86 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<|endoftext|>",
4
+ "<|fim_prefix|>",
5
+ "<|fim_middle|>",
6
+ "<|fim_suffix|>",
7
+ "<|endofprompt|>",
8
+ "<|_unuse_missing_100256|>",
9
+ "<|_unuse_missing_100261|>",
10
+ "<|_unuse_missing_100262|>",
11
+ "<|_unuse_missing_100263|>",
12
+ "<|_unuse_missing_100264|>",
13
+ "<|_unuse_missing_100265|>",
14
+ "<|_unuse_missing_100266|>",
15
+ "<|_unuse_missing_100267|>",
16
+ "<|_unuse_missing_100268|>",
17
+ "<|_unuse_missing_100269|>",
18
+ "<|_unuse_missing_100270|>",
19
+ "<|dummy3|>",
20
+ "<|im_start|>",
21
+ "<|im_end|>",
22
+ "<|stop|>",
23
+ "<|endofturn|>",
24
+ "<repo_name>",
25
+ "<file_sep>",
26
+ "<issue_start>",
27
+ "<issue_comment>",
28
+ "<issue_closed>",
29
+ "<jupyter_start>",
30
+ "<jupyter_text>",
31
+ "<jupyter_code>",
32
+ "<jupyter_output>",
33
+ "<jupyter_script>",
34
+ "<empty_output>",
35
+ "<code_to_intermediate>",
36
+ "<intermediate_to_code>",
37
+ "<pr>",
38
+ "<pr_status>",
39
+ "<pr_is_merged>",
40
+ "<pr_base>",
41
+ "<pr_file>",
42
+ "<pr_base_code>",
43
+ "<pr_diff>",
44
+ "<pr_diff_hunk>",
45
+ "<pr_comment>",
46
+ "<pr_event_id>",
47
+ "<pr_review>",
48
+ "<pr_review_state>",
49
+ "<pr_review_comment>",
50
+ "<pr_in_reply_to_review_id>",
51
+ "<pr_in_reply_to_comment_id>",
52
+ "<pr_diff_hunk_comment_line>",
53
+ "<NAME>",
54
+ "<EMAIL>",
55
+ "<KEY>",
56
+ "<PASSWORD>"
57
+ ],
58
+ "bos_token": {
59
+ "content": "<|im_start|>",
60
+ "lstrip": false,
61
+ "normalized": false,
62
+ "rstrip": false,
63
+ "single_word": false
64
+ },
65
+ "eos_token": {
66
+ "content": "<|im_end|>",
67
+ "lstrip": false,
68
+ "normalized": false,
69
+ "rstrip": false,
70
+ "single_word": false
71
+ },
72
+ "pad_token": {
73
+ "content": "<|endoftext|>",
74
+ "lstrip": false,
75
+ "normalized": false,
76
+ "rstrip": false,
77
+ "single_word": false
78
+ },
79
+ "unk_token": {
80
+ "content": "<|endoftext|>",
81
+ "lstrip": false,
82
+ "normalized": false,
83
+ "rstrip": false,
84
+ "single_word": false
85
+ }
86
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,502 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "100256": {
5
+ "content": "<|_unuse_missing_100256|>",
6
+ "lstrip": false,
7
+ "normalized": false,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "100257": {
13
+ "content": "<|endoftext|>",
14
+ "lstrip": false,
15
+ "normalized": false,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "100258": {
21
+ "content": "<|fim_prefix|>",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "100259": {
29
+ "content": "<|fim_middle|>",
30
+ "lstrip": false,
31
+ "normalized": false,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ },
36
+ "100260": {
37
+ "content": "<|fim_suffix|>",
38
+ "lstrip": false,
39
+ "normalized": false,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": true
43
+ },
44
+ "100261": {
45
+ "content": "<|_unuse_missing_100261|>",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false,
50
+ "special": true
51
+ },
52
+ "100262": {
53
+ "content": "<|_unuse_missing_100262|>",
54
+ "lstrip": false,
55
+ "normalized": false,
56
+ "rstrip": false,
57
+ "single_word": false,
58
+ "special": true
59
+ },
60
+ "100263": {
61
+ "content": "<|_unuse_missing_100263|>",
62
+ "lstrip": false,
63
+ "normalized": false,
64
+ "rstrip": false,
65
+ "single_word": false,
66
+ "special": true
67
+ },
68
+ "100264": {
69
+ "content": "<|_unuse_missing_100264|>",
70
+ "lstrip": false,
71
+ "normalized": false,
72
+ "rstrip": false,
73
+ "single_word": false,
74
+ "special": true
75
+ },
76
+ "100265": {
77
+ "content": "<|_unuse_missing_100265|>",
78
+ "lstrip": false,
79
+ "normalized": false,
80
+ "rstrip": false,
81
+ "single_word": false,
82
+ "special": true
83
+ },
84
+ "100266": {
85
+ "content": "<|_unuse_missing_100266|>",
86
+ "lstrip": false,
87
+ "normalized": false,
88
+ "rstrip": false,
89
+ "single_word": false,
90
+ "special": true
91
+ },
92
+ "100267": {
93
+ "content": "<|_unuse_missing_100267|>",
94
+ "lstrip": false,
95
+ "normalized": false,
96
+ "rstrip": false,
97
+ "single_word": false,
98
+ "special": true
99
+ },
100
+ "100268": {
101
+ "content": "<|_unuse_missing_100268|>",
102
+ "lstrip": false,
103
+ "normalized": false,
104
+ "rstrip": false,
105
+ "single_word": false,
106
+ "special": true
107
+ },
108
+ "100269": {
109
+ "content": "<|_unuse_missing_100269|>",
110
+ "lstrip": false,
111
+ "normalized": false,
112
+ "rstrip": false,
113
+ "single_word": false,
114
+ "special": true
115
+ },
116
+ "100270": {
117
+ "content": "<|_unuse_missing_100270|>",
118
+ "lstrip": false,
119
+ "normalized": false,
120
+ "rstrip": false,
121
+ "single_word": false,
122
+ "special": true
123
+ },
124
+ "100271": {
125
+ "content": "<|dummy3|>",
126
+ "lstrip": false,
127
+ "normalized": false,
128
+ "rstrip": false,
129
+ "single_word": false,
130
+ "special": true
131
+ },
132
+ "100272": {
133
+ "content": "<|im_start|>",
134
+ "lstrip": false,
135
+ "normalized": false,
136
+ "rstrip": false,
137
+ "single_word": false,
138
+ "special": true
139
+ },
140
+ "100273": {
141
+ "content": "<|im_end|>",
142
+ "lstrip": false,
143
+ "normalized": false,
144
+ "rstrip": false,
145
+ "single_word": false,
146
+ "special": true
147
+ },
148
+ "100274": {
149
+ "content": "<|stop|>",
150
+ "lstrip": false,
151
+ "normalized": false,
152
+ "rstrip": false,
153
+ "single_word": false,
154
+ "special": true
155
+ },
156
+ "100275": {
157
+ "content": "<|endofturn|>",
158
+ "lstrip": false,
159
+ "normalized": false,
160
+ "rstrip": false,
161
+ "single_word": false,
162
+ "special": true
163
+ },
164
+ "100276": {
165
+ "content": "<|endofprompt|>",
166
+ "lstrip": false,
167
+ "normalized": false,
168
+ "rstrip": false,
169
+ "single_word": false,
170
+ "special": true
171
+ },
172
+ "110491": {
173
+ "content": "<repo_name>",
174
+ "lstrip": false,
175
+ "normalized": false,
176
+ "rstrip": false,
177
+ "single_word": false,
178
+ "special": true
179
+ },
180
+ "110492": {
181
+ "content": "<file_sep>",
182
+ "lstrip": false,
183
+ "normalized": false,
184
+ "rstrip": false,
185
+ "single_word": false,
186
+ "special": true
187
+ },
188
+ "110493": {
189
+ "content": "<issue_start>",
190
+ "lstrip": false,
191
+ "normalized": false,
192
+ "rstrip": false,
193
+ "single_word": false,
194
+ "special": true
195
+ },
196
+ "110494": {
197
+ "content": "<issue_comment>",
198
+ "lstrip": false,
199
+ "normalized": false,
200
+ "rstrip": false,
201
+ "single_word": false,
202
+ "special": true
203
+ },
204
+ "110495": {
205
+ "content": "<issue_closed>",
206
+ "lstrip": false,
207
+ "normalized": false,
208
+ "rstrip": false,
209
+ "single_word": false,
210
+ "special": true
211
+ },
212
+ "110496": {
213
+ "content": "<jupyter_start>",
214
+ "lstrip": false,
215
+ "normalized": false,
216
+ "rstrip": false,
217
+ "single_word": false,
218
+ "special": true
219
+ },
220
+ "110497": {
221
+ "content": "<jupyter_text>",
222
+ "lstrip": false,
223
+ "normalized": false,
224
+ "rstrip": false,
225
+ "single_word": false,
226
+ "special": true
227
+ },
228
+ "110498": {
229
+ "content": "<jupyter_code>",
230
+ "lstrip": false,
231
+ "normalized": false,
232
+ "rstrip": false,
233
+ "single_word": false,
234
+ "special": true
235
+ },
236
+ "110499": {
237
+ "content": "<jupyter_output>",
238
+ "lstrip": false,
239
+ "normalized": false,
240
+ "rstrip": false,
241
+ "single_word": false,
242
+ "special": true
243
+ },
244
+ "110500": {
245
+ "content": "<jupyter_script>",
246
+ "lstrip": false,
247
+ "normalized": false,
248
+ "rstrip": false,
249
+ "single_word": false,
250
+ "special": true
251
+ },
252
+ "110501": {
253
+ "content": "<empty_output>",
254
+ "lstrip": false,
255
+ "normalized": false,
256
+ "rstrip": false,
257
+ "single_word": false,
258
+ "special": true
259
+ },
260
+ "110502": {
261
+ "content": "<code_to_intermediate>",
262
+ "lstrip": false,
263
+ "normalized": false,
264
+ "rstrip": false,
265
+ "single_word": false,
266
+ "special": true
267
+ },
268
+ "110503": {
269
+ "content": "<intermediate_to_code>",
270
+ "lstrip": false,
271
+ "normalized": false,
272
+ "rstrip": false,
273
+ "single_word": false,
274
+ "special": true
275
+ },
276
+ "110504": {
277
+ "content": "<pr>",
278
+ "lstrip": false,
279
+ "normalized": false,
280
+ "rstrip": false,
281
+ "single_word": false,
282
+ "special": true
283
+ },
284
+ "110505": {
285
+ "content": "<pr_status>",
286
+ "lstrip": false,
287
+ "normalized": false,
288
+ "rstrip": false,
289
+ "single_word": false,
290
+ "special": true
291
+ },
292
+ "110506": {
293
+ "content": "<pr_is_merged>",
294
+ "lstrip": false,
295
+ "normalized": false,
296
+ "rstrip": false,
297
+ "single_word": false,
298
+ "special": true
299
+ },
300
+ "110507": {
301
+ "content": "<pr_base>",
302
+ "lstrip": false,
303
+ "normalized": false,
304
+ "rstrip": false,
305
+ "single_word": false,
306
+ "special": true
307
+ },
308
+ "110508": {
309
+ "content": "<pr_file>",
310
+ "lstrip": false,
311
+ "normalized": false,
312
+ "rstrip": false,
313
+ "single_word": false,
314
+ "special": true
315
+ },
316
+ "110509": {
317
+ "content": "<pr_base_code>",
318
+ "lstrip": false,
319
+ "normalized": false,
320
+ "rstrip": false,
321
+ "single_word": false,
322
+ "special": true
323
+ },
324
+ "110510": {
325
+ "content": "<pr_diff>",
326
+ "lstrip": false,
327
+ "normalized": false,
328
+ "rstrip": false,
329
+ "single_word": false,
330
+ "special": true
331
+ },
332
+ "110511": {
333
+ "content": "<pr_diff_hunk>",
334
+ "lstrip": false,
335
+ "normalized": false,
336
+ "rstrip": false,
337
+ "single_word": false,
338
+ "special": true
339
+ },
340
+ "110512": {
341
+ "content": "<pr_comment>",
342
+ "lstrip": false,
343
+ "normalized": false,
344
+ "rstrip": false,
345
+ "single_word": false,
346
+ "special": true
347
+ },
348
+ "110513": {
349
+ "content": "<pr_event_id>",
350
+ "lstrip": false,
351
+ "normalized": false,
352
+ "rstrip": false,
353
+ "single_word": false,
354
+ "special": true
355
+ },
356
+ "110514": {
357
+ "content": "<pr_review>",
358
+ "lstrip": false,
359
+ "normalized": false,
360
+ "rstrip": false,
361
+ "single_word": false,
362
+ "special": true
363
+ },
364
+ "110515": {
365
+ "content": "<pr_review_state>",
366
+ "lstrip": false,
367
+ "normalized": false,
368
+ "rstrip": false,
369
+ "single_word": false,
370
+ "special": true
371
+ },
372
+ "110516": {
373
+ "content": "<pr_review_comment>",
374
+ "lstrip": false,
375
+ "normalized": false,
376
+ "rstrip": false,
377
+ "single_word": false,
378
+ "special": true
379
+ },
380
+ "110517": {
381
+ "content": "<pr_in_reply_to_review_id>",
382
+ "lstrip": false,
383
+ "normalized": false,
384
+ "rstrip": false,
385
+ "single_word": false,
386
+ "special": true
387
+ },
388
+ "110518": {
389
+ "content": "<pr_in_reply_to_comment_id>",
390
+ "lstrip": false,
391
+ "normalized": false,
392
+ "rstrip": false,
393
+ "single_word": false,
394
+ "special": true
395
+ },
396
+ "110519": {
397
+ "content": "<pr_diff_hunk_comment_line>",
398
+ "lstrip": false,
399
+ "normalized": false,
400
+ "rstrip": false,
401
+ "single_word": false,
402
+ "special": true
403
+ },
404
+ "110520": {
405
+ "content": "<NAME>",
406
+ "lstrip": false,
407
+ "normalized": false,
408
+ "rstrip": false,
409
+ "single_word": false,
410
+ "special": true
411
+ },
412
+ "110521": {
413
+ "content": "<EMAIL>",
414
+ "lstrip": false,
415
+ "normalized": false,
416
+ "rstrip": false,
417
+ "single_word": false,
418
+ "special": true
419
+ },
420
+ "110522": {
421
+ "content": "<KEY>",
422
+ "lstrip": false,
423
+ "normalized": false,
424
+ "rstrip": false,
425
+ "single_word": false,
426
+ "special": true
427
+ },
428
+ "110523": {
429
+ "content": "<PASSWORD>",
430
+ "lstrip": false,
431
+ "normalized": false,
432
+ "rstrip": false,
433
+ "single_word": false,
434
+ "special": true
435
+ }
436
+ },
437
+ "additional_special_tokens": [
438
+ "<|endoftext|>",
439
+ "<|fim_prefix|>",
440
+ "<|fim_middle|>",
441
+ "<|fim_suffix|>",
442
+ "<|endofprompt|>",
443
+ "<|_unuse_missing_100256|>",
444
+ "<|_unuse_missing_100261|>",
445
+ "<|_unuse_missing_100262|>",
446
+ "<|_unuse_missing_100263|>",
447
+ "<|_unuse_missing_100264|>",
448
+ "<|_unuse_missing_100265|>",
449
+ "<|_unuse_missing_100266|>",
450
+ "<|_unuse_missing_100267|>",
451
+ "<|_unuse_missing_100268|>",
452
+ "<|_unuse_missing_100269|>",
453
+ "<|_unuse_missing_100270|>",
454
+ "<|dummy3|>",
455
+ "<|im_start|>",
456
+ "<|im_end|>",
457
+ "<|stop|>",
458
+ "<|endofturn|>",
459
+ "<repo_name>",
460
+ "<file_sep>",
461
+ "<issue_start>",
462
+ "<issue_comment>",
463
+ "<issue_closed>",
464
+ "<jupyter_start>",
465
+ "<jupyter_text>",
466
+ "<jupyter_code>",
467
+ "<jupyter_output>",
468
+ "<jupyter_script>",
469
+ "<empty_output>",
470
+ "<code_to_intermediate>",
471
+ "<intermediate_to_code>",
472
+ "<pr>",
473
+ "<pr_status>",
474
+ "<pr_is_merged>",
475
+ "<pr_base>",
476
+ "<pr_file>",
477
+ "<pr_base_code>",
478
+ "<pr_diff>",
479
+ "<pr_diff_hunk>",
480
+ "<pr_comment>",
481
+ "<pr_event_id>",
482
+ "<pr_review>",
483
+ "<pr_review_state>",
484
+ "<pr_review_comment>",
485
+ "<pr_in_reply_to_review_id>",
486
+ "<pr_in_reply_to_comment_id>",
487
+ "<pr_diff_hunk_comment_line>",
488
+ "<NAME>",
489
+ "<EMAIL>",
490
+ "<KEY>",
491
+ "<PASSWORD>"
492
+ ],
493
+ "bos_token": "<|im_start|>",
494
+ "chat_template": "{% if not add_generation_prompt is defined %}{% set add_generation_prompt = false %}{% endif %}{% for message in messages %}{{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n'}}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant\n' }}{% endif %}",
495
+ "clean_up_tokenization_spaces": true,
496
+ "eos_token": "<|im_end|>",
497
+ "extra_special_tokens": {},
498
+ "model_max_length": 1000000000000000019884624838656,
499
+ "pad_token": "<|endoftext|>",
500
+ "tokenizer_class": "GPT2Tokenizer",
501
+ "unk_token": "<|endoftext|>"
502
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:688a3530a9b0cfe1d6ee65ebe5f594ae8029f7a50bc277f60ac9ebf9638e945c
3
+ size 6968
vocab.json ADDED
The diff for this file is too large to render. See raw diff