yinchenghust commited on
Commit
14c2cd5
·
verified ·
1 Parent(s): e47f86a

Upload folder using huggingface_hub

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
added_tokens.json ADDED
@@ -0,0 +1,2155 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "</box>": 151674,
3
+ "</image>": 151670,
4
+ "</image_id>": 151682,
5
+ "</point>": 151678,
6
+ "</quad>": 151676,
7
+ "</ref>": 151672,
8
+ "</slice>": 151680,
9
+ "</think>": 151668,
10
+ "</tool_call>": 151658,
11
+ "</tool_response>": 151666,
12
+ "</unit>": 151684,
13
+ "<action_0>": 151748,
14
+ "<action_1000>": 152748,
15
+ "<action_1001>": 152749,
16
+ "<action_1002>": 152750,
17
+ "<action_1003>": 152751,
18
+ "<action_1004>": 152752,
19
+ "<action_1005>": 152753,
20
+ "<action_1006>": 152754,
21
+ "<action_1007>": 152755,
22
+ "<action_1008>": 152756,
23
+ "<action_1009>": 152757,
24
+ "<action_100>": 151848,
25
+ "<action_1010>": 152758,
26
+ "<action_1011>": 152759,
27
+ "<action_1012>": 152760,
28
+ "<action_1013>": 152761,
29
+ "<action_1014>": 152762,
30
+ "<action_1015>": 152763,
31
+ "<action_1016>": 152764,
32
+ "<action_1017>": 152765,
33
+ "<action_1018>": 152766,
34
+ "<action_1019>": 152767,
35
+ "<action_101>": 151849,
36
+ "<action_1020>": 152768,
37
+ "<action_1021>": 152769,
38
+ "<action_1022>": 152770,
39
+ "<action_1023>": 152771,
40
+ "<action_1024>": 152772,
41
+ "<action_1025>": 152773,
42
+ "<action_1026>": 152774,
43
+ "<action_1027>": 152775,
44
+ "<action_1028>": 152776,
45
+ "<action_1029>": 152777,
46
+ "<action_102>": 151850,
47
+ "<action_1030>": 152778,
48
+ "<action_1031>": 152779,
49
+ "<action_1032>": 152780,
50
+ "<action_1033>": 152781,
51
+ "<action_1034>": 152782,
52
+ "<action_1035>": 152783,
53
+ "<action_1036>": 152784,
54
+ "<action_1037>": 152785,
55
+ "<action_1038>": 152786,
56
+ "<action_1039>": 152787,
57
+ "<action_103>": 151851,
58
+ "<action_1040>": 152788,
59
+ "<action_1041>": 152789,
60
+ "<action_1042>": 152790,
61
+ "<action_1043>": 152791,
62
+ "<action_1044>": 152792,
63
+ "<action_1045>": 152793,
64
+ "<action_1046>": 152794,
65
+ "<action_1047>": 152795,
66
+ "<action_1048>": 152796,
67
+ "<action_1049>": 152797,
68
+ "<action_104>": 151852,
69
+ "<action_1050>": 152798,
70
+ "<action_1051>": 152799,
71
+ "<action_1052>": 152800,
72
+ "<action_1053>": 152801,
73
+ "<action_1054>": 152802,
74
+ "<action_1055>": 152803,
75
+ "<action_1056>": 152804,
76
+ "<action_1057>": 152805,
77
+ "<action_1058>": 152806,
78
+ "<action_1059>": 152807,
79
+ "<action_105>": 151853,
80
+ "<action_1060>": 152808,
81
+ "<action_1061>": 152809,
82
+ "<action_1062>": 152810,
83
+ "<action_1063>": 152811,
84
+ "<action_1064>": 152812,
85
+ "<action_1065>": 152813,
86
+ "<action_1066>": 152814,
87
+ "<action_1067>": 152815,
88
+ "<action_1068>": 152816,
89
+ "<action_1069>": 152817,
90
+ "<action_106>": 151854,
91
+ "<action_1070>": 152818,
92
+ "<action_1071>": 152819,
93
+ "<action_1072>": 152820,
94
+ "<action_1073>": 152821,
95
+ "<action_1074>": 152822,
96
+ "<action_1075>": 152823,
97
+ "<action_1076>": 152824,
98
+ "<action_1077>": 152825,
99
+ "<action_1078>": 152826,
100
+ "<action_1079>": 152827,
101
+ "<action_107>": 151855,
102
+ "<action_1080>": 152828,
103
+ "<action_1081>": 152829,
104
+ "<action_1082>": 152830,
105
+ "<action_1083>": 152831,
106
+ "<action_1084>": 152832,
107
+ "<action_1085>": 152833,
108
+ "<action_1086>": 152834,
109
+ "<action_1087>": 152835,
110
+ "<action_1088>": 152836,
111
+ "<action_1089>": 152837,
112
+ "<action_108>": 151856,
113
+ "<action_1090>": 152838,
114
+ "<action_1091>": 152839,
115
+ "<action_1092>": 152840,
116
+ "<action_1093>": 152841,
117
+ "<action_1094>": 152842,
118
+ "<action_1095>": 152843,
119
+ "<action_1096>": 152844,
120
+ "<action_1097>": 152845,
121
+ "<action_1098>": 152846,
122
+ "<action_1099>": 152847,
123
+ "<action_109>": 151857,
124
+ "<action_10>": 151758,
125
+ "<action_1100>": 152848,
126
+ "<action_1101>": 152849,
127
+ "<action_1102>": 152850,
128
+ "<action_1103>": 152851,
129
+ "<action_1104>": 152852,
130
+ "<action_1105>": 152853,
131
+ "<action_1106>": 152854,
132
+ "<action_1107>": 152855,
133
+ "<action_1108>": 152856,
134
+ "<action_1109>": 152857,
135
+ "<action_110>": 151858,
136
+ "<action_1110>": 152858,
137
+ "<action_1111>": 152859,
138
+ "<action_1112>": 152860,
139
+ "<action_1113>": 152861,
140
+ "<action_1114>": 152862,
141
+ "<action_1115>": 152863,
142
+ "<action_1116>": 152864,
143
+ "<action_1117>": 152865,
144
+ "<action_1118>": 152866,
145
+ "<action_1119>": 152867,
146
+ "<action_111>": 151859,
147
+ "<action_1120>": 152868,
148
+ "<action_1121>": 152869,
149
+ "<action_1122>": 152870,
150
+ "<action_1123>": 152871,
151
+ "<action_1124>": 152872,
152
+ "<action_1125>": 152873,
153
+ "<action_1126>": 152874,
154
+ "<action_1127>": 152875,
155
+ "<action_1128>": 152876,
156
+ "<action_1129>": 152877,
157
+ "<action_112>": 151860,
158
+ "<action_1130>": 152878,
159
+ "<action_1131>": 152879,
160
+ "<action_1132>": 152880,
161
+ "<action_1133>": 152881,
162
+ "<action_1134>": 152882,
163
+ "<action_1135>": 152883,
164
+ "<action_1136>": 152884,
165
+ "<action_1137>": 152885,
166
+ "<action_1138>": 152886,
167
+ "<action_1139>": 152887,
168
+ "<action_113>": 151861,
169
+ "<action_1140>": 152888,
170
+ "<action_1141>": 152889,
171
+ "<action_1142>": 152890,
172
+ "<action_1143>": 152891,
173
+ "<action_1144>": 152892,
174
+ "<action_1145>": 152893,
175
+ "<action_1146>": 152894,
176
+ "<action_1147>": 152895,
177
+ "<action_1148>": 152896,
178
+ "<action_1149>": 152897,
179
+ "<action_114>": 151862,
180
+ "<action_1150>": 152898,
181
+ "<action_1151>": 152899,
182
+ "<action_1152>": 152900,
183
+ "<action_1153>": 152901,
184
+ "<action_1154>": 152902,
185
+ "<action_1155>": 152903,
186
+ "<action_1156>": 152904,
187
+ "<action_1157>": 152905,
188
+ "<action_1158>": 152906,
189
+ "<action_1159>": 152907,
190
+ "<action_115>": 151863,
191
+ "<action_1160>": 152908,
192
+ "<action_1161>": 152909,
193
+ "<action_1162>": 152910,
194
+ "<action_1163>": 152911,
195
+ "<action_1164>": 152912,
196
+ "<action_1165>": 152913,
197
+ "<action_1166>": 152914,
198
+ "<action_1167>": 152915,
199
+ "<action_1168>": 152916,
200
+ "<action_1169>": 152917,
201
+ "<action_116>": 151864,
202
+ "<action_1170>": 152918,
203
+ "<action_1171>": 152919,
204
+ "<action_1172>": 152920,
205
+ "<action_1173>": 152921,
206
+ "<action_1174>": 152922,
207
+ "<action_1175>": 152923,
208
+ "<action_1176>": 152924,
209
+ "<action_1177>": 152925,
210
+ "<action_1178>": 152926,
211
+ "<action_1179>": 152927,
212
+ "<action_117>": 151865,
213
+ "<action_1180>": 152928,
214
+ "<action_1181>": 152929,
215
+ "<action_1182>": 152930,
216
+ "<action_1183>": 152931,
217
+ "<action_1184>": 152932,
218
+ "<action_1185>": 152933,
219
+ "<action_1186>": 152934,
220
+ "<action_1187>": 152935,
221
+ "<action_1188>": 152936,
222
+ "<action_1189>": 152937,
223
+ "<action_118>": 151866,
224
+ "<action_1190>": 152938,
225
+ "<action_1191>": 152939,
226
+ "<action_1192>": 152940,
227
+ "<action_1193>": 152941,
228
+ "<action_1194>": 152942,
229
+ "<action_1195>": 152943,
230
+ "<action_1196>": 152944,
231
+ "<action_1197>": 152945,
232
+ "<action_1198>": 152946,
233
+ "<action_1199>": 152947,
234
+ "<action_119>": 151867,
235
+ "<action_11>": 151759,
236
+ "<action_1200>": 152948,
237
+ "<action_1201>": 152949,
238
+ "<action_1202>": 152950,
239
+ "<action_1203>": 152951,
240
+ "<action_1204>": 152952,
241
+ "<action_1205>": 152953,
242
+ "<action_1206>": 152954,
243
+ "<action_1207>": 152955,
244
+ "<action_1208>": 152956,
245
+ "<action_1209>": 152957,
246
+ "<action_120>": 151868,
247
+ "<action_1210>": 152958,
248
+ "<action_1211>": 152959,
249
+ "<action_1212>": 152960,
250
+ "<action_1213>": 152961,
251
+ "<action_1214>": 152962,
252
+ "<action_1215>": 152963,
253
+ "<action_1216>": 152964,
254
+ "<action_1217>": 152965,
255
+ "<action_1218>": 152966,
256
+ "<action_1219>": 152967,
257
+ "<action_121>": 151869,
258
+ "<action_1220>": 152968,
259
+ "<action_1221>": 152969,
260
+ "<action_1222>": 152970,
261
+ "<action_1223>": 152971,
262
+ "<action_1224>": 152972,
263
+ "<action_1225>": 152973,
264
+ "<action_1226>": 152974,
265
+ "<action_1227>": 152975,
266
+ "<action_1228>": 152976,
267
+ "<action_1229>": 152977,
268
+ "<action_122>": 151870,
269
+ "<action_1230>": 152978,
270
+ "<action_1231>": 152979,
271
+ "<action_1232>": 152980,
272
+ "<action_1233>": 152981,
273
+ "<action_1234>": 152982,
274
+ "<action_1235>": 152983,
275
+ "<action_1236>": 152984,
276
+ "<action_1237>": 152985,
277
+ "<action_1238>": 152986,
278
+ "<action_1239>": 152987,
279
+ "<action_123>": 151871,
280
+ "<action_1240>": 152988,
281
+ "<action_1241>": 152989,
282
+ "<action_1242>": 152990,
283
+ "<action_1243>": 152991,
284
+ "<action_1244>": 152992,
285
+ "<action_1245>": 152993,
286
+ "<action_1246>": 152994,
287
+ "<action_1247>": 152995,
288
+ "<action_1248>": 152996,
289
+ "<action_1249>": 152997,
290
+ "<action_124>": 151872,
291
+ "<action_1250>": 152998,
292
+ "<action_1251>": 152999,
293
+ "<action_1252>": 153000,
294
+ "<action_1253>": 153001,
295
+ "<action_1254>": 153002,
296
+ "<action_1255>": 153003,
297
+ "<action_1256>": 153004,
298
+ "<action_1257>": 153005,
299
+ "<action_1258>": 153006,
300
+ "<action_1259>": 153007,
301
+ "<action_125>": 151873,
302
+ "<action_1260>": 153008,
303
+ "<action_1261>": 153009,
304
+ "<action_1262>": 153010,
305
+ "<action_1263>": 153011,
306
+ "<action_1264>": 153012,
307
+ "<action_1265>": 153013,
308
+ "<action_1266>": 153014,
309
+ "<action_1267>": 153015,
310
+ "<action_1268>": 153016,
311
+ "<action_1269>": 153017,
312
+ "<action_126>": 151874,
313
+ "<action_1270>": 153018,
314
+ "<action_1271>": 153019,
315
+ "<action_1272>": 153020,
316
+ "<action_1273>": 153021,
317
+ "<action_1274>": 153022,
318
+ "<action_1275>": 153023,
319
+ "<action_1276>": 153024,
320
+ "<action_1277>": 153025,
321
+ "<action_1278>": 153026,
322
+ "<action_1279>": 153027,
323
+ "<action_127>": 151875,
324
+ "<action_1280>": 153028,
325
+ "<action_1281>": 153029,
326
+ "<action_1282>": 153030,
327
+ "<action_1283>": 153031,
328
+ "<action_1284>": 153032,
329
+ "<action_1285>": 153033,
330
+ "<action_1286>": 153034,
331
+ "<action_1287>": 153035,
332
+ "<action_1288>": 153036,
333
+ "<action_1289>": 153037,
334
+ "<action_128>": 151876,
335
+ "<action_1290>": 153038,
336
+ "<action_1291>": 153039,
337
+ "<action_1292>": 153040,
338
+ "<action_1293>": 153041,
339
+ "<action_1294>": 153042,
340
+ "<action_1295>": 153043,
341
+ "<action_1296>": 153044,
342
+ "<action_1297>": 153045,
343
+ "<action_1298>": 153046,
344
+ "<action_1299>": 153047,
345
+ "<action_129>": 151877,
346
+ "<action_12>": 151760,
347
+ "<action_1300>": 153048,
348
+ "<action_1301>": 153049,
349
+ "<action_1302>": 153050,
350
+ "<action_1303>": 153051,
351
+ "<action_1304>": 153052,
352
+ "<action_1305>": 153053,
353
+ "<action_1306>": 153054,
354
+ "<action_1307>": 153055,
355
+ "<action_1308>": 153056,
356
+ "<action_1309>": 153057,
357
+ "<action_130>": 151878,
358
+ "<action_1310>": 153058,
359
+ "<action_1311>": 153059,
360
+ "<action_1312>": 153060,
361
+ "<action_1313>": 153061,
362
+ "<action_1314>": 153062,
363
+ "<action_1315>": 153063,
364
+ "<action_1316>": 153064,
365
+ "<action_1317>": 153065,
366
+ "<action_1318>": 153066,
367
+ "<action_1319>": 153067,
368
+ "<action_131>": 151879,
369
+ "<action_1320>": 153068,
370
+ "<action_1321>": 153069,
371
+ "<action_1322>": 153070,
372
+ "<action_1323>": 153071,
373
+ "<action_1324>": 153072,
374
+ "<action_1325>": 153073,
375
+ "<action_1326>": 153074,
376
+ "<action_1327>": 153075,
377
+ "<action_1328>": 153076,
378
+ "<action_1329>": 153077,
379
+ "<action_132>": 151880,
380
+ "<action_1330>": 153078,
381
+ "<action_1331>": 153079,
382
+ "<action_1332>": 153080,
383
+ "<action_1333>": 153081,
384
+ "<action_1334>": 153082,
385
+ "<action_1335>": 153083,
386
+ "<action_1336>": 153084,
387
+ "<action_1337>": 153085,
388
+ "<action_1338>": 153086,
389
+ "<action_1339>": 153087,
390
+ "<action_133>": 151881,
391
+ "<action_1340>": 153088,
392
+ "<action_1341>": 153089,
393
+ "<action_1342>": 153090,
394
+ "<action_1343>": 153091,
395
+ "<action_1344>": 153092,
396
+ "<action_1345>": 153093,
397
+ "<action_1346>": 153094,
398
+ "<action_1347>": 153095,
399
+ "<action_1348>": 153096,
400
+ "<action_1349>": 153097,
401
+ "<action_134>": 151882,
402
+ "<action_1350>": 153098,
403
+ "<action_1351>": 153099,
404
+ "<action_1352>": 153100,
405
+ "<action_1353>": 153101,
406
+ "<action_1354>": 153102,
407
+ "<action_1355>": 153103,
408
+ "<action_1356>": 153104,
409
+ "<action_1357>": 153105,
410
+ "<action_1358>": 153106,
411
+ "<action_1359>": 153107,
412
+ "<action_135>": 151883,
413
+ "<action_1360>": 153108,
414
+ "<action_1361>": 153109,
415
+ "<action_1362>": 153110,
416
+ "<action_1363>": 153111,
417
+ "<action_1364>": 153112,
418
+ "<action_1365>": 153113,
419
+ "<action_1366>": 153114,
420
+ "<action_1367>": 153115,
421
+ "<action_1368>": 153116,
422
+ "<action_1369>": 153117,
423
+ "<action_136>": 151884,
424
+ "<action_1370>": 153118,
425
+ "<action_1371>": 153119,
426
+ "<action_1372>": 153120,
427
+ "<action_1373>": 153121,
428
+ "<action_1374>": 153122,
429
+ "<action_1375>": 153123,
430
+ "<action_1376>": 153124,
431
+ "<action_1377>": 153125,
432
+ "<action_1378>": 153126,
433
+ "<action_1379>": 153127,
434
+ "<action_137>": 151885,
435
+ "<action_1380>": 153128,
436
+ "<action_1381>": 153129,
437
+ "<action_1382>": 153130,
438
+ "<action_1383>": 153131,
439
+ "<action_1384>": 153132,
440
+ "<action_1385>": 153133,
441
+ "<action_1386>": 153134,
442
+ "<action_1387>": 153135,
443
+ "<action_1388>": 153136,
444
+ "<action_1389>": 153137,
445
+ "<action_138>": 151886,
446
+ "<action_1390>": 153138,
447
+ "<action_1391>": 153139,
448
+ "<action_1392>": 153140,
449
+ "<action_1393>": 153141,
450
+ "<action_1394>": 153142,
451
+ "<action_1395>": 153143,
452
+ "<action_1396>": 153144,
453
+ "<action_1397>": 153145,
454
+ "<action_1398>": 153146,
455
+ "<action_1399>": 153147,
456
+ "<action_139>": 151887,
457
+ "<action_13>": 151761,
458
+ "<action_1400>": 153148,
459
+ "<action_1401>": 153149,
460
+ "<action_1402>": 153150,
461
+ "<action_1403>": 153151,
462
+ "<action_1404>": 153152,
463
+ "<action_1405>": 153153,
464
+ "<action_1406>": 153154,
465
+ "<action_1407>": 153155,
466
+ "<action_1408>": 153156,
467
+ "<action_1409>": 153157,
468
+ "<action_140>": 151888,
469
+ "<action_1410>": 153158,
470
+ "<action_1411>": 153159,
471
+ "<action_1412>": 153160,
472
+ "<action_1413>": 153161,
473
+ "<action_1414>": 153162,
474
+ "<action_1415>": 153163,
475
+ "<action_1416>": 153164,
476
+ "<action_1417>": 153165,
477
+ "<action_1418>": 153166,
478
+ "<action_1419>": 153167,
479
+ "<action_141>": 151889,
480
+ "<action_1420>": 153168,
481
+ "<action_1421>": 153169,
482
+ "<action_1422>": 153170,
483
+ "<action_1423>": 153171,
484
+ "<action_1424>": 153172,
485
+ "<action_1425>": 153173,
486
+ "<action_1426>": 153174,
487
+ "<action_1427>": 153175,
488
+ "<action_1428>": 153176,
489
+ "<action_1429>": 153177,
490
+ "<action_142>": 151890,
491
+ "<action_1430>": 153178,
492
+ "<action_1431>": 153179,
493
+ "<action_1432>": 153180,
494
+ "<action_1433>": 153181,
495
+ "<action_1434>": 153182,
496
+ "<action_1435>": 153183,
497
+ "<action_1436>": 153184,
498
+ "<action_1437>": 153185,
499
+ "<action_1438>": 153186,
500
+ "<action_1439>": 153187,
501
+ "<action_143>": 151891,
502
+ "<action_1440>": 153188,
503
+ "<action_1441>": 153189,
504
+ "<action_1442>": 153190,
505
+ "<action_1443>": 153191,
506
+ "<action_1444>": 153192,
507
+ "<action_1445>": 153193,
508
+ "<action_1446>": 153194,
509
+ "<action_1447>": 153195,
510
+ "<action_1448>": 153196,
511
+ "<action_1449>": 153197,
512
+ "<action_144>": 151892,
513
+ "<action_1450>": 153198,
514
+ "<action_1451>": 153199,
515
+ "<action_1452>": 153200,
516
+ "<action_1453>": 153201,
517
+ "<action_1454>": 153202,
518
+ "<action_1455>": 153203,
519
+ "<action_1456>": 153204,
520
+ "<action_1457>": 153205,
521
+ "<action_1458>": 153206,
522
+ "<action_1459>": 153207,
523
+ "<action_145>": 151893,
524
+ "<action_1460>": 153208,
525
+ "<action_1461>": 153209,
526
+ "<action_1462>": 153210,
527
+ "<action_1463>": 153211,
528
+ "<action_1464>": 153212,
529
+ "<action_1465>": 153213,
530
+ "<action_1466>": 153214,
531
+ "<action_1467>": 153215,
532
+ "<action_1468>": 153216,
533
+ "<action_1469>": 153217,
534
+ "<action_146>": 151894,
535
+ "<action_1470>": 153218,
536
+ "<action_1471>": 153219,
537
+ "<action_1472>": 153220,
538
+ "<action_1473>": 153221,
539
+ "<action_1474>": 153222,
540
+ "<action_1475>": 153223,
541
+ "<action_1476>": 153224,
542
+ "<action_1477>": 153225,
543
+ "<action_1478>": 153226,
544
+ "<action_1479>": 153227,
545
+ "<action_147>": 151895,
546
+ "<action_1480>": 153228,
547
+ "<action_1481>": 153229,
548
+ "<action_1482>": 153230,
549
+ "<action_1483>": 153231,
550
+ "<action_1484>": 153232,
551
+ "<action_1485>": 153233,
552
+ "<action_1486>": 153234,
553
+ "<action_1487>": 153235,
554
+ "<action_1488>": 153236,
555
+ "<action_1489>": 153237,
556
+ "<action_148>": 151896,
557
+ "<action_1490>": 153238,
558
+ "<action_1491>": 153239,
559
+ "<action_1492>": 153240,
560
+ "<action_1493>": 153241,
561
+ "<action_1494>": 153242,
562
+ "<action_1495>": 153243,
563
+ "<action_1496>": 153244,
564
+ "<action_1497>": 153245,
565
+ "<action_1498>": 153246,
566
+ "<action_1499>": 153247,
567
+ "<action_149>": 151897,
568
+ "<action_14>": 151762,
569
+ "<action_1500>": 153248,
570
+ "<action_1501>": 153249,
571
+ "<action_1502>": 153250,
572
+ "<action_1503>": 153251,
573
+ "<action_1504>": 153252,
574
+ "<action_1505>": 153253,
575
+ "<action_1506>": 153254,
576
+ "<action_1507>": 153255,
577
+ "<action_1508>": 153256,
578
+ "<action_1509>": 153257,
579
+ "<action_150>": 151898,
580
+ "<action_1510>": 153258,
581
+ "<action_1511>": 153259,
582
+ "<action_1512>": 153260,
583
+ "<action_1513>": 153261,
584
+ "<action_1514>": 153262,
585
+ "<action_1515>": 153263,
586
+ "<action_1516>": 153264,
587
+ "<action_1517>": 153265,
588
+ "<action_1518>": 153266,
589
+ "<action_1519>": 153267,
590
+ "<action_151>": 151899,
591
+ "<action_1520>": 153268,
592
+ "<action_1521>": 153269,
593
+ "<action_1522>": 153270,
594
+ "<action_1523>": 153271,
595
+ "<action_1524>": 153272,
596
+ "<action_1525>": 153273,
597
+ "<action_1526>": 153274,
598
+ "<action_1527>": 153275,
599
+ "<action_1528>": 153276,
600
+ "<action_1529>": 153277,
601
+ "<action_152>": 151900,
602
+ "<action_1530>": 153278,
603
+ "<action_1531>": 153279,
604
+ "<action_1532>": 153280,
605
+ "<action_1533>": 153281,
606
+ "<action_1534>": 153282,
607
+ "<action_1535>": 153283,
608
+ "<action_1536>": 153284,
609
+ "<action_1537>": 153285,
610
+ "<action_1538>": 153286,
611
+ "<action_1539>": 153287,
612
+ "<action_153>": 151901,
613
+ "<action_1540>": 153288,
614
+ "<action_1541>": 153289,
615
+ "<action_1542>": 153290,
616
+ "<action_1543>": 153291,
617
+ "<action_1544>": 153292,
618
+ "<action_1545>": 153293,
619
+ "<action_1546>": 153294,
620
+ "<action_1547>": 153295,
621
+ "<action_1548>": 153296,
622
+ "<action_1549>": 153297,
623
+ "<action_154>": 151902,
624
+ "<action_1550>": 153298,
625
+ "<action_1551>": 153299,
626
+ "<action_1552>": 153300,
627
+ "<action_1553>": 153301,
628
+ "<action_1554>": 153302,
629
+ "<action_1555>": 153303,
630
+ "<action_1556>": 153304,
631
+ "<action_1557>": 153305,
632
+ "<action_1558>": 153306,
633
+ "<action_1559>": 153307,
634
+ "<action_155>": 151903,
635
+ "<action_1560>": 153308,
636
+ "<action_1561>": 153309,
637
+ "<action_1562>": 153310,
638
+ "<action_1563>": 153311,
639
+ "<action_1564>": 153312,
640
+ "<action_1565>": 153313,
641
+ "<action_1566>": 153314,
642
+ "<action_1567>": 153315,
643
+ "<action_1568>": 153316,
644
+ "<action_1569>": 153317,
645
+ "<action_156>": 151904,
646
+ "<action_1570>": 153318,
647
+ "<action_1571>": 153319,
648
+ "<action_1572>": 153320,
649
+ "<action_1573>": 153321,
650
+ "<action_1574>": 153322,
651
+ "<action_1575>": 153323,
652
+ "<action_1576>": 153324,
653
+ "<action_1577>": 153325,
654
+ "<action_1578>": 153326,
655
+ "<action_1579>": 153327,
656
+ "<action_157>": 151905,
657
+ "<action_1580>": 153328,
658
+ "<action_1581>": 153329,
659
+ "<action_1582>": 153330,
660
+ "<action_1583>": 153331,
661
+ "<action_1584>": 153332,
662
+ "<action_1585>": 153333,
663
+ "<action_1586>": 153334,
664
+ "<action_1587>": 153335,
665
+ "<action_1588>": 153336,
666
+ "<action_1589>": 153337,
667
+ "<action_158>": 151906,
668
+ "<action_1590>": 153338,
669
+ "<action_1591>": 153339,
670
+ "<action_1592>": 153340,
671
+ "<action_1593>": 153341,
672
+ "<action_1594>": 153342,
673
+ "<action_1595>": 153343,
674
+ "<action_1596>": 153344,
675
+ "<action_1597>": 153345,
676
+ "<action_1598>": 153346,
677
+ "<action_1599>": 153347,
678
+ "<action_159>": 151907,
679
+ "<action_15>": 151763,
680
+ "<action_1600>": 153348,
681
+ "<action_1601>": 153349,
682
+ "<action_1602>": 153350,
683
+ "<action_1603>": 153351,
684
+ "<action_1604>": 153352,
685
+ "<action_1605>": 153353,
686
+ "<action_1606>": 153354,
687
+ "<action_1607>": 153355,
688
+ "<action_1608>": 153356,
689
+ "<action_1609>": 153357,
690
+ "<action_160>": 151908,
691
+ "<action_1610>": 153358,
692
+ "<action_1611>": 153359,
693
+ "<action_1612>": 153360,
694
+ "<action_1613>": 153361,
695
+ "<action_1614>": 153362,
696
+ "<action_1615>": 153363,
697
+ "<action_1616>": 153364,
698
+ "<action_1617>": 153365,
699
+ "<action_1618>": 153366,
700
+ "<action_1619>": 153367,
701
+ "<action_161>": 151909,
702
+ "<action_1620>": 153368,
703
+ "<action_1621>": 153369,
704
+ "<action_1622>": 153370,
705
+ "<action_1623>": 153371,
706
+ "<action_1624>": 153372,
707
+ "<action_1625>": 153373,
708
+ "<action_1626>": 153374,
709
+ "<action_1627>": 153375,
710
+ "<action_1628>": 153376,
711
+ "<action_1629>": 153377,
712
+ "<action_162>": 151910,
713
+ "<action_1630>": 153378,
714
+ "<action_1631>": 153379,
715
+ "<action_1632>": 153380,
716
+ "<action_1633>": 153381,
717
+ "<action_1634>": 153382,
718
+ "<action_1635>": 153383,
719
+ "<action_1636>": 153384,
720
+ "<action_1637>": 153385,
721
+ "<action_1638>": 153386,
722
+ "<action_1639>": 153387,
723
+ "<action_163>": 151911,
724
+ "<action_1640>": 153388,
725
+ "<action_1641>": 153389,
726
+ "<action_1642>": 153390,
727
+ "<action_1643>": 153391,
728
+ "<action_1644>": 153392,
729
+ "<action_1645>": 153393,
730
+ "<action_1646>": 153394,
731
+ "<action_1647>": 153395,
732
+ "<action_1648>": 153396,
733
+ "<action_1649>": 153397,
734
+ "<action_164>": 151912,
735
+ "<action_1650>": 153398,
736
+ "<action_1651>": 153399,
737
+ "<action_1652>": 153400,
738
+ "<action_1653>": 153401,
739
+ "<action_1654>": 153402,
740
+ "<action_1655>": 153403,
741
+ "<action_1656>": 153404,
742
+ "<action_1657>": 153405,
743
+ "<action_1658>": 153406,
744
+ "<action_1659>": 153407,
745
+ "<action_165>": 151913,
746
+ "<action_1660>": 153408,
747
+ "<action_1661>": 153409,
748
+ "<action_1662>": 153410,
749
+ "<action_1663>": 153411,
750
+ "<action_1664>": 153412,
751
+ "<action_1665>": 153413,
752
+ "<action_1666>": 153414,
753
+ "<action_1667>": 153415,
754
+ "<action_1668>": 153416,
755
+ "<action_1669>": 153417,
756
+ "<action_166>": 151914,
757
+ "<action_1670>": 153418,
758
+ "<action_1671>": 153419,
759
+ "<action_1672>": 153420,
760
+ "<action_1673>": 153421,
761
+ "<action_1674>": 153422,
762
+ "<action_1675>": 153423,
763
+ "<action_1676>": 153424,
764
+ "<action_1677>": 153425,
765
+ "<action_1678>": 153426,
766
+ "<action_1679>": 153427,
767
+ "<action_167>": 151915,
768
+ "<action_1680>": 153428,
769
+ "<action_1681>": 153429,
770
+ "<action_1682>": 153430,
771
+ "<action_1683>": 153431,
772
+ "<action_1684>": 153432,
773
+ "<action_1685>": 153433,
774
+ "<action_1686>": 153434,
775
+ "<action_1687>": 153435,
776
+ "<action_1688>": 153436,
777
+ "<action_1689>": 153437,
778
+ "<action_168>": 151916,
779
+ "<action_1690>": 153438,
780
+ "<action_1691>": 153439,
781
+ "<action_1692>": 153440,
782
+ "<action_1693>": 153441,
783
+ "<action_1694>": 153442,
784
+ "<action_1695>": 153443,
785
+ "<action_1696>": 153444,
786
+ "<action_1697>": 153445,
787
+ "<action_1698>": 153446,
788
+ "<action_1699>": 153447,
789
+ "<action_169>": 151917,
790
+ "<action_16>": 151764,
791
+ "<action_1700>": 153448,
792
+ "<action_1701>": 153449,
793
+ "<action_1702>": 153450,
794
+ "<action_1703>": 153451,
795
+ "<action_1704>": 153452,
796
+ "<action_1705>": 153453,
797
+ "<action_1706>": 153454,
798
+ "<action_1707>": 153455,
799
+ "<action_1708>": 153456,
800
+ "<action_1709>": 153457,
801
+ "<action_170>": 151918,
802
+ "<action_1710>": 153458,
803
+ "<action_1711>": 153459,
804
+ "<action_1712>": 153460,
805
+ "<action_1713>": 153461,
806
+ "<action_1714>": 153462,
807
+ "<action_1715>": 153463,
808
+ "<action_1716>": 153464,
809
+ "<action_1717>": 153465,
810
+ "<action_1718>": 153466,
811
+ "<action_1719>": 153467,
812
+ "<action_171>": 151919,
813
+ "<action_1720>": 153468,
814
+ "<action_1721>": 153469,
815
+ "<action_1722>": 153470,
816
+ "<action_1723>": 153471,
817
+ "<action_1724>": 153472,
818
+ "<action_1725>": 153473,
819
+ "<action_1726>": 153474,
820
+ "<action_1727>": 153475,
821
+ "<action_1728>": 153476,
822
+ "<action_1729>": 153477,
823
+ "<action_172>": 151920,
824
+ "<action_1730>": 153478,
825
+ "<action_1731>": 153479,
826
+ "<action_1732>": 153480,
827
+ "<action_1733>": 153481,
828
+ "<action_1734>": 153482,
829
+ "<action_1735>": 153483,
830
+ "<action_1736>": 153484,
831
+ "<action_1737>": 153485,
832
+ "<action_1738>": 153486,
833
+ "<action_1739>": 153487,
834
+ "<action_173>": 151921,
835
+ "<action_1740>": 153488,
836
+ "<action_1741>": 153489,
837
+ "<action_1742>": 153490,
838
+ "<action_1743>": 153491,
839
+ "<action_1744>": 153492,
840
+ "<action_1745>": 153493,
841
+ "<action_1746>": 153494,
842
+ "<action_1747>": 153495,
843
+ "<action_1748>": 153496,
844
+ "<action_1749>": 153497,
845
+ "<action_174>": 151922,
846
+ "<action_1750>": 153498,
847
+ "<action_1751>": 153499,
848
+ "<action_1752>": 153500,
849
+ "<action_1753>": 153501,
850
+ "<action_1754>": 153502,
851
+ "<action_1755>": 153503,
852
+ "<action_1756>": 153504,
853
+ "<action_1757>": 153505,
854
+ "<action_1758>": 153506,
855
+ "<action_1759>": 153507,
856
+ "<action_175>": 151923,
857
+ "<action_1760>": 153508,
858
+ "<action_1761>": 153509,
859
+ "<action_1762>": 153510,
860
+ "<action_1763>": 153511,
861
+ "<action_1764>": 153512,
862
+ "<action_1765>": 153513,
863
+ "<action_1766>": 153514,
864
+ "<action_1767>": 153515,
865
+ "<action_1768>": 153516,
866
+ "<action_1769>": 153517,
867
+ "<action_176>": 151924,
868
+ "<action_1770>": 153518,
869
+ "<action_1771>": 153519,
870
+ "<action_1772>": 153520,
871
+ "<action_1773>": 153521,
872
+ "<action_1774>": 153522,
873
+ "<action_1775>": 153523,
874
+ "<action_1776>": 153524,
875
+ "<action_1777>": 153525,
876
+ "<action_1778>": 153526,
877
+ "<action_1779>": 153527,
878
+ "<action_177>": 151925,
879
+ "<action_1780>": 153528,
880
+ "<action_1781>": 153529,
881
+ "<action_1782>": 153530,
882
+ "<action_1783>": 153531,
883
+ "<action_1784>": 153532,
884
+ "<action_1785>": 153533,
885
+ "<action_1786>": 153534,
886
+ "<action_1787>": 153535,
887
+ "<action_1788>": 153536,
888
+ "<action_1789>": 153537,
889
+ "<action_178>": 151926,
890
+ "<action_1790>": 153538,
891
+ "<action_1791>": 153539,
892
+ "<action_1792>": 153540,
893
+ "<action_1793>": 153541,
894
+ "<action_1794>": 153542,
895
+ "<action_1795>": 153543,
896
+ "<action_1796>": 153544,
897
+ "<action_1797>": 153545,
898
+ "<action_1798>": 153546,
899
+ "<action_1799>": 153547,
900
+ "<action_179>": 151927,
901
+ "<action_17>": 151765,
902
+ "<action_1800>": 153548,
903
+ "<action_1801>": 153549,
904
+ "<action_1802>": 153550,
905
+ "<action_1803>": 153551,
906
+ "<action_1804>": 153552,
907
+ "<action_1805>": 153553,
908
+ "<action_1806>": 153554,
909
+ "<action_1807>": 153555,
910
+ "<action_1808>": 153556,
911
+ "<action_1809>": 153557,
912
+ "<action_180>": 151928,
913
+ "<action_1810>": 153558,
914
+ "<action_1811>": 153559,
915
+ "<action_1812>": 153560,
916
+ "<action_1813>": 153561,
917
+ "<action_1814>": 153562,
918
+ "<action_1815>": 153563,
919
+ "<action_1816>": 153564,
920
+ "<action_1817>": 153565,
921
+ "<action_1818>": 153566,
922
+ "<action_1819>": 153567,
923
+ "<action_181>": 151929,
924
+ "<action_1820>": 153568,
925
+ "<action_1821>": 153569,
926
+ "<action_1822>": 153570,
927
+ "<action_1823>": 153571,
928
+ "<action_1824>": 153572,
929
+ "<action_1825>": 153573,
930
+ "<action_1826>": 153574,
931
+ "<action_1827>": 153575,
932
+ "<action_1828>": 153576,
933
+ "<action_1829>": 153577,
934
+ "<action_182>": 151930,
935
+ "<action_1830>": 153578,
936
+ "<action_1831>": 153579,
937
+ "<action_1832>": 153580,
938
+ "<action_1833>": 153581,
939
+ "<action_1834>": 153582,
940
+ "<action_1835>": 153583,
941
+ "<action_1836>": 153584,
942
+ "<action_1837>": 153585,
943
+ "<action_1838>": 153586,
944
+ "<action_1839>": 153587,
945
+ "<action_183>": 151931,
946
+ "<action_1840>": 153588,
947
+ "<action_1841>": 153589,
948
+ "<action_1842>": 153590,
949
+ "<action_1843>": 153591,
950
+ "<action_1844>": 153592,
951
+ "<action_1845>": 153593,
952
+ "<action_1846>": 153594,
953
+ "<action_1847>": 153595,
954
+ "<action_1848>": 153596,
955
+ "<action_1849>": 153597,
956
+ "<action_184>": 151932,
957
+ "<action_1850>": 153598,
958
+ "<action_1851>": 153599,
959
+ "<action_1852>": 153600,
960
+ "<action_1853>": 153601,
961
+ "<action_1854>": 153602,
962
+ "<action_1855>": 153603,
963
+ "<action_1856>": 153604,
964
+ "<action_1857>": 153605,
965
+ "<action_1858>": 153606,
966
+ "<action_1859>": 153607,
967
+ "<action_185>": 151933,
968
+ "<action_1860>": 153608,
969
+ "<action_1861>": 153609,
970
+ "<action_1862>": 153610,
971
+ "<action_1863>": 153611,
972
+ "<action_1864>": 153612,
973
+ "<action_1865>": 153613,
974
+ "<action_1866>": 153614,
975
+ "<action_1867>": 153615,
976
+ "<action_1868>": 153616,
977
+ "<action_1869>": 153617,
978
+ "<action_186>": 151934,
979
+ "<action_1870>": 153618,
980
+ "<action_1871>": 153619,
981
+ "<action_1872>": 153620,
982
+ "<action_1873>": 153621,
983
+ "<action_1874>": 153622,
984
+ "<action_1875>": 153623,
985
+ "<action_1876>": 153624,
986
+ "<action_1877>": 153625,
987
+ "<action_1878>": 153626,
988
+ "<action_1879>": 153627,
989
+ "<action_187>": 151935,
990
+ "<action_1880>": 153628,
991
+ "<action_1881>": 153629,
992
+ "<action_1882>": 153630,
993
+ "<action_1883>": 153631,
994
+ "<action_1884>": 153632,
995
+ "<action_1885>": 153633,
996
+ "<action_1886>": 153634,
997
+ "<action_1887>": 153635,
998
+ "<action_1888>": 153636,
999
+ "<action_1889>": 153637,
1000
+ "<action_188>": 151936,
1001
+ "<action_1890>": 153638,
1002
+ "<action_1891>": 153639,
1003
+ "<action_1892>": 153640,
1004
+ "<action_1893>": 153641,
1005
+ "<action_1894>": 153642,
1006
+ "<action_1895>": 153643,
1007
+ "<action_1896>": 153644,
1008
+ "<action_1897>": 153645,
1009
+ "<action_1898>": 153646,
1010
+ "<action_1899>": 153647,
1011
+ "<action_189>": 151937,
1012
+ "<action_18>": 151766,
1013
+ "<action_1900>": 153648,
1014
+ "<action_1901>": 153649,
1015
+ "<action_1902>": 153650,
1016
+ "<action_1903>": 153651,
1017
+ "<action_1904>": 153652,
1018
+ "<action_1905>": 153653,
1019
+ "<action_1906>": 153654,
1020
+ "<action_1907>": 153655,
1021
+ "<action_1908>": 153656,
1022
+ "<action_1909>": 153657,
1023
+ "<action_190>": 151938,
1024
+ "<action_1910>": 153658,
1025
+ "<action_1911>": 153659,
1026
+ "<action_1912>": 153660,
1027
+ "<action_1913>": 153661,
1028
+ "<action_1914>": 153662,
1029
+ "<action_1915>": 153663,
1030
+ "<action_1916>": 153664,
1031
+ "<action_1917>": 153665,
1032
+ "<action_1918>": 153666,
1033
+ "<action_1919>": 153667,
1034
+ "<action_191>": 151939,
1035
+ "<action_1920>": 153668,
1036
+ "<action_1921>": 153669,
1037
+ "<action_1922>": 153670,
1038
+ "<action_1923>": 153671,
1039
+ "<action_1924>": 153672,
1040
+ "<action_1925>": 153673,
1041
+ "<action_1926>": 153674,
1042
+ "<action_1927>": 153675,
1043
+ "<action_1928>": 153676,
1044
+ "<action_1929>": 153677,
1045
+ "<action_192>": 151940,
1046
+ "<action_1930>": 153678,
1047
+ "<action_1931>": 153679,
1048
+ "<action_1932>": 153680,
1049
+ "<action_1933>": 153681,
1050
+ "<action_1934>": 153682,
1051
+ "<action_1935>": 153683,
1052
+ "<action_1936>": 153684,
1053
+ "<action_1937>": 153685,
1054
+ "<action_1938>": 153686,
1055
+ "<action_1939>": 153687,
1056
+ "<action_193>": 151941,
1057
+ "<action_1940>": 153688,
1058
+ "<action_1941>": 153689,
1059
+ "<action_1942>": 153690,
1060
+ "<action_1943>": 153691,
1061
+ "<action_1944>": 153692,
1062
+ "<action_1945>": 153693,
1063
+ "<action_1946>": 153694,
1064
+ "<action_1947>": 153695,
1065
+ "<action_1948>": 153696,
1066
+ "<action_1949>": 153697,
1067
+ "<action_194>": 151942,
1068
+ "<action_1950>": 153698,
1069
+ "<action_1951>": 153699,
1070
+ "<action_1952>": 153700,
1071
+ "<action_1953>": 153701,
1072
+ "<action_1954>": 153702,
1073
+ "<action_1955>": 153703,
1074
+ "<action_1956>": 153704,
1075
+ "<action_1957>": 153705,
1076
+ "<action_1958>": 153706,
1077
+ "<action_1959>": 153707,
1078
+ "<action_195>": 151943,
1079
+ "<action_1960>": 153708,
1080
+ "<action_1961>": 153709,
1081
+ "<action_1962>": 153710,
1082
+ "<action_1963>": 153711,
1083
+ "<action_1964>": 153712,
1084
+ "<action_1965>": 153713,
1085
+ "<action_1966>": 153714,
1086
+ "<action_1967>": 153715,
1087
+ "<action_1968>": 153716,
1088
+ "<action_1969>": 153717,
1089
+ "<action_196>": 151944,
1090
+ "<action_1970>": 153718,
1091
+ "<action_1971>": 153719,
1092
+ "<action_1972>": 153720,
1093
+ "<action_1973>": 153721,
1094
+ "<action_1974>": 153722,
1095
+ "<action_1975>": 153723,
1096
+ "<action_1976>": 153724,
1097
+ "<action_1977>": 153725,
1098
+ "<action_1978>": 153726,
1099
+ "<action_1979>": 153727,
1100
+ "<action_197>": 151945,
1101
+ "<action_1980>": 153728,
1102
+ "<action_1981>": 153729,
1103
+ "<action_1982>": 153730,
1104
+ "<action_1983>": 153731,
1105
+ "<action_1984>": 153732,
1106
+ "<action_1985>": 153733,
1107
+ "<action_1986>": 153734,
1108
+ "<action_1987>": 153735,
1109
+ "<action_1988>": 153736,
1110
+ "<action_1989>": 153737,
1111
+ "<action_198>": 151946,
1112
+ "<action_1990>": 153738,
1113
+ "<action_1991>": 153739,
1114
+ "<action_1992>": 153740,
1115
+ "<action_1993>": 153741,
1116
+ "<action_1994>": 153742,
1117
+ "<action_1995>": 153743,
1118
+ "<action_1996>": 153744,
1119
+ "<action_1997>": 153745,
1120
+ "<action_1998>": 153746,
1121
+ "<action_1999>": 153747,
1122
+ "<action_199>": 151947,
1123
+ "<action_19>": 151767,
1124
+ "<action_1>": 151749,
1125
+ "<action_2000>": 153748,
1126
+ "<action_2001>": 153749,
1127
+ "<action_2002>": 153750,
1128
+ "<action_2003>": 153751,
1129
+ "<action_2004>": 153752,
1130
+ "<action_2005>": 153753,
1131
+ "<action_2006>": 153754,
1132
+ "<action_2007>": 153755,
1133
+ "<action_2008>": 153756,
1134
+ "<action_2009>": 153757,
1135
+ "<action_200>": 151948,
1136
+ "<action_2010>": 153758,
1137
+ "<action_2011>": 153759,
1138
+ "<action_2012>": 153760,
1139
+ "<action_2013>": 153761,
1140
+ "<action_2014>": 153762,
1141
+ "<action_2015>": 153763,
1142
+ "<action_2016>": 153764,
1143
+ "<action_2017>": 153765,
1144
+ "<action_2018>": 153766,
1145
+ "<action_2019>": 153767,
1146
+ "<action_201>": 151949,
1147
+ "<action_2020>": 153768,
1148
+ "<action_2021>": 153769,
1149
+ "<action_2022>": 153770,
1150
+ "<action_2023>": 153771,
1151
+ "<action_2024>": 153772,
1152
+ "<action_2025>": 153773,
1153
+ "<action_2026>": 153774,
1154
+ "<action_2027>": 153775,
1155
+ "<action_2028>": 153776,
1156
+ "<action_2029>": 153777,
1157
+ "<action_202>": 151950,
1158
+ "<action_2030>": 153778,
1159
+ "<action_2031>": 153779,
1160
+ "<action_2032>": 153780,
1161
+ "<action_2033>": 153781,
1162
+ "<action_2034>": 153782,
1163
+ "<action_2035>": 153783,
1164
+ "<action_2036>": 153784,
1165
+ "<action_2037>": 153785,
1166
+ "<action_2038>": 153786,
1167
+ "<action_2039>": 153787,
1168
+ "<action_203>": 151951,
1169
+ "<action_2040>": 153788,
1170
+ "<action_2041>": 153789,
1171
+ "<action_2042>": 153790,
1172
+ "<action_2043>": 153791,
1173
+ "<action_2044>": 153792,
1174
+ "<action_2045>": 153793,
1175
+ "<action_2046>": 153794,
1176
+ "<action_2047>": 153795,
1177
+ "<action_204>": 151952,
1178
+ "<action_205>": 151953,
1179
+ "<action_206>": 151954,
1180
+ "<action_207>": 151955,
1181
+ "<action_208>": 151956,
1182
+ "<action_209>": 151957,
1183
+ "<action_20>": 151768,
1184
+ "<action_210>": 151958,
1185
+ "<action_211>": 151959,
1186
+ "<action_212>": 151960,
1187
+ "<action_213>": 151961,
1188
+ "<action_214>": 151962,
1189
+ "<action_215>": 151963,
1190
+ "<action_216>": 151964,
1191
+ "<action_217>": 151965,
1192
+ "<action_218>": 151966,
1193
+ "<action_219>": 151967,
1194
+ "<action_21>": 151769,
1195
+ "<action_220>": 151968,
1196
+ "<action_221>": 151969,
1197
+ "<action_222>": 151970,
1198
+ "<action_223>": 151971,
1199
+ "<action_224>": 151972,
1200
+ "<action_225>": 151973,
1201
+ "<action_226>": 151974,
1202
+ "<action_227>": 151975,
1203
+ "<action_228>": 151976,
1204
+ "<action_229>": 151977,
1205
+ "<action_22>": 151770,
1206
+ "<action_230>": 151978,
1207
+ "<action_231>": 151979,
1208
+ "<action_232>": 151980,
1209
+ "<action_233>": 151981,
1210
+ "<action_234>": 151982,
1211
+ "<action_235>": 151983,
1212
+ "<action_236>": 151984,
1213
+ "<action_237>": 151985,
1214
+ "<action_238>": 151986,
1215
+ "<action_239>": 151987,
1216
+ "<action_23>": 151771,
1217
+ "<action_240>": 151988,
1218
+ "<action_241>": 151989,
1219
+ "<action_242>": 151990,
1220
+ "<action_243>": 151991,
1221
+ "<action_244>": 151992,
1222
+ "<action_245>": 151993,
1223
+ "<action_246>": 151994,
1224
+ "<action_247>": 151995,
1225
+ "<action_248>": 151996,
1226
+ "<action_249>": 151997,
1227
+ "<action_24>": 151772,
1228
+ "<action_250>": 151998,
1229
+ "<action_251>": 151999,
1230
+ "<action_252>": 152000,
1231
+ "<action_253>": 152001,
1232
+ "<action_254>": 152002,
1233
+ "<action_255>": 152003,
1234
+ "<action_256>": 152004,
1235
+ "<action_257>": 152005,
1236
+ "<action_258>": 152006,
1237
+ "<action_259>": 152007,
1238
+ "<action_25>": 151773,
1239
+ "<action_260>": 152008,
1240
+ "<action_261>": 152009,
1241
+ "<action_262>": 152010,
1242
+ "<action_263>": 152011,
1243
+ "<action_264>": 152012,
1244
+ "<action_265>": 152013,
1245
+ "<action_266>": 152014,
1246
+ "<action_267>": 152015,
1247
+ "<action_268>": 152016,
1248
+ "<action_269>": 152017,
1249
+ "<action_26>": 151774,
1250
+ "<action_270>": 152018,
1251
+ "<action_271>": 152019,
1252
+ "<action_272>": 152020,
1253
+ "<action_273>": 152021,
1254
+ "<action_274>": 152022,
1255
+ "<action_275>": 152023,
1256
+ "<action_276>": 152024,
1257
+ "<action_277>": 152025,
1258
+ "<action_278>": 152026,
1259
+ "<action_279>": 152027,
1260
+ "<action_27>": 151775,
1261
+ "<action_280>": 152028,
1262
+ "<action_281>": 152029,
1263
+ "<action_282>": 152030,
1264
+ "<action_283>": 152031,
1265
+ "<action_284>": 152032,
1266
+ "<action_285>": 152033,
1267
+ "<action_286>": 152034,
1268
+ "<action_287>": 152035,
1269
+ "<action_288>": 152036,
1270
+ "<action_289>": 152037,
1271
+ "<action_28>": 151776,
1272
+ "<action_290>": 152038,
1273
+ "<action_291>": 152039,
1274
+ "<action_292>": 152040,
1275
+ "<action_293>": 152041,
1276
+ "<action_294>": 152042,
1277
+ "<action_295>": 152043,
1278
+ "<action_296>": 152044,
1279
+ "<action_297>": 152045,
1280
+ "<action_298>": 152046,
1281
+ "<action_299>": 152047,
1282
+ "<action_29>": 151777,
1283
+ "<action_2>": 151750,
1284
+ "<action_300>": 152048,
1285
+ "<action_301>": 152049,
1286
+ "<action_302>": 152050,
1287
+ "<action_303>": 152051,
1288
+ "<action_304>": 152052,
1289
+ "<action_305>": 152053,
1290
+ "<action_306>": 152054,
1291
+ "<action_307>": 152055,
1292
+ "<action_308>": 152056,
1293
+ "<action_309>": 152057,
1294
+ "<action_30>": 151778,
1295
+ "<action_310>": 152058,
1296
+ "<action_311>": 152059,
1297
+ "<action_312>": 152060,
1298
+ "<action_313>": 152061,
1299
+ "<action_314>": 152062,
1300
+ "<action_315>": 152063,
1301
+ "<action_316>": 152064,
1302
+ "<action_317>": 152065,
1303
+ "<action_318>": 152066,
1304
+ "<action_319>": 152067,
1305
+ "<action_31>": 151779,
1306
+ "<action_320>": 152068,
1307
+ "<action_321>": 152069,
1308
+ "<action_322>": 152070,
1309
+ "<action_323>": 152071,
1310
+ "<action_324>": 152072,
1311
+ "<action_325>": 152073,
1312
+ "<action_326>": 152074,
1313
+ "<action_327>": 152075,
1314
+ "<action_328>": 152076,
1315
+ "<action_329>": 152077,
1316
+ "<action_32>": 151780,
1317
+ "<action_330>": 152078,
1318
+ "<action_331>": 152079,
1319
+ "<action_332>": 152080,
1320
+ "<action_333>": 152081,
1321
+ "<action_334>": 152082,
1322
+ "<action_335>": 152083,
1323
+ "<action_336>": 152084,
1324
+ "<action_337>": 152085,
1325
+ "<action_338>": 152086,
1326
+ "<action_339>": 152087,
1327
+ "<action_33>": 151781,
1328
+ "<action_340>": 152088,
1329
+ "<action_341>": 152089,
1330
+ "<action_342>": 152090,
1331
+ "<action_343>": 152091,
1332
+ "<action_344>": 152092,
1333
+ "<action_345>": 152093,
1334
+ "<action_346>": 152094,
1335
+ "<action_347>": 152095,
1336
+ "<action_348>": 152096,
1337
+ "<action_349>": 152097,
1338
+ "<action_34>": 151782,
1339
+ "<action_350>": 152098,
1340
+ "<action_351>": 152099,
1341
+ "<action_352>": 152100,
1342
+ "<action_353>": 152101,
1343
+ "<action_354>": 152102,
1344
+ "<action_355>": 152103,
1345
+ "<action_356>": 152104,
1346
+ "<action_357>": 152105,
1347
+ "<action_358>": 152106,
1348
+ "<action_359>": 152107,
1349
+ "<action_35>": 151783,
1350
+ "<action_360>": 152108,
1351
+ "<action_361>": 152109,
1352
+ "<action_362>": 152110,
1353
+ "<action_363>": 152111,
1354
+ "<action_364>": 152112,
1355
+ "<action_365>": 152113,
1356
+ "<action_366>": 152114,
1357
+ "<action_367>": 152115,
1358
+ "<action_368>": 152116,
1359
+ "<action_369>": 152117,
1360
+ "<action_36>": 151784,
1361
+ "<action_370>": 152118,
1362
+ "<action_371>": 152119,
1363
+ "<action_372>": 152120,
1364
+ "<action_373>": 152121,
1365
+ "<action_374>": 152122,
1366
+ "<action_375>": 152123,
1367
+ "<action_376>": 152124,
1368
+ "<action_377>": 152125,
1369
+ "<action_378>": 152126,
1370
+ "<action_379>": 152127,
1371
+ "<action_37>": 151785,
1372
+ "<action_380>": 152128,
1373
+ "<action_381>": 152129,
1374
+ "<action_382>": 152130,
1375
+ "<action_383>": 152131,
1376
+ "<action_384>": 152132,
1377
+ "<action_385>": 152133,
1378
+ "<action_386>": 152134,
1379
+ "<action_387>": 152135,
1380
+ "<action_388>": 152136,
1381
+ "<action_389>": 152137,
1382
+ "<action_38>": 151786,
1383
+ "<action_390>": 152138,
1384
+ "<action_391>": 152139,
1385
+ "<action_392>": 152140,
1386
+ "<action_393>": 152141,
1387
+ "<action_394>": 152142,
1388
+ "<action_395>": 152143,
1389
+ "<action_396>": 152144,
1390
+ "<action_397>": 152145,
1391
+ "<action_398>": 152146,
1392
+ "<action_399>": 152147,
1393
+ "<action_39>": 151787,
1394
+ "<action_3>": 151751,
1395
+ "<action_400>": 152148,
1396
+ "<action_401>": 152149,
1397
+ "<action_402>": 152150,
1398
+ "<action_403>": 152151,
1399
+ "<action_404>": 152152,
1400
+ "<action_405>": 152153,
1401
+ "<action_406>": 152154,
1402
+ "<action_407>": 152155,
1403
+ "<action_408>": 152156,
1404
+ "<action_409>": 152157,
1405
+ "<action_40>": 151788,
1406
+ "<action_410>": 152158,
1407
+ "<action_411>": 152159,
1408
+ "<action_412>": 152160,
1409
+ "<action_413>": 152161,
1410
+ "<action_414>": 152162,
1411
+ "<action_415>": 152163,
1412
+ "<action_416>": 152164,
1413
+ "<action_417>": 152165,
1414
+ "<action_418>": 152166,
1415
+ "<action_419>": 152167,
1416
+ "<action_41>": 151789,
1417
+ "<action_420>": 152168,
1418
+ "<action_421>": 152169,
1419
+ "<action_422>": 152170,
1420
+ "<action_423>": 152171,
1421
+ "<action_424>": 152172,
1422
+ "<action_425>": 152173,
1423
+ "<action_426>": 152174,
1424
+ "<action_427>": 152175,
1425
+ "<action_428>": 152176,
1426
+ "<action_429>": 152177,
1427
+ "<action_42>": 151790,
1428
+ "<action_430>": 152178,
1429
+ "<action_431>": 152179,
1430
+ "<action_432>": 152180,
1431
+ "<action_433>": 152181,
1432
+ "<action_434>": 152182,
1433
+ "<action_435>": 152183,
1434
+ "<action_436>": 152184,
1435
+ "<action_437>": 152185,
1436
+ "<action_438>": 152186,
1437
+ "<action_439>": 152187,
1438
+ "<action_43>": 151791,
1439
+ "<action_440>": 152188,
1440
+ "<action_441>": 152189,
1441
+ "<action_442>": 152190,
1442
+ "<action_443>": 152191,
1443
+ "<action_444>": 152192,
1444
+ "<action_445>": 152193,
1445
+ "<action_446>": 152194,
1446
+ "<action_447>": 152195,
1447
+ "<action_448>": 152196,
1448
+ "<action_449>": 152197,
1449
+ "<action_44>": 151792,
1450
+ "<action_450>": 152198,
1451
+ "<action_451>": 152199,
1452
+ "<action_452>": 152200,
1453
+ "<action_453>": 152201,
1454
+ "<action_454>": 152202,
1455
+ "<action_455>": 152203,
1456
+ "<action_456>": 152204,
1457
+ "<action_457>": 152205,
1458
+ "<action_458>": 152206,
1459
+ "<action_459>": 152207,
1460
+ "<action_45>": 151793,
1461
+ "<action_460>": 152208,
1462
+ "<action_461>": 152209,
1463
+ "<action_462>": 152210,
1464
+ "<action_463>": 152211,
1465
+ "<action_464>": 152212,
1466
+ "<action_465>": 152213,
1467
+ "<action_466>": 152214,
1468
+ "<action_467>": 152215,
1469
+ "<action_468>": 152216,
1470
+ "<action_469>": 152217,
1471
+ "<action_46>": 151794,
1472
+ "<action_470>": 152218,
1473
+ "<action_471>": 152219,
1474
+ "<action_472>": 152220,
1475
+ "<action_473>": 152221,
1476
+ "<action_474>": 152222,
1477
+ "<action_475>": 152223,
1478
+ "<action_476>": 152224,
1479
+ "<action_477>": 152225,
1480
+ "<action_478>": 152226,
1481
+ "<action_479>": 152227,
1482
+ "<action_47>": 151795,
1483
+ "<action_480>": 152228,
1484
+ "<action_481>": 152229,
1485
+ "<action_482>": 152230,
1486
+ "<action_483>": 152231,
1487
+ "<action_484>": 152232,
1488
+ "<action_485>": 152233,
1489
+ "<action_486>": 152234,
1490
+ "<action_487>": 152235,
1491
+ "<action_488>": 152236,
1492
+ "<action_489>": 152237,
1493
+ "<action_48>": 151796,
1494
+ "<action_490>": 152238,
1495
+ "<action_491>": 152239,
1496
+ "<action_492>": 152240,
1497
+ "<action_493>": 152241,
1498
+ "<action_494>": 152242,
1499
+ "<action_495>": 152243,
1500
+ "<action_496>": 152244,
1501
+ "<action_497>": 152245,
1502
+ "<action_498>": 152246,
1503
+ "<action_499>": 152247,
1504
+ "<action_49>": 151797,
1505
+ "<action_4>": 151752,
1506
+ "<action_500>": 152248,
1507
+ "<action_501>": 152249,
1508
+ "<action_502>": 152250,
1509
+ "<action_503>": 152251,
1510
+ "<action_504>": 152252,
1511
+ "<action_505>": 152253,
1512
+ "<action_506>": 152254,
1513
+ "<action_507>": 152255,
1514
+ "<action_508>": 152256,
1515
+ "<action_509>": 152257,
1516
+ "<action_50>": 151798,
1517
+ "<action_510>": 152258,
1518
+ "<action_511>": 152259,
1519
+ "<action_512>": 152260,
1520
+ "<action_513>": 152261,
1521
+ "<action_514>": 152262,
1522
+ "<action_515>": 152263,
1523
+ "<action_516>": 152264,
1524
+ "<action_517>": 152265,
1525
+ "<action_518>": 152266,
1526
+ "<action_519>": 152267,
1527
+ "<action_51>": 151799,
1528
+ "<action_520>": 152268,
1529
+ "<action_521>": 152269,
1530
+ "<action_522>": 152270,
1531
+ "<action_523>": 152271,
1532
+ "<action_524>": 152272,
1533
+ "<action_525>": 152273,
1534
+ "<action_526>": 152274,
1535
+ "<action_527>": 152275,
1536
+ "<action_528>": 152276,
1537
+ "<action_529>": 152277,
1538
+ "<action_52>": 151800,
1539
+ "<action_530>": 152278,
1540
+ "<action_531>": 152279,
1541
+ "<action_532>": 152280,
1542
+ "<action_533>": 152281,
1543
+ "<action_534>": 152282,
1544
+ "<action_535>": 152283,
1545
+ "<action_536>": 152284,
1546
+ "<action_537>": 152285,
1547
+ "<action_538>": 152286,
1548
+ "<action_539>": 152287,
1549
+ "<action_53>": 151801,
1550
+ "<action_540>": 152288,
1551
+ "<action_541>": 152289,
1552
+ "<action_542>": 152290,
1553
+ "<action_543>": 152291,
1554
+ "<action_544>": 152292,
1555
+ "<action_545>": 152293,
1556
+ "<action_546>": 152294,
1557
+ "<action_547>": 152295,
1558
+ "<action_548>": 152296,
1559
+ "<action_549>": 152297,
1560
+ "<action_54>": 151802,
1561
+ "<action_550>": 152298,
1562
+ "<action_551>": 152299,
1563
+ "<action_552>": 152300,
1564
+ "<action_553>": 152301,
1565
+ "<action_554>": 152302,
1566
+ "<action_555>": 152303,
1567
+ "<action_556>": 152304,
1568
+ "<action_557>": 152305,
1569
+ "<action_558>": 152306,
1570
+ "<action_559>": 152307,
1571
+ "<action_55>": 151803,
1572
+ "<action_560>": 152308,
1573
+ "<action_561>": 152309,
1574
+ "<action_562>": 152310,
1575
+ "<action_563>": 152311,
1576
+ "<action_564>": 152312,
1577
+ "<action_565>": 152313,
1578
+ "<action_566>": 152314,
1579
+ "<action_567>": 152315,
1580
+ "<action_568>": 152316,
1581
+ "<action_569>": 152317,
1582
+ "<action_56>": 151804,
1583
+ "<action_570>": 152318,
1584
+ "<action_571>": 152319,
1585
+ "<action_572>": 152320,
1586
+ "<action_573>": 152321,
1587
+ "<action_574>": 152322,
1588
+ "<action_575>": 152323,
1589
+ "<action_576>": 152324,
1590
+ "<action_577>": 152325,
1591
+ "<action_578>": 152326,
1592
+ "<action_579>": 152327,
1593
+ "<action_57>": 151805,
1594
+ "<action_580>": 152328,
1595
+ "<action_581>": 152329,
1596
+ "<action_582>": 152330,
1597
+ "<action_583>": 152331,
1598
+ "<action_584>": 152332,
1599
+ "<action_585>": 152333,
1600
+ "<action_586>": 152334,
1601
+ "<action_587>": 152335,
1602
+ "<action_588>": 152336,
1603
+ "<action_589>": 152337,
1604
+ "<action_58>": 151806,
1605
+ "<action_590>": 152338,
1606
+ "<action_591>": 152339,
1607
+ "<action_592>": 152340,
1608
+ "<action_593>": 152341,
1609
+ "<action_594>": 152342,
1610
+ "<action_595>": 152343,
1611
+ "<action_596>": 152344,
1612
+ "<action_597>": 152345,
1613
+ "<action_598>": 152346,
1614
+ "<action_599>": 152347,
1615
+ "<action_59>": 151807,
1616
+ "<action_5>": 151753,
1617
+ "<action_600>": 152348,
1618
+ "<action_601>": 152349,
1619
+ "<action_602>": 152350,
1620
+ "<action_603>": 152351,
1621
+ "<action_604>": 152352,
1622
+ "<action_605>": 152353,
1623
+ "<action_606>": 152354,
1624
+ "<action_607>": 152355,
1625
+ "<action_608>": 152356,
1626
+ "<action_609>": 152357,
1627
+ "<action_60>": 151808,
1628
+ "<action_610>": 152358,
1629
+ "<action_611>": 152359,
1630
+ "<action_612>": 152360,
1631
+ "<action_613>": 152361,
1632
+ "<action_614>": 152362,
1633
+ "<action_615>": 152363,
1634
+ "<action_616>": 152364,
1635
+ "<action_617>": 152365,
1636
+ "<action_618>": 152366,
1637
+ "<action_619>": 152367,
1638
+ "<action_61>": 151809,
1639
+ "<action_620>": 152368,
1640
+ "<action_621>": 152369,
1641
+ "<action_622>": 152370,
1642
+ "<action_623>": 152371,
1643
+ "<action_624>": 152372,
1644
+ "<action_625>": 152373,
1645
+ "<action_626>": 152374,
1646
+ "<action_627>": 152375,
1647
+ "<action_628>": 152376,
1648
+ "<action_629>": 152377,
1649
+ "<action_62>": 151810,
1650
+ "<action_630>": 152378,
1651
+ "<action_631>": 152379,
1652
+ "<action_632>": 152380,
1653
+ "<action_633>": 152381,
1654
+ "<action_634>": 152382,
1655
+ "<action_635>": 152383,
1656
+ "<action_636>": 152384,
1657
+ "<action_637>": 152385,
1658
+ "<action_638>": 152386,
1659
+ "<action_639>": 152387,
1660
+ "<action_63>": 151811,
1661
+ "<action_640>": 152388,
1662
+ "<action_641>": 152389,
1663
+ "<action_642>": 152390,
1664
+ "<action_643>": 152391,
1665
+ "<action_644>": 152392,
1666
+ "<action_645>": 152393,
1667
+ "<action_646>": 152394,
1668
+ "<action_647>": 152395,
1669
+ "<action_648>": 152396,
1670
+ "<action_649>": 152397,
1671
+ "<action_64>": 151812,
1672
+ "<action_650>": 152398,
1673
+ "<action_651>": 152399,
1674
+ "<action_652>": 152400,
1675
+ "<action_653>": 152401,
1676
+ "<action_654>": 152402,
1677
+ "<action_655>": 152403,
1678
+ "<action_656>": 152404,
1679
+ "<action_657>": 152405,
1680
+ "<action_658>": 152406,
1681
+ "<action_659>": 152407,
1682
+ "<action_65>": 151813,
1683
+ "<action_660>": 152408,
1684
+ "<action_661>": 152409,
1685
+ "<action_662>": 152410,
1686
+ "<action_663>": 152411,
1687
+ "<action_664>": 152412,
1688
+ "<action_665>": 152413,
1689
+ "<action_666>": 152414,
1690
+ "<action_667>": 152415,
1691
+ "<action_668>": 152416,
1692
+ "<action_669>": 152417,
1693
+ "<action_66>": 151814,
1694
+ "<action_670>": 152418,
1695
+ "<action_671>": 152419,
1696
+ "<action_672>": 152420,
1697
+ "<action_673>": 152421,
1698
+ "<action_674>": 152422,
1699
+ "<action_675>": 152423,
1700
+ "<action_676>": 152424,
1701
+ "<action_677>": 152425,
1702
+ "<action_678>": 152426,
1703
+ "<action_679>": 152427,
1704
+ "<action_67>": 151815,
1705
+ "<action_680>": 152428,
1706
+ "<action_681>": 152429,
1707
+ "<action_682>": 152430,
1708
+ "<action_683>": 152431,
1709
+ "<action_684>": 152432,
1710
+ "<action_685>": 152433,
1711
+ "<action_686>": 152434,
1712
+ "<action_687>": 152435,
1713
+ "<action_688>": 152436,
1714
+ "<action_689>": 152437,
1715
+ "<action_68>": 151816,
1716
+ "<action_690>": 152438,
1717
+ "<action_691>": 152439,
1718
+ "<action_692>": 152440,
1719
+ "<action_693>": 152441,
1720
+ "<action_694>": 152442,
1721
+ "<action_695>": 152443,
1722
+ "<action_696>": 152444,
1723
+ "<action_697>": 152445,
1724
+ "<action_698>": 152446,
1725
+ "<action_699>": 152447,
1726
+ "<action_69>": 151817,
1727
+ "<action_6>": 151754,
1728
+ "<action_700>": 152448,
1729
+ "<action_701>": 152449,
1730
+ "<action_702>": 152450,
1731
+ "<action_703>": 152451,
1732
+ "<action_704>": 152452,
1733
+ "<action_705>": 152453,
1734
+ "<action_706>": 152454,
1735
+ "<action_707>": 152455,
1736
+ "<action_708>": 152456,
1737
+ "<action_709>": 152457,
1738
+ "<action_70>": 151818,
1739
+ "<action_710>": 152458,
1740
+ "<action_711>": 152459,
1741
+ "<action_712>": 152460,
1742
+ "<action_713>": 152461,
1743
+ "<action_714>": 152462,
1744
+ "<action_715>": 152463,
1745
+ "<action_716>": 152464,
1746
+ "<action_717>": 152465,
1747
+ "<action_718>": 152466,
1748
+ "<action_719>": 152467,
1749
+ "<action_71>": 151819,
1750
+ "<action_720>": 152468,
1751
+ "<action_721>": 152469,
1752
+ "<action_722>": 152470,
1753
+ "<action_723>": 152471,
1754
+ "<action_724>": 152472,
1755
+ "<action_725>": 152473,
1756
+ "<action_726>": 152474,
1757
+ "<action_727>": 152475,
1758
+ "<action_728>": 152476,
1759
+ "<action_729>": 152477,
1760
+ "<action_72>": 151820,
1761
+ "<action_730>": 152478,
1762
+ "<action_731>": 152479,
1763
+ "<action_732>": 152480,
1764
+ "<action_733>": 152481,
1765
+ "<action_734>": 152482,
1766
+ "<action_735>": 152483,
1767
+ "<action_736>": 152484,
1768
+ "<action_737>": 152485,
1769
+ "<action_738>": 152486,
1770
+ "<action_739>": 152487,
1771
+ "<action_73>": 151821,
1772
+ "<action_740>": 152488,
1773
+ "<action_741>": 152489,
1774
+ "<action_742>": 152490,
1775
+ "<action_743>": 152491,
1776
+ "<action_744>": 152492,
1777
+ "<action_745>": 152493,
1778
+ "<action_746>": 152494,
1779
+ "<action_747>": 152495,
1780
+ "<action_748>": 152496,
1781
+ "<action_749>": 152497,
1782
+ "<action_74>": 151822,
1783
+ "<action_750>": 152498,
1784
+ "<action_751>": 152499,
1785
+ "<action_752>": 152500,
1786
+ "<action_753>": 152501,
1787
+ "<action_754>": 152502,
1788
+ "<action_755>": 152503,
1789
+ "<action_756>": 152504,
1790
+ "<action_757>": 152505,
1791
+ "<action_758>": 152506,
1792
+ "<action_759>": 152507,
1793
+ "<action_75>": 151823,
1794
+ "<action_760>": 152508,
1795
+ "<action_761>": 152509,
1796
+ "<action_762>": 152510,
1797
+ "<action_763>": 152511,
1798
+ "<action_764>": 152512,
1799
+ "<action_765>": 152513,
1800
+ "<action_766>": 152514,
1801
+ "<action_767>": 152515,
1802
+ "<action_768>": 152516,
1803
+ "<action_769>": 152517,
1804
+ "<action_76>": 151824,
1805
+ "<action_770>": 152518,
1806
+ "<action_771>": 152519,
1807
+ "<action_772>": 152520,
1808
+ "<action_773>": 152521,
1809
+ "<action_774>": 152522,
1810
+ "<action_775>": 152523,
1811
+ "<action_776>": 152524,
1812
+ "<action_777>": 152525,
1813
+ "<action_778>": 152526,
1814
+ "<action_779>": 152527,
1815
+ "<action_77>": 151825,
1816
+ "<action_780>": 152528,
1817
+ "<action_781>": 152529,
1818
+ "<action_782>": 152530,
1819
+ "<action_783>": 152531,
1820
+ "<action_784>": 152532,
1821
+ "<action_785>": 152533,
1822
+ "<action_786>": 152534,
1823
+ "<action_787>": 152535,
1824
+ "<action_788>": 152536,
1825
+ "<action_789>": 152537,
1826
+ "<action_78>": 151826,
1827
+ "<action_790>": 152538,
1828
+ "<action_791>": 152539,
1829
+ "<action_792>": 152540,
1830
+ "<action_793>": 152541,
1831
+ "<action_794>": 152542,
1832
+ "<action_795>": 152543,
1833
+ "<action_796>": 152544,
1834
+ "<action_797>": 152545,
1835
+ "<action_798>": 152546,
1836
+ "<action_799>": 152547,
1837
+ "<action_79>": 151827,
1838
+ "<action_7>": 151755,
1839
+ "<action_800>": 152548,
1840
+ "<action_801>": 152549,
1841
+ "<action_802>": 152550,
1842
+ "<action_803>": 152551,
1843
+ "<action_804>": 152552,
1844
+ "<action_805>": 152553,
1845
+ "<action_806>": 152554,
1846
+ "<action_807>": 152555,
1847
+ "<action_808>": 152556,
1848
+ "<action_809>": 152557,
1849
+ "<action_80>": 151828,
1850
+ "<action_810>": 152558,
1851
+ "<action_811>": 152559,
1852
+ "<action_812>": 152560,
1853
+ "<action_813>": 152561,
1854
+ "<action_814>": 152562,
1855
+ "<action_815>": 152563,
1856
+ "<action_816>": 152564,
1857
+ "<action_817>": 152565,
1858
+ "<action_818>": 152566,
1859
+ "<action_819>": 152567,
1860
+ "<action_81>": 151829,
1861
+ "<action_820>": 152568,
1862
+ "<action_821>": 152569,
1863
+ "<action_822>": 152570,
1864
+ "<action_823>": 152571,
1865
+ "<action_824>": 152572,
1866
+ "<action_825>": 152573,
1867
+ "<action_826>": 152574,
1868
+ "<action_827>": 152575,
1869
+ "<action_828>": 152576,
1870
+ "<action_829>": 152577,
1871
+ "<action_82>": 151830,
1872
+ "<action_830>": 152578,
1873
+ "<action_831>": 152579,
1874
+ "<action_832>": 152580,
1875
+ "<action_833>": 152581,
1876
+ "<action_834>": 152582,
1877
+ "<action_835>": 152583,
1878
+ "<action_836>": 152584,
1879
+ "<action_837>": 152585,
1880
+ "<action_838>": 152586,
1881
+ "<action_839>": 152587,
1882
+ "<action_83>": 151831,
1883
+ "<action_840>": 152588,
1884
+ "<action_841>": 152589,
1885
+ "<action_842>": 152590,
1886
+ "<action_843>": 152591,
1887
+ "<action_844>": 152592,
1888
+ "<action_845>": 152593,
1889
+ "<action_846>": 152594,
1890
+ "<action_847>": 152595,
1891
+ "<action_848>": 152596,
1892
+ "<action_849>": 152597,
1893
+ "<action_84>": 151832,
1894
+ "<action_850>": 152598,
1895
+ "<action_851>": 152599,
1896
+ "<action_852>": 152600,
1897
+ "<action_853>": 152601,
1898
+ "<action_854>": 152602,
1899
+ "<action_855>": 152603,
1900
+ "<action_856>": 152604,
1901
+ "<action_857>": 152605,
1902
+ "<action_858>": 152606,
1903
+ "<action_859>": 152607,
1904
+ "<action_85>": 151833,
1905
+ "<action_860>": 152608,
1906
+ "<action_861>": 152609,
1907
+ "<action_862>": 152610,
1908
+ "<action_863>": 152611,
1909
+ "<action_864>": 152612,
1910
+ "<action_865>": 152613,
1911
+ "<action_866>": 152614,
1912
+ "<action_867>": 152615,
1913
+ "<action_868>": 152616,
1914
+ "<action_869>": 152617,
1915
+ "<action_86>": 151834,
1916
+ "<action_870>": 152618,
1917
+ "<action_871>": 152619,
1918
+ "<action_872>": 152620,
1919
+ "<action_873>": 152621,
1920
+ "<action_874>": 152622,
1921
+ "<action_875>": 152623,
1922
+ "<action_876>": 152624,
1923
+ "<action_877>": 152625,
1924
+ "<action_878>": 152626,
1925
+ "<action_879>": 152627,
1926
+ "<action_87>": 151835,
1927
+ "<action_880>": 152628,
1928
+ "<action_881>": 152629,
1929
+ "<action_882>": 152630,
1930
+ "<action_883>": 152631,
1931
+ "<action_884>": 152632,
1932
+ "<action_885>": 152633,
1933
+ "<action_886>": 152634,
1934
+ "<action_887>": 152635,
1935
+ "<action_888>": 152636,
1936
+ "<action_889>": 152637,
1937
+ "<action_88>": 151836,
1938
+ "<action_890>": 152638,
1939
+ "<action_891>": 152639,
1940
+ "<action_892>": 152640,
1941
+ "<action_893>": 152641,
1942
+ "<action_894>": 152642,
1943
+ "<action_895>": 152643,
1944
+ "<action_896>": 152644,
1945
+ "<action_897>": 152645,
1946
+ "<action_898>": 152646,
1947
+ "<action_899>": 152647,
1948
+ "<action_89>": 151837,
1949
+ "<action_8>": 151756,
1950
+ "<action_900>": 152648,
1951
+ "<action_901>": 152649,
1952
+ "<action_902>": 152650,
1953
+ "<action_903>": 152651,
1954
+ "<action_904>": 152652,
1955
+ "<action_905>": 152653,
1956
+ "<action_906>": 152654,
1957
+ "<action_907>": 152655,
1958
+ "<action_908>": 152656,
1959
+ "<action_909>": 152657,
1960
+ "<action_90>": 151838,
1961
+ "<action_910>": 152658,
1962
+ "<action_911>": 152659,
1963
+ "<action_912>": 152660,
1964
+ "<action_913>": 152661,
1965
+ "<action_914>": 152662,
1966
+ "<action_915>": 152663,
1967
+ "<action_916>": 152664,
1968
+ "<action_917>": 152665,
1969
+ "<action_918>": 152666,
1970
+ "<action_919>": 152667,
1971
+ "<action_91>": 151839,
1972
+ "<action_920>": 152668,
1973
+ "<action_921>": 152669,
1974
+ "<action_922>": 152670,
1975
+ "<action_923>": 152671,
1976
+ "<action_924>": 152672,
1977
+ "<action_925>": 152673,
1978
+ "<action_926>": 152674,
1979
+ "<action_927>": 152675,
1980
+ "<action_928>": 152676,
1981
+ "<action_929>": 152677,
1982
+ "<action_92>": 151840,
1983
+ "<action_930>": 152678,
1984
+ "<action_931>": 152679,
1985
+ "<action_932>": 152680,
1986
+ "<action_933>": 152681,
1987
+ "<action_934>": 152682,
1988
+ "<action_935>": 152683,
1989
+ "<action_936>": 152684,
1990
+ "<action_937>": 152685,
1991
+ "<action_938>": 152686,
1992
+ "<action_939>": 152687,
1993
+ "<action_93>": 151841,
1994
+ "<action_940>": 152688,
1995
+ "<action_941>": 152689,
1996
+ "<action_942>": 152690,
1997
+ "<action_943>": 152691,
1998
+ "<action_944>": 152692,
1999
+ "<action_945>": 152693,
2000
+ "<action_946>": 152694,
2001
+ "<action_947>": 152695,
2002
+ "<action_948>": 152696,
2003
+ "<action_949>": 152697,
2004
+ "<action_94>": 151842,
2005
+ "<action_950>": 152698,
2006
+ "<action_951>": 152699,
2007
+ "<action_952>": 152700,
2008
+ "<action_953>": 152701,
2009
+ "<action_954>": 152702,
2010
+ "<action_955>": 152703,
2011
+ "<action_956>": 152704,
2012
+ "<action_957>": 152705,
2013
+ "<action_958>": 152706,
2014
+ "<action_959>": 152707,
2015
+ "<action_95>": 151843,
2016
+ "<action_960>": 152708,
2017
+ "<action_961>": 152709,
2018
+ "<action_962>": 152710,
2019
+ "<action_963>": 152711,
2020
+ "<action_964>": 152712,
2021
+ "<action_965>": 152713,
2022
+ "<action_966>": 152714,
2023
+ "<action_967>": 152715,
2024
+ "<action_968>": 152716,
2025
+ "<action_969>": 152717,
2026
+ "<action_96>": 151844,
2027
+ "<action_970>": 152718,
2028
+ "<action_971>": 152719,
2029
+ "<action_972>": 152720,
2030
+ "<action_973>": 152721,
2031
+ "<action_974>": 152722,
2032
+ "<action_975>": 152723,
2033
+ "<action_976>": 152724,
2034
+ "<action_977>": 152725,
2035
+ "<action_978>": 152726,
2036
+ "<action_979>": 152727,
2037
+ "<action_97>": 151845,
2038
+ "<action_980>": 152728,
2039
+ "<action_981>": 152729,
2040
+ "<action_982>": 152730,
2041
+ "<action_983>": 152731,
2042
+ "<action_984>": 152732,
2043
+ "<action_985>": 152733,
2044
+ "<action_986>": 152734,
2045
+ "<action_987>": 152735,
2046
+ "<action_988>": 152736,
2047
+ "<action_989>": 152737,
2048
+ "<action_98>": 151846,
2049
+ "<action_990>": 152738,
2050
+ "<action_991>": 152739,
2051
+ "<action_992>": 152740,
2052
+ "<action_993>": 152741,
2053
+ "<action_994>": 152742,
2054
+ "<action_995>": 152743,
2055
+ "<action_996>": 152744,
2056
+ "<action_997>": 152745,
2057
+ "<action_998>": 152746,
2058
+ "<action_999>": 152747,
2059
+ "<action_99>": 151847,
2060
+ "<action_9>": 151757,
2061
+ "<box>": 151673,
2062
+ "")
128
+ self.slice_start_token = kwargs.pop("slice_start", "<slice>")
129
+ self.slice_end_token = kwargs.pop("slice_end", "</slice>")
130
+ self.unk_token = kwargs.pop("unk", "<unk>")
131
+ self.im_id_start = kwargs.pop("im_id_start", "<image_id>")
132
+ self.im_id_end = kwargs.pop("im_id_end", "</image_id>")
133
+ self.slice_mode = kwargs.pop("slice_mode", True)
134
+ self.mean = np.array(kwargs.pop("norm_mean", [0.5, 0.5, 0.5]))
135
+ self.std = np.array(kwargs.pop("norm_std", [0.5, 0.5, 0.5]))
136
+ self.version = kwargs.pop("version", 2.0)
137
+
138
+ def ensure_divide(self, length, patch_size):
139
+ return max(round(length / patch_size) * patch_size, patch_size)
140
+
141
+ def find_best_resize(self,
142
+ original_size,
143
+ scale_resolution,
144
+ patch_size,
145
+ allow_upscale=False):
146
+ width, height = original_size
147
+ if (width * height >
148
+ scale_resolution * scale_resolution) or allow_upscale:
149
+ r = width / height
150
+ height = int(scale_resolution / math.sqrt(r))
151
+ width = int(height * r)
152
+ best_width = self.ensure_divide(width, patch_size)
153
+ best_height = self.ensure_divide(height, patch_size)
154
+ return (best_width, best_height)
155
+
156
+ def get_refine_size(self,
157
+ original_size,
158
+ grid,
159
+ scale_resolution,
160
+ patch_size,
161
+ allow_upscale=False):
162
+ width, height = original_size
163
+ grid_x, grid_y = grid
164
+
165
+ refine_width = self.ensure_divide(width, grid_x)
166
+ refine_height = self.ensure_divide(height, grid_y)
167
+
168
+ grid_width = refine_width / grid_x
169
+ grid_height = refine_height / grid_y
170
+
171
+ best_grid_size = self.find_best_resize((grid_width, grid_height),
172
+ scale_resolution,
173
+ patch_size,
174
+ allow_upscale=allow_upscale)
175
+ refine_size = (best_grid_size[0] * grid_x, best_grid_size[1] * grid_y)
176
+ return refine_size
177
+
178
+ def split_to_patches(self, image, grid):
179
+ patches = []
180
+ width, height = image.size
181
+ grid_x = int(width / grid[0])
182
+ grid_y = int(height / grid[1])
183
+ for i in range(0, height, grid_y):
184
+ images = []
185
+ for j in range(0, width, grid_x):
186
+ box = (j, i, j + grid_x, i + grid_y)
187
+ patch = image.crop(box)
188
+ images.append(patch)
189
+ patches.append(images)
190
+ return patches
191
+
192
+ def slice_image(
193
+ self, image, max_slice_nums=9, scale_resolution=448, patch_size=14, never_split=False
194
+ ):
195
+ original_size = image.size
196
+ source_image = None
197
+ best_grid = self.get_sliced_grid(original_size, max_slice_nums, never_split)
198
+ patches = []
199
+
200
+ if best_grid is None:
201
+ # dont need to slice, upsample
202
+ best_size = self.find_best_resize(
203
+ original_size, scale_resolution, patch_size, allow_upscale=True
204
+ )
205
+ source_image = image.resize(best_size, resample=Image.Resampling.BICUBIC)
206
+ else:
207
+ # source image, down-sampling and ensure divided by patch_size
208
+ best_resize = self.find_best_resize(original_size, scale_resolution, patch_size)
209
+ source_image = image.copy().resize(best_resize, resample=Image.Resampling.BICUBIC)
210
+ refine_size = self.get_refine_size(
211
+ original_size, best_grid, scale_resolution, patch_size, allow_upscale=True
212
+ )
213
+ refine_image = image.resize(refine_size, resample=Image.Resampling.BICUBIC)
214
+ patches = self.split_to_patches(refine_image, best_grid)
215
+
216
+ return source_image, patches, best_grid
217
+
218
+ def get_grid_placeholder(self, grid):
219
+ if grid is None:
220
+ return ""
221
+ slice_image_placeholder = (
222
+ self.slice_start_token
223
+ + self.unk_token * self.image_feature_size
224
+ + self.slice_end_token
225
+ )
226
+
227
+ cols = grid[0]
228
+ rows = grid[1]
229
+ slices = []
230
+ for i in range(rows):
231
+ lines = []
232
+ for j in range(cols):
233
+ lines.append(slice_image_placeholder)
234
+ slices.append("".join(lines))
235
+
236
+ slice_placeholder = "\n".join(slices)
237
+ return slice_placeholder
238
+
239
+ def get_image_id_placeholder(self, idx=0):
240
+ return f"{self.im_id_start}{idx}{self.im_id_end}"
241
+
242
+ def get_sliced_images(self, image, max_slice_nums=None):
243
+ slice_images = []
244
+
245
+ if not self.slice_mode:
246
+ return [image]
247
+
248
+ max_slice_nums = self.max_slice_nums if max_slice_nums is None else int(max_slice_nums)
249
+ assert max_slice_nums > 0
250
+ source_image, patches, sliced_grid = self.slice_image(
251
+ image,
252
+ max_slice_nums, # default: 9
253
+ self.scale_resolution, # default: 448
254
+ self.patch_size # default: 14
255
+ )
256
+
257
+ slice_images.append(source_image)
258
+ if len(patches) > 0:
259
+ for i in range(len(patches)):
260
+ for j in range(len(patches[0])):
261
+ slice_images.append(patches[i][j])
262
+ return slice_images
263
+
264
+ def get_sliced_grid(self, image_size, max_slice_nums, nerver_split=False):
265
+ original_width, original_height = image_size
266
+ log_ratio = math.log(original_width / original_height)
267
+ ratio = original_width * original_height / (self.scale_resolution * self.scale_resolution)
268
+ multiple = min(math.ceil(ratio), max_slice_nums)
269
+ if multiple <= 1 or nerver_split:
270
+ return None
271
+ candidate_split_grids_nums = []
272
+ for i in [multiple - 1, multiple, multiple + 1]:
273
+ if i == 1 or i > max_slice_nums:
274
+ continue
275
+ candidate_split_grids_nums.append(i)
276
+
277
+ candidate_grids = []
278
+ for split_grids_nums in candidate_split_grids_nums:
279
+ m = 1
280
+ while m <= split_grids_nums:
281
+ if split_grids_nums % m == 0:
282
+ candidate_grids.append([m, split_grids_nums // m])
283
+ m += 1
284
+
285
+ best_grid = [1, 1]
286
+ min_error = float("inf")
287
+ for grid in candidate_grids:
288
+ error = abs(log_ratio - math.log(grid[0] / grid[1]))
289
+ if error < min_error:
290
+ best_grid = grid
291
+ min_error = error
292
+
293
+ return best_grid
294
+
295
+ def get_slice_image_placeholder(self, image_size, image_idx=0, max_slice_nums=None, use_image_id=None):
296
+ max_slice_nums = self.max_slice_nums if max_slice_nums is None else int(max_slice_nums)
297
+ assert max_slice_nums > 0
298
+ grid = self.get_sliced_grid(image_size=image_size, max_slice_nums=max_slice_nums)
299
+
300
+ image_placeholder = (
301
+ self.im_start_token
302
+ + self.unk_token * self.image_feature_size
303
+ + self.im_end_token
304
+ )
305
+ use_image_id = self.use_image_id if use_image_id is None else bool(use_image_id)
306
+ if use_image_id:
307
+ final_placeholder = self.get_image_id_placeholder(image_idx) + image_placeholder
308
+ else:
309
+ final_placeholder = image_placeholder
310
+
311
+ if self.slice_mode:
312
+ final_placeholder = final_placeholder + self.get_grid_placeholder(grid=grid)
313
+ return final_placeholder
314
+
315
+ def to_pil_image(self, image, rescale=None) -> PIL.Image.Image:
316
+ """
317
+ Converts `image` to a PIL Image. Optionally rescales it and puts the channel dimension back as the last axis if
318
+ needed.
319
+
320
+ Args:
321
+ image (`PIL.Image.Image` or `numpy.ndarray` or `torch.Tensor`):
322
+ The image to convert to the PIL Image format.
323
+ rescale (`bool`, *optional*):
324
+ Whether or not to apply the scaling factor (to make pixel values integers between 0 and 255). Will
325
+ default to `True` if the image type is a floating type, `False` otherwise.
326
+ """
327
+ if isinstance(image, PIL.Image.Image):
328
+ return image
329
+ if is_torch_tensor(image):
330
+ image = image.numpy()
331
+
332
+ if isinstance(image, np.ndarray):
333
+ if rescale is None:
334
+ # rescale default to the array being of floating type.
335
+ rescale = isinstance(image.flat[0], np.floating)
336
+ # If the channel as been moved to first dim, we put it back at the end.
337
+ if image.ndim == 3 and image.shape[0] in [1, 3]:
338
+ image = image.transpose(1, 2, 0)
339
+ if rescale:
340
+ image = image * 255
341
+ image = image.astype(np.uint8)
342
+ return PIL.Image.fromarray(image)
343
+ return image
344
+
345
+ def reshape_by_patch(self, image):
346
+ """
347
+ :param image: shape [3, H, W]
348
+ :param patch_size:
349
+ :return: [3, patch_size, HW/patch_size]
350
+ """
351
+ image = torch.from_numpy(image)
352
+ patch_size = self.patch_size
353
+ patches = torch.nn.functional.unfold(
354
+ image,
355
+ (patch_size, patch_size),
356
+ stride=(patch_size, patch_size)
357
+ )
358
+
359
+ patches = patches.reshape(image.size(0), patch_size, patch_size, -1)
360
+ patches = patches.permute(0, 1, 3, 2).reshape(image.size(0), patch_size, -1)
361
+ return patches.numpy()
362
+
363
+ def preprocess(
364
+ self,
365
+ images: Union[Image.Image, List[Image.Image], List[List[Image.Image]]],
366
+ do_pad: Optional[bool] = True, # TODO: add pad for MiniCPM-Llama3-V-2_5
367
+ max_slice_nums: int = None,
368
+ temporal_ids: Optional[Union[List[List[int]], List[List[List[int]]]]] = None,
369
+ return_tensors: Optional[Union[str, TensorType]] = None,
370
+ **kwargs
371
+ ) -> MiniCPMVBatchFeature:
372
+ if isinstance(images, Image.Image):
373
+ images_list = [[images]]
374
+ elif isinstance(images[0], Image.Image):
375
+ images_list = [images]
376
+ else:
377
+ images_list = images
378
+
379
+ if temporal_ids is not None:
380
+ if list_depth(temporal_ids) == 2:
381
+ temporal_ids = [temporal_ids]
382
+
383
+ new_images_list = []
384
+ image_sizes_list = []
385
+ tgt_sizes_list = []
386
+ temporal_ids_list = []
387
+ skip_image_idx_list = []
388
+
389
+ for batch_idx, _images in enumerate(images_list):
390
+ if _images is None or len(_images) == 0:
391
+ new_images_list.append([])
392
+ image_sizes_list.append([])
393
+ tgt_sizes_list.append([])
394
+ temporal_ids_list.append([])
395
+ skip_image_idx_list.append([])
396
+ continue
397
+ if not valid_images(_images):
398
+ raise ValueError(
399
+ "Invalid image type. Must be of type PIL.Image.Image, numpy.ndarray, "
400
+ "torch.Tensor, tf.Tensor or jax.ndarray."
401
+ )
402
+
403
+ _images = [self.to_pil_image(image).convert("RGB") for image in _images]
404
+ input_data_format = infer_channel_dimension_format(np.array(_images[0]))
405
+
406
+ new_images = []
407
+ image_sizes = [image.size for image in _images]
408
+ tgt_sizes = []
409
+ tp_ids = []
410
+ skip_image_idx = []
411
+
412
+ # for image in _images:
413
+ # image_patches = self.get_sliced_images(image, max_slice_nums)
414
+ # image_patches = [to_numpy_array(image).astype(np.float32) / 255 for image in image_patches]
415
+ # image_patches = [
416
+ # self.normalize(image=image, mean=self.mean, std=self.std, input_data_format=input_data_format)
417
+ # for image in image_patches
418
+ # ]
419
+ # image_patches = [
420
+ # to_channel_dimension_format(image, ChannelDimension.FIRST, input_channel_dim=input_data_format)
421
+ # for image in image_patches
422
+ # ]
423
+ # for slice_image in image_patches:
424
+ # new_images.append(self.reshape_by_patch(slice_image))
425
+ # tgt_sizes.append(np.array((slice_image.shape[1] // self.patch_size, slice_image.shape[2] // self.patch_size)))
426
+
427
+ if temporal_ids is None:
428
+ # no temporal ids
429
+ for image in _images:
430
+ image_patches = self.get_sliced_images(image, max_slice_nums)
431
+ image_patches = [to_numpy_array(image).astype(np.float32) / 255 for image in image_patches]
432
+ image_patches = [
433
+ self.normalize(image=image, mean=self.mean, std=self.std, input_data_format=input_data_format)
434
+ for image in image_patches
435
+ ]
436
+ image_patches = [
437
+ to_channel_dimension_format(image, ChannelDimension.FIRST, input_channel_dim=input_data_format)
438
+ for image in image_patches
439
+ ]
440
+ for slice_image in image_patches:
441
+ new_images.append(self.reshape_by_patch(slice_image))
442
+ tgt_sizes.append(np.array((slice_image.shape[1] // self.patch_size, slice_image.shape[2] // self.patch_size)))
443
+
444
+ tp_ids.extend([[-1]] * len(image_patches))
445
+ else:
446
+ temporal_ids_flatten = list(chain.from_iterable(temporal_ids[batch_idx]))
447
+ assert len(temporal_ids_flatten) == len(_images)
448
+ frame_groups = []
449
+ s = 0
450
+ for group in temporal_ids[batch_idx]:
451
+ frame_groups.append(_images[s:s+len(group)])
452
+ s += len(group)
453
+
454
+ skip_start = 0
455
+ for frame_group, tp_id in zip(frame_groups, temporal_ids[batch_idx]):
456
+ image_patches_group = []
457
+ for frame in frame_group:
458
+ image_patches = self.get_sliced_images(frame, max_slice_nums)
459
+ image_patches = [to_numpy_array(image).astype(np.float32) / 255 for image in image_patches]
460
+ image_patches = [
461
+ self.normalize(image=image, mean=self.mean, std=self.std, input_data_format=input_data_format)
462
+ for image in image_patches
463
+ ]
464
+ image_patches = [
465
+ to_channel_dimension_format(image, ChannelDimension.FIRST, input_channel_dim=input_data_format)
466
+ for image in image_patches
467
+ ]
468
+ image_patches_group.append(image_patches)
469
+
470
+ group_cnt = len(image_patches_group[0])
471
+ for gidx in range(group_cnt):
472
+ group_images = [s[gidx] for s in image_patches_group]
473
+ tgt_sizes.extend([np.array((i.shape[1] // self.patch_size, i.shape[2] // self.patch_size)) for i in group_images])
474
+
475
+ group_images = [self.reshape_by_patch(i) for i in group_images]
476
+ new_images.extend(group_images)
477
+ tp_ids.append(tp_id)
478
+ skip_image_idx.extend(list(range(skip_start + 1, skip_start + len(frame_group))))
479
+ skip_start += len(frame_group)
480
+
481
+ if tgt_sizes:
482
+ tgt_sizes = np.vstack(tgt_sizes)
483
+
484
+ new_images_list.append(new_images)
485
+ image_sizes_list.append(image_sizes)
486
+ tgt_sizes_list.append(tgt_sizes)
487
+ temporal_ids_list.append(tp_ids)
488
+ skip_image_idx_list.append(skip_image_idx)
489
+
490
+ data = {
491
+ "pixel_values": new_images_list,
492
+ "image_sizes": image_sizes_list,
493
+ "tgt_sizes": tgt_sizes_list,
494
+ "temporal_ids": temporal_ids_list,
495
+ "skip_image_idx": skip_image_idx_list
496
+ }
497
+
498
+
499
+ return MiniCPMVBatchFeature(data=data, tensor_type=return_tensors)
500
+
501
+ AutoImageProcessor.register("MiniCPMVImageProcessor", MiniCPMVImageProcessor)
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
model-00001-of-00008.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:21daa5f10bcc95802543a9f4a5a01228cbaf636093c4b2621bc31b192d2a13a5
3
+ size 4935818536
model-00002-of-00008.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d7c0ad33b64caa673b436ca24ea4354eebc518f29d39572827614d34e931c739
3
+ size 4899159104
model-00003-of-00008.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:933abd60e66bf5f2dd2705e23de7d1b5b8878698c99e5130892dd5d32be88f92
3
+ size 4832048920
model-00004-of-00008.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6d4bb152f4202524fc2a6dac4643ea1c4c8fc34d4bdd186ad91770252d2af37f
3
+ size 4999855832
model-00005-of-00008.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2a5bafd22096d007999e2b9a6ffdf4a90f5d2a713484921c63b3f44ecf0509ce
3
+ size 4832048936
model-00006-of-00008.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9508dc11390edfa1dde9ad4883a7b71609b316f6555577454435185e57e53593
3
+ size 4832048936
model-00007-of-00008.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:443a88f69bab4e9cfce48952b4a5ba1e8f36232616fe2ec18b5145ab173d907d
3
+ size 4997227096
model-00008-of-00008.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7bb5dc4cdf0673de559548723e2683ae05fad88155eac993c69db64328eff681
3
+ size 522581504
model.safetensors.index.json ADDED
@@ -0,0 +1,856 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 34850689984
4
+ },
5
+ "weight_map": {
6
+ "llm.lm_head.weight": "model-00007-of-00008.safetensors",
7
+ "llm.model.embed_tokens.weight": "model-00001-of-00008.safetensors",
8
+ "llm.model.layers.0.input_layernorm.weight": "model-00001-of-00008.safetensors",
9
+ "llm.model.layers.0.mlp.down_proj.weight": "model-00001-of-00008.safetensors",
10
+ "llm.model.layers.0.mlp.gate_proj.weight": "model-00001-of-00008.safetensors",
11
+ "llm.model.layers.0.mlp.up_proj.weight": "model-00001-of-00008.safetensors",
12
+ "llm.model.layers.0.post_attention_layernorm.weight": "model-00001-of-00008.safetensors",
13
+ "llm.model.layers.0.self_attn.k_norm.weight": "model-00001-of-00008.safetensors",
14
+ "llm.model.layers.0.self_attn.k_proj.weight": "model-00001-of-00008.safetensors",
15
+ "llm.model.layers.0.self_attn.o_proj.weight": "model-00001-of-00008.safetensors",
16
+ "llm.model.layers.0.self_attn.q_norm.weight": "model-00001-of-00008.safetensors",
17
+ "llm.model.layers.0.self_attn.q_proj.weight": "model-00001-of-00008.safetensors",
18
+ "llm.model.layers.0.self_attn.v_proj.weight": "model-00001-of-00008.safetensors",
19
+ "llm.model.layers.1.input_layernorm.weight": "model-00001-of-00008.safetensors",
20
+ "llm.model.layers.1.mlp.down_proj.weight": "model-00001-of-00008.safetensors",
21
+ "llm.model.layers.1.mlp.gate_proj.weight": "model-00001-of-00008.safetensors",
22
+ "llm.model.layers.1.mlp.up_proj.weight": "model-00001-of-00008.safetensors",
23
+ "llm.model.layers.1.post_attention_layernorm.weight": "model-00001-of-00008.safetensors",
24
+ "llm.model.layers.1.self_attn.k_norm.weight": "model-00001-of-00008.safetensors",
25
+ "llm.model.layers.1.self_attn.k_proj.weight": "model-00001-of-00008.safetensors",
26
+ "llm.model.layers.1.self_attn.o_proj.weight": "model-00001-of-00008.safetensors",
27
+ "llm.model.layers.1.self_attn.q_norm.weight": "model-00001-of-00008.safetensors",
28
+ "llm.model.layers.1.self_attn.q_proj.weight": "model-00001-of-00008.safetensors",
29
+ "llm.model.layers.1.self_attn.v_proj.weight": "model-00001-of-00008.safetensors",
30
+ "llm.model.layers.10.input_layernorm.weight": "model-00003-of-00008.safetensors",
31
+ "llm.model.layers.10.mlp.down_proj.weight": "model-00003-of-00008.safetensors",
32
+ "llm.model.layers.10.mlp.gate_proj.weight": "model-00003-of-00008.safetensors",
33
+ "llm.model.layers.10.mlp.up_proj.weight": "model-00003-of-00008.safetensors",
34
+ "llm.model.layers.10.post_attention_layernorm.weight": "model-00003-of-00008.safetensors",
35
+ "llm.model.layers.10.self_attn.k_norm.weight": "model-00003-of-00008.safetensors",
36
+ "llm.model.layers.10.self_attn.k_proj.weight": "model-00003-of-00008.safetensors",
37
+ "llm.model.layers.10.self_attn.o_proj.weight": "model-00003-of-00008.safetensors",
38
+ "llm.model.layers.10.self_attn.q_norm.weight": "model-00003-of-00008.safetensors",
39
+ "llm.model.layers.10.self_attn.q_proj.weight": "model-00003-of-00008.safetensors",
40
+ "llm.model.layers.10.self_attn.v_proj.weight": "model-00003-of-00008.safetensors",
41
+ "llm.model.layers.11.input_layernorm.weight": "model-00003-of-00008.safetensors",
42
+ "llm.model.layers.11.mlp.down_proj.weight": "model-00003-of-00008.safetensors",
43
+ "llm.model.layers.11.mlp.gate_proj.weight": "model-00003-of-00008.safetensors",
44
+ "llm.model.layers.11.mlp.up_proj.weight": "model-00003-of-00008.safetensors",
45
+ "llm.model.layers.11.post_attention_layernorm.weight": "model-00003-of-00008.safetensors",
46
+ "llm.model.layers.11.self_attn.k_norm.weight": "model-00003-of-00008.safetensors",
47
+ "llm.model.layers.11.self_attn.k_proj.weight": "model-00003-of-00008.safetensors",
48
+ "llm.model.layers.11.self_attn.o_proj.weight": "model-00003-of-00008.safetensors",
49
+ "llm.model.layers.11.self_attn.q_norm.weight": "model-00003-of-00008.safetensors",
50
+ "llm.model.layers.11.self_attn.q_proj.weight": "model-00003-of-00008.safetensors",
51
+ "llm.model.layers.11.self_attn.v_proj.weight": "model-00003-of-00008.safetensors",
52
+ "llm.model.layers.12.input_layernorm.weight": "model-00003-of-00008.safetensors",
53
+ "llm.model.layers.12.mlp.down_proj.weight": "model-00003-of-00008.safetensors",
54
+ "llm.model.layers.12.mlp.gate_proj.weight": "model-00003-of-00008.safetensors",
55
+ "llm.model.layers.12.mlp.up_proj.weight": "model-00003-of-00008.safetensors",
56
+ "llm.model.layers.12.post_attention_layernorm.weight": "model-00003-of-00008.safetensors",
57
+ "llm.model.layers.12.self_attn.k_norm.weight": "model-00003-of-00008.safetensors",
58
+ "llm.model.layers.12.self_attn.k_proj.weight": "model-00003-of-00008.safetensors",
59
+ "llm.model.layers.12.self_attn.o_proj.weight": "model-00003-of-00008.safetensors",
60
+ "llm.model.layers.12.self_attn.q_norm.weight": "model-00003-of-00008.safetensors",
61
+ "llm.model.layers.12.self_attn.q_proj.weight": "model-00003-of-00008.safetensors",
62
+ "llm.model.layers.12.self_attn.v_proj.weight": "model-00003-of-00008.safetensors",
63
+ "llm.model.layers.13.input_layernorm.weight": "model-00003-of-00008.safetensors",
64
+ "llm.model.layers.13.mlp.down_proj.weight": "model-00003-of-00008.safetensors",
65
+ "llm.model.layers.13.mlp.gate_proj.weight": "model-00003-of-00008.safetensors",
66
+ "llm.model.layers.13.mlp.up_proj.weight": "model-00003-of-00008.safetensors",
67
+ "llm.model.layers.13.post_attention_layernorm.weight": "model-00003-of-00008.safetensors",
68
+ "llm.model.layers.13.self_attn.k_norm.weight": "model-00003-of-00008.safetensors",
69
+ "llm.model.layers.13.self_attn.k_proj.weight": "model-00003-of-00008.safetensors",
70
+ "llm.model.layers.13.self_attn.o_proj.weight": "model-00003-of-00008.safetensors",
71
+ "llm.model.layers.13.self_attn.q_norm.weight": "model-00003-of-00008.safetensors",
72
+ "llm.model.layers.13.self_attn.q_proj.weight": "model-00003-of-00008.safetensors",
73
+ "llm.model.layers.13.self_attn.v_proj.weight": "model-00003-of-00008.safetensors",
74
+ "llm.model.layers.14.input_layernorm.weight": "model-00003-of-00008.safetensors",
75
+ "llm.model.layers.14.mlp.down_proj.weight": "model-00003-of-00008.safetensors",
76
+ "llm.model.layers.14.mlp.gate_proj.weight": "model-00003-of-00008.safetensors",
77
+ "llm.model.layers.14.mlp.up_proj.weight": "model-00003-of-00008.safetensors",
78
+ "llm.model.layers.14.post_attention_layernorm.weight": "model-00003-of-00008.safetensors",
79
+ "llm.model.layers.14.self_attn.k_norm.weight": "model-00003-of-00008.safetensors",
80
+ "llm.model.layers.14.self_attn.k_proj.weight": "model-00003-of-00008.safetensors",
81
+ "llm.model.layers.14.self_attn.o_proj.weight": "model-00003-of-00008.safetensors",
82
+ "llm.model.layers.14.self_attn.q_norm.weight": "model-00003-of-00008.safetensors",
83
+ "llm.model.layers.14.self_attn.q_proj.weight": "model-00003-of-00008.safetensors",
84
+ "llm.model.layers.14.self_attn.v_proj.weight": "model-00003-of-00008.safetensors",
85
+ "llm.model.layers.15.input_layernorm.weight": "model-00004-of-00008.safetensors",
86
+ "llm.model.layers.15.mlp.down_proj.weight": "model-00004-of-00008.safetensors",
87
+ "llm.model.layers.15.mlp.gate_proj.weight": "model-00003-of-00008.safetensors",
88
+ "llm.model.layers.15.mlp.up_proj.weight": "model-00003-of-00008.safetensors",
89
+ "llm.model.layers.15.post_attention_layernorm.weight": "model-00004-of-00008.safetensors",
90
+ "llm.model.layers.15.self_attn.k_norm.weight": "model-00003-of-00008.safetensors",
91
+ "llm.model.layers.15.self_attn.k_proj.weight": "model-00003-of-00008.safetensors",
92
+ "llm.model.layers.15.self_attn.o_proj.weight": "model-00003-of-00008.safetensors",
93
+ "llm.model.layers.15.self_attn.q_norm.weight": "model-00003-of-00008.safetensors",
94
+ "llm.model.layers.15.self_attn.q_proj.weight": "model-00003-of-00008.safetensors",
95
+ "llm.model.layers.15.self_attn.v_proj.weight": "model-00003-of-00008.safetensors",
96
+ "llm.model.layers.16.input_layernorm.weight": "model-00004-of-00008.safetensors",
97
+ "llm.model.layers.16.mlp.down_proj.weight": "model-00004-of-00008.safetensors",
98
+ "llm.model.layers.16.mlp.gate_proj.weight": "model-00004-of-00008.safetensors",
99
+ "llm.model.layers.16.mlp.up_proj.weight": "model-00004-of-00008.safetensors",
100
+ "llm.model.layers.16.post_attention_layernorm.weight": "model-00004-of-00008.safetensors",
101
+ "llm.model.layers.16.self_attn.k_norm.weight": "model-00004-of-00008.safetensors",
102
+ "llm.model.layers.16.self_attn.k_proj.weight": "model-00004-of-00008.safetensors",
103
+ "llm.model.layers.16.self_attn.o_proj.weight": "model-00004-of-00008.safetensors",
104
+ "llm.model.layers.16.self_attn.q_norm.weight": "model-00004-of-00008.safetensors",
105
+ "llm.model.layers.16.self_attn.q_proj.weight": "model-00004-of-00008.safetensors",
106
+ "llm.model.layers.16.self_attn.v_proj.weight": "model-00004-of-00008.safetensors",
107
+ "llm.model.layers.17.input_layernorm.weight": "model-00004-of-00008.safetensors",
108
+ "llm.model.layers.17.mlp.down_proj.weight": "model-00004-of-00008.safetensors",
109
+ "llm.model.layers.17.mlp.gate_proj.weight": "model-00004-of-00008.safetensors",
110
+ "llm.model.layers.17.mlp.up_proj.weight": "model-00004-of-00008.safetensors",
111
+ "llm.model.layers.17.post_attention_layernorm.weight": "model-00004-of-00008.safetensors",
112
+ "llm.model.layers.17.self_attn.k_norm.weight": "model-00004-of-00008.safetensors",
113
+ "llm.model.layers.17.self_attn.k_proj.weight": "model-00004-of-00008.safetensors",
114
+ "llm.model.layers.17.self_attn.o_proj.weight": "model-00004-of-00008.safetensors",
115
+ "llm.model.layers.17.self_attn.q_norm.weight": "model-00004-of-00008.safetensors",
116
+ "llm.model.layers.17.self_attn.q_proj.weight": "model-00004-of-00008.safetensors",
117
+ "llm.model.layers.17.self_attn.v_proj.weight": "model-00004-of-00008.safetensors",
118
+ "llm.model.layers.18.input_layernorm.weight": "model-00004-of-00008.safetensors",
119
+ "llm.model.layers.18.mlp.down_proj.weight": "model-00004-of-00008.safetensors",
120
+ "llm.model.layers.18.mlp.gate_proj.weight": "model-00004-of-00008.safetensors",
121
+ "llm.model.layers.18.mlp.up_proj.weight": "model-00004-of-00008.safetensors",
122
+ "llm.model.layers.18.post_attention_layernorm.weight": "model-00004-of-00008.safetensors",
123
+ "llm.model.layers.18.self_attn.k_norm.weight": "model-00004-of-00008.safetensors",
124
+ "llm.model.layers.18.self_attn.k_proj.weight": "model-00004-of-00008.safetensors",
125
+ "llm.model.layers.18.self_attn.o_proj.weight": "model-00004-of-00008.safetensors",
126
+ "llm.model.layers.18.self_attn.q_norm.weight": "model-00004-of-00008.safetensors",
127
+ "llm.model.layers.18.self_attn.q_proj.weight": "model-00004-of-00008.safetensors",
128
+ "llm.model.layers.18.self_attn.v_proj.weight": "model-00004-of-00008.safetensors",
129
+ "llm.model.layers.19.input_layernorm.weight": "model-00004-of-00008.safetensors",
130
+ "llm.model.layers.19.mlp.down_proj.weight": "model-00004-of-00008.safetensors",
131
+ "llm.model.layers.19.mlp.gate_proj.weight": "model-00004-of-00008.safetensors",
132
+ "llm.model.layers.19.mlp.up_proj.weight": "model-00004-of-00008.safetensors",
133
+ "llm.model.layers.19.post_attention_layernorm.weight": "model-00004-of-00008.safetensors",
134
+ "llm.model.layers.19.self_attn.k_norm.weight": "model-00004-of-00008.safetensors",
135
+ "llm.model.layers.19.self_attn.k_proj.weight": "model-00004-of-00008.safetensors",
136
+ "llm.model.layers.19.self_attn.o_proj.weight": "model-00004-of-00008.safetensors",
137
+ "llm.model.layers.19.self_attn.q_norm.weight": "model-00004-of-00008.safetensors",
138
+ "llm.model.layers.19.self_attn.q_proj.weight": "model-00004-of-00008.safetensors",
139
+ "llm.model.layers.19.self_attn.v_proj.weight": "model-00004-of-00008.safetensors",
140
+ "llm.model.layers.2.input_layernorm.weight": "model-00001-of-00008.safetensors",
141
+ "llm.model.layers.2.mlp.down_proj.weight": "model-00001-of-00008.safetensors",
142
+ "llm.model.layers.2.mlp.gate_proj.weight": "model-00001-of-00008.safetensors",
143
+ "llm.model.layers.2.mlp.up_proj.weight": "model-00001-of-00008.safetensors",
144
+ "llm.model.layers.2.post_attention_layernorm.weight": "model-00001-of-00008.safetensors",
145
+ "llm.model.layers.2.self_attn.k_norm.weight": "model-00001-of-00008.safetensors",
146
+ "llm.model.layers.2.self_attn.k_proj.weight": "model-00001-of-00008.safetensors",
147
+ "llm.model.layers.2.self_attn.o_proj.weight": "model-00001-of-00008.safetensors",
148
+ "llm.model.layers.2.self_attn.q_norm.weight": "model-00001-of-00008.safetensors",
149
+ "llm.model.layers.2.self_attn.q_proj.weight": "model-00001-of-00008.safetensors",
150
+ "llm.model.layers.2.self_attn.v_proj.weight": "model-00001-of-00008.safetensors",
151
+ "llm.model.layers.20.input_layernorm.weight": "model-00004-of-00008.safetensors",
152
+ "llm.model.layers.20.mlp.down_proj.weight": "model-00004-of-00008.safetensors",
153
+ "llm.model.layers.20.mlp.gate_proj.weight": "model-00004-of-00008.safetensors",
154
+ "llm.model.layers.20.mlp.up_proj.weight": "model-00004-of-00008.safetensors",
155
+ "llm.model.layers.20.post_attention_layernorm.weight": "model-00004-of-00008.safetensors",
156
+ "llm.model.layers.20.self_attn.k_norm.weight": "model-00004-of-00008.safetensors",
157
+ "llm.model.layers.20.self_attn.k_proj.weight": "model-00004-of-00008.safetensors",
158
+ "llm.model.layers.20.self_attn.o_proj.weight": "model-00004-of-00008.safetensors",
159
+ "llm.model.layers.20.self_attn.q_norm.weight": "model-00004-of-00008.safetensors",
160
+ "llm.model.layers.20.self_attn.q_proj.weight": "model-00004-of-00008.safetensors",
161
+ "llm.model.layers.20.self_attn.v_proj.weight": "model-00004-of-00008.safetensors",
162
+ "llm.model.layers.21.input_layernorm.weight": "model-00004-of-00008.safetensors",
163
+ "llm.model.layers.21.mlp.down_proj.weight": "model-00004-of-00008.safetensors",
164
+ "llm.model.layers.21.mlp.gate_proj.weight": "model-00004-of-00008.safetensors",
165
+ "llm.model.layers.21.mlp.up_proj.weight": "model-00004-of-00008.safetensors",
166
+ "llm.model.layers.21.post_attention_layernorm.weight": "model-00004-of-00008.safetensors",
167
+ "llm.model.layers.21.self_attn.k_norm.weight": "model-00004-of-00008.safetensors",
168
+ "llm.model.layers.21.self_attn.k_proj.weight": "model-00004-of-00008.safetensors",
169
+ "llm.model.layers.21.self_attn.o_proj.weight": "model-00004-of-00008.safetensors",
170
+ "llm.model.layers.21.self_attn.q_norm.weight": "model-00004-of-00008.safetensors",
171
+ "llm.model.layers.21.self_attn.q_proj.weight": "model-00004-of-00008.safetensors",
172
+ "llm.model.layers.21.self_attn.v_proj.weight": "model-00004-of-00008.safetensors",
173
+ "llm.model.layers.22.input_layernorm.weight": "model-00005-of-00008.safetensors",
174
+ "llm.model.layers.22.mlp.down_proj.weight": "model-00005-of-00008.safetensors",
175
+ "llm.model.layers.22.mlp.gate_proj.weight": "model-00005-of-00008.safetensors",
176
+ "llm.model.layers.22.mlp.up_proj.weight": "model-00005-of-00008.safetensors",
177
+ "llm.model.layers.22.post_attention_layernorm.weight": "model-00005-of-00008.safetensors",
178
+ "llm.model.layers.22.self_attn.k_norm.weight": "model-00004-of-00008.safetensors",
179
+ "llm.model.layers.22.self_attn.k_proj.weight": "model-00004-of-00008.safetensors",
180
+ "llm.model.layers.22.self_attn.o_proj.weight": "model-00004-of-00008.safetensors",
181
+ "llm.model.layers.22.self_attn.q_norm.weight": "model-00004-of-00008.safetensors",
182
+ "llm.model.layers.22.self_attn.q_proj.weight": "model-00004-of-00008.safetensors",
183
+ "llm.model.layers.22.self_attn.v_proj.weight": "model-00004-of-00008.safetensors",
184
+ "llm.model.layers.23.input_layernorm.weight": "model-00005-of-00008.safetensors",
185
+ "llm.model.layers.23.mlp.down_proj.weight": "model-00005-of-00008.safetensors",
186
+ "llm.model.layers.23.mlp.gate_proj.weight": "model-00005-of-00008.safetensors",
187
+ "llm.model.layers.23.mlp.up_proj.weight": "model-00005-of-00008.safetensors",
188
+ "llm.model.layers.23.post_attention_layernorm.weight": "model-00005-of-00008.safetensors",
189
+ "llm.model.layers.23.self_attn.k_norm.weight": "model-00005-of-00008.safetensors",
190
+ "llm.model.layers.23.self_attn.k_proj.weight": "model-00005-of-00008.safetensors",
191
+ "llm.model.layers.23.self_attn.o_proj.weight": "model-00005-of-00008.safetensors",
192
+ "llm.model.layers.23.self_attn.q_norm.weight": "model-00005-of-00008.safetensors",
193
+ "llm.model.layers.23.self_attn.q_proj.weight": "model-00005-of-00008.safetensors",
194
+ "llm.model.layers.23.self_attn.v_proj.weight": "model-00005-of-00008.safetensors",
195
+ "llm.model.layers.24.input_layernorm.weight": "model-00005-of-00008.safetensors",
196
+ "llm.model.layers.24.mlp.down_proj.weight": "model-00005-of-00008.safetensors",
197
+ "llm.model.layers.24.mlp.gate_proj.weight": "model-00005-of-00008.safetensors",
198
+ "llm.model.layers.24.mlp.up_proj.weight": "model-00005-of-00008.safetensors",
199
+ "llm.model.layers.24.post_attention_layernorm.weight": "model-00005-of-00008.safetensors",
200
+ "llm.model.layers.24.self_attn.k_norm.weight": "model-00005-of-00008.safetensors",
201
+ "llm.model.layers.24.self_attn.k_proj.weight": "model-00005-of-00008.safetensors",
202
+ "llm.model.layers.24.self_attn.o_proj.weight": "model-00005-of-00008.safetensors",
203
+ "llm.model.layers.24.self_attn.q_norm.weight": "model-00005-of-00008.safetensors",
204
+ "llm.model.layers.24.self_attn.q_proj.weight": "model-00005-of-00008.safetensors",
205
+ "llm.model.layers.24.self_attn.v_proj.weight": "model-00005-of-00008.safetensors",
206
+ "llm.model.layers.25.input_layernorm.weight": "model-00005-of-00008.safetensors",
207
+ "llm.model.layers.25.mlp.down_proj.weight": "model-00005-of-00008.safetensors",
208
+ "llm.model.layers.25.mlp.gate_proj.weight": "model-00005-of-00008.safetensors",
209
+ "llm.model.layers.25.mlp.up_proj.weight": "model-00005-of-00008.safetensors",
210
+ "llm.model.layers.25.post_attention_layernorm.weight": "model-00005-of-00008.safetensors",
211
+ "llm.model.layers.25.self_attn.k_norm.weight": "model-00005-of-00008.safetensors",
212
+ "llm.model.layers.25.self_attn.k_proj.weight": "model-00005-of-00008.safetensors",
213
+ "llm.model.layers.25.self_attn.o_proj.weight": "model-00005-of-00008.safetensors",
214
+ "llm.model.layers.25.self_attn.q_norm.weight": "model-00005-of-00008.safetensors",
215
+ "llm.model.layers.25.self_attn.q_proj.weight": "model-00005-of-00008.safetensors",
216
+ "llm.model.layers.25.self_attn.v_proj.weight": "model-00005-of-00008.safetensors",
217
+ "llm.model.layers.26.input_layernorm.weight": "model-00005-of-00008.safetensors",
218
+ "llm.model.layers.26.mlp.down_proj.weight": "model-00005-of-00008.safetensors",
219
+ "llm.model.layers.26.mlp.gate_proj.weight": "model-00005-of-00008.safetensors",
220
+ "llm.model.layers.26.mlp.up_proj.weight": "model-00005-of-00008.safetensors",
221
+ "llm.model.layers.26.post_attention_layernorm.weight": "model-00005-of-00008.safetensors",
222
+ "llm.model.layers.26.self_attn.k_norm.weight": "model-00005-of-00008.safetensors",
223
+ "llm.model.layers.26.self_attn.k_proj.weight": "model-00005-of-00008.safetensors",
224
+ "llm.model.layers.26.self_attn.o_proj.weight": "model-00005-of-00008.safetensors",
225
+ "llm.model.layers.26.self_attn.q_norm.weight": "model-00005-of-00008.safetensors",
226
+ "llm.model.layers.26.self_attn.q_proj.weight": "model-00005-of-00008.safetensors",
227
+ "llm.model.layers.26.self_attn.v_proj.weight": "model-00005-of-00008.safetensors",
228
+ "llm.model.layers.27.input_layernorm.weight": "model-00005-of-00008.safetensors",
229
+ "llm.model.layers.27.mlp.down_proj.weight": "model-00005-of-00008.safetensors",
230
+ "llm.model.layers.27.mlp.gate_proj.weight": "model-00005-of-00008.safetensors",
231
+ "llm.model.layers.27.mlp.up_proj.weight": "model-00005-of-00008.safetensors",
232
+ "llm.model.layers.27.post_attention_layernorm.weight": "model-00005-of-00008.safetensors",
233
+ "llm.model.layers.27.self_attn.k_norm.weight": "model-00005-of-00008.safetensors",
234
+ "llm.model.layers.27.self_attn.k_proj.weight": "model-00005-of-00008.safetensors",
235
+ "llm.model.layers.27.self_attn.o_proj.weight": "model-00005-of-00008.safetensors",
236
+ "llm.model.layers.27.self_attn.q_norm.weight": "model-00005-of-00008.safetensors",
237
+ "llm.model.layers.27.self_attn.q_proj.weight": "model-00005-of-00008.safetensors",
238
+ "llm.model.layers.27.self_attn.v_proj.weight": "model-00005-of-00008.safetensors",
239
+ "llm.model.layers.28.input_layernorm.weight": "model-00006-of-00008.safetensors",
240
+ "llm.model.layers.28.mlp.down_proj.weight": "model-00006-of-00008.safetensors",
241
+ "llm.model.layers.28.mlp.gate_proj.weight": "model-00005-of-00008.safetensors",
242
+ "llm.model.layers.28.mlp.up_proj.weight": "model-00006-of-00008.safetensors",
243
+ "llm.model.layers.28.post_attention_layernorm.weight": "model-00006-of-00008.safetensors",
244
+ "llm.model.layers.28.self_attn.k_norm.weight": "model-00005-of-00008.safetensors",
245
+ "llm.model.layers.28.self_attn.k_proj.weight": "model-00005-of-00008.safetensors",
246
+ "llm.model.layers.28.self_attn.o_proj.weight": "model-00005-of-00008.safetensors",
247
+ "llm.model.layers.28.self_attn.q_norm.weight": "model-00005-of-00008.safetensors",
248
+ "llm.model.layers.28.self_attn.q_proj.weight": "model-00005-of-00008.safetensors",
249
+ "llm.model.layers.28.self_attn.v_proj.weight": "model-00005-of-00008.safetensors",
250
+ "llm.model.layers.29.input_layernorm.weight": "model-00006-of-00008.safetensors",
251
+ "llm.model.layers.29.mlp.down_proj.weight": "model-00006-of-00008.safetensors",
252
+ "llm.model.layers.29.mlp.gate_proj.weight": "model-00006-of-00008.safetensors",
253
+ "llm.model.layers.29.mlp.up_proj.weight": "model-00006-of-00008.safetensors",
254
+ "llm.model.layers.29.post_attention_layernorm.weight": "model-00006-of-00008.safetensors",
255
+ "llm.model.layers.29.self_attn.k_norm.weight": "model-00006-of-00008.safetensors",
256
+ "llm.model.layers.29.self_attn.k_proj.weight": "model-00006-of-00008.safetensors",
257
+ "llm.model.layers.29.self_attn.o_proj.weight": "model-00006-of-00008.safetensors",
258
+ "llm.model.layers.29.self_attn.q_norm.weight": "model-00006-of-00008.safetensors",
259
+ "llm.model.layers.29.self_attn.q_proj.weight": "model-00006-of-00008.safetensors",
260
+ "llm.model.layers.29.self_attn.v_proj.weight": "model-00006-of-00008.safetensors",
261
+ "llm.model.layers.3.input_layernorm.weight": "model-00002-of-00008.safetensors",
262
+ "llm.model.layers.3.mlp.down_proj.weight": "model-00002-of-00008.safetensors",
263
+ "llm.model.layers.3.mlp.gate_proj.weight": "model-00002-of-00008.safetensors",
264
+ "llm.model.layers.3.mlp.up_proj.weight": "model-00002-of-00008.safetensors",
265
+ "llm.model.layers.3.post_attention_layernorm.weight": "model-00002-of-00008.safetensors",
266
+ "llm.model.layers.3.self_attn.k_norm.weight": "model-00002-of-00008.safetensors",
267
+ "llm.model.layers.3.self_attn.k_proj.weight": "model-00001-of-00008.safetensors",
268
+ "llm.model.layers.3.self_attn.o_proj.weight": "model-00002-of-00008.safetensors",
269
+ "llm.model.layers.3.self_attn.q_norm.weight": "model-00002-of-00008.safetensors",
270
+ "llm.model.layers.3.self_attn.q_proj.weight": "model-00001-of-00008.safetensors",
271
+ "llm.model.layers.3.self_attn.v_proj.weight": "model-00001-of-00008.safetensors",
272
+ "llm.model.layers.30.input_layernorm.weight": "model-00006-of-00008.safetensors",
273
+ "llm.model.layers.30.mlp.down_proj.weight": "model-00006-of-00008.safetensors",
274
+ "llm.model.layers.30.mlp.gate_proj.weight": "model-00006-of-00008.safetensors",
275
+ "llm.model.layers.30.mlp.up_proj.weight": "model-00006-of-00008.safetensors",
276
+ "llm.model.layers.30.post_attention_layernorm.weight": "model-00006-of-00008.safetensors",
277
+ "llm.model.layers.30.self_attn.k_norm.weight": "model-00006-of-00008.safetensors",
278
+ "llm.model.layers.30.self_attn.k_proj.weight": "model-00006-of-00008.safetensors",
279
+ "llm.model.layers.30.self_attn.o_proj.weight": "model-00006-of-00008.safetensors",
280
+ "llm.model.layers.30.self_attn.q_norm.weight": "model-00006-of-00008.safetensors",
281
+ "llm.model.layers.30.self_attn.q_proj.weight": "model-00006-of-00008.safetensors",
282
+ "llm.model.layers.30.self_attn.v_proj.weight": "model-00006-of-00008.safetensors",
283
+ "llm.model.layers.31.input_layernorm.weight": "model-00006-of-00008.safetensors",
284
+ "llm.model.layers.31.mlp.down_proj.weight": "model-00006-of-00008.safetensors",
285
+ "llm.model.layers.31.mlp.gate_proj.weight": "model-00006-of-00008.safetensors",
286
+ "llm.model.layers.31.mlp.up_proj.weight": "model-00006-of-00008.safetensors",
287
+ "llm.model.layers.31.post_attention_layernorm.weight": "model-00006-of-00008.safetensors",
288
+ "llm.model.layers.31.self_attn.k_norm.weight": "model-00006-of-00008.safetensors",
289
+ "llm.model.layers.31.self_attn.k_proj.weight": "model-00006-of-00008.safetensors",
290
+ "llm.model.layers.31.self_attn.o_proj.weight": "model-00006-of-00008.safetensors",
291
+ "llm.model.layers.31.self_attn.q_norm.weight": "model-00006-of-00008.safetensors",
292
+ "llm.model.layers.31.self_attn.q_proj.weight": "model-00006-of-00008.safetensors",
293
+ "llm.model.layers.31.self_attn.v_proj.weight": "model-00006-of-00008.safetensors",
294
+ "llm.model.layers.32.input_layernorm.weight": "model-00006-of-00008.safetensors",
295
+ "llm.model.layers.32.mlp.down_proj.weight": "model-00006-of-00008.safetensors",
296
+ "llm.model.layers.32.mlp.gate_proj.weight": "model-00006-of-00008.safetensors",
297
+ "llm.model.layers.32.mlp.up_proj.weight": "model-00006-of-00008.safetensors",
298
+ "llm.model.layers.32.post_attention_layernorm.weight": "model-00006-of-00008.safetensors",
299
+ "llm.model.layers.32.self_attn.k_norm.weight": "model-00006-of-00008.safetensors",
300
+ "llm.model.layers.32.self_attn.k_proj.weight": "model-00006-of-00008.safetensors",
301
+ "llm.model.layers.32.self_attn.o_proj.weight": "model-00006-of-00008.safetensors",
302
+ "llm.model.layers.32.self_attn.q_norm.weight": "model-00006-of-00008.safetensors",
303
+ "llm.model.layers.32.self_attn.q_proj.weight": "model-00006-of-00008.safetensors",
304
+ "llm.model.layers.32.self_attn.v_proj.weight": "model-00006-of-00008.safetensors",
305
+ "llm.model.layers.33.input_layernorm.weight": "model-00006-of-00008.safetensors",
306
+ "llm.model.layers.33.mlp.down_proj.weight": "model-00006-of-00008.safetensors",
307
+ "llm.model.layers.33.mlp.gate_proj.weight": "model-00006-of-00008.safetensors",
308
+ "llm.model.layers.33.mlp.up_proj.weight": "model-00006-of-00008.safetensors",
309
+ "llm.model.layers.33.post_attention_layernorm.weight": "model-00006-of-00008.safetensors",
310
+ "llm.model.layers.33.self_attn.k_norm.weight": "model-00006-of-00008.safetensors",
311
+ "llm.model.layers.33.self_attn.k_proj.weight": "model-00006-of-00008.safetensors",
312
+ "llm.model.layers.33.self_attn.o_proj.weight": "model-00006-of-00008.safetensors",
313
+ "llm.model.layers.33.self_attn.q_norm.weight": "model-00006-of-00008.safetensors",
314
+ "llm.model.layers.33.self_attn.q_proj.weight": "model-00006-of-00008.safetensors",
315
+ "llm.model.layers.33.self_attn.v_proj.weight": "model-00006-of-00008.safetensors",
316
+ "llm.model.layers.34.input_layernorm.weight": "model-00007-of-00008.safetensors",
317
+ "llm.model.layers.34.mlp.down_proj.weight": "model-00007-of-00008.safetensors",
318
+ "llm.model.layers.34.mlp.gate_proj.weight": "model-00006-of-00008.safetensors",
319
+ "llm.model.layers.34.mlp.up_proj.weight": "model-00006-of-00008.safetensors",
320
+ "llm.model.layers.34.post_attention_layernorm.weight": "model-00007-of-00008.safetensors",
321
+ "llm.model.layers.34.self_attn.k_norm.weight": "model-00006-of-00008.safetensors",
322
+ "llm.model.layers.34.self_attn.k_proj.weight": "model-00006-of-00008.safetensors",
323
+ "llm.model.layers.34.self_attn.o_proj.weight": "model-00006-of-00008.safetensors",
324
+ "llm.model.layers.34.self_attn.q_norm.weight": "model-00006-of-00008.safetensors",
325
+ "llm.model.layers.34.self_attn.q_proj.weight": "model-00006-of-00008.safetensors",
326
+ "llm.model.layers.34.self_attn.v_proj.weight": "model-00006-of-00008.safetensors",
327
+ "llm.model.layers.35.input_layernorm.weight": "model-00007-of-00008.safetensors",
328
+ "llm.model.layers.35.mlp.down_proj.weight": "model-00007-of-00008.safetensors",
329
+ "llm.model.layers.35.mlp.gate_proj.weight": "model-00007-of-00008.safetensors",
330
+ "llm.model.layers.35.mlp.up_proj.weight": "model-00007-of-00008.safetensors",
331
+ "llm.model.layers.35.post_attention_layernorm.weight": "model-00007-of-00008.safetensors",
332
+ "llm.model.layers.35.self_attn.k_norm.weight": "model-00007-of-00008.safetensors",
333
+ "llm.model.layers.35.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
334
+ "llm.model.layers.35.self_attn.o_proj.weight": "model-00007-of-00008.safetensors",
335
+ "llm.model.layers.35.self_attn.q_norm.weight": "model-00007-of-00008.safetensors",
336
+ "llm.model.layers.35.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
337
+ "llm.model.layers.35.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
338
+ "llm.model.layers.4.input_layernorm.weight": "model-00002-of-00008.safetensors",
339
+ "llm.model.layers.4.mlp.down_proj.weight": "model-00002-of-00008.safetensors",
340
+ "llm.model.layers.4.mlp.gate_proj.weight": "model-00002-of-00008.safetensors",
341
+ "llm.model.layers.4.mlp.up_proj.weight": "model-00002-of-00008.safetensors",
342
+ "llm.model.layers.4.post_attention_layernorm.weight": "model-00002-of-00008.safetensors",
343
+ "llm.model.layers.4.self_attn.k_norm.weight": "model-00002-of-00008.safetensors",
344
+ "llm.model.layers.4.self_attn.k_proj.weight": "model-00002-of-00008.safetensors",
345
+ "llm.model.layers.4.self_attn.o_proj.weight": "model-00002-of-00008.safetensors",
346
+ "llm.model.layers.4.self_attn.q_norm.weight": "model-00002-of-00008.safetensors",
347
+ "llm.model.layers.4.self_attn.q_proj.weight": "model-00002-of-00008.safetensors",
348
+ "llm.model.layers.4.self_attn.v_proj.weight": "model-00002-of-00008.safetensors",
349
+ "llm.model.layers.5.input_layernorm.weight": "model-00002-of-00008.safetensors",
350
+ "llm.model.layers.5.mlp.down_proj.weight": "model-00002-of-00008.safetensors",
351
+ "llm.model.layers.5.mlp.gate_proj.weight": "model-00002-of-00008.safetensors",
352
+ "llm.model.layers.5.mlp.up_proj.weight": "model-00002-of-00008.safetensors",
353
+ "llm.model.layers.5.post_attention_layernorm.weight": "model-00002-of-00008.safetensors",
354
+ "llm.model.layers.5.self_attn.k_norm.weight": "model-00002-of-00008.safetensors",
355
+ "llm.model.layers.5.self_attn.k_proj.weight": "model-00002-of-00008.safetensors",
356
+ "llm.model.layers.5.self_attn.o_proj.weight": "model-00002-of-00008.safetensors",
357
+ "llm.model.layers.5.self_attn.q_norm.weight": "model-00002-of-00008.safetensors",
358
+ "llm.model.layers.5.self_attn.q_proj.weight": "model-00002-of-00008.safetensors",
359
+ "llm.model.layers.5.self_attn.v_proj.weight": "model-00002-of-00008.safetensors",
360
+ "llm.model.layers.6.input_layernorm.weight": "model-00002-of-00008.safetensors",
361
+ "llm.model.layers.6.mlp.down_proj.weight": "model-00002-of-00008.safetensors",
362
+ "llm.model.layers.6.mlp.gate_proj.weight": "model-00002-of-00008.safetensors",
363
+ "llm.model.layers.6.mlp.up_proj.weight": "model-00002-of-00008.safetensors",
364
+ "llm.model.layers.6.post_attention_layernorm.weight": "model-00002-of-00008.safetensors",
365
+ "llm.model.layers.6.self_attn.k_norm.weight": "model-00002-of-00008.safetensors",
366
+ "llm.model.layers.6.self_attn.k_proj.weight": "model-00002-of-00008.safetensors",
367
+ "llm.model.layers.6.self_attn.o_proj.weight": "model-00002-of-00008.safetensors",
368
+ "llm.model.layers.6.self_attn.q_norm.weight": "model-00002-of-00008.safetensors",
369
+ "llm.model.layers.6.self_attn.q_proj.weight": "model-00002-of-00008.safetensors",
370
+ "llm.model.layers.6.self_attn.v_proj.weight": "model-00002-of-00008.safetensors",
371
+ "llm.model.layers.7.input_layernorm.weight": "model-00002-of-00008.safetensors",
372
+ "llm.model.layers.7.mlp.down_proj.weight": "model-00002-of-00008.safetensors",
373
+ "llm.model.layers.7.mlp.gate_proj.weight": "model-00002-of-00008.safetensors",
374
+ "llm.model.layers.7.mlp.up_proj.weight": "model-00002-of-00008.safetensors",
375
+ "llm.model.layers.7.post_attention_layernorm.weight": "model-00002-of-00008.safetensors",
376
+ "llm.model.layers.7.self_attn.k_norm.weight": "model-00002-of-00008.safetensors",
377
+ "llm.model.layers.7.self_attn.k_proj.weight": "model-00002-of-00008.safetensors",
378
+ "llm.model.layers.7.self_attn.o_proj.weight": "model-00002-of-00008.safetensors",
379
+ "llm.model.layers.7.self_attn.q_norm.weight": "model-00002-of-00008.safetensors",
380
+ "llm.model.layers.7.self_attn.q_proj.weight": "model-00002-of-00008.safetensors",
381
+ "llm.model.layers.7.self_attn.v_proj.weight": "model-00002-of-00008.safetensors",
382
+ "llm.model.layers.8.input_layernorm.weight": "model-00002-of-00008.safetensors",
383
+ "llm.model.layers.8.mlp.down_proj.weight": "model-00002-of-00008.safetensors",
384
+ "llm.model.layers.8.mlp.gate_proj.weight": "model-00002-of-00008.safetensors",
385
+ "llm.model.layers.8.mlp.up_proj.weight": "model-00002-of-00008.safetensors",
386
+ "llm.model.layers.8.post_attention_layernorm.weight": "model-00002-of-00008.safetensors",
387
+ "llm.model.layers.8.self_attn.k_norm.weight": "model-00002-of-00008.safetensors",
388
+ "llm.model.layers.8.self_attn.k_proj.weight": "model-00002-of-00008.safetensors",
389
+ "llm.model.layers.8.self_attn.o_proj.weight": "model-00002-of-00008.safetensors",
390
+ "llm.model.layers.8.self_attn.q_norm.weight": "model-00002-of-00008.safetensors",
391
+ "llm.model.layers.8.self_attn.q_proj.weight": "model-00002-of-00008.safetensors",
392
+ "llm.model.layers.8.self_attn.v_proj.weight": "model-00002-of-00008.safetensors",
393
+ "llm.model.layers.9.input_layernorm.weight": "model-00003-of-00008.safetensors",
394
+ "llm.model.layers.9.mlp.down_proj.weight": "model-00003-of-00008.safetensors",
395
+ "llm.model.layers.9.mlp.gate_proj.weight": "model-00002-of-00008.safetensors",
396
+ "llm.model.layers.9.mlp.up_proj.weight": "model-00003-of-00008.safetensors",
397
+ "llm.model.layers.9.post_attention_layernorm.weight": "model-00003-of-00008.safetensors",
398
+ "llm.model.layers.9.self_attn.k_norm.weight": "model-00002-of-00008.safetensors",
399
+ "llm.model.layers.9.self_attn.k_proj.weight": "model-00002-of-00008.safetensors",
400
+ "llm.model.layers.9.self_attn.o_proj.weight": "model-00002-of-00008.safetensors",
401
+ "llm.model.layers.9.self_attn.q_norm.weight": "model-00002-of-00008.safetensors",
402
+ "llm.model.layers.9.self_attn.q_proj.weight": "model-00002-of-00008.safetensors",
403
+ "llm.model.layers.9.self_attn.v_proj.weight": "model-00002-of-00008.safetensors",
404
+ "llm.model.norm.weight": "model-00007-of-00008.safetensors",
405
+ "resampler.attn.in_proj_bias": "model-00008-of-00008.safetensors",
406
+ "resampler.attn.in_proj_weight": "model-00008-of-00008.safetensors",
407
+ "resampler.attn.out_proj.bias": "model-00008-of-00008.safetensors",
408
+ "resampler.attn.out_proj.weight": "model-00008-of-00008.safetensors",
409
+ "resampler.kv_proj.weight": "model-00008-of-00008.safetensors",
410
+ "resampler.ln_kv.bias": "model-00008-of-00008.safetensors",
411
+ "resampler.ln_kv.weight": "model-00008-of-00008.safetensors",
412
+ "resampler.ln_post.bias": "model-00008-of-00008.safetensors",
413
+ "resampler.ln_post.weight": "model-00008-of-00008.safetensors",
414
+ "resampler.ln_q.bias": "model-00008-of-00008.safetensors",
415
+ "resampler.ln_q.weight": "model-00008-of-00008.safetensors",
416
+ "resampler.proj": "model-00008-of-00008.safetensors",
417
+ "resampler.query": "model-00008-of-00008.safetensors",
418
+ "vpm.embeddings.patch_embedding.bias": "model-00007-of-00008.safetensors",
419
+ "vpm.embeddings.patch_embedding.weight": "model-00007-of-00008.safetensors",
420
+ "vpm.embeddings.position_embedding.weight": "model-00007-of-00008.safetensors",
421
+ "vpm.encoder.layers.0.layer_norm1.bias": "model-00007-of-00008.safetensors",
422
+ "vpm.encoder.layers.0.layer_norm1.weight": "model-00007-of-00008.safetensors",
423
+ "vpm.encoder.layers.0.layer_norm2.bias": "model-00007-of-00008.safetensors",
424
+ "vpm.encoder.layers.0.layer_norm2.weight": "model-00007-of-00008.safetensors",
425
+ "vpm.encoder.layers.0.mlp.fc1.bias": "model-00007-of-00008.safetensors",
426
+ "vpm.encoder.layers.0.mlp.fc1.weight": "model-00007-of-00008.safetensors",
427
+ "vpm.encoder.layers.0.mlp.fc2.bias": "model-00007-of-00008.safetensors",
428
+ "vpm.encoder.layers.0.mlp.fc2.weight": "model-00007-of-00008.safetensors",
429
+ "vpm.encoder.layers.0.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
430
+ "vpm.encoder.layers.0.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
431
+ "vpm.encoder.layers.0.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
432
+ "vpm.encoder.layers.0.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
433
+ "vpm.encoder.layers.0.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
434
+ "vpm.encoder.layers.0.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
435
+ "vpm.encoder.layers.0.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
436
+ "vpm.encoder.layers.0.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
437
+ "vpm.encoder.layers.1.layer_norm1.bias": "model-00007-of-00008.safetensors",
438
+ "vpm.encoder.layers.1.layer_norm1.weight": "model-00007-of-00008.safetensors",
439
+ "vpm.encoder.layers.1.layer_norm2.bias": "model-00007-of-00008.safetensors",
440
+ "vpm.encoder.layers.1.layer_norm2.weight": "model-00007-of-00008.safetensors",
441
+ "vpm.encoder.layers.1.mlp.fc1.bias": "model-00007-of-00008.safetensors",
442
+ "vpm.encoder.layers.1.mlp.fc1.weight": "model-00007-of-00008.safetensors",
443
+ "vpm.encoder.layers.1.mlp.fc2.bias": "model-00007-of-00008.safetensors",
444
+ "vpm.encoder.layers.1.mlp.fc2.weight": "model-00007-of-00008.safetensors",
445
+ "vpm.encoder.layers.1.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
446
+ "vpm.encoder.layers.1.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
447
+ "vpm.encoder.layers.1.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
448
+ "vpm.encoder.layers.1.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
449
+ "vpm.encoder.layers.1.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
450
+ "vpm.encoder.layers.1.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
451
+ "vpm.encoder.layers.1.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
452
+ "vpm.encoder.layers.1.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
453
+ "vpm.encoder.layers.10.layer_norm1.bias": "model-00007-of-00008.safetensors",
454
+ "vpm.encoder.layers.10.layer_norm1.weight": "model-00007-of-00008.safetensors",
455
+ "vpm.encoder.layers.10.layer_norm2.bias": "model-00007-of-00008.safetensors",
456
+ "vpm.encoder.layers.10.layer_norm2.weight": "model-00007-of-00008.safetensors",
457
+ "vpm.encoder.layers.10.mlp.fc1.bias": "model-00007-of-00008.safetensors",
458
+ "vpm.encoder.layers.10.mlp.fc1.weight": "model-00007-of-00008.safetensors",
459
+ "vpm.encoder.layers.10.mlp.fc2.bias": "model-00007-of-00008.safetensors",
460
+ "vpm.encoder.layers.10.mlp.fc2.weight": "model-00007-of-00008.safetensors",
461
+ "vpm.encoder.layers.10.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
462
+ "vpm.encoder.layers.10.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
463
+ "vpm.encoder.layers.10.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
464
+ "vpm.encoder.layers.10.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
465
+ "vpm.encoder.layers.10.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
466
+ "vpm.encoder.layers.10.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
467
+ "vpm.encoder.layers.10.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
468
+ "vpm.encoder.layers.10.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
469
+ "vpm.encoder.layers.11.layer_norm1.bias": "model-00007-of-00008.safetensors",
470
+ "vpm.encoder.layers.11.layer_norm1.weight": "model-00007-of-00008.safetensors",
471
+ "vpm.encoder.layers.11.layer_norm2.bias": "model-00007-of-00008.safetensors",
472
+ "vpm.encoder.layers.11.layer_norm2.weight": "model-00007-of-00008.safetensors",
473
+ "vpm.encoder.layers.11.mlp.fc1.bias": "model-00007-of-00008.safetensors",
474
+ "vpm.encoder.layers.11.mlp.fc1.weight": "model-00007-of-00008.safetensors",
475
+ "vpm.encoder.layers.11.mlp.fc2.bias": "model-00007-of-00008.safetensors",
476
+ "vpm.encoder.layers.11.mlp.fc2.weight": "model-00007-of-00008.safetensors",
477
+ "vpm.encoder.layers.11.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
478
+ "vpm.encoder.layers.11.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
479
+ "vpm.encoder.layers.11.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
480
+ "vpm.encoder.layers.11.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
481
+ "vpm.encoder.layers.11.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
482
+ "vpm.encoder.layers.11.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
483
+ "vpm.encoder.layers.11.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
484
+ "vpm.encoder.layers.11.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
485
+ "vpm.encoder.layers.12.layer_norm1.bias": "model-00007-of-00008.safetensors",
486
+ "vpm.encoder.layers.12.layer_norm1.weight": "model-00007-of-00008.safetensors",
487
+ "vpm.encoder.layers.12.layer_norm2.bias": "model-00007-of-00008.safetensors",
488
+ "vpm.encoder.layers.12.layer_norm2.weight": "model-00007-of-00008.safetensors",
489
+ "vpm.encoder.layers.12.mlp.fc1.bias": "model-00007-of-00008.safetensors",
490
+ "vpm.encoder.layers.12.mlp.fc1.weight": "model-00007-of-00008.safetensors",
491
+ "vpm.encoder.layers.12.mlp.fc2.bias": "model-00007-of-00008.safetensors",
492
+ "vpm.encoder.layers.12.mlp.fc2.weight": "model-00007-of-00008.safetensors",
493
+ "vpm.encoder.layers.12.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
494
+ "vpm.encoder.layers.12.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
495
+ "vpm.encoder.layers.12.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
496
+ "vpm.encoder.layers.12.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
497
+ "vpm.encoder.layers.12.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
498
+ "vpm.encoder.layers.12.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
499
+ "vpm.encoder.layers.12.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
500
+ "vpm.encoder.layers.12.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
501
+ "vpm.encoder.layers.13.layer_norm1.bias": "model-00007-of-00008.safetensors",
502
+ "vpm.encoder.layers.13.layer_norm1.weight": "model-00007-of-00008.safetensors",
503
+ "vpm.encoder.layers.13.layer_norm2.bias": "model-00007-of-00008.safetensors",
504
+ "vpm.encoder.layers.13.layer_norm2.weight": "model-00007-of-00008.safetensors",
505
+ "vpm.encoder.layers.13.mlp.fc1.bias": "model-00007-of-00008.safetensors",
506
+ "vpm.encoder.layers.13.mlp.fc1.weight": "model-00007-of-00008.safetensors",
507
+ "vpm.encoder.layers.13.mlp.fc2.bias": "model-00007-of-00008.safetensors",
508
+ "vpm.encoder.layers.13.mlp.fc2.weight": "model-00007-of-00008.safetensors",
509
+ "vpm.encoder.layers.13.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
510
+ "vpm.encoder.layers.13.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
511
+ "vpm.encoder.layers.13.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
512
+ "vpm.encoder.layers.13.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
513
+ "vpm.encoder.layers.13.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
514
+ "vpm.encoder.layers.13.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
515
+ "vpm.encoder.layers.13.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
516
+ "vpm.encoder.layers.13.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
517
+ "vpm.encoder.layers.14.layer_norm1.bias": "model-00007-of-00008.safetensors",
518
+ "vpm.encoder.layers.14.layer_norm1.weight": "model-00007-of-00008.safetensors",
519
+ "vpm.encoder.layers.14.layer_norm2.bias": "model-00007-of-00008.safetensors",
520
+ "vpm.encoder.layers.14.layer_norm2.weight": "model-00007-of-00008.safetensors",
521
+ "vpm.encoder.layers.14.mlp.fc1.bias": "model-00007-of-00008.safetensors",
522
+ "vpm.encoder.layers.14.mlp.fc1.weight": "model-00007-of-00008.safetensors",
523
+ "vpm.encoder.layers.14.mlp.fc2.bias": "model-00007-of-00008.safetensors",
524
+ "vpm.encoder.layers.14.mlp.fc2.weight": "model-00007-of-00008.safetensors",
525
+ "vpm.encoder.layers.14.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
526
+ "vpm.encoder.layers.14.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
527
+ "vpm.encoder.layers.14.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
528
+ "vpm.encoder.layers.14.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
529
+ "vpm.encoder.layers.14.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
530
+ "vpm.encoder.layers.14.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
531
+ "vpm.encoder.layers.14.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
532
+ "vpm.encoder.layers.14.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
533
+ "vpm.encoder.layers.15.layer_norm1.bias": "model-00007-of-00008.safetensors",
534
+ "vpm.encoder.layers.15.layer_norm1.weight": "model-00007-of-00008.safetensors",
535
+ "vpm.encoder.layers.15.layer_norm2.bias": "model-00007-of-00008.safetensors",
536
+ "vpm.encoder.layers.15.layer_norm2.weight": "model-00007-of-00008.safetensors",
537
+ "vpm.encoder.layers.15.mlp.fc1.bias": "model-00007-of-00008.safetensors",
538
+ "vpm.encoder.layers.15.mlp.fc1.weight": "model-00007-of-00008.safetensors",
539
+ "vpm.encoder.layers.15.mlp.fc2.bias": "model-00007-of-00008.safetensors",
540
+ "vpm.encoder.layers.15.mlp.fc2.weight": "model-00007-of-00008.safetensors",
541
+ "vpm.encoder.layers.15.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
542
+ "vpm.encoder.layers.15.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
543
+ "vpm.encoder.layers.15.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
544
+ "vpm.encoder.layers.15.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
545
+ "vpm.encoder.layers.15.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
546
+ "vpm.encoder.layers.15.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
547
+ "vpm.encoder.layers.15.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
548
+ "vpm.encoder.layers.15.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
549
+ "vpm.encoder.layers.16.layer_norm1.bias": "model-00007-of-00008.safetensors",
550
+ "vpm.encoder.layers.16.layer_norm1.weight": "model-00007-of-00008.safetensors",
551
+ "vpm.encoder.layers.16.layer_norm2.bias": "model-00007-of-00008.safetensors",
552
+ "vpm.encoder.layers.16.layer_norm2.weight": "model-00007-of-00008.safetensors",
553
+ "vpm.encoder.layers.16.mlp.fc1.bias": "model-00007-of-00008.safetensors",
554
+ "vpm.encoder.layers.16.mlp.fc1.weight": "model-00007-of-00008.safetensors",
555
+ "vpm.encoder.layers.16.mlp.fc2.bias": "model-00007-of-00008.safetensors",
556
+ "vpm.encoder.layers.16.mlp.fc2.weight": "model-00007-of-00008.safetensors",
557
+ "vpm.encoder.layers.16.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
558
+ "vpm.encoder.layers.16.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
559
+ "vpm.encoder.layers.16.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
560
+ "vpm.encoder.layers.16.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
561
+ "vpm.encoder.layers.16.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
562
+ "vpm.encoder.layers.16.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
563
+ "vpm.encoder.layers.16.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
564
+ "vpm.encoder.layers.16.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
565
+ "vpm.encoder.layers.17.layer_norm1.bias": "model-00007-of-00008.safetensors",
566
+ "vpm.encoder.layers.17.layer_norm1.weight": "model-00007-of-00008.safetensors",
567
+ "vpm.encoder.layers.17.layer_norm2.bias": "model-00007-of-00008.safetensors",
568
+ "vpm.encoder.layers.17.layer_norm2.weight": "model-00007-of-00008.safetensors",
569
+ "vpm.encoder.layers.17.mlp.fc1.bias": "model-00007-of-00008.safetensors",
570
+ "vpm.encoder.layers.17.mlp.fc1.weight": "model-00007-of-00008.safetensors",
571
+ "vpm.encoder.layers.17.mlp.fc2.bias": "model-00007-of-00008.safetensors",
572
+ "vpm.encoder.layers.17.mlp.fc2.weight": "model-00007-of-00008.safetensors",
573
+ "vpm.encoder.layers.17.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
574
+ "vpm.encoder.layers.17.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
575
+ "vpm.encoder.layers.17.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
576
+ "vpm.encoder.layers.17.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
577
+ "vpm.encoder.layers.17.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
578
+ "vpm.encoder.layers.17.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
579
+ "vpm.encoder.layers.17.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
580
+ "vpm.encoder.layers.17.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
581
+ "vpm.encoder.layers.18.layer_norm1.bias": "model-00007-of-00008.safetensors",
582
+ "vpm.encoder.layers.18.layer_norm1.weight": "model-00007-of-00008.safetensors",
583
+ "vpm.encoder.layers.18.layer_norm2.bias": "model-00007-of-00008.safetensors",
584
+ "vpm.encoder.layers.18.layer_norm2.weight": "model-00007-of-00008.safetensors",
585
+ "vpm.encoder.layers.18.mlp.fc1.bias": "model-00007-of-00008.safetensors",
586
+ "vpm.encoder.layers.18.mlp.fc1.weight": "model-00007-of-00008.safetensors",
587
+ "vpm.encoder.layers.18.mlp.fc2.bias": "model-00007-of-00008.safetensors",
588
+ "vpm.encoder.layers.18.mlp.fc2.weight": "model-00007-of-00008.safetensors",
589
+ "vpm.encoder.layers.18.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
590
+ "vpm.encoder.layers.18.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
591
+ "vpm.encoder.layers.18.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
592
+ "vpm.encoder.layers.18.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
593
+ "vpm.encoder.layers.18.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
594
+ "vpm.encoder.layers.18.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
595
+ "vpm.encoder.layers.18.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
596
+ "vpm.encoder.layers.18.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
597
+ "vpm.encoder.layers.19.layer_norm1.bias": "model-00007-of-00008.safetensors",
598
+ "vpm.encoder.layers.19.layer_norm1.weight": "model-00007-of-00008.safetensors",
599
+ "vpm.encoder.layers.19.layer_norm2.bias": "model-00007-of-00008.safetensors",
600
+ "vpm.encoder.layers.19.layer_norm2.weight": "model-00007-of-00008.safetensors",
601
+ "vpm.encoder.layers.19.mlp.fc1.bias": "model-00007-of-00008.safetensors",
602
+ "vpm.encoder.layers.19.mlp.fc1.weight": "model-00007-of-00008.safetensors",
603
+ "vpm.encoder.layers.19.mlp.fc2.bias": "model-00007-of-00008.safetensors",
604
+ "vpm.encoder.layers.19.mlp.fc2.weight": "model-00007-of-00008.safetensors",
605
+ "vpm.encoder.layers.19.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
606
+ "vpm.encoder.layers.19.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
607
+ "vpm.encoder.layers.19.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
608
+ "vpm.encoder.layers.19.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
609
+ "vpm.encoder.layers.19.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
610
+ "vpm.encoder.layers.19.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
611
+ "vpm.encoder.layers.19.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
612
+ "vpm.encoder.layers.19.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
613
+ "vpm.encoder.layers.2.layer_norm1.bias": "model-00007-of-00008.safetensors",
614
+ "vpm.encoder.layers.2.layer_norm1.weight": "model-00007-of-00008.safetensors",
615
+ "vpm.encoder.layers.2.layer_norm2.bias": "model-00007-of-00008.safetensors",
616
+ "vpm.encoder.layers.2.layer_norm2.weight": "model-00007-of-00008.safetensors",
617
+ "vpm.encoder.layers.2.mlp.fc1.bias": "model-00007-of-00008.safetensors",
618
+ "vpm.encoder.layers.2.mlp.fc1.weight": "model-00007-of-00008.safetensors",
619
+ "vpm.encoder.layers.2.mlp.fc2.bias": "model-00007-of-00008.safetensors",
620
+ "vpm.encoder.layers.2.mlp.fc2.weight": "model-00007-of-00008.safetensors",
621
+ "vpm.encoder.layers.2.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
622
+ "vpm.encoder.layers.2.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
623
+ "vpm.encoder.layers.2.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
624
+ "vpm.encoder.layers.2.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
625
+ "vpm.encoder.layers.2.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
626
+ "vpm.encoder.layers.2.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
627
+ "vpm.encoder.layers.2.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
628
+ "vpm.encoder.layers.2.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
629
+ "vpm.encoder.layers.20.layer_norm1.bias": "model-00007-of-00008.safetensors",
630
+ "vpm.encoder.layers.20.layer_norm1.weight": "model-00007-of-00008.safetensors",
631
+ "vpm.encoder.layers.20.layer_norm2.bias": "model-00007-of-00008.safetensors",
632
+ "vpm.encoder.layers.20.layer_norm2.weight": "model-00007-of-00008.safetensors",
633
+ "vpm.encoder.layers.20.mlp.fc1.bias": "model-00007-of-00008.safetensors",
634
+ "vpm.encoder.layers.20.mlp.fc1.weight": "model-00007-of-00008.safetensors",
635
+ "vpm.encoder.layers.20.mlp.fc2.bias": "model-00007-of-00008.safetensors",
636
+ "vpm.encoder.layers.20.mlp.fc2.weight": "model-00007-of-00008.safetensors",
637
+ "vpm.encoder.layers.20.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
638
+ "vpm.encoder.layers.20.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
639
+ "vpm.encoder.layers.20.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
640
+ "vpm.encoder.layers.20.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
641
+ "vpm.encoder.layers.20.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
642
+ "vpm.encoder.layers.20.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
643
+ "vpm.encoder.layers.20.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
644
+ "vpm.encoder.layers.20.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
645
+ "vpm.encoder.layers.21.layer_norm1.bias": "model-00007-of-00008.safetensors",
646
+ "vpm.encoder.layers.21.layer_norm1.weight": "model-00007-of-00008.safetensors",
647
+ "vpm.encoder.layers.21.layer_norm2.bias": "model-00007-of-00008.safetensors",
648
+ "vpm.encoder.layers.21.layer_norm2.weight": "model-00007-of-00008.safetensors",
649
+ "vpm.encoder.layers.21.mlp.fc1.bias": "model-00007-of-00008.safetensors",
650
+ "vpm.encoder.layers.21.mlp.fc1.weight": "model-00007-of-00008.safetensors",
651
+ "vpm.encoder.layers.21.mlp.fc2.bias": "model-00007-of-00008.safetensors",
652
+ "vpm.encoder.layers.21.mlp.fc2.weight": "model-00007-of-00008.safetensors",
653
+ "vpm.encoder.layers.21.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
654
+ "vpm.encoder.layers.21.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
655
+ "vpm.encoder.layers.21.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
656
+ "vpm.encoder.layers.21.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
657
+ "vpm.encoder.layers.21.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
658
+ "vpm.encoder.layers.21.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
659
+ "vpm.encoder.layers.21.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
660
+ "vpm.encoder.layers.21.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
661
+ "vpm.encoder.layers.22.layer_norm1.bias": "model-00007-of-00008.safetensors",
662
+ "vpm.encoder.layers.22.layer_norm1.weight": "model-00007-of-00008.safetensors",
663
+ "vpm.encoder.layers.22.layer_norm2.bias": "model-00007-of-00008.safetensors",
664
+ "vpm.encoder.layers.22.layer_norm2.weight": "model-00007-of-00008.safetensors",
665
+ "vpm.encoder.layers.22.mlp.fc1.bias": "model-00007-of-00008.safetensors",
666
+ "vpm.encoder.layers.22.mlp.fc1.weight": "model-00007-of-00008.safetensors",
667
+ "vpm.encoder.layers.22.mlp.fc2.bias": "model-00007-of-00008.safetensors",
668
+ "vpm.encoder.layers.22.mlp.fc2.weight": "model-00007-of-00008.safetensors",
669
+ "vpm.encoder.layers.22.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
670
+ "vpm.encoder.layers.22.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
671
+ "vpm.encoder.layers.22.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
672
+ "vpm.encoder.layers.22.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
673
+ "vpm.encoder.layers.22.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
674
+ "vpm.encoder.layers.22.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
675
+ "vpm.encoder.layers.22.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
676
+ "vpm.encoder.layers.22.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
677
+ "vpm.encoder.layers.23.layer_norm1.bias": "model-00007-of-00008.safetensors",
678
+ "vpm.encoder.layers.23.layer_norm1.weight": "model-00007-of-00008.safetensors",
679
+ "vpm.encoder.layers.23.layer_norm2.bias": "model-00007-of-00008.safetensors",
680
+ "vpm.encoder.layers.23.layer_norm2.weight": "model-00007-of-00008.safetensors",
681
+ "vpm.encoder.layers.23.mlp.fc1.bias": "model-00007-of-00008.safetensors",
682
+ "vpm.encoder.layers.23.mlp.fc1.weight": "model-00007-of-00008.safetensors",
683
+ "vpm.encoder.layers.23.mlp.fc2.bias": "model-00007-of-00008.safetensors",
684
+ "vpm.encoder.layers.23.mlp.fc2.weight": "model-00007-of-00008.safetensors",
685
+ "vpm.encoder.layers.23.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
686
+ "vpm.encoder.layers.23.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
687
+ "vpm.encoder.layers.23.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
688
+ "vpm.encoder.layers.23.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
689
+ "vpm.encoder.layers.23.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
690
+ "vpm.encoder.layers.23.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
691
+ "vpm.encoder.layers.23.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
692
+ "vpm.encoder.layers.23.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
693
+ "vpm.encoder.layers.24.layer_norm1.bias": "model-00008-of-00008.safetensors",
694
+ "vpm.encoder.layers.24.layer_norm1.weight": "model-00008-of-00008.safetensors",
695
+ "vpm.encoder.layers.24.layer_norm2.bias": "model-00008-of-00008.safetensors",
696
+ "vpm.encoder.layers.24.layer_norm2.weight": "model-00008-of-00008.safetensors",
697
+ "vpm.encoder.layers.24.mlp.fc1.bias": "model-00008-of-00008.safetensors",
698
+ "vpm.encoder.layers.24.mlp.fc1.weight": "model-00008-of-00008.safetensors",
699
+ "vpm.encoder.layers.24.mlp.fc2.bias": "model-00008-of-00008.safetensors",
700
+ "vpm.encoder.layers.24.mlp.fc2.weight": "model-00008-of-00008.safetensors",
701
+ "vpm.encoder.layers.24.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
702
+ "vpm.encoder.layers.24.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
703
+ "vpm.encoder.layers.24.self_attn.out_proj.bias": "model-00008-of-00008.safetensors",
704
+ "vpm.encoder.layers.24.self_attn.out_proj.weight": "model-00008-of-00008.safetensors",
705
+ "vpm.encoder.layers.24.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
706
+ "vpm.encoder.layers.24.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
707
+ "vpm.encoder.layers.24.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
708
+ "vpm.encoder.layers.24.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
709
+ "vpm.encoder.layers.25.layer_norm1.bias": "model-00008-of-00008.safetensors",
710
+ "vpm.encoder.layers.25.layer_norm1.weight": "model-00008-of-00008.safetensors",
711
+ "vpm.encoder.layers.25.layer_norm2.bias": "model-00008-of-00008.safetensors",
712
+ "vpm.encoder.layers.25.layer_norm2.weight": "model-00008-of-00008.safetensors",
713
+ "vpm.encoder.layers.25.mlp.fc1.bias": "model-00008-of-00008.safetensors",
714
+ "vpm.encoder.layers.25.mlp.fc1.weight": "model-00008-of-00008.safetensors",
715
+ "vpm.encoder.layers.25.mlp.fc2.bias": "model-00008-of-00008.safetensors",
716
+ "vpm.encoder.layers.25.mlp.fc2.weight": "model-00008-of-00008.safetensors",
717
+ "vpm.encoder.layers.25.self_attn.k_proj.bias": "model-00008-of-00008.safetensors",
718
+ "vpm.encoder.layers.25.self_attn.k_proj.weight": "model-00008-of-00008.safetensors",
719
+ "vpm.encoder.layers.25.self_attn.out_proj.bias": "model-00008-of-00008.safetensors",
720
+ "vpm.encoder.layers.25.self_attn.out_proj.weight": "model-00008-of-00008.safetensors",
721
+ "vpm.encoder.layers.25.self_attn.q_proj.bias": "model-00008-of-00008.safetensors",
722
+ "vpm.encoder.layers.25.self_attn.q_proj.weight": "model-00008-of-00008.safetensors",
723
+ "vpm.encoder.layers.25.self_attn.v_proj.bias": "model-00008-of-00008.safetensors",
724
+ "vpm.encoder.layers.25.self_attn.v_proj.weight": "model-00008-of-00008.safetensors",
725
+ "vpm.encoder.layers.26.layer_norm1.bias": "model-00008-of-00008.safetensors",
726
+ "vpm.encoder.layers.26.layer_norm1.weight": "model-00008-of-00008.safetensors",
727
+ "vpm.encoder.layers.26.layer_norm2.bias": "model-00008-of-00008.safetensors",
728
+ "vpm.encoder.layers.26.layer_norm2.weight": "model-00008-of-00008.safetensors",
729
+ "vpm.encoder.layers.26.mlp.fc1.bias": "model-00008-of-00008.safetensors",
730
+ "vpm.encoder.layers.26.mlp.fc1.weight": "model-00008-of-00008.safetensors",
731
+ "vpm.encoder.layers.26.mlp.fc2.bias": "model-00008-of-00008.safetensors",
732
+ "vpm.encoder.layers.26.mlp.fc2.weight": "model-00008-of-00008.safetensors",
733
+ "vpm.encoder.layers.26.self_attn.k_proj.bias": "model-00008-of-00008.safetensors",
734
+ "vpm.encoder.layers.26.self_attn.k_proj.weight": "model-00008-of-00008.safetensors",
735
+ "vpm.encoder.layers.26.self_attn.out_proj.bias": "model-00008-of-00008.safetensors",
736
+ "vpm.encoder.layers.26.self_attn.out_proj.weight": "model-00008-of-00008.safetensors",
737
+ "vpm.encoder.layers.26.self_attn.q_proj.bias": "model-00008-of-00008.safetensors",
738
+ "vpm.encoder.layers.26.self_attn.q_proj.weight": "model-00008-of-00008.safetensors",
739
+ "vpm.encoder.layers.26.self_attn.v_proj.bias": "model-00008-of-00008.safetensors",
740
+ "vpm.encoder.layers.26.self_attn.v_proj.weight": "model-00008-of-00008.safetensors",
741
+ "vpm.encoder.layers.3.layer_norm1.bias": "model-00007-of-00008.safetensors",
742
+ "vpm.encoder.layers.3.layer_norm1.weight": "model-00007-of-00008.safetensors",
743
+ "vpm.encoder.layers.3.layer_norm2.bias": "model-00007-of-00008.safetensors",
744
+ "vpm.encoder.layers.3.layer_norm2.weight": "model-00007-of-00008.safetensors",
745
+ "vpm.encoder.layers.3.mlp.fc1.bias": "model-00007-of-00008.safetensors",
746
+ "vpm.encoder.layers.3.mlp.fc1.weight": "model-00007-of-00008.safetensors",
747
+ "vpm.encoder.layers.3.mlp.fc2.bias": "model-00007-of-00008.safetensors",
748
+ "vpm.encoder.layers.3.mlp.fc2.weight": "model-00007-of-00008.safetensors",
749
+ "vpm.encoder.layers.3.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
750
+ "vpm.encoder.layers.3.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
751
+ "vpm.encoder.layers.3.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
752
+ "vpm.encoder.layers.3.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
753
+ "vpm.encoder.layers.3.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
754
+ "vpm.encoder.layers.3.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
755
+ "vpm.encoder.layers.3.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
756
+ "vpm.encoder.layers.3.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
757
+ "vpm.encoder.layers.4.layer_norm1.bias": "model-00007-of-00008.safetensors",
758
+ "vpm.encoder.layers.4.layer_norm1.weight": "model-00007-of-00008.safetensors",
759
+ "vpm.encoder.layers.4.layer_norm2.bias": "model-00007-of-00008.safetensors",
760
+ "vpm.encoder.layers.4.layer_norm2.weight": "model-00007-of-00008.safetensors",
761
+ "vpm.encoder.layers.4.mlp.fc1.bias": "model-00007-of-00008.safetensors",
762
+ "vpm.encoder.layers.4.mlp.fc1.weight": "model-00007-of-00008.safetensors",
763
+ "vpm.encoder.layers.4.mlp.fc2.bias": "model-00007-of-00008.safetensors",
764
+ "vpm.encoder.layers.4.mlp.fc2.weight": "model-00007-of-00008.safetensors",
765
+ "vpm.encoder.layers.4.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
766
+ "vpm.encoder.layers.4.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
767
+ "vpm.encoder.layers.4.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
768
+ "vpm.encoder.layers.4.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
769
+ "vpm.encoder.layers.4.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
770
+ "vpm.encoder.layers.4.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
771
+ "vpm.encoder.layers.4.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
772
+ "vpm.encoder.layers.4.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
773
+ "vpm.encoder.layers.5.layer_norm1.bias": "model-00007-of-00008.safetensors",
774
+ "vpm.encoder.layers.5.layer_norm1.weight": "model-00007-of-00008.safetensors",
775
+ "vpm.encoder.layers.5.layer_norm2.bias": "model-00007-of-00008.safetensors",
776
+ "vpm.encoder.layers.5.layer_norm2.weight": "model-00007-of-00008.safetensors",
777
+ "vpm.encoder.layers.5.mlp.fc1.bias": "model-00007-of-00008.safetensors",
778
+ "vpm.encoder.layers.5.mlp.fc1.weight": "model-00007-of-00008.safetensors",
779
+ "vpm.encoder.layers.5.mlp.fc2.bias": "model-00007-of-00008.safetensors",
780
+ "vpm.encoder.layers.5.mlp.fc2.weight": "model-00007-of-00008.safetensors",
781
+ "vpm.encoder.layers.5.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
782
+ "vpm.encoder.layers.5.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
783
+ "vpm.encoder.layers.5.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
784
+ "vpm.encoder.layers.5.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
785
+ "vpm.encoder.layers.5.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
786
+ "vpm.encoder.layers.5.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
787
+ "vpm.encoder.layers.5.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
788
+ "vpm.encoder.layers.5.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
789
+ "vpm.encoder.layers.6.layer_norm1.bias": "model-00007-of-00008.safetensors",
790
+ "vpm.encoder.layers.6.layer_norm1.weight": "model-00007-of-00008.safetensors",
791
+ "vpm.encoder.layers.6.layer_norm2.bias": "model-00007-of-00008.safetensors",
792
+ "vpm.encoder.layers.6.layer_norm2.weight": "model-00007-of-00008.safetensors",
793
+ "vpm.encoder.layers.6.mlp.fc1.bias": "model-00007-of-00008.safetensors",
794
+ "vpm.encoder.layers.6.mlp.fc1.weight": "model-00007-of-00008.safetensors",
795
+ "vpm.encoder.layers.6.mlp.fc2.bias": "model-00007-of-00008.safetensors",
796
+ "vpm.encoder.layers.6.mlp.fc2.weight": "model-00007-of-00008.safetensors",
797
+ "vpm.encoder.layers.6.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
798
+ "vpm.encoder.layers.6.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
799
+ "vpm.encoder.layers.6.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
800
+ "vpm.encoder.layers.6.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
801
+ "vpm.encoder.layers.6.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
802
+ "vpm.encoder.layers.6.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
803
+ "vpm.encoder.layers.6.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
804
+ "vpm.encoder.layers.6.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
805
+ "vpm.encoder.layers.7.layer_norm1.bias": "model-00007-of-00008.safetensors",
806
+ "vpm.encoder.layers.7.layer_norm1.weight": "model-00007-of-00008.safetensors",
807
+ "vpm.encoder.layers.7.layer_norm2.bias": "model-00007-of-00008.safetensors",
808
+ "vpm.encoder.layers.7.layer_norm2.weight": "model-00007-of-00008.safetensors",
809
+ "vpm.encoder.layers.7.mlp.fc1.bias": "model-00007-of-00008.safetensors",
810
+ "vpm.encoder.layers.7.mlp.fc1.weight": "model-00007-of-00008.safetensors",
811
+ "vpm.encoder.layers.7.mlp.fc2.bias": "model-00007-of-00008.safetensors",
812
+ "vpm.encoder.layers.7.mlp.fc2.weight": "model-00007-of-00008.safetensors",
813
+ "vpm.encoder.layers.7.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
814
+ "vpm.encoder.layers.7.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
815
+ "vpm.encoder.layers.7.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
816
+ "vpm.encoder.layers.7.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
817
+ "vpm.encoder.layers.7.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
818
+ "vpm.encoder.layers.7.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
819
+ "vpm.encoder.layers.7.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
820
+ "vpm.encoder.layers.7.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
821
+ "vpm.encoder.layers.8.layer_norm1.bias": "model-00007-of-00008.safetensors",
822
+ "vpm.encoder.layers.8.layer_norm1.weight": "model-00007-of-00008.safetensors",
823
+ "vpm.encoder.layers.8.layer_norm2.bias": "model-00007-of-00008.safetensors",
824
+ "vpm.encoder.layers.8.layer_norm2.weight": "model-00007-of-00008.safetensors",
825
+ "vpm.encoder.layers.8.mlp.fc1.bias": "model-00007-of-00008.safetensors",
826
+ "vpm.encoder.layers.8.mlp.fc1.weight": "model-00007-of-00008.safetensors",
827
+ "vpm.encoder.layers.8.mlp.fc2.bias": "model-00007-of-00008.safetensors",
828
+ "vpm.encoder.layers.8.mlp.fc2.weight": "model-00007-of-00008.safetensors",
829
+ "vpm.encoder.layers.8.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
830
+ "vpm.encoder.layers.8.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
831
+ "vpm.encoder.layers.8.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
832
+ "vpm.encoder.layers.8.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
833
+ "vpm.encoder.layers.8.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
834
+ "vpm.encoder.layers.8.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
835
+ "vpm.encoder.layers.8.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
836
+ "vpm.encoder.layers.8.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
837
+ "vpm.encoder.layers.9.layer_norm1.bias": "model-00007-of-00008.safetensors",
838
+ "vpm.encoder.layers.9.layer_norm1.weight": "model-00007-of-00008.safetensors",
839
+ "vpm.encoder.layers.9.layer_norm2.bias": "model-00007-of-00008.safetensors",
840
+ "vpm.encoder.layers.9.layer_norm2.weight": "model-00007-of-00008.safetensors",
841
+ "vpm.encoder.layers.9.mlp.fc1.bias": "model-00007-of-00008.safetensors",
842
+ "vpm.encoder.layers.9.mlp.fc1.weight": "model-00007-of-00008.safetensors",
843
+ "vpm.encoder.layers.9.mlp.fc2.bias": "model-00007-of-00008.safetensors",
844
+ "vpm.encoder.layers.9.mlp.fc2.weight": "model-00007-of-00008.safetensors",
845
+ "vpm.encoder.layers.9.self_attn.k_proj.bias": "model-00007-of-00008.safetensors",
846
+ "vpm.encoder.layers.9.self_attn.k_proj.weight": "model-00007-of-00008.safetensors",
847
+ "vpm.encoder.layers.9.self_attn.out_proj.bias": "model-00007-of-00008.safetensors",
848
+ "vpm.encoder.layers.9.self_attn.out_proj.weight": "model-00007-of-00008.safetensors",
849
+ "vpm.encoder.layers.9.self_attn.q_proj.bias": "model-00007-of-00008.safetensors",
850
+ "vpm.encoder.layers.9.self_attn.q_proj.weight": "model-00007-of-00008.safetensors",
851
+ "vpm.encoder.layers.9.self_attn.v_proj.bias": "model-00007-of-00008.safetensors",
852
+ "vpm.encoder.layers.9.self_attn.v_proj.weight": "model-00007-of-00008.safetensors",
853
+ "vpm.post_layernorm.bias": "model-00008-of-00008.safetensors",
854
+ "vpm.post_layernorm.weight": "model-00008-of-00008.safetensors"
855
+ }
856
+ }
preprocessor_config.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "image_processor_type": "MiniCPMVImageProcessor",
3
+ "auto_map": {
4
+ "AutoProcessor": "processing_minicpmv.MiniCPMVProcessor",
5
+ "AutoImageProcessor": "image_processing_minicpmv.MiniCPMVImageProcessor"
6
+ },
7
+ "processor_class": "MiniCPMVProcessor",
8
+ "max_slice_nums": 9,
9
+ "scale_resolution": 448,
10
+ "patch_size": 14,
11
+ "use_image_id": true,
12
+ "image_feature_size": 64,
13
+ "im_start": "",
15
+ "slice_start": "<slice>",
16
+ "slice_end": "</slice>",
17
+ "unk": "<unk>",
18
+ "im_id_start": "<image_id>",
19
+ "im_id_end": "</image_id>",
20
+ "slice_mode": true,
21
+ "norm_mean": [0.5, 0.5, 0.5],
22
+ "norm_std": [0.5, 0.5, 0.5],
23
+ "version": 2.6
24
+ }
processing_minicpmv.py ADDED
@@ -0,0 +1,255 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # coding=utf-8
2
+ # Copyright 2024 The HuggingFace Inc. team.
3
+ #
4
+ # Licensed under the Apache License, Version 2.0 (the "License");
5
+ # you may not use this file except in compliance with the License.
6
+ # You may obtain a copy of the License at
7
+ #
8
+ # http://www.apache.org/licenses/LICENSE-2.0
9
+ #
10
+ # Unless required by applicable law or agreed to in writing, software
11
+ # distributed under the License is distributed on an "AS IS" BASIS,
12
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13
+ # See the License for the specific language governing permissions and
14
+ # limitations under the License.
15
+ """
16
+ Processor class for MiniCPMV.
17
+ """
18
+
19
+ from typing import List, Optional, Union, Dict, Any
20
+ import torch
21
+ import re
22
+
23
+ from transformers.image_processing_utils import BatchFeature
24
+ from transformers.image_utils import ImageInput
25
+ from transformers.processing_utils import ProcessorMixin
26
+ from transformers.tokenization_utils_base import PaddingStrategy, PreTokenizedInput, TextInput, TruncationStrategy
27
+ from transformers.utils import TensorType, requires_backends, is_torch_dtype, is_torch_device
28
+
29
+ from .image_processing_minicpmv import MiniCPMVBatchFeature
30
+
31
+
32
+ class MiniCPMVProcessor(ProcessorMixin):
33
+ r"""
34
+ Constructs a MiniCPMV processor which wraps a MiniCPMV image processor and a MiniCPMV tokenizer into a single processor.
35
+
36
+ [`MiniCPMVProcessor`] offers all the functionalities of [`MiniCPMVImageProcessor`] and [`LlamaTokenizerWrapper`]. See the
37
+ [`~MiniCPMVProcessor.__call__`] and [`~MiniCPMVProcessor.decode`] for more information.
38
+
39
+ Args:
40
+ image_processor ([`MiniCPMVImageProcessor`], *optional*):
41
+ The image processor is a required input.
42
+ tokenizer ([`LlamaTokenizerWrapper`], *optional*):
43
+ The tokenizer is a required input.
44
+ """
45
+ attributes = ["image_processor", "tokenizer"]
46
+ image_processor_class = "AutoImageProcessor"
47
+ tokenizer_class = "AutoTokenizer"
48
+
49
+ def __init__(self, image_processor=None, tokenizer=None):
50
+ super().__init__(image_processor, tokenizer)
51
+ self.version = image_processor.version
52
+
53
+ def __call__(
54
+ self,
55
+ text: Union[TextInput, PreTokenizedInput, List[TextInput], List[PreTokenizedInput]],
56
+ images: ImageInput = None,
57
+ max_length: Optional[int] = None,
58
+ do_pad: Optional[bool] = True,
59
+ max_slice_nums: int = None,
60
+ use_image_id: bool = None,
61
+ temporal_ids: Optional[Union[List[List[int]], List[List[List[int]]]]] = None,
62
+ return_tensors: Optional[Union[str, TensorType]] = TensorType.PYTORCH,
63
+ **kwargs
64
+ ) -> MiniCPMVBatchFeature:
65
+
66
+ if images is not None:
67
+ # image_inputs = self.image_processor(images, do_pad=do_pad, max_slice_nums=max_slice_nums, return_tensors=return_tensors)
68
+ image_inputs = self.image_processor(images, do_pad=do_pad, max_slice_nums=max_slice_nums, temporal_ids=temporal_ids, return_tensors=return_tensors)
69
+ # return self._convert_images_texts_to_inputs(image_inputs, text, max_slice_nums=max_slice_nums, use_image_id=use_image_id, max_length=max_length, **kwargs)
70
+ return self._convert_images_texts_to_inputs(image_inputs, text, max_slice_nums=max_slice_nums, use_image_id=use_image_id, max_length=max_length, temporal_ids=temporal_ids, **kwargs)
71
+
72
+ # Copied from transformers.models.clip.processing_clip.CLIPProcessor.batch_decode with CLIP->Llama
73
+ def batch_decode(self, *args, **kwargs):
74
+ """
75
+ This method forwards all its arguments to LlamaTokenizerFast's [`~PreTrainedTokenizer.batch_decode`]. Please
76
+ refer to the docstring of this method for more information.
77
+ """
78
+ output_ids = args[0]
79
+ result_text = []
80
+ for result in output_ids:
81
+ result = result[result != 0]
82
+ if result[0] == self.tokenizer.bos_id:
83
+ result = result[1:]
84
+ if result[-1] == self.tokenizer.eos_id:
85
+ result = result[:-1]
86
+ result_text.append(self.tokenizer.decode(result, *args[1:], **kwargs).strip())
87
+ return result_text
88
+ # return self.tokenizer.batch_decode(*args, **kwargs)
89
+
90
+ # Copied from transformers.models.clip.processing_clip.CLIPProcessor.decode with CLIP->Llama
91
+ def decode(self, *args, **kwargs):
92
+ """
93
+ This method forwards all its arguments to LlamaTokenizerFast's [`~PreTrainedTokenizer.decode`]. Please refer to
94
+ the docstring of this method for more information.
95
+ """
96
+ result = args[0]
97
+ result = result[result != 0]
98
+ if result[0] == self.tokenizer.bos_id:
99
+ result = result[1:]
100
+ if result[-1] == self.tokenizer.eos_id or (hasattr(self.tokenizer, "eot_id") and result[-1] == self.tokenizer.eot_id):
101
+ result = result[:-1]
102
+ return self.tokenizer.decode(result, *args[1:], **kwargs).strip()
103
+
104
+ def _convert(
105
+ self, input_str, max_inp_length: Optional[int] = None
106
+ ):
107
+ if self.version > 2.5 or not getattr(self.tokenizer, "add_bos_token", False):
108
+ input_ids = self.tokenizer.encode(input_str)
109
+ else:
110
+ input_ids = [self.tokenizer.bos_id] + self.tokenizer.encode(input_str)
111
+ if max_inp_length is not None:
112
+ input_ids = input_ids[:max_inp_length]
113
+ input_ids = torch.tensor(input_ids, dtype=torch.int32)
114
+
115
+ start_cond = (input_ids == self.tokenizer.im_start_id) | (input_ids == self.tokenizer.slice_start_id)
116
+ end_cond = (input_ids == self.tokenizer.im_end_id) | (input_ids == self.tokenizer.slice_end_id)
117
+
118
+ image_start_tokens = torch.where(start_cond)[0]
119
+ image_start_tokens += 1
120
+ image_end_tokens = torch.where(end_cond)[0]
121
+
122
+ valid_image_nums = max(len(image_start_tokens), len(image_end_tokens))
123
+
124
+ image_bounds = torch.hstack(
125
+ [
126
+ image_start_tokens[:valid_image_nums].unsqueeze(-1),
127
+ image_end_tokens[:valid_image_nums].unsqueeze(-1),
128
+ ]
129
+ )
130
+ return input_ids, image_bounds
131
+
132
+ def _convert_images_texts_to_inputs(
133
+ self,
134
+ images,
135
+ texts: Union[str, List[str]],
136
+ truncation=None,
137
+ max_length=None,
138
+ max_slice_nums=None,
139
+ use_image_id=None,
140
+ return_tensors=None,
141
+ **kwargs
142
+ ):
143
+ if images is None or not len(images):
144
+ model_inputs = self.tokenizer(texts, return_tensors=return_tensors, truncation=truncation, max_length=max_length, **kwargs)
145
+ return MiniCPMVBatchFeature(data={**model_inputs})
146
+
147
+ pattern = "()"
148
+ # images, image_sizes, tgt_sizes = images["pixel_values"], images["image_sizes"], images["tgt_sizes"]
149
+ images, image_sizes, tgt_sizes, temporal_ids, skip_image_idx = images["pixel_values"], images["image_sizes"], images["tgt_sizes"], images["temporal_ids"], images["skip_image_idx"]
150
+
151
+ if isinstance(texts, str):
152
+ texts = [texts]
153
+ input_ids_list = []
154
+ image_bounds_list = []
155
+ for index, (text, skip_idx) in enumerate(zip(texts, skip_image_idx)):
156
+ image_tags = re.findall(pattern, text)
157
+ assert len(image_tags) == len(image_sizes[index])
158
+ text_chunks = text.split(pattern)
159
+ final_text = ""
160
+
161
+ for i in range(len(image_tags)):
162
+ if i in skip_idx:
163
+ image_placeholder = ''
164
+ text_chunk = text_chunks[i].strip()
165
+
166
+ else:
167
+ image_placeholder = self.image_processor.get_slice_image_placeholder(
168
+ image_sizes[index][i],
169
+ i,
170
+ max_slice_nums,
171
+ use_image_id
172
+ )
173
+ text_chunk = text_chunks[i]
174
+
175
+ final_text = final_text + text_chunk + image_placeholder
176
+
177
+ final_text += text_chunks[-1]
178
+
179
+ input_ids, image_bounds = self._convert(final_text, max_length)
180
+ input_ids_list.append(input_ids)
181
+ image_bounds_list.append(image_bounds)
182
+ padded_input_ids, padding_lengths = self.pad(
183
+ input_ids_list,
184
+ padding_side="left"
185
+ )
186
+ for i, length in enumerate(padding_lengths):
187
+ image_bounds_list[i] = image_bounds_list[i] + length
188
+ attention_mask = padded_input_ids.ne(0)
189
+
190
+ return MiniCPMVBatchFeature(data={
191
+ "input_ids": padded_input_ids,
192
+ "attention_mask": attention_mask,
193
+ "pixel_values": images,
194
+ "image_sizes": image_sizes,
195
+ "image_bound": image_bounds_list,
196
+ "tgt_sizes": tgt_sizes,
197
+ "temporal_ids": temporal_ids
198
+ })
199
+
200
+ @property
201
+ # Copied from transformers.models.clip.processing_clip.CLIPProcessor.model_input_names
202
+ def model_input_names(self):
203
+ tokenizer_input_names = self.tokenizer.model_input_names
204
+ image_processor_input_names = self.image_processor.model_input_names
205
+ return list(dict.fromkeys(tokenizer_input_names + image_processor_input_names))
206
+
207
+
208
+ def pad(self, inputs, max_length=None, padding_value=0, padding_side="left"):
209
+ items = []
210
+ if isinstance(inputs[0], list):
211
+ assert isinstance(inputs[0][0], torch.Tensor)
212
+ for it in inputs:
213
+ for tr in it:
214
+ items.append(tr)
215
+ else:
216
+ assert isinstance(inputs[0], torch.Tensor)
217
+ items = inputs
218
+
219
+ batch_size = len(items)
220
+ shape = items[0].shape
221
+ dim = len(shape)
222
+ assert dim <= 2
223
+ if max_length is None:
224
+ max_length = 0
225
+ max_length = max(max_length, max(item.shape[-1] for item in items))
226
+ min_length = min(item.shape[-1] for item in items)
227
+ dtype = items[0].dtype
228
+
229
+ if dim == 0:
230
+ return torch.stack([item for item in items], dim=0), [0]
231
+ elif dim == 1:
232
+ if max_length == min_length:
233
+ return torch.stack([item for item in items], dim=0), [0] * batch_size
234
+ tensor = torch.zeros((batch_size, max_length), dtype=dtype) + padding_value
235
+ else:
236
+ tensor = (
237
+ torch.zeros((batch_size, max_length, shape[-1]), dtype=dtype)
238
+ + padding_value
239
+ )
240
+
241
+ padding_length = []
242
+ for i, item in enumerate(items):
243
+ if dim == 1:
244
+ if padding_side == "left":
245
+ tensor[i, -len(item) :] = item.clone()
246
+ else:
247
+ tensor[i, : len(item)] = item.clone()
248
+ elif dim == 2:
249
+ if padding_side == "left":
250
+ tensor[i, -len(item) :, :] = item.clone()
251
+ else:
252
+ tensor[i, : len(item), :] = item.clone()
253
+ padding_length.append(tensor.shape[-1] - len(item))
254
+
255
+ return tensor, padding_length
special_tokens_map.json ADDED
@@ -0,0 +1,112 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<unk>",
4
+ "",
6
+ "<ref>",
7
+ "</ref>",
8
+ "<box>",
9
+ "</box>",
10
+ "<quad>",
11
+ "</quad>",
12
+ "<point>",
13
+ "</point>",
14
+ "<slice>",
15
+ "</slice>",
16
+ "<image_id>",
17
+ "</image_id>",
18
+ "<unit>",
19
+ "</unit>",
20
+ "<|reserved_0|>",
21
+ "<|reserved_1|>",
22
+ "<|reserved_2|>",
23
+ "<|reserved_3|>",
24
+ "<|reserved_4|>",
25
+ "<|reserved_5|>",
26
+ "<|reserved_6|>",
27
+ "<|reserved_7|>",
28
+ "<|reserved_8|>",
29
+ "<|reserved_9|>",
30
+ "<|reserved_10|>",
31
+ "<|reserved_11|>",
32
+ "<|reserved_12|>",
33
+ "<|reserved_13|>",
34
+ "<|reserved_14|>",
35
+ "<|reserved_15|>",
36
+ "<|reserved_16|>",
37
+ "<|reserved_17|>",
38
+ "<|reserved_18|>",
39
+ "<|reserved_19|>",
40
+ "<|reserved_20|>",
41
+ "<|reserved_21|>",
42
+ "<|reserved_22|>",
43
+ "<|reserved_23|>",
44
+ "<|reserved_24|>",
45
+ "<|reserved_25|>",
46
+ "<|reserved_26|>",
47
+ "<|reserved_27|>",
48
+ "<|reserved_28|>",
49
+ "<|reserved_29|>",
50
+ "<|reserved_30|>",
51
+ "<|reserved_31|>",
52
+ "<|reserved_32|>",
53
+ "<|reserved_33|>",
54
+ "<|reserved_34|>",
55
+ "<|reserved_35|>",
56
+ "<|reserved_36|>",
57
+ "<|reserved_37|>",
58
+ "<|reserved_38|>",
59
+ "<|reserved_39|>",
60
+ "<|reserved_40|>",
61
+ "<|reserved_41|>",
62
+ "<|reserved_42|>",
63
+ "<|reserved_43|>",
64
+ "<|reserved_44|>",
65
+ "<|reserved_45|>",
66
+ "<|reserved_46|>",
67
+ "<|reserved_47|>",
68
+ "<|reserved_48|>",
69
+ "<|reserved_49|>",
70
+ "<|reserved_50|>",
71
+ "<|reserved_51|>",
72
+ "<|reserved_52|>",
73
+ "<|reserved_53|>",
74
+ "<|reserved_54|>",
75
+ "<|reserved_55|>",
76
+ "<|reserved_56|>",
77
+ "<|reserved_57|>",
78
+ "<|reserved_58|>",
79
+ "<|reserved_59|>",
80
+ "<|reserved_60|>",
81
+ "<|reserved_61|>",
82
+ "<|reserved_62|>"
83
+ ],
84
+ "bos_token": {
85
+ "content": "<|im_start|>",
86
+ "lstrip": false,
87
+ "normalized": false,
88
+ "rstrip": false,
89
+ "single_word": false
90
+ },
91
+ "eos_token": {
92
+ "content": "<|im_end|>",
93
+ "lstrip": false,
94
+ "normalized": false,
95
+ "rstrip": false,
96
+ "single_word": false
97
+ },
98
+ "pad_token": {
99
+ "content": "<|endoftext|>",
100
+ "lstrip": false,
101
+ "normalized": false,
102
+ "rstrip": false,
103
+ "single_word": false
104
+ },
105
+ "unk_token": {
106
+ "content": "<unk>",
107
+ "lstrip": false,
108
+ "normalized": false,
109
+ "rstrip": false,
110
+ "single_word": false
111
+ }
112
+ }
tokenization_minicpmv_fast.py ADDED
@@ -0,0 +1,66 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from transformers import Qwen2TokenizerFast
2
+
3
+
4
+ class MiniCPMVTokenizerFast(Qwen2TokenizerFast):
5
+ def __init__(self, **kwargs):
6
+ super().__init__(**kwargs)
7
+ self.im_start = ""
9
+ self.ref_start = "<ref>"
10
+ self.ref_end = "</ref>"
11
+ self.box_start = "<box>"
12
+ self.box_end = "</box>"
13
+ self.quad_start = "<quad>"
14
+ self.quad_end = "</quad>"
15
+ self.slice_start = "<slice>"
16
+ self.slice_end = "</slice>"
17
+ self.im_id_start = "<image_id>"
18
+ self.im_id_end = "</image_id>"
19
+
20
+ @property
21
+ def eos_id(self):
22
+ return self.eos_token_id
23
+
24
+ @property
25
+ def bos_id(self):
26
+ return self.bos_token_id
27
+
28
+ @property
29
+ def unk_id(self):
30
+ return self.unk_token_id
31
+
32
+ @property
33
+ def im_start_id(self):
34
+ return self.convert_tokens_to_ids(self.im_start)
35
+
36
+ @property
37
+ def im_end_id(self):
38
+ return self.convert_tokens_to_ids(self.im_end)
39
+
40
+ @property
41
+ def slice_start_id(self):
42
+ return self.convert_tokens_to_ids(self.slice_start)
43
+
44
+ @property
45
+ def slice_end_id(self):
46
+ return self.convert_tokens_to_ids(self.slice_end)
47
+
48
+ @property
49
+ def im_id_start_id(self):
50
+ return self.convert_tokens_to_ids(self.im_id_start)
51
+
52
+ @property
53
+ def im_id_end_id(self):
54
+ return self.convert_tokens_to_ids(self.im_id_end)
55
+
56
+ @property
57
+ def newline_id(self):
58
+ return self.convert_tokens_to_ids('\n')
59
+
60
+ @staticmethod
61
+ def escape(text: str) -> str:
62
+ return text
63
+
64
+ @staticmethod
65
+ def unescape(text: str) -> str:
66
+ return text
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0ea5706e1afcf5774ae15138f8418dfd46b30558d6a2c9782f4eae9ff25341e0
3
+ size 11825878
tokenizer_config.json ADDED
The diff for this file is too large to render. See raw diff
 
vocab.json ADDED
The diff for this file is too large to render. See raw diff