irene93 commited on
Commit
5791e02
·
verified ·
1 Parent(s): 866f9b5

Upload folder using huggingface_hub

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 1024,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,693 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:23404
8
+ - loss:MultipleNegativesRankingLoss
9
+ base_model: BAAI/bge-m3
10
+ widget:
11
+ - source_sentence: What concern did analysts have regarding Microsoft's canceled data
12
+ center leases, and how might recent growth in Azure address these concerns?
13
+ sentences:
14
+ - 'Microsoft''s Azure cloud division in particular grew by a third over
15
+
16
+ the quarter. The company says about half of that was contributed by AI, news
17
+
18
+ that could ease worries about a slowdown in demand for the new tech. Some
19
+
20
+ analysts had previously pointed to canceled data center leases by Microsoft as
21
+
22
+ a sign of excess capacity.'
23
+ - "The 2-year note <US2YT=RR> yield, which typically moves in\nstep with interest\
24
+ \ rate expectations for the Federal Reserve,\nfell 2.5 basis points to 3.66%,\
25
+ \ from 3.685% late on Monday. Oil prices slid on fears of a global recession and\
26
+ \ dampening\ndemand due to Trump's trade war. U.S. crude <CLc1> fell 2.63% to\
27
+ \ settle at $60.42 per barrel,\n while Brent <LCOc1> settled at $64.25 per barrel,\
28
+ \ down 2.44% on\nthe day."
29
+ - 'They tried to bring back tariffs to save our country, but it
30
+
31
+ was gone. It was too late. Nothing could have been done.'
32
+ - source_sentence: How are evolving domestic alternatives in China expected to impact
33
+ U.S. semiconductor firms?
34
+ sentences:
35
+ - 'And Mr. Secretary, why don''t you talk a
36
+
37
+ little bit about? >> So it''s really a three legged stool in the economic
38
+
39
+ policy. It''s trade, it''s tax, and it''s deregulation.'
40
+ - "* \n Arm's CPUs gain popularity due to lower power consumption\n \n\n\
41
+ \ * \n Amazon, Google, Microsoft design Arm-based data center\nchips\n\
42
+ \ \n\n * \n Arm's journey to 50% market share took 20 years\n \
43
+ \ \n\n \n By Max A. Cherney\n SAN FRANCISCO, March 31 (Reuters) -\
44
+ \ Arm Holdings\n<O9Ty.F> expects its share of the global market for data center\n\
45
+ central processing units to surge to 50% by the end of the year,\nup from about\
46
+ \ 15% in 2024 with gains driven by the boom in\nartificial intelligence, a senior\
47
+ \ executive said. Arm's CPUs are often used as a \"host\" chip inside of an AI\n\
48
+ computing system and act as a kind of traffic controller for\nother AI chips.\
49
+ \ Nvidia <NVDA.O>, for example, uses an Arm-based\nchip called Grace in some of\
50
+ \ its advanced AI systems which\ncontain two of its Blackwell chips."
51
+ - "\"Semiconductors will feel a greater impact ... We're already\nwitnessing a domestic\
52
+ \ ecosystem evolve in China, with direct\nalternatives for every major US semiconductor\
53
+ \ firm. This trend\nis likely to accelerate,\" Udupa said. NATURAL RESOURCES\n\
54
+ \ Crude prices, already under pressure from an expected OPEC+\noil output hike\
55
+ \ in May, added to the losses."
56
+ - source_sentence: Which company has declared a quarterly coupon on the Alerian MLP
57
+ Index ETN?
58
+ sentences:
59
+ - JPMORGAN CHASE FINANCIAL COMPANY LLC DECLARES QUARTERLY COUPON ON ALERIAN MLP
60
+ INDEX ETN
61
+ - 'The
62
+
63
+ industry has been clouded by fears that a trade dispute could
64
+
65
+ temper consumer confidence, reduce spending, weaken loan demand
66
+
67
+ and pressure fees from advising on deals. JPMorgan Chase <JPM.N>, the biggest
68
+ U.S. bank by assets,
69
+
70
+ sank 7%. Wall Street titans Goldman Sachs <GS.N> and Morgan
71
+
72
+ Stanley <MS.N> dropped more than 7% each.'
73
+ - 'They went to foreign countries and
74
+
75
+ they built companies are pouring into our country at levels never seen before
76
+
77
+ with jobs and money to follow, and it''s really beautiful. In the coming days,
78
+
79
+ there will be complaints from the globalists and the outsources and special
80
+
81
+ interests. And always fake news will always complain.'
82
+ - source_sentence: What reasons did the Data Center Coalition give for opposing ERCOT's
83
+ proposal regarding data centers and crypto miners?
84
+ sentences:
85
+ - "The company said in August 2024 that it was set to start\ntalks with other pharmaceutical\
86
+ \ companies in the second half of\nthis year for potential partnerships to develop\
87
+ \ and\ncommercialize petrelintide. STRUCTURE THERAPEUTICS\n Structure Therapeutics\
88
+ \ <GPCR.O> said last year its\nexperimental oral obesity drug helped reduce weight\
89
+ \ by 6.2% on\naverage at the end of 12 weeks in a mid-stage study. <^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\
90
+ Weight-loss drug forecasts jump to $150 billion as supply grows \n https://www.reuters.com/business/healthcare-pharmaceuticals/weight-loss-drug-forecasts-jump-150-billion-supply-grows-2024-05-28/\n\
91
+ \ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^>\n (Reporting\
92
+ \ by Mariam Sunny, Kamal Choudhury, Pratik Jain,\nSriparna Roy, Leroy Leo, Sneha\
93
+ \ S K and Nathan Gomes in\nBengaluru; Editing by Shilpi Majumdar and Alan Barona)\n\
94
95
+ - '"When you
96
+
97
+ make that shift from de-risking your books to getting back in a
98
+
99
+ risk-on attitude, that is going to show up first in all things
100
+
101
+ technology and specifically the Mag Seven for sure." The group''s 2025 struggles
102
+ are a dramatic shift from the
103
+
104
+ prior two years. The Magnificent Seven''s stunning gains meant
105
+
106
+ they were responsible for well over half of the S&P 500''s 58%
107
+
108
+ two-year return in 2023 and 2024.'
109
+ - 'But
110
+
111
+ data center operators are opposed because of the risk of
112
+
113
+ damaging electronic equipment and cooling systems. ERCOT last year withdrew a
114
+ proposal that would have imposed
115
+
116
+ ride-through restrictions on data centers and crypto miners
117
+
118
+ after facing pushback from an industry group, the Data Center
119
+
120
+ Coalition. The group, whose members include Amazon, Google, and Meta,
121
+
122
+ cited costs and the risk of damaging computer chips and cooling
123
+
124
+ systems exposed to fluctuating voltage levels.'
125
+ - source_sentence: What factors are contributing to pressure on Apple's market share
126
+ in China?
127
+ sentences:
128
+ - 'The company forecast low-to-mid single-digit
129
+
130
+ revenue growth, in line with muted expectations. In China, Apple posted $16 billion
131
+ in revenue, slightly
132
+
133
+ above forecasts, though competition from Huawei and slower AI
134
+
135
+ rollout continue to pressure market share. If losses hold, Apple is on track to
136
+ shed more than $150
137
+
138
+ billion in market value, while a bullish outlook from Microsoft
139
+
140
+ <MSFT.O> earlier this week has helped the Windows-maker become
141
+
142
+ the world''s most valuable company.'
143
+ - 'With recent
144
+
145
+ exchange rate fluctuations adding to the uncertainty, we are
146
+
147
+ taking a more cautious outlook for the near future." While Washington and Beijing
148
+ on Monday agreed to slash
149
+
150
+ tariffs for at least 90 days, the cheer over the temporary truce
151
+
152
+ was tempered by caution given a more permanent trade deal needs
153
+
154
+ to be struck, while higher tariffs overall could still weigh on
155
+
156
+ the global economy. Most of the iPhones Foxconn makes for Apple are assembled
157
+ in
158
+
159
+ China.'
160
+ - "It sounds\nlike what youâ\x80\x99re saying CATL obviously major battery supplier,\
161
+ \ huge kind\nof company undergirding the electric revolution, you have DeepSeek,\
162
+ \ as you\nsaid, kind of prove the viability of AI engineering in a world where\
163
+ \ the US is\ntrying to crack down on Chinese access to leading edge technology.\
164
+ \ So, like\nboth of those are like, â\x80\x9COh, you can actually get a tech rally\
165
+ \ over in\nChina despite whatâ\x80\x99s happening in the trade war.â\x80\x9D Is\
166
+ \ that kind of\nlike itâ\x80\x99s just the same mania, itâ\x80\x99s just going\
167
+ \ where it has a new\noutlet or? Well, I mean, I think to some extent."
168
+ pipeline_tag: sentence-similarity
169
+ library_name: sentence-transformers
170
+ metrics:
171
+ - cosine_accuracy@1
172
+ - cosine_accuracy@3
173
+ - cosine_accuracy@5
174
+ - cosine_accuracy@10
175
+ - cosine_precision@1
176
+ - cosine_precision@3
177
+ - cosine_precision@5
178
+ - cosine_precision@10
179
+ - cosine_recall@1
180
+ - cosine_recall@3
181
+ - cosine_recall@5
182
+ - cosine_recall@10
183
+ - cosine_ndcg@10
184
+ - cosine_mrr@10
185
+ - cosine_map@100
186
+ model-index:
187
+ - name: SentenceTransformer based on BAAI/bge-m3
188
+ results:
189
+ - task:
190
+ type: information-retrieval
191
+ name: Information Retrieval
192
+ dataset:
193
+ name: Unknown
194
+ type: unknown
195
+ metrics:
196
+ - type: cosine_accuracy@1
197
+ value: 0.34541104084771834
198
+ name: Cosine Accuracy@1
199
+ - type: cosine_accuracy@3
200
+ value: 0.6057084259101009
201
+ name: Cosine Accuracy@3
202
+ - type: cosine_accuracy@5
203
+ value: 0.7222696974876089
204
+ name: Cosine Accuracy@5
205
+ - type: cosine_accuracy@10
206
+ value: 0.8465219620577679
207
+ name: Cosine Accuracy@10
208
+ - type: cosine_precision@1
209
+ value: 0.34541104084771834
210
+ name: Cosine Precision@1
211
+ - type: cosine_precision@3
212
+ value: 0.20190280863670027
213
+ name: Cosine Precision@3
214
+ - type: cosine_precision@5
215
+ value: 0.14445393949752178
216
+ name: Cosine Precision@5
217
+ - type: cosine_precision@10
218
+ value: 0.08465219620577678
219
+ name: Cosine Precision@10
220
+ - type: cosine_recall@1
221
+ value: 0.34541104084771834
222
+ name: Cosine Recall@1
223
+ - type: cosine_recall@3
224
+ value: 0.6057084259101009
225
+ name: Cosine Recall@3
226
+ - type: cosine_recall@5
227
+ value: 0.7222696974876089
228
+ name: Cosine Recall@5
229
+ - type: cosine_recall@10
230
+ value: 0.8465219620577679
231
+ name: Cosine Recall@10
232
+ - type: cosine_ndcg@10
233
+ value: 0.5858686159391115
234
+ name: Cosine Ndcg@10
235
+ - type: cosine_mrr@10
236
+ value: 0.503433682480002
237
+ name: Cosine Mrr@10
238
+ - type: cosine_map@100
239
+ value: 0.5104997716404978
240
+ name: Cosine Map@100
241
+ ---
242
+
243
+ # SentenceTransformer based on BAAI/bge-m3
244
+
245
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
246
+
247
+ ## Model Details
248
+
249
+ ### Model Description
250
+ - **Model Type:** Sentence Transformer
251
+ - **Base model:** [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) <!-- at revision 5617a9f61b028005a4858fdac845db406aefb181 -->
252
+ - **Maximum Sequence Length:** 8192 tokens
253
+ - **Output Dimensionality:** 1024 dimensions
254
+ - **Similarity Function:** Cosine Similarity
255
+ <!-- - **Training Dataset:** Unknown -->
256
+ <!-- - **Language:** Unknown -->
257
+ <!-- - **License:** Unknown -->
258
+
259
+ ### Model Sources
260
+
261
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
262
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
263
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
264
+
265
+ ### Full Model Architecture
266
+
267
+ ```
268
+ SentenceTransformer(
269
+ (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
270
+ (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
271
+ )
272
+ ```
273
+
274
+ ## Usage
275
+
276
+ ### Direct Usage (Sentence Transformers)
277
+
278
+ First install the Sentence Transformers library:
279
+
280
+ ```bash
281
+ pip install -U sentence-transformers
282
+ ```
283
+
284
+ Then you can load this model and run inference.
285
+ ```python
286
+ from sentence_transformers import SentenceTransformer
287
+
288
+ # Download from the 🤗 Hub
289
+ model = SentenceTransformer("sentence_transformers_model_id")
290
+ # Run inference
291
+ sentences = [
292
+ "What factors are contributing to pressure on Apple's market share in China?",
293
+ "The company forecast low-to-mid single-digit\nrevenue growth, in line with muted expectations. In China, Apple posted $16 billion in revenue, slightly\nabove forecasts, though competition from Huawei and slower AI\nrollout continue to pressure market share. If losses hold, Apple is on track to shed more than $150\nbillion in market value, while a bullish outlook from Microsoft\n<MSFT.O> earlier this week has helped the Windows-maker become\nthe world's most valuable company.",
294
+ 'With recent\nexchange rate fluctuations adding to the uncertainty, we are\ntaking a more cautious outlook for the near future." While Washington and Beijing on Monday agreed to slash\ntariffs for at least 90 days, the cheer over the temporary truce\nwas tempered by caution given a more permanent trade deal needs\nto be struck, while higher tariffs overall could still weigh on\nthe global economy. Most of the iPhones Foxconn makes for Apple are assembled in\nChina.',
295
+ ]
296
+ embeddings = model.encode(sentences)
297
+ print(embeddings.shape)
298
+ # [3, 1024]
299
+
300
+ # Get the similarity scores for the embeddings
301
+ similarities = model.similarity(embeddings, embeddings)
302
+ print(similarities.shape)
303
+ # [3, 3]
304
+ ```
305
+
306
+ <!--
307
+ ### Direct Usage (Transformers)
308
+
309
+ <details><summary>Click to see the direct usage in Transformers</summary>
310
+
311
+ </details>
312
+ -->
313
+
314
+ <!--
315
+ ### Downstream Usage (Sentence Transformers)
316
+
317
+ You can finetune this model on your own dataset.
318
+
319
+ <details><summary>Click to expand</summary>
320
+
321
+ </details>
322
+ -->
323
+
324
+ <!--
325
+ ### Out-of-Scope Use
326
+
327
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
328
+ -->
329
+
330
+ ## Evaluation
331
+
332
+ ### Metrics
333
+
334
+ #### Information Retrieval
335
+
336
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
337
+
338
+ | Metric | Value |
339
+ |:--------------------|:-----------|
340
+ | cosine_accuracy@1 | 0.3454 |
341
+ | cosine_accuracy@3 | 0.6057 |
342
+ | cosine_accuracy@5 | 0.7223 |
343
+ | cosine_accuracy@10 | 0.8465 |
344
+ | cosine_precision@1 | 0.3454 |
345
+ | cosine_precision@3 | 0.2019 |
346
+ | cosine_precision@5 | 0.1445 |
347
+ | cosine_precision@10 | 0.0847 |
348
+ | cosine_recall@1 | 0.3454 |
349
+ | cosine_recall@3 | 0.6057 |
350
+ | cosine_recall@5 | 0.7223 |
351
+ | cosine_recall@10 | 0.8465 |
352
+ | **cosine_ndcg@10** | **0.5859** |
353
+ | cosine_mrr@10 | 0.5034 |
354
+ | cosine_map@100 | 0.5105 |
355
+
356
+ <!--
357
+ ## Bias, Risks and Limitations
358
+
359
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
360
+ -->
361
+
362
+ <!--
363
+ ### Recommendations
364
+
365
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
366
+ -->
367
+
368
+ ## Training Details
369
+
370
+ ### Training Dataset
371
+
372
+ #### Unnamed Dataset
373
+
374
+ * Size: 23,404 training samples
375
+ * Columns: <code>sentence_0</code> and <code>sentence_1</code>
376
+ * Approximate statistics based on the first 1000 samples:
377
+ | | sentence_0 | sentence_1 |
378
+ |:--------|:-----------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|
379
+ | type | string | string |
380
+ | details | <ul><li>min: 10 tokens</li><li>mean: 24.88 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 257.76 tokens</li><li>max: 8022 tokens</li></ul> |
381
+ * Samples:
382
+ | sentence_0 | sentence_1 |
383
+ |:--------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
384
+ | <code>By approximately what percentage did Meta's shares increase in after-hours trading following the announcement of its results?</code> | <code>Its shares<br>jumped around 7% in after-hours trade on the news of the results. Meanwhile,<br>Meta beat estimates with $42 billion in revenue last quarter. It's also said<br>its daily active users across Facebook, Instagram, and the rest of its<br>services rose 6% year-on-year, marking welcome news for advertisers.</code> |
385
+ | <code>How have drugmakers responded to proposed tariffs on imported pharmaceutical products during the Commerce Department's investigation?</code> | <code>The move triggered a 21-day public comment period as part of<br> the investigation led by the Commerce Department. Drugmakers see the probe as a chance to show the<br>administration that high tariffs would hinder their efforts to<br>swiftly ramp up U.S. production, and to propose alternatives,<br>said Ted Murphy, a trade lawyer at law firm Sidley Austin, which<br>is advising companies on their submissions to the Commerce<br>Department. Drugmakers have also lobbied Trump to phase in tariffs on<br>imported pharmaceutical products in hopes of reducing the sting<br>from the charges.</code> |
386
+ | <code>Which South American companies currently use the company's regional services, and what growth expectations does Estevez have for the area?</code> | <code>The company already has 36 regions and 114 availability<br>zones worldwide used by companies such as Netflix, General<br>Electric and Sony for storage, networking and remote security. Chilean retailer Cencosud, online retail giant MercadoLibre,<br>and mining companies already use the company's other regional<br>services, it said. Amazon's first-quarter cloud revenue and income forecast<br>came in below estimates last Thursday, but Estevez said he's<br>expecting strong growth in Chile and across the region.</code> |
387
+ * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
388
+ ```json
389
+ {
390
+ "scale": 20.0,
391
+ "similarity_fct": "cos_sim"
392
+ }
393
+ ```
394
+
395
+ ### Training Hyperparameters
396
+ #### Non-Default Hyperparameters
397
+
398
+ - `eval_strategy`: steps
399
+ - `per_device_train_batch_size`: 3
400
+ - `per_device_eval_batch_size`: 3
401
+ - `num_train_epochs`: 2
402
+ - `multi_dataset_batch_sampler`: round_robin
403
+
404
+ #### All Hyperparameters
405
+ <details><summary>Click to expand</summary>
406
+
407
+ - `overwrite_output_dir`: False
408
+ - `do_predict`: False
409
+ - `eval_strategy`: steps
410
+ - `prediction_loss_only`: True
411
+ - `per_device_train_batch_size`: 3
412
+ - `per_device_eval_batch_size`: 3
413
+ - `per_gpu_train_batch_size`: None
414
+ - `per_gpu_eval_batch_size`: None
415
+ - `gradient_accumulation_steps`: 1
416
+ - `eval_accumulation_steps`: None
417
+ - `torch_empty_cache_steps`: None
418
+ - `learning_rate`: 5e-05
419
+ - `weight_decay`: 0.0
420
+ - `adam_beta1`: 0.9
421
+ - `adam_beta2`: 0.999
422
+ - `adam_epsilon`: 1e-08
423
+ - `max_grad_norm`: 1
424
+ - `num_train_epochs`: 2
425
+ - `max_steps`: -1
426
+ - `lr_scheduler_type`: linear
427
+ - `lr_scheduler_kwargs`: {}
428
+ - `warmup_ratio`: 0.0
429
+ - `warmup_steps`: 0
430
+ - `log_level`: passive
431
+ - `log_level_replica`: warning
432
+ - `log_on_each_node`: True
433
+ - `logging_nan_inf_filter`: True
434
+ - `save_safetensors`: True
435
+ - `save_on_each_node`: False
436
+ - `save_only_model`: False
437
+ - `restore_callback_states_from_checkpoint`: False
438
+ - `no_cuda`: False
439
+ - `use_cpu`: False
440
+ - `use_mps_device`: False
441
+ - `seed`: 42
442
+ - `data_seed`: None
443
+ - `jit_mode_eval`: False
444
+ - `use_ipex`: False
445
+ - `bf16`: False
446
+ - `fp16`: False
447
+ - `fp16_opt_level`: O1
448
+ - `half_precision_backend`: auto
449
+ - `bf16_full_eval`: False
450
+ - `fp16_full_eval`: False
451
+ - `tf32`: None
452
+ - `local_rank`: 0
453
+ - `ddp_backend`: None
454
+ - `tpu_num_cores`: None
455
+ - `tpu_metrics_debug`: False
456
+ - `debug`: []
457
+ - `dataloader_drop_last`: False
458
+ - `dataloader_num_workers`: 0
459
+ - `dataloader_prefetch_factor`: None
460
+ - `past_index`: -1
461
+ - `disable_tqdm`: False
462
+ - `remove_unused_columns`: True
463
+ - `label_names`: None
464
+ - `load_best_model_at_end`: False
465
+ - `ignore_data_skip`: False
466
+ - `fsdp`: []
467
+ - `fsdp_min_num_params`: 0
468
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
469
+ - `fsdp_transformer_layer_cls_to_wrap`: None
470
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
471
+ - `deepspeed`: None
472
+ - `label_smoothing_factor`: 0.0
473
+ - `optim`: adamw_torch
474
+ - `optim_args`: None
475
+ - `adafactor`: False
476
+ - `group_by_length`: False
477
+ - `length_column_name`: length
478
+ - `ddp_find_unused_parameters`: None
479
+ - `ddp_bucket_cap_mb`: None
480
+ - `ddp_broadcast_buffers`: False
481
+ - `dataloader_pin_memory`: True
482
+ - `dataloader_persistent_workers`: False
483
+ - `skip_memory_metrics`: True
484
+ - `use_legacy_prediction_loop`: False
485
+ - `push_to_hub`: False
486
+ - `resume_from_checkpoint`: None
487
+ - `hub_model_id`: None
488
+ - `hub_strategy`: every_save
489
+ - `hub_private_repo`: None
490
+ - `hub_always_push`: False
491
+ - `hub_revision`: None
492
+ - `gradient_checkpointing`: False
493
+ - `gradient_checkpointing_kwargs`: None
494
+ - `include_inputs_for_metrics`: False
495
+ - `include_for_metrics`: []
496
+ - `eval_do_concat_batches`: True
497
+ - `fp16_backend`: auto
498
+ - `push_to_hub_model_id`: None
499
+ - `push_to_hub_organization`: None
500
+ - `mp_parameters`:
501
+ - `auto_find_batch_size`: False
502
+ - `full_determinism`: False
503
+ - `torchdynamo`: None
504
+ - `ray_scope`: last
505
+ - `ddp_timeout`: 1800
506
+ - `torch_compile`: False
507
+ - `torch_compile_backend`: None
508
+ - `torch_compile_mode`: None
509
+ - `include_tokens_per_second`: False
510
+ - `include_num_input_tokens_seen`: False
511
+ - `neftune_noise_alpha`: None
512
+ - `optim_target_modules`: None
513
+ - `batch_eval_metrics`: False
514
+ - `eval_on_start`: False
515
+ - `use_liger_kernel`: False
516
+ - `liger_kernel_config`: None
517
+ - `eval_use_gather_object`: False
518
+ - `average_tokens_across_devices`: False
519
+ - `prompts`: None
520
+ - `batch_sampler`: batch_sampler
521
+ - `multi_dataset_batch_sampler`: round_robin
522
+
523
+ </details>
524
+
525
+ ### Training Logs
526
+ <details><summary>Click to expand</summary>
527
+
528
+ | Epoch | Step | Training Loss | cosine_ndcg@10 |
529
+ |:------:|:----:|:-------------:|:--------------:|
530
+ | 0.0192 | 50 | - | 0.5170 |
531
+ | 0.0384 | 100 | - | 0.5279 |
532
+ | 0.0577 | 150 | - | 0.5324 |
533
+ | 0.0769 | 200 | - | 0.5336 |
534
+ | 0.0961 | 250 | - | 0.5456 |
535
+ | 0.1153 | 300 | - | 0.5535 |
536
+ | 0.1346 | 350 | - | 0.5507 |
537
+ | 0.1538 | 400 | - | 0.5532 |
538
+ | 0.1730 | 450 | - | 0.5591 |
539
+ | 0.1922 | 500 | 0.2091 | 0.5693 |
540
+ | 0.2115 | 550 | - | 0.5666 |
541
+ | 0.2307 | 600 | - | 0.5669 |
542
+ | 0.2499 | 650 | - | 0.5668 |
543
+ | 0.2691 | 700 | - | 0.5636 |
544
+ | 0.2884 | 750 | - | 0.5650 |
545
+ | 0.3076 | 800 | - | 0.5636 |
546
+ | 0.3268 | 850 | - | 0.5677 |
547
+ | 0.3460 | 900 | - | 0.5686 |
548
+ | 0.3652 | 950 | - | 0.5678 |
549
+ | 0.3845 | 1000 | 0.0546 | 0.5624 |
550
+ | 0.4037 | 1050 | - | 0.5659 |
551
+ | 0.4229 | 1100 | - | 0.5687 |
552
+ | 0.4421 | 1150 | - | 0.5704 |
553
+ | 0.4614 | 1200 | - | 0.5695 |
554
+ | 0.4806 | 1250 | - | 0.5702 |
555
+ | 0.4998 | 1300 | - | 0.5582 |
556
+ | 0.5190 | 1350 | - | 0.5703 |
557
+ | 0.5383 | 1400 | - | 0.5688 |
558
+ | 0.5575 | 1450 | - | 0.5722 |
559
+ | 0.5767 | 1500 | 0.0529 | 0.5673 |
560
+ | 0.5959 | 1550 | - | 0.5669 |
561
+ | 0.6151 | 1600 | - | 0.5597 |
562
+ | 0.6344 | 1650 | - | 0.5666 |
563
+ | 0.6536 | 1700 | - | 0.5626 |
564
+ | 0.6728 | 1750 | - | 0.5627 |
565
+ | 0.6920 | 1800 | - | 0.5641 |
566
+ | 0.7113 | 1850 | - | 0.5572 |
567
+ | 0.7305 | 1900 | - | 0.5632 |
568
+ | 0.7497 | 1950 | - | 0.5733 |
569
+ | 0.7689 | 2000 | 0.0478 | 0.5644 |
570
+ | 0.7882 | 2050 | - | 0.5658 |
571
+ | 0.8074 | 2100 | - | 0.5608 |
572
+ | 0.8266 | 2150 | - | 0.5687 |
573
+ | 0.8458 | 2200 | - | 0.5728 |
574
+ | 0.8651 | 2250 | - | 0.5581 |
575
+ | 0.8843 | 2300 | - | 0.5612 |
576
+ | 0.9035 | 2350 | - | 0.5616 |
577
+ | 0.9227 | 2400 | - | 0.5650 |
578
+ | 0.9419 | 2450 | - | 0.5626 |
579
+ | 0.9612 | 2500 | 0.0482 | 0.5665 |
580
+ | 0.9804 | 2550 | - | 0.5668 |
581
+ | 0.9996 | 2600 | - | 0.5552 |
582
+ | 1.0 | 2601 | - | 0.5556 |
583
+ | 1.0188 | 2650 | - | 0.5681 |
584
+ | 1.0381 | 2700 | - | 0.5620 |
585
+ | 1.0573 | 2750 | - | 0.5639 |
586
+ | 1.0765 | 2800 | - | 0.5646 |
587
+ | 1.0957 | 2850 | - | 0.5714 |
588
+ | 1.1150 | 2900 | - | 0.5748 |
589
+ | 1.1342 | 2950 | - | 0.5739 |
590
+ | 1.1534 | 3000 | 0.033 | 0.5630 |
591
+ | 1.1726 | 3050 | - | 0.5655 |
592
+ | 1.1918 | 3100 | - | 0.5711 |
593
+ | 1.2111 | 3150 | - | 0.5680 |
594
+ | 1.2303 | 3200 | - | 0.5742 |
595
+ | 1.2495 | 3250 | - | 0.5714 |
596
+ | 1.2687 | 3300 | - | 0.5657 |
597
+ | 1.2880 | 3350 | - | 0.5636 |
598
+ | 1.3072 | 3400 | - | 0.5701 |
599
+ | 1.3264 | 3450 | - | 0.5720 |
600
+ | 1.3456 | 3500 | 0.0276 | 0.5733 |
601
+ | 1.3649 | 3550 | - | 0.5738 |
602
+ | 1.3841 | 3600 | - | 0.5743 |
603
+ | 1.4033 | 3650 | - | 0.5702 |
604
+ | 1.4225 | 3700 | - | 0.5732 |
605
+ | 1.4418 | 3750 | - | 0.5705 |
606
+ | 1.4610 | 3800 | - | 0.5774 |
607
+ | 1.4802 | 3850 | - | 0.5735 |
608
+ | 1.4994 | 3900 | - | 0.5781 |
609
+ | 1.5186 | 3950 | - | 0.5691 |
610
+ | 1.5379 | 4000 | 0.0266 | 0.5729 |
611
+ | 1.5571 | 4050 | - | 0.5712 |
612
+ | 1.5763 | 4100 | - | 0.5685 |
613
+ | 1.5955 | 4150 | - | 0.5711 |
614
+ | 1.6148 | 4200 | - | 0.5712 |
615
+ | 1.6340 | 4250 | - | 0.5716 |
616
+ | 1.6532 | 4300 | - | 0.5762 |
617
+ | 1.6724 | 4350 | - | 0.5813 |
618
+ | 1.6917 | 4400 | - | 0.5822 |
619
+ | 1.7109 | 4450 | - | 0.5805 |
620
+ | 1.7301 | 4500 | 0.0337 | 0.5789 |
621
+ | 1.7493 | 4550 | - | 0.5745 |
622
+ | 1.7686 | 4600 | - | 0.5752 |
623
+ | 1.7878 | 4650 | - | 0.5780 |
624
+ | 1.8070 | 4700 | - | 0.5815 |
625
+ | 1.8262 | 4750 | - | 0.5833 |
626
+ | 1.8454 | 4800 | - | 0.5809 |
627
+ | 1.8647 | 4850 | - | 0.5711 |
628
+ | 1.8839 | 4900 | - | 0.5716 |
629
+ | 1.9031 | 4950 | - | 0.5816 |
630
+ | 1.9223 | 5000 | 0.0299 | 0.5815 |
631
+ | 1.9416 | 5050 | - | 0.5816 |
632
+ | 1.9608 | 5100 | - | 0.5847 |
633
+ | 1.9800 | 5150 | - | 0.5831 |
634
+ | 1.9992 | 5200 | - | 0.5847 |
635
+ | 2.0 | 5202 | - | 0.5859 |
636
+
637
+ </details>
638
+
639
+ ### Framework Versions
640
+ - Python: 3.10.12
641
+ - Sentence Transformers: 3.4.1
642
+ - Transformers: 4.53.0
643
+ - PyTorch: 2.2.0+cu121
644
+ - Accelerate: 1.8.1
645
+ - Datasets: 3.6.0
646
+ - Tokenizers: 0.21.2
647
+
648
+ ## Citation
649
+
650
+ ### BibTeX
651
+
652
+ #### Sentence Transformers
653
+ ```bibtex
654
+ @inproceedings{reimers-2019-sentence-bert,
655
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
656
+ author = "Reimers, Nils and Gurevych, Iryna",
657
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
658
+ month = "11",
659
+ year = "2019",
660
+ publisher = "Association for Computational Linguistics",
661
+ url = "https://arxiv.org/abs/1908.10084",
662
+ }
663
+ ```
664
+
665
+ #### MultipleNegativesRankingLoss
666
+ ```bibtex
667
+ @misc{henderson2017efficient,
668
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
669
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
670
+ year={2017},
671
+ eprint={1705.00652},
672
+ archivePrefix={arXiv},
673
+ primaryClass={cs.CL}
674
+ }
675
+ ```
676
+
677
+ <!--
678
+ ## Glossary
679
+
680
+ *Clearly define terms in order to be accessible across audiences.*
681
+ -->
682
+
683
+ <!--
684
+ ## Model Card Authors
685
+
686
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
687
+ -->
688
+
689
+ <!--
690
+ ## Model Card Contact
691
+
692
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
693
+ -->
config.json ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "XLMRobertaModel"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.1,
6
+ "bos_token_id": 0,
7
+ "classifier_dropout": null,
8
+ "eos_token_id": 2,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 1024,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 4096,
14
+ "layer_norm_eps": 1e-05,
15
+ "max_position_embeddings": 8194,
16
+ "model_type": "xlm-roberta",
17
+ "num_attention_heads": 16,
18
+ "num_hidden_layers": 24,
19
+ "output_past": true,
20
+ "pad_token_id": 1,
21
+ "position_embedding_type": "absolute",
22
+ "torch_dtype": "float32",
23
+ "transformers_version": "4.53.0",
24
+ "type_vocab_size": 1,
25
+ "use_cache": true,
26
+ "vocab_size": 250002
27
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.4.1",
4
+ "transformers": "4.53.0",
5
+ "pytorch": "2.2.0+cu121"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
eval/Information-Retrieval_evaluation_results.csv ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@3,cosine-Accuracy@5,cosine-Accuracy@10,cosine-Precision@1,cosine-Recall@1,cosine-Precision@3,cosine-Recall@3,cosine-Precision@5,cosine-Recall@5,cosine-Precision@10,cosine-Recall@10,cosine-MRR@10,cosine-NDCG@10,cosine-MAP@100
2
+ 1.0,2601,0.32490172620064944,0.5742608101179285,0.6841565544351393,0.8080669970945138,0.32490172620064944,0.32490172620064944,0.19142027003930953,0.5742608101179285,0.13683131088702785,0.6841565544351393,0.08080669970945137,0.8080669970945138,0.47589938499184925,0.555604442571936,0.4841771909816718
3
+ 2.0,5202,0.34541104084771834,0.6057084259101009,0.7222696974876089,0.8465219620577679,0.34541104084771834,0.34541104084771834,0.20190280863670027,0.6057084259101009,0.14445393949752178,0.7222696974876089,0.08465219620577678,0.8465219620577679,0.503433682480002,0.5858686159391115,0.5104997716404978
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8d185e9d6e1cb7175e25ec450b93d8bb880897cf549d24bb6f05a7b305a9530f
3
+ size 2271064456
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 8192,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "<s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "<mask>",
25
+ "lstrip": true,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "<pad>",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "</s>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "<unk>",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e4f7e21bec3fb0044ca0bb2d50eb5d4d8c596273c422baef84466d2c73748b9c
3
+ size 17083053
tokenizer_config.json ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<s>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "<pad>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "</s>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "<unk>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "250001": {
36
+ "content": "<mask>",
37
+ "lstrip": true,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "bos_token": "<s>",
45
+ "clean_up_tokenization_spaces": true,
46
+ "cls_token": "<s>",
47
+ "eos_token": "</s>",
48
+ "extra_special_tokens": {},
49
+ "mask_token": "<mask>",
50
+ "model_max_length": 8192,
51
+ "pad_token": "<pad>",
52
+ "sep_token": "</s>",
53
+ "sp_model_kwargs": {},
54
+ "tokenizer_class": "XLMRobertaTokenizer",
55
+ "unk_token": "<unk>"
56
+ }