synapti's picture
End of training
0c4d155 verified
|
raw
history blame
2.96 kB
metadata
library_name: transformers
license: apache-2.0
base_model: answerdotai/ModernBERT-base
tags:
  - generated_from_trainer
model-index:
  - name: nci-technique-classifier-v5.2
    results: []

nci-technique-classifier-v5.2

This model is a fine-tuned version of answerdotai/ModernBERT-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0173
  • Micro F1: 0.7718
  • Macro F1: 0.5789

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 3
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Micro F1 Macro F1
0.0275 0.1570 200 0.0272 0.6634 0.2831
0.0256 0.3140 400 0.0238 0.6844 0.3147
0.0211 0.4710 600 0.0226 0.7276 0.2792
0.0224 0.6279 800 0.0206 0.7140 0.4159
0.0198 0.7849 1000 0.0203 0.7180 0.4403
0.0175 0.9419 1200 0.0192 0.7481 0.4333
0.018 1.0989 1400 0.0190 0.7320 0.4845
0.017 1.2559 1600 0.0191 0.7199 0.4723
0.0165 1.4129 1800 0.0188 0.7597 0.4633
0.0165 1.5699 2000 0.0182 0.7434 0.5247
0.0167 1.7268 2200 0.0183 0.7345 0.5005
0.0167 1.8838 2400 0.0182 0.7629 0.5162
0.0143 2.0408 2600 0.0180 0.7493 0.5557
0.016 2.1978 2800 0.0183 0.7588 0.5513
0.0157 2.3548 3000 0.0185 0.7663 0.5457
0.0157 2.5118 3200 0.0183 0.7665 0.5756
0.0146 2.6688 3400 0.0179 0.7641 0.5885
0.0123 2.8257 3600 0.0182 0.7719 0.5734
0.0136 2.9827 3800 0.0179 0.7682 0.5952

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu128
  • Datasets 4.4.1
  • Tokenizers 0.22.1