camembert-pcg-real-transactions

This model is a fine-tuned version of camembert-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7377
  • Accuracy: 0.8906
  • Precision: 0.9037
  • Recall: 0.8906
  • F1: 0.8906

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 120

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1
3.5853 2.5641 200 3.5139 0.3019 0.2288 0.3019 0.2345
3.2713 5.1282 400 3.0864 0.5377 0.4640 0.5377 0.4615
2.7401 7.6923 600 2.5215 0.6302 0.5401 0.6302 0.5601
2.1343 10.2564 800 1.9296 0.7170 0.6254 0.7170 0.6571
1.5063 12.8205 1000 1.3920 0.7566 0.6875 0.7566 0.7114
1.0361 15.3846 1200 1.0702 0.8132 0.7713 0.8132 0.7810
0.769 17.9487 1400 0.9110 0.8189 0.7982 0.8189 0.7962
0.6071 20.5128 1600 0.8106 0.8302 0.8136 0.8302 0.8125
0.4613 23.0769 1800 0.7313 0.8491 0.8441 0.8491 0.8370
0.358 25.6410 2000 0.6852 0.8547 0.8498 0.8547 0.8452
0.284 28.2051 2200 0.6448 0.8679 0.8611 0.8679 0.8600
0.22 30.7692 2400 0.6581 0.8679 0.8781 0.8679 0.8660
0.1745 33.3333 2600 0.6068 0.8717 0.8729 0.8717 0.8666
0.1375 35.8974 2800 0.6068 0.8830 0.8946 0.8830 0.8827
0.1096 38.4615 3000 0.5848 0.8962 0.9096 0.8962 0.8961
0.0889 41.0256 3200 0.5778 0.8925 0.9057 0.8925 0.8922
0.0722 43.5897 3400 0.6095 0.8830 0.8983 0.8830 0.8835
0.0636 46.1538 3600 0.6074 0.8849 0.8986 0.8849 0.8849
0.0548 48.7179 3800 0.5972 0.8962 0.9094 0.8962 0.8962
0.0428 51.2821 4000 0.6226 0.8906 0.9069 0.8906 0.8928
0.0416 53.8462 4200 0.6335 0.8849 0.9036 0.8849 0.8861
0.0375 56.4103 4400 0.6533 0.8811 0.8930 0.8811 0.8795
0.037 58.9744 4600 0.6516 0.8849 0.8941 0.8849 0.8843
0.028 61.5385 4800 0.6452 0.8887 0.8993 0.8887 0.8893
0.0341 64.1026 5000 0.6523 0.8925 0.9056 0.8925 0.8938
0.0253 66.6667 5200 0.6546 0.8887 0.9020 0.8887 0.8902
0.024 69.2308 5400 0.7024 0.8830 0.8950 0.8830 0.8830
0.0244 71.7949 5600 0.6767 0.8925 0.9043 0.8925 0.8932
0.0201 74.3590 5800 0.6885 0.8887 0.9009 0.8887 0.8897
0.0258 76.9231 6000 0.6870 0.8925 0.9059 0.8925 0.8937
0.0227 79.4872 6200 0.7150 0.8868 0.9003 0.8868 0.8873
0.019 82.0513 6400 0.7534 0.8830 0.8964 0.8830 0.8834
0.0224 84.6154 6600 0.7238 0.8887 0.9002 0.8887 0.8882
0.0173 87.1795 6800 0.7238 0.8906 0.9043 0.8906 0.8911
0.0212 89.7436 7000 0.7031 0.8925 0.9064 0.8925 0.8934
0.0187 92.3077 7200 0.7364 0.8943 0.9088 0.8943 0.8956
0.0187 94.8718 7400 0.7152 0.8943 0.9110 0.8943 0.8961
0.0168 97.4359 7600 0.7188 0.8849 0.8980 0.8849 0.8856
0.0191 100.0 7800 0.7268 0.8887 0.9039 0.8887 0.8903
0.0167 102.5641 8000 0.7339 0.8906 0.9033 0.8906 0.8911
0.013 105.1282 8200 0.7268 0.8925 0.9065 0.8925 0.8936
0.0168 107.6923 8400 0.7346 0.8906 0.9038 0.8906 0.8909
0.0132 110.2564 8600 0.7319 0.8906 0.9038 0.8906 0.8909
0.0139 112.8205 8800 0.7405 0.8906 0.9024 0.8906 0.8908
0.0136 115.3846 9000 0.7361 0.8906 0.9024 0.8906 0.8908
0.0144 117.9487 9200 0.7377 0.8906 0.9037 0.8906 0.8906

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu128
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
305
Safetensors
Model size
0.1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Tiime/camembert-pcg-real-transactions

Finetuned
(145)
this model

Evaluation results