dinov2-base-imagenet1k-1-layer-finetuned-galaxy10-decals-head-finetuned-100-galaxy_mnist

This model is a fine-tuned version of matthieulel/dinov2-base-imagenet1k-1-layer-finetuned-galaxy10-decals on the matthieulel/galaxy_mnist dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2235
  • Accuracy: 0.909
  • Precision: 0.9108
  • Recall: 0.909
  • F1: 0.9089

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1
1.1235 0.99 62 0.9579 0.622 0.6410 0.622 0.6249
0.5488 2.0 125 0.3964 0.852 0.8534 0.852 0.8523
0.384 2.99 187 0.2877 0.882 0.8820 0.882 0.8819
0.3493 4.0 250 0.2621 0.8885 0.8885 0.8885 0.8885
0.3184 4.99 312 0.2570 0.8945 0.8957 0.8945 0.8946
0.3027 6.0 375 0.2485 0.9 0.9013 0.9 0.9000
0.3245 6.99 437 0.2397 0.9035 0.9038 0.9035 0.9034
0.2666 8.0 500 0.2413 0.9035 0.9051 0.9035 0.9034
0.2705 8.99 562 0.2336 0.905 0.9057 0.905 0.9050
0.2524 10.0 625 0.2320 0.9065 0.9072 0.9065 0.9065
0.271 10.99 687 0.2293 0.908 0.9085 0.908 0.9079
0.2624 12.0 750 0.2266 0.9045 0.9047 0.9045 0.9045
0.2681 12.99 812 0.2286 0.908 0.9088 0.908 0.9079
0.262 14.0 875 0.2239 0.9085 0.9089 0.9085 0.9085
0.3015 14.99 937 0.2265 0.9075 0.9085 0.9075 0.9074
0.2635 16.0 1000 0.2256 0.9065 0.9082 0.9065 0.9064
0.2708 16.99 1062 0.2239 0.9075 0.9089 0.9075 0.9075
0.2616 18.0 1125 0.2231 0.9075 0.9084 0.9075 0.9074
0.2658 18.99 1187 0.2229 0.9075 0.9092 0.9075 0.9075
0.2578 20.0 1250 0.2216 0.908 0.9093 0.908 0.9080
0.2823 20.99 1312 0.2206 0.9075 0.9084 0.9075 0.9075
0.2714 22.0 1375 0.2223 0.9085 0.9100 0.9085 0.9084
0.2631 22.99 1437 0.2235 0.909 0.9108 0.909 0.9089
0.2695 24.0 1500 0.2214 0.9085 0.9101 0.9085 0.9084
0.2698 24.99 1562 0.2208 0.908 0.9095 0.908 0.9079
0.2285 26.0 1625 0.2188 0.9075 0.9083 0.9075 0.9075
0.2716 26.99 1687 0.2202 0.9075 0.9089 0.9075 0.9075
0.2628 28.0 1750 0.2205 0.9075 0.9090 0.9075 0.9074
0.2619 28.99 1812 0.2207 0.907 0.9085 0.907 0.9069
0.2568 29.76 1860 0.2204 0.907 0.9085 0.907 0.9069

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.3.0
  • Datasets 2.19.1
  • Tokenizers 0.15.1
Downloads last month
21
Safetensors
Model size
86.6M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for matthieulel/dinov2-base-imagenet1k-1-layer-finetuned-galaxy10-decals-head-finetuned-100-galaxy_mnist

OSZAR »