DinoAmoros is a fine-tuned version of DinoAmoros-small-2025_05_06_35187-prova_bs16_freeze_monolabel. It achieves the following results on the test set:

  • Loss: 3.1431
  • F1 Micro: 0.4000
  • F1 Macro: 0.1538
  • Accuracy: 0.4000

Model description

DinoAmoros is a model built on top of DinoAmoros-small-2025_05_06_35187-prova_bs16_freeze_monolabel model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.

The source code for training the model can be found in this Git repository.


Intended uses & limitations

You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.


Training and evaluation data

Details on the number of images for each class are given in the following table:

Class train test val Total
ALGAE 2 0 2 4
Acr 0 0 0 0
Acr_Br 0 0 0 0
Anem 0 0 0 0
CCA 0 0 0 0
Ech 0 0 0 0
Fts 0 0 0 0
Gal 0 0 0 0
Gon 0 0 0 0
H_oval 0 0 0 0
H_uni 0 0 0 0
Mtp 2 0 0 2
P 0 0 0 0
Poc 0 0 0 0
Por 0 0 0 0
R 2 4 1 7
RDC 0 0 0 0
S 3 4 3 10
SG 1 2 4 7
Sarg 0 0 0 0
Ser 0 0 0 0
Slt 0 0 0 0
Sp 0 0 0 0
Turf 0 0 0 0

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • Number of Epochs: 49.0
  • Learning Rate: 0.001
  • Train Batch Size: 16
  • Eval Batch Size: 16
  • Optimizer: Adam
  • LR Scheduler Type: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
  • Freeze Encoder: Yes
  • Data Augmentation: Yes

Data Augmentation

Data were augmented using the following transformations :

Train Transforms

  • PreProcess: No additional parameters
  • Resize: probability=1.00
  • RandomHorizontalFlip: probability=0.25
  • RandomVerticalFlip: probability=0.25
  • ColorJiggle: probability=0.25
  • RandomPerspective: probability=0.25
  • Normalize: probability=1.00

Val Transforms

  • PreProcess: No additional parameters
  • Resize: probability=1.00
  • Normalize: probability=1.00

Training results

Epoch Validation Loss Accuracy F1 Macro F1 Micro Learning Rate
1 3.1656250953674316 0.0000 0.0000 0.0000 0.001
2 3.168750286102295 0.0000 0.0000 0.0000 0.001
3 3.171875 0.0000 0.0000 0.0000 0.001
4 3.181640625 0.0000 0.0000 0.0000 0.001
5 3.126953125 0.0000 0.0000 0.0000 0.001
6 3.076953411102295 0.2000 0.2000 0.0889 0.001
7 3.030468463897705 0.3000 0.3000 0.1769 0.001
8 2.979687213897705 0.3000 0.3000 0.1769 0.001
9 2.943066120147705 0.3000 0.3000 0.1769 0.001
10 2.9164061546325684 0.3000 0.3000 0.1548 0.001
11 2.9022459983825684 0.3000 0.3000 0.1769 0.001
12 2.8817381858825684 0.3000 0.3000 0.1667 0.001
13 2.866894245147705 0.3000 0.3000 0.1458 0.001
14 2.857128620147705 0.3000 0.3000 0.1429 0.001
15 2.8280272483825684 0.3000 0.3000 0.1429 0.001
16 2.8199219703674316 0.3000 0.3000 0.1429 0.001
17 2.81640625 0.3000 0.3000 0.1429 0.001
18 2.810497760772705 0.3000 0.3000 0.1429 0.001
19 2.8023924827575684 0.3000 0.3000 0.1429 0.001
20 2.7597899436950684 0.4000 0.4000 0.2333 0.001
21 2.699267864227295 0.4000 0.4000 0.2333 0.001
22 2.672900676727295 0.4000 0.4000 0.2333 0.001
23 2.623706340789795 0.4000 0.4000 0.2333 0.001
24 2.5790038108825684 0.5000 0.5000 0.3333 0.001
25 2.543994426727295 0.5000 0.5000 0.3333 0.001
26 2.5174317359924316 0.5000 0.5000 0.3333 0.001
27 2.4705567359924316 0.5000 0.5000 0.3333 0.001
28 2.47216796875 0.4000 0.4000 0.2800 0.001
29 2.458691120147705 0.4000 0.4000 0.2800 0.001
30 2.414844036102295 0.4000 0.4000 0.2800 0.001
31 2.39306640625 0.4000 0.4000 0.2800 0.001
32 2.3921875953674316 0.4000 0.4000 0.2800 0.001
33 2.3856444358825684 0.4000 0.4000 0.2800 0.001
34 2.4126830101013184 0.4000 0.4000 0.2800 0.001
35 2.4005675315856934 0.4000 0.4000 0.2800 0.001
36 2.3883910179138184 0.4000 0.4000 0.2800 0.001
37 2.414501667022705 0.4000 0.4000 0.2800 0.001
38 2.39424467086792 0.4000 0.4000 0.2800 0.001
39 2.380664348602295 0.4000 0.4000 0.3133 0.001
40 2.38067626953125 0.4000 0.4000 0.3133 0.001
41 2.388671875 0.4000 0.4000 0.2800 0.001
42 2.3944458961486816 0.4000 0.4000 0.3133 0.001
43 2.4193968772888184 0.4000 0.4000 0.3133 0.001
44 2.461944580078125 0.4000 0.4000 0.3133 0.001
45 2.4820556640625 0.4000 0.4000 0.3133 0.001
46 2.4767489433288574 0.4000 0.4000 0.3133 0.0001
47 2.4376220703125 0.4000 0.4000 0.2800 0.0001
48 2.4618897438049316 0.4000 0.4000 0.2800 0.0001
49 2.44083833694458 0.4000 0.4000 0.2800 0.0001

Framework Versions

  • Transformers: 4.48.0
  • Pytorch: 2.6.0+cu118
  • Datasets: 3.0.2
  • Tokenizers: 0.21.1
Downloads last month
4
Safetensors
Model size
22M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support