SegForCoral-2025_05_12_76513-bs16_refine is a fine-tuned version of nvidia/mit-b0.


Model description

SegForCoral-2025_05_12_76513-bs16_refine is a model built on top of nvidia/mit-b0 model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.

The source code for training the model can be found in this Git repository.


Intended uses & limitations

You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.


Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • Number of Epochs: 51.0
  • Learning Rate: 1e-05
  • Train Batch Size: 16
  • Eval Batch Size: 16
  • Optimizer: Adam
  • LR Scheduler Type: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
  • Freeze Encoder: Yes
  • Data Augmentation: Yes

Training results

Epoch Validation Loss Accuracy F1 Macro F1 Micro Learning Rate
1 N/A 1e-05
2 N/A 1e-05
3 N/A 1e-05
4 N/A 1e-05
5 N/A 1e-05
6 N/A 1e-05
7 N/A 1e-05
8 N/A 1e-05
9 N/A 1e-05
10 N/A 1e-05
11 N/A 1e-05
12 N/A 1e-05
13 N/A 1e-05
14 N/A 1e-05
15 N/A 1e-05
16 N/A 1e-05
17 N/A 1e-05
18 N/A 1e-05
19 N/A 1e-05
20 N/A 1e-05
21 N/A 1e-05
22 N/A 1e-05
23 N/A 1e-05
24 N/A 1e-05
25 N/A 1e-05
26 N/A 1e-05
27 N/A 1e-05
28 N/A 1e-05
29 N/A 1e-05
30 N/A 1e-05
31 N/A 1e-05
32 N/A 1e-05
33 N/A 1e-05
34 N/A 1e-05
35 N/A 1e-05
36 N/A 1.0000000000000002e-06
37 N/A 1.0000000000000002e-06
38 N/A 1.0000000000000002e-06
39 N/A 1.0000000000000002e-06
40 N/A 1.0000000000000002e-06
41 N/A 1.0000000000000002e-06
42 N/A 1.0000000000000002e-06
43 N/A 1.0000000000000002e-06
44 N/A 1.0000000000000002e-06
45 N/A 1.0000000000000002e-06
46 N/A 1.0000000000000002e-06
47 N/A 1.0000000000000002e-06
48 N/A 1.0000000000000002e-07
49 N/A 1.0000000000000002e-07
50 N/A 1.0000000000000002e-07
51 N/A 1.0000000000000002e-07

Framework Versions

  • Transformers: 4.49.0
  • Pytorch: 2.3.1+cu121
  • Datasets: 3.5.0
  • Tokenizers: 0.21.1
Downloads last month
4
Safetensors
Model size
3.72M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support