IGNF
/

🌐 FLAIR-HUB Model Collection

  • Trained on: FLAIR-HUB dataset 🔗
  • Available modalities: Aerial images, SPOT images, Topographic info, Sentinel-2 yearly time-series, Sentinel-1 yearly time-series, Historical aerial images
  • Encoders: ConvNeXTV2, Swin (Tiny, Small, Base, Large)
  • Decoders: UNet, UPerNet
  • Tasks: Land-cover mapping (LC), Crop-type mapping (LPIS)
  • Class nomenclature: 15 classes for LC, 23 classes for LPIS
🆔
Model ID
🗺️
Land-cover
🌾
Crop-types
🛩️
Aerial
⛰️
Elevation
🛰️
SPOT
🛰️
S2 t.s.
🛰️
S1 t.s.
🛩️
Historical
LC-A
LC-D
LC-F
LC-G
LC-I
LC-L
LPIS-A
LPIS-F
LPIS-I
LPIS-J

🔍 Model: FLAIR-HUB_LPIS-J_swinbase-upernet

  • Encoder: swin_base_patch4_window12_384
  • Decoder: upernet
  • Metrics:
  • mIoU O.A. F-score Precision Recall
    32.35% 87.97% 43.04% 50.96% 42.60%
  • Params.: 186.9
  • Code: GitHub

General Informations


Training Config Hyperparameters

- Model architecture: swin_base_patch4_window12_384-upernet
- Optimizer: AdamW (betas=[0.9, 0.999], weight_decay=0.01)
- Learning rate: 5e-5
- Scheduler: one_cycle_lr (warmup_fraction=0.2)
- Epochs: 150
- Batch size: 5
- Seed: 2025
- Early stopping: patience 20, monitor val_miou (mode=max)
- Class weights:
    - default: 1.0
    - masked classes: [clear cut, ligneous, mixed, other]  weight = 0
- Input channels:
    - AERIAL_RGBI: [4, 1, 2]
    - SPOT_RGBI: [4, 1, 2]
    - SENTINEL2_TS: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
    - SENTINEL1-ASC_TS: [1, 2]
    - SENTINEL1-DESC_TS: [1, 2]
- Input normalization (custom):
    - AERIAL_RGBI:
        mean: [106.59, 105.66, 111.35]
        std:  [39.78, 52.23, 45.62]
    - SPOT_RGBI:
        mean: [1137.03, 433.26, 508.75]
        std:  [543.11, 312.76, 284.61]

Training Data

- Train patches: 152225
- Validation patches: 38175
- Test patches: 50700
Classes distribution.

Training Logging

Training logging.

Metrics

Metric Value
mIoU 32.35%
Overall Accuracy 87.97%
F-score 43.04%
Precision 50.96%
Recall 42.60%
Class IoU (%) F-score (%) Precision (%) Recall (%)
grasses 52.02 68.44 73.20 64.26
wheat 57.45 72.97 64.92 83.30
barley 30.96 47.29 72.74 35.03
maize 78.30 87.83 84.77 91.11
other cereals 8.15 15.08 18.59 12.69
rice 0.00 0.00 0.00 0.00
flax/hemp/tobacco 10.61 19.19 88.66 10.76
sunflower 45.82 62.85 53.61 75.92
rapeseed 71.89 83.64 82.53 84.78
other oilseed crops 0.00 0.00 0.00 0.00
soy 33.68 50.38 62.54 42.19
other protein crops 8.93 16.39 18.95 14.44
fodder legumes 27.19 42.76 44.10 41.50
beetroots 75.31 85.91 84.14 87.77
potatoes 14.37 25.13 19.45 35.48
other arable crops 22.10 36.20 39.78 33.22
vineyard 44.55 61.64 56.82 67.36
olive groves 16.38 28.14 55.59 18.84
fruits orchards 36.57 53.55 47.56 61.27
nut orchards 6.60 12.38 19.69 9.03
other permanent crops 12.12 21.62 84.16 12.40
mixed crops 2.30 4.50 7.39 3.24
background 88.73 94.03 92.86 95.22

Inference

Aerial ROI

AERIAL

Inference ROI

INFERENCE

Cite

BibTeX:

@article{ign2025flairhub,
  doi = {10.48550/arXiv.2506.07080},
  url = {https://arxiv.org/abs/2506.07080},
  author = {Garioud, Anatol and Giordano, Sébastien and David, Nicolas and Gonthier, Nicolas},
  title = {FLAIR-HUB: Large-scale Multimodal Dataset for Land Cover and Crop Mapping},
  publisher = {arXiv},
  year = {2025}
}

APA:

Anatol Garioud, Sébastien Giordano, Nicolas David, Nicolas Gonthier. 
FLAIR-HUB: Large-scale Multimodal Dataset for Land Cover and Crop Mapping. (2025). 
DOI: https://doi.org/10.48550/arXiv.2506.07080
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including IGNF/FLAIR-HUB_LPIS-J_swinbase-upernet

Evaluation results