Subh775's picture
Update README.md
32daa39 verified
---
dataset_info:
features:
- name: image
dtype: image
- name: age
dtype:
class_label:
names:
'0': 0-2
'1': 3-9
'2': 10-19
'3': 20-29
'4': 30-39
'5': 40-49
'6': 50-59
'7': 60-69
'8': more than 70
- name: gender
dtype:
class_label:
names:
'0': Male
'1': Female
- name: race
dtype:
class_label:
names:
'0': East Asian
'1': Indian
'2': Black
'3': White
'4': Middle Eastern
'5': Latino_Hispanic
'6': Southeast Asian
- name: service_test
dtype: bool
splits:
- name: train
num_bytes: 65869100.551
num_examples: 3031
download_size: 65046038
dataset_size: 65869100.551
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
language:
- en
pretty_name: Fairness
size_categories:
- 1K<n<10K
---
<h1 align="center" style="font-family:Arial, sans-serif; font-size:36px; color:#333;">FairFace_Balanced_3K</h1>
## Overview
`FairFace_Balanced_3K` is a balanced subset of the original [HuggingFaceM4/FairFace](https://huggingface.co/datasets/HuggingFaceM4/FairFace) dataset created to support bias-sensitive experiments in facial attribute recognition. This subset includes **3,031 samples**, with **433 images per race class**, across **7 race categories**:
- White
- Black
- East Asian
- Southeast Asian
- Indian
- Middle Eastern
- Latino_Hispanic
Each entry contains:
- RGB facial image
- Age group label (9 categories)
- Gender label (Male, Female)
- Race label (7 classes)
The subset is balanced to mitigate data bias and allow fair evaluation across racial groups.
Some random samples of the dataset are as: <img src="testimgs/random_samples.png" alt="Sample" width="300"/>
## Data Format
The dataset is stored in the Hugging Face Hub using the `datasets` library and Parquet format. Each row includes:
- `image`: PIL image in byte format
- `age`: Integer (mapped to age group)
- `gender`: Integer (0: Male, 1: Female)
- `race`: Integer (0–6, mapped to race category)
## Visuals
- Race Distribution
<img src="testimgs/race_distribution.png" alt="Sample" width="300"/>
- Age vs Group Distriution
<img src="testimgs/age_roup_distribution.png" alt="Sample" width="300"/>
- Gender Distribution
<img src="testimgs/gender_distribution.png" alt="Sample" width="300"/>
## Citation
If you use this dataset or the original FairFace dataset, please cite the following work:
```bibtex
@inproceedings{karkkainenfairface,
title={FairFace: Face Attribute Dataset for Balanced Race, Gender, and Age for Bias Measurement and Mitigation},
author={Karkkainen, Kimmo and Joo, Jungseock},
booktitle={Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision},
year={2021},
pages={1548--1558}
}
```
## Original Sources
Original Repo: [joojs/fairface](https://github.com/joojs/fairface)
Original Dataset: [HuggingFaceM4/FairFace](https://huggingface.co/datasets/HuggingFaceM4/FairFace)
Original Paper: [FairFace](https://openaccess.thecvf.com/content/WACV2021/papers/Karkkainen_FairFace_Face_Attribute_Dataset_for_Balanced_Race_Gender_and_Age_WACV_2021_paper.pdf)