Datasets:

Modalities:
Image
Text
Formats:
webdataset
Languages:
English
ArXiv:
Libraries:
Datasets
WebDataset
License:

You need to agree to share your contact information to access this dataset

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this dataset content.

HEADS-UP: Head-Mounted Egocentric Dataset for Trajectory Prediction in Blind Assistance Systems

We introduce HEADS-UP, the first egocentric dataset collected from head-mounted cameras, designed specifically for trajectory prediction in blind assistance systems. Given the large population of blind and visually impaired individuals, there is an increasing need for intelligent assistive tools that can provide real-time warnings about potential collisions with dynamic obstacles. These systems depend on algorithms that can predict the trajectories of moving objects, such as pedestrians, to issue timely hazard alerts. However, current datasets do not capture the essential information from the perspective of a blind individual. To bridge this gap, HEADS-UP introduces a novel dataset specifically designed for trajectory prediction in this context.

If you use this dataset, please cite:

@article{haghighi2024heads,
  title={HEADS-UP: Head-Mounted Egocentric Dataset for Trajectory Prediction in Blind Assistance Systems},
  author={Haghighi, Yasaman and Demonsant, Celine and Chalimourdas, Panagiotis and Naeini, Maryam Tavasoli and Munoz, Jhon Kevin and Bacca, Bladimir and Suter, Silvan and Gani, Matthieu and Alahi, Alexandre},
  journal={arXiv preprint arXiv:2409.20324},
  year={2024}
}
Downloads last month
36