Sparsh-skin model

Sparsh-skin is a transformer-based backbone for full hand tactile sensing with the Xela sensor. This model is trained using self-distillation SSL and is specifically adapted for full hand Xela sensing, accounting for hand configuration, etc.

Disclaimer: This model card was written by the Sparsh-skin authors. The Transformer architetcure and DINO objectives have been adapted for full hand tactile SSL purposes.

Intended uses & limitations

You can utilize the Sparsh-skin model to extract touch representations for the Xela sensor. You have two options:

  1. Use the frozen Sparsh-skin encoder: This allows you to leverage the pre-trained weights of the Sparsh-skin model without modifying them.
  2. Fine-tune the Sparsh-skin encoder: You can fine-tune the Sparsh-skin encoder along with the training of your downstream task, allowing the model to adapt to your specific use case.

Both options enable you to take advantage of the powerful touch representations learned by the Sparsh-skin model.

How to Use

For detailed instructions on how to load the encoder and integrate it into your downstream task, please refer to our GitHub repository.

Citation

@inproceedings{
sharma2025selfsupervised,
title={Self-supervised perception for tactile skin covered dexterous hands},
author={Akash Sharma and Carolina Higuera and Chaithanya Krishna Bodduluri and Zixi Liu and Taosha Fan and Tess Hellebrekers and Mike Lambeta and Byron Boots and Michael Kaess and Tingfan Wu and Francois Robert Hogan and Mustafa Mukadam},
booktitle={9th Annual Conference on Robot Learning},
year={2025},
url={https://openreview.net/forum?id=eLeCrM5PEO}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support