The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.
Dataset Details
This dataset contains paired tactile and force data, intended for use in predicting 3-axis normal and shear forces applied to the sensor's elastomer. We used three different indenter shapes to collect force-labeled data: hemisphere, sharp, and flat. To measure force ground truths, we employed the ATI nano17 force/torque sensor. The protocol consisted of applying a random normal load (up to 5N) followed by a shear load, achieved by sliding the probe 2mm on the sensor's elastomer. An illustration of the protocol is shown below:
This dataset is part of TacBench for evaluating Sparsh touch representations. For more information, please visit https://sparsh-ssl.github.io/.
Uses
This dataset includes aligned tactile data, 3-axis force, and slip labels using the DIGIT sensor. It is designed to evaluate the performance of Sparsh encoders in understanding tactile properties such as force estimation and slip detection. Note that slip labels {0: no_slip, 1:slip} were obtain indirectlty via the friction cone method.
For more information on how to use this dataset and set up corresponding downstream tasks, please refer to the Sparsh repository.
Dataset Structure
The dataset consists a collection of normal/shear load trajectories for each probe. The structure is as follows:
sphere
βββ batch_1
β βββ dataset_digit_00.pkl
β βββ ...
β βββ dataset_digit_03.pkl
β βββ dataset_slip_forces.pkl
βββ batch_2
β βββ ...
flat
βββ batch_1
β βββ dataset_digit_00.pkl
β βββ ...
β βββ dataset_digit_03.pkl
β βββ dataset_slip_forces.pkl
β ...
sharp
βββ ....
For each batch:
dataset_digit_xy.pkl
: contains the binarized tactile images only.dataset_slip_forces.pkl
: it's a dictionary where each key represents a sliding trajectory. Each trajectory has the corresponding force and slip labels.
def load_pickle_dataset(file_dataset):
with open(file_dataset, "rb") as f:
all_frames = pickle.load(f)
return all_frames
def load_bin_image(io_buf):
img = Image.open(io.BytesIO(io_buf))
img = np.array(img)
return img
frames = load_pickle_dataset('sphere/batch_1/dataset_digit_00.pkl')
img = load_bin_image(frames[0])
Please refer to Sparsh repository for further information about extracting the force/slip dataset.
BibTeX entry and citation info
@inproceedings{
higuera2024sparsh,
title={Sparsh: Self-supervised touch representations for vision-based tactile sensing},
author={Carolina Higuera and Akash Sharma and Chaithanya Krishna Bodduluri and Taosha Fan and Patrick Lancaster and Mrinal Kalakrishnan and Michael Kaess and Byron Boots and Mike Lambeta and Tingfan Wu and Mustafa Mukadam},
booktitle={8th Annual Conference on Robot Learning},
year={2024},
url={https://openreview.net/forum?id=xYJn2e1uu8}
}
- Downloads last month
- 245