Datasets:

Languages:
English
ArXiv:
License:

You need to agree to share your contact information to access this dataset

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this dataset content.

The Tremendous TabLib Trawl (T4) is a dataset for training tabular foundation models. The dataset is described in detail in our paper, "Large Scale Transfer Learning for Tabular Data via Language Modeling." The paper also includes a datasheet for this dataset.

T4 consists of a set of Parquet files (described below). For examples and infrastructure showing how to train a lannguage model on T4, see our open-source Python library, rtfm, which was used to train TabuLa-8B on T4.

Files and Directory Structure

The T4 dataset contains approximately 3.1M tables. Each table is a separate Parquet file, named according to the content_hash of the dataset in TabLib. The dataset is stored in "chunk" subdirectores, which represent batches of tables from the preprocessing phase. Each chunk directory (e.g. chunk-0000) is stored as a single .zip file; unzip these files to access the underlying Parquet files.

The dataset occupies a total of 219GB compressed (1.34TB uncompressed) on disk.

License and Acceptable Use

We release this dataset under the same license as the original corpuse from which it was derived, TabLib.

By using this dataset, you are acknowledging that you have permission to access the TabLib dataset, and you agree to abide by the terms of use and license of TabLib.

TabLib can be accessed on HF Datasets, and you can read more about TabLib in the associated paper and blog post.

We claim no affiliation with the original creators of TabLib, and this dataset release is not associated with Approximate Labs (but we are grateful to the original TabLib authors for their contributions to the research community and for releasing TabLib).

Downloads last month
75

Models trained or fine-tuned on mlfoundations/t4-full

Collection including mlfoundations/t4-full