Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
Bradley
/
fineweb-sample-100BT_over-2048-tokens-subset-split-processed-l3tokenizer
like
0
Formats:
parquet
Size:
1M - 10M
Libraries:
Datasets
Dask
Croissant
+ 1
Dataset card
Data Studio
Files
Files and versions
Community
main
fineweb-sample-100BT_over-2048-tokens-subset-split-processed-l3tokenizer
1 contributor
History:
4 commits
Bradley
Upload dataset (part 00002-of-00003)
91fb401
verified
4 days ago
data
Upload dataset (part 00001-of-00003)
4 days ago
.gitattributes
Safe
2.46 kB
initial commit
4 days ago
README.md
Safe
340 Bytes
Upload dataset (part 00002-of-00003)
4 days ago