File size: 722 Bytes
24446e6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9200bb8
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
---
dataset_info:
  features:
  - name: text
    dtype: string
  - name: id
    dtype: string
  - name: dump
    dtype: string
  - name: url
    dtype: string
  - name: date
    dtype: string
  - name: file_path
    dtype: string
  - name: language
    dtype: string
  - name: language_score
    dtype: float64
  - name: token_count
    dtype: int64
  splits:
  - name: train
    num_bytes: 124335849835.91966
    num_examples: 13377130
  download_size: 42308647425
  dataset_size: 124335849835.91966
configs:
- config_name: default
  data_files:
  - split: train
    path: data/train-*
---

A subset of FineWeb sample-100BT with sequence length >= 1024 when tokenized with the Llama 2 tokenizer (including special tokens)