How many disk space will be consumed if downloading all images?
1TB,10TB,100TB,1PB?
@eeyrw depends on the image size and encode quality, and also not sure what the link rot % is now. ~ 2 years ago I think it was 240TB for the full 5B, 2B-en was ~100TB. I believe that download was default jpeg encode quality 95, and a max size 384, I think it was a smallest edge resized to max 384 (left as is if <)
I'd recommend setting up img2dataset with your desired settings, download 2-5M images. You'll be able to estimate reasonably from that. You'll get an idea of the % sucess and the avg size/img.
It sounds quite big and professional storage solution needed. It is hard to filter out wanted samples by now due to lack of CLIP embedding index for Re-LAION. The caption of LAION is quite noisy and not captioned by modern VLM. So it is also hard to filter out by caption. Will CLIP embedding index for Re-LAION be back?
@eeyrw not sure what the plans are for the embeddings, someone might know if you ask in the LAION discord.
If you can afford to train on 2B images, the storage is usually a much smaller challenge. A 'professional GPU' setup, ie hundreds of A100/H100 is >> $ than 100 TB of drive space. Most clusters with the needed GPUs will have the storage :)
I am not that rich to train on 2B images :( Sometimes people need to create topic specific subset of LAION like human posture, logo, plants, butterfly, whatever... Those kinds of subsets can be used to train LoRA as domain expert. With embedding I can use CLIP to do rough pre-filter on 2B without downloading all images.
Yeah I plan to filter but didn't have time yet. Maybe in a week or two.
Glad to hear that it's under plan. That's not urgent for me to use Re-LAION 2B CLIP embeddings immediately. Current I use laion-pop which is significantly smaller as base dataset. If there is a chance that releasing new embedding is better.