TiM / requirements.txt
Julien Blanchon
soijds
aef3da7
raw
history blame contribute delete
314 Bytes
gradio>=4.0.0
spaces>=0.28.0
torch==2.8.0
torchvision
diffusers
transformers>=4.25.0
omegaconf
einops
numpy
Pillow
safetensors
tqdm
flash-attn @ https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.3.14/flash_attn-2.8.2+cu129torch2.8-cp310-cp310-linux_x86_64.whl
accelerate
kernels
timm