YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
1. Where is the code
All code can be found in https://huggingface.co/Apostasi0225/susie-finetuned.
It also contains the dataset and our3
model checkpoint.
2. Environment Setup
System requirements:
- Linux
- CUDA 11.8
- CUDNN 8.6
Build the environment using conda:
conda create -n susie python=3.10
conda activate susie
cd susie
pip install -r requirements.txt
pip install -e .
Additionally, install the following packages:
# Install Pytorch for model loading
pip install torch==2.6.0
# Install jaxlib 0.4.11
pip install https://storage.googleapis.com/jax-releases/cuda11/jaxlib-0.4.11+cuda11.cudnn86-cp310-cp310-manylinux2014_x86_64.whl
pip install scipy==1.12.0
# Downgrade numpy and orbax-checkpoint
pip install orbax-checkpoint==0.3.5
pip install numpy==1.24
# Install skimage
pip install scikit-image
pip install ipykernel
Modify diffusers
library code:
- Find the location of the package
pip show diffusers # e.g. Location: /home/username/miniconda3/envs/susie/lib/python3.10/site-packages/diffusers
Open the folder and find
utils/dynamic_modules_utils.py
Remove
cached_download
in line 28:from huggingface_hub import ...
3. Getting Started
Step 1: Generate TFRecords data
Run 0-generate_dataset.ipynb
to generate the TFRecords data.
This will automatically split the dataset into training and validation sets.
Step 2: Train the Model
Follow the command in 1-train.md
to start training.
You can modify the training configuration parameters by editing susie/configs/base.py
Step 3: Evaluate the Model
We provide the checkpoint for ours3
model.
After training, evaluate the model using 3-eval.ipynb
.
This will test the model on the validation set and output evaluation metrics such as SSIM and PSNR for each task.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support