trollek's picture
Update README.md
ac35bc1 verified
metadata
license: apache-2.0
datasets:
  - trollek/ImagePromptHelper-v02
  - Gustavosta/Stable-Diffusion-Prompts
  - k-mktr/improved-flux-prompts
  - Falah/image_generation_prompts_SDXL
  - ChrisGoringe/flux_prompts
language:
  - en
base_model:
  - HuggingFaceTB/SmolLM2-135M
library_name: transformers
tags:
  - llama-factory
  - full

Smol Image Prompt Helper

This is meant to be a drop-in replacement for my last image prompt helper but with a new trick and a much smaller size. It achieves the following results on the evaluation set:

  • Loss: 1.0077

Model description

Lets say you have a node in ComfyUI to parse JSON and send the appropriate prompt to the text encoders. Tadaaa:

You are an AI assistant tasked with expanding and formatting image prompts. You are given an input that you will need to write image prompts for different text encoders.
Always respond with the following format:
{
    "clip_l": "<keywords from image analysis>",
    "clip_g": "<simple descriptions of the image>",
    "t5xxl": "<complex semanticly rich description of the image>",
    "negative": "<contrasting keywords for what is not in the image>"
}

Intended uses & limitations

Have a look at the dataset that I created (ImagePromptHelper-v02 (CC BY 4.0)) and you will see whaaaaat I've doooone.

Training procedure

I continued the pretraining with SDXL and Flux prompts and then SFT'd it on my own dataset.

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 1
  • seed: 443
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 64
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.05
  • num_epochs: 3

Training results

Training Loss Epoch Step Validation Loss
1.1631 0.3966 500 1.2816
1.019 0.7932 1000 1.1431
0.9857 1.1896 1500 1.0818
1.0436 1.5862 2000 1.0459
0.9918 1.9827 2500 1.0235
0.9287 2.3791 3000 1.0114
0.9205 2.7757 3500 1.0079

Framework versions

  • Transformers 4.50.0
  • Pytorch 2.6.0+cu126
  • Datasets 3.4.1
  • Tokenizers 0.21.0