Schedulers
🤗 Diffusers provides many scheduler functions for the diffusion process. A scheduler takes a model’s output (the sample which the diffusion process is iterating on) and a timestep to return a denoised sample. The timestep is important because it dictates where in the diffusion process the step is; data is generated by iterating forward n timesteps and inference occurs by propagating backward through the timesteps. Based on the timestep, a scheduler may be discrete in which case the timestep is an int
or continuous in which case the timestep is a float
.
Depending on the context, a scheduler defines how to iteratively add noise to an image or how to update a sample based on a model’s output:
- during training, a scheduler adds noise (there are different algorithms for how to add noise) to a sample to train a diffusion model
- during inference, a scheduler defines how to update a sample based on a pretrained model’s output
Many schedulers are implemented from the k-diffusion library by Katherine Crowson, and they’re also widely used in A1111. To help you map the schedulers from k-diffusion and A1111 to the schedulers in 🤗 Diffusers, take a look at the table below:
A1111/k-diffusion | 🤗 Diffusers | Usage |
---|---|---|
DPM++ 2M | DPMSolverMultistepScheduler | |
DPM++ 2M Karras | DPMSolverMultistepScheduler | init with use_karras_sigmas=True |
DPM++ 2M SDE | DPMSolverMultistepScheduler | init with algorithm_type="sde-dpmsolver++" |
DPM++ 2M SDE Karras | DPMSolverMultistepScheduler | init with use_karras_sigmas=True and algorithm_type="sde-dpmsolver++" |
DPM++ 2S a | N/A | very similar to DPMSolverSinglestepScheduler |
DPM++ 2S a Karras | N/A | very similar to DPMSolverSinglestepScheduler(use_karras_sigmas=True, ...) |
DPM++ SDE | DPMSolverSinglestepScheduler | |
DPM++ SDE Karras | DPMSolverSinglestepScheduler | init with use_karras_sigmas=True |
DPM2 | KDPM2DiscreteScheduler | |
DPM2 Karras | KDPM2DiscreteScheduler | init with use_karras_sigmas=True |
DPM2 a | KDPM2AncestralDiscreteScheduler | |
DPM2 a Karras | KDPM2AncestralDiscreteScheduler | init with use_karras_sigmas=True |
DPM adaptive | N/A | |
DPM fast | N/A | |
Euler | EulerDiscreteScheduler | |
Euler a | EulerAncestralDiscreteScheduler | |
Heun | HeunDiscreteScheduler | |
LMS | LMSDiscreteScheduler | |
LMS Karras | LMSDiscreteScheduler | init with use_karras_sigmas=True |
N/A | DEISMultistepScheduler | |
N/A | UniPCMultistepScheduler |
Noise schedules and schedule types
A1111/k-diffusion | 🤗 Diffusers |
---|---|
Karras | init with use_karras_sigmas=True |
sgm_uniform | init with timestep_spacing="trailing" |
simple | init with timestep_spacing="trailing" |
exponential | init with timestep_spacing="linspace" , use_exponential_sigmas=True |
beta | init with timestep_spacing="linspace" , use_beta_sigmas=True |
All schedulers are built from the base SchedulerMixin class which implements low level utilities shared by all schedulers.
SchedulerMixin
Base class for all schedulers.
SchedulerMixin contains common functions shared by all schedulers such as general loading and saving functionalities.
ConfigMixin takes care of storing the configuration attributes (like num_train_timesteps
) that are passed to
the scheduler’s __init__
function, and the attributes can be accessed by scheduler.config.num_train_timesteps
.
Class attributes:
- _compatibles (
List[str]
) — A list of scheduler classes that are compatible with the parent scheduler class. Use from_config() to load a different compatible scheduler class (should be overridden by parent class).
from_pretrained
< source >( pretrained_model_name_or_path: typing.Union[str, os.PathLike, NoneType] = None subfolder: typing.Optional[str] = None return_unused_kwargs = False **kwargs )
Parameters
- pretrained_model_name_or_path (
str
oros.PathLike
, optional) — Can be either:- A string, the model id (for example
google/ddpm-celebahq-256
) of a pretrained model hosted on the Hub. - A path to a directory (for example
./my_model_directory
) containing the scheduler configuration saved with save_pretrained().
- A string, the model id (for example
- subfolder (
str
, optional) — The subfolder location of a model file within a larger model repository on the Hub or locally. - return_unused_kwargs (
bool
, optional, defaults toFalse
) — Whether kwargs that are not consumed by the Python class should be returned or not. - cache_dir (
Union[str, os.PathLike]
, optional) — Path to a directory where a downloaded pretrained model configuration is cached if the standard cache is not used. - force_download (
bool
, optional, defaults toFalse
) — Whether or not to force the (re-)download of the model weights and configuration files, overriding the cached versions if they exist. - proxies (
Dict[str, str]
, optional) — A dictionary of proxy servers to use by protocol or endpoint, for example,{'http': 'foo.bar:3128', 'http://hostname': 'foo.bar:4012'}
. The proxies are used on each request. - output_loading_info(
bool
, optional, defaults toFalse
) — Whether or not to also return a dictionary containing missing keys, unexpected keys and error messages. - local_files_only(
bool
, optional, defaults toFalse
) — Whether to only load local model weights and configuration files or not. If set toTrue
, the model won’t be downloaded from the Hub. - token (
str
or bool, optional) — The token to use as HTTP bearer authorization for remote files. IfTrue
, the token generated fromdiffusers-cli login
(stored in~/.huggingface
) is used. - revision (
str
, optional, defaults to"main"
) — The specific model version to use. It can be a branch name, a tag name, a commit id, or any identifier allowed by Git.
Instantiate a scheduler from a pre-defined JSON configuration file in a local directory or Hub repository.
To use private or gated models, log-in with
huggingface-cli login
. You can also activate the special
“offline-mode” to use this method in a
firewalled environment.
save_pretrained
< source >( save_directory: typing.Union[str, os.PathLike] push_to_hub: bool = False **kwargs )
Parameters
- save_directory (
str
oros.PathLike
) — Directory where the configuration JSON file will be saved (will be created if it does not exist). - push_to_hub (
bool
, optional, defaults toFalse
) — Whether or not to push your model to the Hugging Face Hub after saving it. You can specify the repository you want to push to withrepo_id
(will default to the name ofsave_directory
in your namespace). - kwargs (
Dict[str, Any]
, optional) — Additional keyword arguments passed along to the push_to_hub() method.
Save a scheduler configuration object to a directory so that it can be reloaded using the from_pretrained() class method.
SchedulerOutput
class diffusers.schedulers.scheduling_utils.SchedulerOutput
< source >( prev_sample: Tensor )
Base class for the output of a scheduler’s step
function.
KarrasDiffusionSchedulers
KarrasDiffusionSchedulers
are a broad generalization of schedulers in 🤗 Diffusers. The schedulers in this class are distinguished at a high level by their noise sampling strategy, the type of network and scaling, the training strategy, and how the loss is weighed.
The different schedulers in this class, depending on the ordinary differential equations (ODE) solver type, fall into the above taxonomy and provide a good abstraction for the design of the main schedulers implemented in 🤗 Diffusers. The schedulers in this class are given here.
PushToHubMixin
A Mixin to push a model, scheduler, or pipeline to the Hugging Face Hub.
push_to_hub
< source >( repo_id: str commit_message: typing.Optional[str] = None private: typing.Optional[bool] = None token: typing.Optional[str] = None create_pr: bool = False safe_serialization: bool = True variant: typing.Optional[str] = None )
Parameters
- repo_id (
str
) — The name of the repository you want to push your model, scheduler, or pipeline files to. It should contain your organization name when pushing to an organization.repo_id
can also be a path to a local directory. - commit_message (
str
, optional) — Message to commit while pushing. Default to"Upload {object}"
. - private (
bool
, optional) — Whether to make the repo private. IfNone
(default), the repo will be public unless the organization’s default is private. This value is ignored if the repo already exists. - token (
str
, optional) — The token to use as HTTP bearer authorization for remote files. The token generated when runninghuggingface-cli login
(stored in~/.huggingface
). - create_pr (
bool
, optional, defaults toFalse
) — Whether or not to create a PR with the uploaded files or directly commit. - safe_serialization (
bool
, optional, defaults toTrue
) — Whether or not to convert the model weights to thesafetensors
format. - variant (
str
, optional) — If specified, weights are saved in the formatpytorch_model.<variant>.bin
.
Upload model, scheduler, or pipeline files to the 🤗 Hugging Face Hub.
Examples:
from diffusers import UNet2DConditionModel
unet = UNet2DConditionModel.from_pretrained("stabilityai/stable-diffusion-2", subfolder="unet")
# Push the `unet` to your namespace with the name "my-finetuned-unet".
unet.push_to_hub("my-finetuned-unet")
# Push the `unet` to an organization with the name "my-finetuned-unet".
unet.push_to_hub("your-org/my-finetuned-unet")