Configuration

Configuration

Schedulers from SchedulerMixin and models from ModelMixin inherit from ConfigMixin which stores all the parameters that are passed to their respective __init__ methods in a JSON-configuration file.

To use private or gated models, log-in with boincai-cli login.

ConfigMixin

class diffusers.ConfigMixin

<source>

( )

Base class for all configuration classes. All configuration parameters are stored under self.config. Also provides the from_config() and save_config() methods for loading, downloading, and saving classes that inherit from ConfigMixin.

Class attributes:

  • config_name (str) β€” A filename under which the config should stored when calling save_config() (should be overridden by parent class).

  • ignore_for_config (List[str]) β€” A list of attributes that should not be saved in the config (should be overridden by subclass).

  • has_compatibles (bool) β€” Whether the class has compatible classes (should be overridden by subclass).

  • _deprecated_kwargs (List[str]) β€” Keyword arguments that are deprecated. Note that the init function should only have a kwargs argument if at least one argument is deprecated (should be overridden by subclass).

load_config

<source>

( pretrained_model_name_or_path: typing.Union[str, os.PathLike]return_unused_kwargs = Falsereturn_commit_hash = False**kwargs ) β†’ dict

Parameters

  • pretrained_model_name_or_path (str or os.PathLike, optional) β€” Can be either:

    • A string, the model id (for example google/ddpm-celebahq-256) of a pretrained model hosted on the Hub.

    • A path to a directory (for example ./my_model_directory) containing model weights saved with save_config().

  • cache_dir (Union[str, os.PathLike], optional) β€” Path to a directory where a downloaded pretrained model configuration is cached if the standard cache is not used.

  • force_download (bool, optional, defaults to False) β€” Whether or not to force the (re-)download of the model weights and configuration files, overriding the cached versions if they exist.

  • resume_download (bool, optional, defaults to False) β€” Whether or not to resume downloading the model weights and configuration files. If set to False, any incompletely downloaded files are deleted.

  • proxies (Dict[str, str], optional) β€” A dictionary of proxy servers to use by protocol or endpoint, for example, {'http': 'foo.bar:3128', 'http://hostname': 'foo.bar:4012'}. The proxies are used on each request.

  • output_loading_info(bool, optional, defaults to False) β€” Whether or not to also return a dictionary containing missing keys, unexpected keys and error messages.

  • local_files_only (bool, optional, defaults to False) β€” Whether to only load local model weights and configuration files or not. If set to True, the model won’t be downloaded from the Hub.

  • use_auth_token (str or bool, optional) β€” The token to use as HTTP bearer authorization for remote files. If True, the token generated from diffusers-cli login (stored in ~/.boincai) is used.

  • revision (str, optional, defaults to "main") β€” The specific model version to use. It can be a branch name, a tag name, a commit id, or any identifier allowed by Git.

  • subfolder (str, optional, defaults to "") β€” The subfolder location of a model file within a larger model repository on the Hub or locally.

  • return_unused_kwargs (bool, optional, defaults to `False) β€” Whether unused keyword arguments of the config are returned.

  • return_commit_hash (bool, optional, defaults to False) -- Whether the commit_hash` of the loaded configuration are returned.

Returns

dict

A dictionary of all the parameters stored in a JSON configuration file.

Load a model or scheduler configuration.

from_config

<source>

( config: typing.Union[diffusers.configuration_utils.FrozenDict, typing.Dict[str, typing.Any]] = Nonereturn_unused_kwargs = False**kwargs ) β†’ ModelMixin or SchedulerMixin

Parameters

  • config (Dict[str, Any]) β€” A config dictionary from which the Python class is instantiated. Make sure to only load configuration files of compatible classes.

  • return_unused_kwargs (bool, optional, defaults to False) β€” Whether kwargs that are not consumed by the Python class should be returned or not.

  • kwargs (remaining dictionary of keyword arguments, optional) β€” Can be used to update the configuration object (after it is loaded) and initiate the Python class. **kwargs are passed directly to the underlying scheduler/model’s __init__ method and eventually overwrite the same named arguments in config.

Returns

ModelMixin or SchedulerMixin

A model or scheduler object instantiated from a config dictionary.

Instantiate a Python class from a config dictionary.

Examples:

Copied

>>> from diffusers import DDPMScheduler, DDIMScheduler, PNDMScheduler

>>> # Download scheduler from boincai.com and cache.
>>> scheduler = DDPMScheduler.from_pretrained("google/ddpm-cifar10-32")

>>> # Instantiate DDIM scheduler class with same config as DDPM
>>> scheduler = DDIMScheduler.from_config(scheduler.config)

>>> # Instantiate PNDM scheduler class with same config as DDPM
>>> scheduler = PNDMScheduler.from_config(scheduler.config)

save_config

<source>

( save_directory: typing.Union[str, os.PathLike]push_to_hub: bool = False**kwargs )

Parameters

  • save_directory (str or os.PathLike) β€” Directory where the configuration JSON file is saved (will be created if it does not exist).

  • push_to_hub (bool, optional, defaults to False) β€” Whether or not to push your model to the BOINC AI Hub after saving it. You can specify the repository you want to push to with repo_id (will default to the name of save_directory in your namespace).

  • kwargs (Dict[str, Any], optional) β€” Additional keyword arguments passed along to the push_to_hub() method.

Save a configuration object to the directory specified in save_directory so that it can be reloaded using the from_config() class method.

to_json_file

<source>

( json_file_path: typing.Union[str, os.PathLike] )

Parameters

  • json_file_path (str or os.PathLike) β€” Path to the JSON file to save a configuration instance’s parameters.

Save the configuration instance’s parameters to a JSON file.

to_json_string

<source>

( ) β†’ str

Returns

str

String containing all the attributes that make up the configuration instance in JSON format.

Serializes the configuration instance to a JSON string.

Last updated