PEFT model

Models

PeftModel is the base model class for specifying the base Transformer model and configuration to apply a PEFT method to. The base PeftModel contains methods for loading and saving models from the Hub, and supports the PromptEncoder for prompt learning.

PeftModel

class peft.PeftModel

<source>

( model: PreTrainedModelpeft_config: PeftConfigadapter_name: str = 'default' )

Parameters

  • model (PreTrainedModel) β€” The base transformer model used for Peft.

  • peft_config (PeftConfig) β€” The configuration of the Peft model.

  • adapter_name (str) β€” The name of the adapter, defaults to "default".

Base model encompassing various Peft methods.

Attributes:

  • base_model (PreTrainedModel) β€” The base transformer model used for Peft.

  • peft_config (PeftConfig) β€” The configuration of the Peft model.

  • modules_to_save (list of str) β€” The list of sub-module names to save when saving the model.

  • prompt_encoder (PromptEncoder) β€” The prompt encoder used for Peft if using PromptLearningConfig.

  • prompt_tokens (torch.Tensor) β€” The virtual prompt tokens used for Peft if using PromptLearningConfig.

  • transformer_backbone_name (str) β€” The name of the transformer backbone in the base model if using PromptLearningConfig.

  • word_embeddings (torch.nn.Embedding) β€” The word embeddings of the transformer backbone in the base model if using PromptLearningConfig.

create_or_update_model_card

<source>

( output_dir: str )

Updates or create model card to include information about peft:

  1. Adds peft library tag

  2. Adds peft version

  3. Adds base model info

  4. Adds quantization information if it was used

disable_adapter

<source>

( )

Disables the adapter module.

forward

<source>

( *args: Any**kwargs: Any )

Forward pass of the model.

from_pretrained

<source>

( model: PreTrainedModelmodel_id: Union[str, os.PathLike]adapter_name: str = 'default'is_trainable: bool = Falseconfig: Optional[PeftConfig] = None**kwargs: Any )

Parameters

  • model (PreTrainedModel) β€” The model to be adapted. The model should be initialized with the from_pretrained method from the 🌍 Transformers library.

  • model_id (str or os.PathLike) β€” The name of the PEFT configuration to use. Can be either:

    • A string, the model id of a PEFT configuration hosted inside a model repo on the BOINC AI Hub.

    • A path to a directory containing a PEFT configuration file saved using the save_pretrained method (./my_peft_config_directory/).

  • adapter_name (str, optional, defaults to "default") β€” The name of the adapter to be loaded. This is useful for loading multiple adapters.

  • is_trainable (bool, optional, defaults to False) β€” Whether the adapter should be trainable or not. If False, the adapter will be frozen and use for inference

  • config (PeftConfig, optional) β€” The configuration object to use instead of an automatically loaded configuation. This configuration object is mutually exclusive with model_id and kwargs. This is useful when configuration is already loaded before calling from_pretrained. kwargs β€” (optional): Additional keyword arguments passed along to the specific PEFT configuration class.

Instantiate a PEFT model from a pretrained model and loaded PEFT weights.

Note that the passed model may be modified inplace.

get_base_model

<source>

( )

Returns the base model.

get_nb_trainable_parameters

<source>

( )

Returns the number of trainable parameters and number of all parameters in the model.

get_prompt

<source>

( batch_size: inttask_ids: Optional[torch.Tensor] = None )

Returns the virtual prompts to use for Peft. Only applicable when peft_config.peft_type != PeftType.LORA.

get_prompt_embedding_to_save

<source>

( adapter_name: str )

Returns the prompt embedding to save when saving the model. Only applicable when peft_config.peft_type != PeftType.LORA.

print_trainable_parameters

<source>

( )

Prints the number of trainable parameters in the model.

save_pretrained

<source>

( save_directory: strsafe_serialization: bool = Falseselected_adapters: Optional[List[str]] = None**kwargs: Any )

Parameters

  • save_directory (str) β€” Directory where the adapter model and configuration files will be saved (will be created if it does not exist).

  • kwargs (additional keyword arguments, optional) β€” Additional keyword arguments passed along to the push_to_hub method.

This function saves the adapter model and the adapter configuration files to a directory, so that it can be reloaded using the PeftModel.from_pretrained() class method, and also used by the PeftModel.push_to_hub() method.

set_adapter

<source>

( adapter_name: str )

Sets the active adapter.

PeftModelForSequenceClassification

A PeftModel for sequence classification tasks.

class peft.PeftModelForSequenceClassification

<source>

( modelpeft_config: PeftConfigadapter_name = 'default' )

Parameters

Peft model for sequence classification tasks.

Attributes:

  • config (PretrainedConfig) β€” The configuration object of the base model.

  • cls_layer_name (str) β€” The name of the classification layer.

Example:

Copied

PeftModelForTokenClassification

A PeftModel for token classification tasks.

class peft.PeftModelForTokenClassification

<source>

( modelpeft_config: PeftConfig = Noneadapter_name = 'default' )

Parameters

Peft model for token classification tasks.

Attributes:

  • config (PretrainedConfig) β€” The configuration object of the base model.

  • cls_layer_name (str) β€” The name of the classification layer.

Example:

Copied

PeftModelForCausalLM

A PeftModel for causal language modeling.

class peft.PeftModelForCausalLM

<source>

( modelpeft_config: PeftConfigadapter_name = 'default' )

Parameters

Peft model for causal language modeling.

Example:

Copied

PeftModelForSeq2SeqLM

A PeftModel for sequence-to-sequence language modeling.

class peft.PeftModelForSeq2SeqLM

<source>

( modelpeft_config: PeftConfigadapter_name = 'default' )

Parameters

Peft model for sequence-to-sequence language modeling.

Example:

Copied

PeftModelForQuestionAnswering

A PeftModel for question answering.

class peft.PeftModelForQuestionAnswering

<source>

( modelpeft_config: PeftConfig = Noneadapter_name = 'default' )

Parameters

Peft model for extractive question answering.

Attributes:

  • config (PretrainedConfig) β€” The configuration object of the base model.

  • cls_layer_name (str) β€” The name of the classification layer.

Example:

Copied

PeftModelForFeatureExtraction

A PeftModel for getting extracting features/embeddings from transformer models.

class peft.PeftModelForFeatureExtraction

<source>

( modelpeft_config: PeftConfig = Noneadapter_name = 'default' )

Parameters

Peft model for extracting features/embeddings from transformer models

Attributes:

Example:

Copied

Last updated