PEFT model
Models
PeftModel is the base model class for specifying the base Transformer model and configuration to apply a PEFT method to. The base PeftModel
contains methods for loading and saving models from the Hub, and supports the PromptEncoder for prompt learning.
PeftModel
class peft.PeftModel
( model: PreTrainedModelpeft_config: PeftConfigadapter_name: str = 'default' )
Parameters
model (PreTrainedModel) — The base transformer model used for Peft.
peft_config (PeftConfig) — The configuration of the Peft model.
adapter_name (
str
) — The name of the adapter, defaults to"default"
.
Base model encompassing various Peft methods.
Attributes:
base_model (PreTrainedModel) — The base transformer model used for Peft.
peft_config (PeftConfig) — The configuration of the Peft model.
modules_to_save (
list
ofstr
) — The list of sub-module names to save when saving the model.prompt_encoder (PromptEncoder) — The prompt encoder used for Peft if using PromptLearningConfig.
prompt_tokens (
torch.Tensor
) — The virtual prompt tokens used for Peft if using PromptLearningConfig.transformer_backbone_name (
str
) — The name of the transformer backbone in the base model if using PromptLearningConfig.word_embeddings (
torch.nn.Embedding
) — The word embeddings of the transformer backbone in the base model if using PromptLearningConfig.
create_or_update_model_card
( output_dir: str )
Updates or create model card to include information about peft:
Adds
peft
library tagAdds peft version
Adds base model info
Adds quantization information if it was used
disable_adapter
( )
Disables the adapter module.
forward
( *args: Any**kwargs: Any )
Forward pass of the model.
from_pretrained
( model: PreTrainedModelmodel_id: Union[str, os.PathLike]adapter_name: str = 'default'is_trainable: bool = Falseconfig: Optional[PeftConfig] = None**kwargs: Any )
Parameters
model (PreTrainedModel) — The model to be adapted. The model should be initialized with the from_pretrained method from the 🌍 Transformers library.
model_id (
str
oros.PathLike
) — The name of the PEFT configuration to use. Can be either:A string, the
model id
of a PEFT configuration hosted inside a model repo on the BOINC AI Hub.A path to a directory containing a PEFT configuration file saved using the
save_pretrained
method (./my_peft_config_directory/
).
adapter_name (
str
, optional, defaults to"default"
) — The name of the adapter to be loaded. This is useful for loading multiple adapters.is_trainable (
bool
, optional, defaults toFalse
) — Whether the adapter should be trainable or not. IfFalse
, the adapter will be frozen and use for inferenceconfig (PeftConfig, optional) — The configuration object to use instead of an automatically loaded configuation. This configuration object is mutually exclusive with
model_id
andkwargs
. This is useful when configuration is already loaded before callingfrom_pretrained
. kwargs — (optional
): Additional keyword arguments passed along to the specific PEFT configuration class.
Instantiate a PEFT model from a pretrained model and loaded PEFT weights.
Note that the passed model
may be modified inplace.
get_base_model
( )
Returns the base model.
get_nb_trainable_parameters
( )
Returns the number of trainable parameters and number of all parameters in the model.
get_prompt
( batch_size: inttask_ids: Optional[torch.Tensor] = None )
Returns the virtual prompts to use for Peft. Only applicable when peft_config.peft_type != PeftType.LORA
.
get_prompt_embedding_to_save
( adapter_name: str )
Returns the prompt embedding to save when saving the model. Only applicable when peft_config.peft_type != PeftType.LORA
.
print_trainable_parameters
( )
Prints the number of trainable parameters in the model.
save_pretrained
( save_directory: strsafe_serialization: bool = Falseselected_adapters: Optional[List[str]] = None**kwargs: Any )
Parameters
save_directory (
str
) — Directory where the adapter model and configuration files will be saved (will be created if it does not exist).kwargs (additional keyword arguments, optional) — Additional keyword arguments passed along to the
push_to_hub
method.
This function saves the adapter model and the adapter configuration files to a directory, so that it can be reloaded using the PeftModel.from_pretrained() class method, and also used by the PeftModel.push_to_hub()
method.
set_adapter
( adapter_name: str )
Sets the active adapter.
PeftModelForSequenceClassification
A PeftModel
for sequence classification tasks.
class peft.PeftModelForSequenceClassification
( modelpeft_config: PeftConfigadapter_name = 'default' )
Parameters
model (PreTrainedModel) — Base transformer model.
peft_config (PeftConfig) — Peft config.
Peft model for sequence classification tasks.
Attributes:
config (PretrainedConfig) — The configuration object of the base model.
cls_layer_name (
str
) — The name of the classification layer.
Example:
Copied
PeftModelForTokenClassification
A PeftModel
for token classification tasks.
class peft.PeftModelForTokenClassification
( modelpeft_config: PeftConfig = Noneadapter_name = 'default' )
Parameters
model (PreTrainedModel) — Base transformer model.
peft_config (PeftConfig) — Peft config.
Peft model for token classification tasks.
Attributes:
config (PretrainedConfig) — The configuration object of the base model.
cls_layer_name (
str
) — The name of the classification layer.
Example:
Copied
PeftModelForCausalLM
A PeftModel
for causal language modeling.
class peft.PeftModelForCausalLM
( modelpeft_config: PeftConfigadapter_name = 'default' )
Parameters
model (PreTrainedModel) — Base transformer model.
peft_config (PeftConfig) — Peft config.
Peft model for causal language modeling.
Example:
Copied
PeftModelForSeq2SeqLM
A PeftModel
for sequence-to-sequence language modeling.
class peft.PeftModelForSeq2SeqLM
( modelpeft_config: PeftConfigadapter_name = 'default' )
Parameters
model (PreTrainedModel) — Base transformer model.
peft_config (PeftConfig) — Peft config.
Peft model for sequence-to-sequence language modeling.
Example:
Copied
PeftModelForQuestionAnswering
A PeftModel
for question answering.
class peft.PeftModelForQuestionAnswering
( modelpeft_config: PeftConfig = Noneadapter_name = 'default' )
Parameters
model (PreTrainedModel) — Base transformer model.
peft_config (PeftConfig) — Peft config.
Peft model for extractive question answering.
Attributes:
config (PretrainedConfig) — The configuration object of the base model.
cls_layer_name (
str
) — The name of the classification layer.
Example:
Copied
PeftModelForFeatureExtraction
A PeftModel
for getting extracting features/embeddings from transformer models.
class peft.PeftModelForFeatureExtraction
( modelpeft_config: PeftConfig = Noneadapter_name = 'default' )
Parameters
model (PreTrainedModel) — Base transformer model.
peft_config (PeftConfig) — Peft config.
Peft model for extracting features/embeddings from transformer models
Attributes:
config (PretrainedConfig) — The configuration object of the base model.
Example:
Copied
Last updated