Convert Transformers models to use BetterTransformer
Last updated
Last updated
You can easily use the BetterTransformer
integration with π Optimum, first install the dependencies as follows:
Copied
Also, make sure to install the latest version of PyTorch by following the guidelines on the . Note that BetterTransformer
API is only compatible with torch>=1.13
, so make sure to have this version installed on your environement before starting. If you want to benefit from the scaled_dot_product_attention
function (for decoder-based models), make sure to use at least torch>=2.0
.
First, load your BOINC AI model using π Transformers. Make sure to download one of the models that is supported by the BetterTransformer
API:
Copied
Sometimes you can directly load your model on your GPU devices using `accelerate` library, therefore you can optionally try out the following command:Copied
If you did not used device_map="auto"
to load your model (or if your model does not support device_map="auto"
), you can manually set your model to a GPU:
Copied
Now time to convert your model using BetterTransformer
API! You can run the commands below:
Copied
By default, BetterTransformer.transform
will overwrite your model, which means that your previous native model cannot be used anymore. If you want to keep it for some reasons, just add the flag keep_original_model=True
!
Copied
If your model does not support the BetterTransformer
API, this will be displayed on an error trace. Note also that decoder-based models (OPT, BLOOM, etc.) are not supported yet but this is in the roadmap of PyTorch for the future.
Copied
If you want to run a pipeline on a GPU device, run:
Copied
You can also use transformers.pipeline
as usual and pass the converted model directly:
Copied
You can now benefit from the BetterTransformer
API for your training scripts. Just make sure to convert back your model to its original version by calling BetterTransformer.reverse
before saving your model. The code snippet below shows how:
Copied
is also compatible with this integration and you can use BetterTransformer
as an accelerator for your pipelines. The code snippet below shows how:
Please refer to the for further usage. If you face into any issue, do not hesitate to open an isse on GitHub!