Export to TFLite
TensorFlow Lite is a lightweight framework for deploying machine learning models on resource-constrained devices, such as mobile phones, embedded systems, and Internet of Things (IoT) devices. TFLite is designed to optimize and run models efficiently on these devices with limited computational power, memory, and power consumption. A TensorFlow Lite model is represented in a special efficient portable format identified by the .tflite
file extension.
π Optimum offers functionality to export π Transformers models to TFLite through the exporters.tflite
module. For the list of supported model architectures, please refer to π Optimum documentation.
To export a model to TFLite, install the required dependencies:
Copied
To check out all available arguments, refer to the π Optimum docs, or view help in command line:
Copied
To export a modelβs checkpoint from the π Hub, for example, bert-base-uncased
, run the following command:
Copied
You should see the logs indicating progress and showing where the resulting model.tflite
is saved, like this:
Copied
The example above illustrates exporting a checkpoint from π Hub. When exporting a local model, first make sure that you saved both the modelβs weights and tokenizer files in the same directory (local_path
). When using CLI, pass the local_path
to the model
argument instead of the checkpoint name on π Hub.
Last updated