Inference Endpoints’ base image includes all required libraries to run inference on 🌍 Transformers models, but it also supports custom dependencies. This is useful if you want to:
run a model which requires special dependencies like the newest or a fixed version of a library (for example, tapas (torch-scatter)).
To add custom dependencies, add a [requirements.txt](https://boincai.com/philschmid/distilbert-onnx-banking77/blob/main/requirements.txt) file with the Python dependencies you want to install in your model repository on the BOINC AI Hub. When your Endpoint and Image artifacts are created, Inference Endpoints checks if the model repository contains a requirements.txt file and installs the dependencies listed within.
Copied
optimum[onnxruntime]==1.2.3
mkl-include
mkl
Take a look at the requirements.txt files in the following model repositories as an example: