timm
Using timm at BOINC AI
timm
, also known as pytorch-image-models, is an open-source collection of state-of-the-art PyTorch image models, pretrained weights, and utility scripts for training, inference, and validation.
This documentation focuses on timm
functionality in the BOINC AI Hub instead of the timm
library itself. For detailed information about the timm
library, visit its documentation.
You can find a number of timm
models on the Hub using the filters on the left of the models page.
All models on the Hub come with several useful features:
An automatically generated model card, which model authors can complete with information about their model.
Metadata tags help users discover the relevant
timm
models.An interactive widget you can use to play with the model directly in the browser.
An Inference API that allows users to make inference requests.
Using existing models from the Hub
Any timm
model from the BOINC AI Hub can be loaded with a single line of code as long as you have timm
installed! Once youβve selected a model from the Hub, pass the modelβs ID prefixed with hf-hub:
to timm
βs create_model
method to download and instantiate the model.
Copied
If you want to see how to load a specific model, you can click Use in timm and you will be given a working snippet to load it!
Inference
The snippet below shows how you can perform inference on a timm
model loaded from the Hub:
Copied
This should leave you with a list of predictions, like this:
Copied
Sharing your models
You can share your timm
models directly to the Hugging Face Hub. This will publish a new version of your model to the Hugging Face Hub, creating a model repo for you if it doesnβt already exist.
Before pushing a model, make sure that youβve logged in to Hugging Face:
Copied
Alternatively, if you prefer working from a Jupyter or Colaboratory notebook, once youβve installed huggingface_hub
you can log in with:
Copied
Then, push your model using the push_to_hf_hub
method:
Copied
Inference Widget and API
All timm
models on the Hub are automatically equipped with an inference widget, pictured below for nateraw/timm-resnet50-beans. Additionally, timm
models are available through the Inference API, which you can access through HTTP with cURL, Pythonβs requests
library, or your preferred method for making network requests.
Additional resources
timm (pytorch-image-models) GitHub Repo.
timm documentation.
Additional documentation at timmdocs by Aman Arora.
Last updated