BOINC AI Hub
  • 🌍BOINC AI Hub
  • 🌍Repositories
  • Getting Started with Repositories
  • Repository Settings
  • Pull Requests & Discussions
  • Notifications
  • Collections
  • 🌍Webhooks
    • How-to: Automatic fine-tuning with Auto-Train
    • How-to: Build a Discussion bot based on BLOOM
    • How-to: Create automatic metadata quality reports
  • Repository size recommendations
  • Next Steps
  • Licenses
  • 🌍Models
  • The Model Hub
  • 🌍Model Cards
    • Annotated Model Card
    • Carbon Emissions
    • Model Card Guidebook
    • Landscape Analysis
  • Gated Models
  • Uploading Models
  • Downloading Models
  • 🌍Integrated Libraries
    • Adapter Transformers
    • AllenNLP
    • Asteroid
    • Diffusers
    • ESPnet
    • fastai
    • Flair
    • Keras
    • ML-Agents
    • PaddleNLP
    • RL-Baselines3-Zoo
    • Sample Factory
    • Sentence Transformers
    • spaCy
    • SpanMarker
    • SpeechBrain
    • Stable-Baselines3
    • Stanza
    • TensorBoard
    • timm
    • Transformers
    • Transformers.js
  • 🌍Model Widgets
    • Widget Examples
  • Inference API docs
  • Frequently Asked Questions
  • 🌍Advanced Topics
    • Integrate a library with the Hub
    • Tasks
  • 🌍Datasets
  • Datasets Overview
  • Dataset Cards
  • Gated Datasets
  • Dataset Viewer
  • Using Datasets
  • Adding New Datasets
  • 🌍Spaces
  • 🌍Spaces Overview
    • Handling Spaces Dependencies
    • Spaces Settings
    • Using Spaces for Organization Cards
  • Spaces GPU Upgrades
  • Spaces Persistent Storage
  • Gradio Spaces
  • Streamlit Spaces
  • Static HTML Spaces
  • 🌍Docker Spaces
    • Your first Docker Spaces
    • Example Docker Spaces
    • Argilla on Spaces
    • Label Studio on Spaces
    • Aim on Space
    • Livebook on Spaces
    • Shiny on Spaces
    • ZenML on Spaces
    • Panel on Spaces
    • ChatUI on Spaces
    • Tabby on Spaces
  • Embed your Space
  • Run Spaces with Docker
  • Spaces Configuration Reference
  • Sign-In with BA button
  • Spaces Changelog
  • 🌍Advanced Topics
    • Using OpenCV in Spaces
    • More ways to create Spaces
    • Managing Spaces with Github Actions
    • Custom Python Spaces
    • How to Add a Space to ArXiv
    • Cookie limitations in Spaces
  • 🌍Other
  • 🌍Organizations
    • Managing Organizations
    • Organization Cards
    • Access Control in Organizations
  • Billing
  • 🌍Security
    • User Access Tokens
    • Git over SSH
    • Signing Commits with GPG
    • Single Sign-On (SSO)
    • Malware Scanning
    • Pickle Scanning
    • Secrets Scanning
  • Moderation
  • Paper Pages
  • Search
  • Digital Object Identifier (DOI)
  • Hub API Endpoints
  • Sign-In with BA
Powered by GitBook
On this page
  • Using timm at BOINC AI
  • Using existing models from the Hub
  • Sharing your models
  • Inference Widget and API
  • Additional resources
  1. Integrated Libraries

timm

PreviousTensorBoardNextTransformers

Last updated 1 year ago

Using timm at BOINC AI

timm, also known as , is an open-source collection of state-of-the-art PyTorch image models, pretrained weights, and utility scripts for training, inference, and validation.

This documentation focuses on timm functionality in the BOINC AI Hub instead of the timm library itself. For detailed information about the timm library, visit .

You can find a number of timm models on the Hub using the filters on the left of the .

All models on the Hub come with several useful features:

  1. An automatically generated model card, which model authors can complete with .

  2. Metadata tags help users discover the relevant timm models.

  3. An you can use to play with the model directly in the browser.

  4. An that allows users to make inference requests.

Using existing models from the Hub

Any timm model from the BOINC AI Hub can be loaded with a single line of code as long as you have timm installed! Once you’ve selected a model from the Hub, pass the model’s ID prefixed with hf-hub: to timm’s create_model method to download and instantiate the model.

Copied

import timm

# Loading https://huggingface.co/timm/eca_nfnet_l0
model = timm.create_model("hf-hub:timm/eca_nfnet_l0", pretrained=True)

If you want to see how to load a specific model, you can click Use in timm and you will be given a working snippet to load it!

Inference

The snippet below shows how you can perform inference on a timm model loaded from the Hub:

Copied

import timm
import torch
from PIL import Image
from timm.data import resolve_data_config
from timm.data.transforms_factory import create_transform

# Load from Hub 🔥
model = timm.create_model(
    'hf-hub:nateraw/resnet50-oxford-iiit-pet',
    pretrained=True
)

# Set model to eval mode for inference
model.eval()

# Create Transform
transform = create_transform(**resolve_data_config(model.pretrained_cfg, model=model))

# Get the labels from the model config
labels = model.pretrained_cfg['labels']
top_k = min(len(labels), 5)

# Use your own image file here...
image = Image.open('boxer.jpg').convert('RGB')

# Process PIL image with transforms and add a batch dimension
x = transform(image).unsqueeze(0)

# Pass inputs to model forward function to get outputs
out = model(x)

# Apply softmax to get predicted probabilities for each class
probabilities = torch.nn.functional.softmax(out[0], dim=0)

# Grab the values and indices of top 5 predicted classes
values, indices = torch.topk(probabilities, top_k)

# Prepare a nice dict of top k predictions
predictions = [
    {"label": labels[i], "score": v.item()}
    for i, v in zip(indices, values)
]
print(predictions)

This should leave you with a list of predictions, like this:

Copied

[
    {'label': 'american_pit_bull_terrier', 'score': 0.9999998807907104},
    {'label': 'staffordshire_bull_terrier', 'score': 1.0000000149011612e-07},
    {'label': 'miniature_pinscher', 'score': 1.0000000149011612e-07},
    {'label': 'chihuahua', 'score': 1.0000000149011612e-07},
    {'label': 'beagle', 'score': 1.0000000149011612e-07}
]

Sharing your models

You can share your timm models directly to the Hugging Face Hub. This will publish a new version of your model to the Hugging Face Hub, creating a model repo for you if it doesn’t already exist.

Before pushing a model, make sure that you’ve logged in to Hugging Face:

Copied

python -m pip install huggingface_hub
huggingface-cli login

Alternatively, if you prefer working from a Jupyter or Colaboratory notebook, once you’ve installed huggingface_hub you can log in with:

Copied

from huggingface_hub import notebook_login
notebook_login()

Then, push your model using the push_to_hf_hub method:

Copied

import timm

# Build or load a model, e.g. timm's pretrained resnet18
model = timm.create_model('resnet18', pretrained=True, num_classes=4)

###########################
# [Fine tune your model...]
###########################

# Push it to the 🌍 Hub
timm.models.hub.push_to_hf_hub(
    model,
    'resnet18-random-classifier',
    model_config={'labels': ['a', 'b', 'c', 'd']}
)

# Load your model from the Hub
model_reloaded = timm.create_model(
    'hf-hub:<your-username>/resnet18-random-classifier',
    pretrained=True
)

Inference Widget and API

curl https://api-inference.huggingface.co/models/nateraw/timm-resnet50-beans \
        -X POST \
        --data-binary '@beans.jpeg' \
        -H "Authorization: Bearer {$HF_API_TOKEN}"
# [{"label":"angular_leaf_spot","score":0.9845947027206421},{"label":"bean_rust","score":0.01368315052241087},{"label":"healthy","score":0.001722085871733725}]

Additional resources

All timm models on the Hub are automatically equipped with an , pictured below for . Additionally, timm models are available through the , which you can access through HTTP with cURL, Python’s requests library, or your preferred method for making network requests.

Copied

timm (pytorch-image-models) .

timm .

Additional documentation at by .

by .

🌍
pytorch-image-models
its documentation
models page
information about their model
interactive widget
Inference API
inference widget
nateraw/timm-resnet50-beans
Inference API
GitHub Repo
documentation
timmdocs
Aman Arora
Getting Started with PyTorch Image Models (timm): A Practitioner’s Guide
Chris Hughes