BOINC AI Hub
  • 🌍BOINC AI Hub
  • 🌍Repositories
  • Getting Started with Repositories
  • Repository Settings
  • Pull Requests & Discussions
  • Notifications
  • Collections
  • 🌍Webhooks
    • How-to: Automatic fine-tuning with Auto-Train
    • How-to: Build a Discussion bot based on BLOOM
    • How-to: Create automatic metadata quality reports
  • Repository size recommendations
  • Next Steps
  • Licenses
  • 🌍Models
  • The Model Hub
  • 🌍Model Cards
    • Annotated Model Card
    • Carbon Emissions
    • Model Card Guidebook
    • Landscape Analysis
  • Gated Models
  • Uploading Models
  • Downloading Models
  • 🌍Integrated Libraries
    • Adapter Transformers
    • AllenNLP
    • Asteroid
    • Diffusers
    • ESPnet
    • fastai
    • Flair
    • Keras
    • ML-Agents
    • PaddleNLP
    • RL-Baselines3-Zoo
    • Sample Factory
    • Sentence Transformers
    • spaCy
    • SpanMarker
    • SpeechBrain
    • Stable-Baselines3
    • Stanza
    • TensorBoard
    • timm
    • Transformers
    • Transformers.js
  • 🌍Model Widgets
    • Widget Examples
  • Inference API docs
  • Frequently Asked Questions
  • 🌍Advanced Topics
    • Integrate a library with the Hub
    • Tasks
  • 🌍Datasets
  • Datasets Overview
  • Dataset Cards
  • Gated Datasets
  • Dataset Viewer
  • Using Datasets
  • Adding New Datasets
  • 🌍Spaces
  • 🌍Spaces Overview
    • Handling Spaces Dependencies
    • Spaces Settings
    • Using Spaces for Organization Cards
  • Spaces GPU Upgrades
  • Spaces Persistent Storage
  • Gradio Spaces
  • Streamlit Spaces
  • Static HTML Spaces
  • 🌍Docker Spaces
    • Your first Docker Spaces
    • Example Docker Spaces
    • Argilla on Spaces
    • Label Studio on Spaces
    • Aim on Space
    • Livebook on Spaces
    • Shiny on Spaces
    • ZenML on Spaces
    • Panel on Spaces
    • ChatUI on Spaces
    • Tabby on Spaces
  • Embed your Space
  • Run Spaces with Docker
  • Spaces Configuration Reference
  • Sign-In with BA button
  • Spaces Changelog
  • 🌍Advanced Topics
    • Using OpenCV in Spaces
    • More ways to create Spaces
    • Managing Spaces with Github Actions
    • Custom Python Spaces
    • How to Add a Space to ArXiv
    • Cookie limitations in Spaces
  • 🌍Other
  • 🌍Organizations
    • Managing Organizations
    • Organization Cards
    • Access Control in Organizations
  • Billing
  • 🌍Security
    • User Access Tokens
    • Git over SSH
    • Signing Commits with GPG
    • Single Sign-On (SSO)
    • Malware Scanning
    • Pickle Scanning
    • Secrets Scanning
  • Moderation
  • Paper Pages
  • Search
  • Digital Object Identifier (DOI)
  • Hub API Endpoints
  • Sign-In with BA
Powered by GitBook
On this page
  • Using Adapter Transformers at BOINC AI
  • Exploring adapter-transformers in the Hub
  • Using existing models
  • Sharing your models
  • Additional resources
  1. Integrated Libraries

Adapter Transformers

PreviousIntegrated LibrariesNextAllenNLP

Last updated 1 year ago

Using Adapter Transformers at BOINC AI

adapter-transformers is a library that extends 🌍 transformers by allowing to integrate, train and use Adapters and other efficient fine-tuning methods. The library is fully compatible with 🌍 transformers. Adapters are small learnt layers inserted within each layer of a pre-trained model. You can learn more about this in the .

Exploring adapter-transformers in the Hub

You can find over a hundred adapter-transformer models by filtering at the left of the . Some adapter models can be found in the Adapter Hub . Models from both sources are then aggregated in the .

Using existing models

For a full guide on loading pre-trained adapters, we recommend checking out the .

As a brief summary, once you load a model with the usual *Model classes from 🌍transformers, you can use the load_adapter method to load and activate the Adapter (remember adapter-transformers extends 🌍transformers.).

Copied

from transformers import AutoModelWithHeads

model = AutoModelWithHeads.from_pretrained("bert-base-uncased")
adapter_name = model.load_adapter("AdapterHub/bert-base-uncased-pf-imdb", source="hf")
model.active_adapters = adapter_name

You can also use list_adapters to find all Adapter Models programmatically

Copied

from transformers import list_adapters

# source can be "ah" (AdapterHub), "hf" (hf.co) or None (for both, default)
adapter_infos = list_adapters(source="hf", model_name="bert-base-uncased")

If you want to see how to load a specific model, you can click Use in Adapter Transformers and you will be given a working snippet that you can load it!

Sharing your models

You can share your Adapter by using the push_adapter_to_hub method from a model that already contains an adapter.

Copied

model.push_adapter_to_hub(
    "my-awesome-adapter",
    "awesome_adapter",
    adapterhub_tag="sentiment/imdb",
    datasets_tag="imdb"
)

This command creates a repository with an automatically generated model card and all necessary metadata.

Additional resources

For a full guide on sharing models with adapter-transformers, we recommend checking out the .

Adapter Transformers .

Adapter Transformers .

Integration with Hub .

🌍
original paper
models page
repository
AdapterHub
official guide
official guide
library
docs
docs