Notebooks with examples

🌍 Transformers Notebooks

You can find here a list of the official notebooks provided by BOINC AI.

Also, we would like to list here interesting content created by the community. If you wrote some notebook(s) leveraging 🌍 Transformers and would like to be listed here, please open a Pull Request so it can be included under the Community notebooks.

BOINC AI’s notebooks 🌍

Documentation notebooks

You can open any page of the documentation as a notebook in Colab (there is a button directly on said pages) but they are also listed here if you need them:

Notebook
Description

A presentation of the various APIs in Transformers

Open in Colab

Open in AWS Studio

How to run the models of the Transformers library task by task

Open in Colab

Open in AWS Studio

How to use a tokenizer to preprocess your data

Open in Colab

Open in AWS Studio

How to use the Trainer to fine-tune a pretrained model

Open in Colab

Open in AWS Studio

The differences between the tokenizers algorithm

Open in Colab

Open in AWS Studio

How to use the multilingual models of the library

Open in Colab

Open in AWS Studio

PyTorch Examples

Natural Language Processing

Notebook
Description

How to train and use your very own tokenizer

Open in Colab

Open in AWS Studio

How to easily start using transformers

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained model on any GLUE task.

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained model on a causal or masked LM task.

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained model on a token classification task (NER, PoS).

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained model on SQUAD.

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained model on SWAG.

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained model on WMT.

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained model on XSUM.

Open in Colab

Open in AWS Studio

Highlight all the steps to effectively train Transformer model on custom data

Open in Colab

Open in AWS Studio

How to use different decoding methods for language generation with transformers

Open in Colab

Open in AWS Studio

How to guide language generation with user-provided constraints

Open in Colab

Open in AWS Studio

How Reformer pushes the limits of language modeling

Open in Colab

Open in AWS Studio

Computer Vision

Notebook
Description

Show how to preprocess the data using Torchvision and fine-tune any pretrained Vision model on Image Classification

Open in Colab

Open in AWS Studio

Show how to preprocess the data using Albumentations and fine-tune any pretrained Vision model on Image Classification

Open in Colab

Open in AWS Studio

Show how to preprocess the data using Kornia and fine-tune any pretrained Vision model on Image Classification

Open in Colab

Open in AWS Studio

Show how to perform zero-shot object detection on images with text queries

Open in Colab

Open in AWS Studio

Show how to fine-tune BLIP for image captioning on a custom dataset

Open in Colab

Open in AWS Studio

Show how to build an image similarity system

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained SegFormer model on Semantic Segmentation

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained VideoMAE model on Video Classification

Open in Colab

Open in AWS Studio

Audio

Notebook
Description

Show how to preprocess the data and fine-tune a pretrained Speech model on TIMIT

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a multi-lingually pretrained speech model on Common Voice

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained Speech model on Keyword Spotting

Open in Colab

Open in AWS Studio

Biological Sequences

Notebook
Description

See how to tokenize proteins and fine-tune a large pre-trained protein “language” model

Open in Colab

Open in AWS Studio

See how to go from protein sequence to a full protein model and PDB file

Open in Colab

Open in AWS Studio

See how to tokenize DNA and fine-tune a large pre-trained DNA “language” model

Open in Colab

Open in AWS Studio

Train even larger DNA models in a memory-efficient way

Open in Colab

Open in AWS Studio

Other modalities

Notebook
Description

See how to train Time Series Transformer on a custom dataset

Open in Colab

Open in AWS Studio

Utility notebooks

Notebook
Description

Highlight how to export and run inference workloads through ONNX

How to benchmark models with transformers

Open in Colab

Open in AWS Studio

TensorFlow Examples

Natural Language Processing

Notebook
Description

How to train and use your very own tokenizer

Open in Colab

Open in AWS Studio

How to easily start using transformers

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained model on any GLUE task.

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained model on a causal or masked LM task.

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained model on a token classification task (NER, PoS).

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained model on SQUAD.

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained model on SWAG.

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained model on WMT.

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained model on XSUM.

Open in Colab

Open in AWS Studio

Computer Vision

Notebook
Description

Show how to preprocess the data and fine-tune any pretrained Vision model on Image Classification

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a pretrained SegFormer model on Semantic Segmentation

Open in Colab

Open in AWS Studio

Biological Sequences

Notebook
Description

See how to tokenize proteins and fine-tune a large pre-trained protein “language” model

Open in Colab

Open in AWS Studio

Utility notebooks

Notebook
Description

See how to train at high speed on Google’s TPU hardware

Open in Colab

Open in AWS Studio

Optimum notebooks

🌍 Optimum is an extension of 🌍 Transformers, providing a set of performance optimization tools enabling maximum efficiency to train and run models on targeted hardwares.

Notebook
Description

Show how to apply static and dynamic quantization on a model using ONNX Runtime for any GLUE task.

Open in Colab

Open in AWS Studio

Show how to apply static, dynamic and aware training quantization on a model using Intel Neural Compressor (INC) for any GLUE task.

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a model on any GLUE task using ONNX Runtime.

Open in Colab

Open in AWS Studio

Show how to preprocess the data and fine-tune a model on XSUM using ONNX Runtime.

Open in Colab

Open in AWS Studio

Community notebooks:

More notebooks developed by the community are available here.

Last updated