AutoTokenizer
Last updated
Last updated
( )
This is a generic tokenizer class that will be instantiated as one of the tokenizer classes of the library when created with the class method.
This class cannot be instantiated directly using __init__()
(throws an error).
from_pretrained
( pretrained_model_name_or_path*inputs**kwargs )
Parameters
pretrained_model_name_or_path (str
or os.PathLike
) — Can be either:
A string, the model id of a predefined tokenizer hosted inside a model repo on huggingface.co. Valid model ids can be located at the root-level, like bert-base-uncased
, or namespaced under a user or organization name, like dbmdz/bert-base-german-cased
.
A path to a directory containing vocabulary files required by the tokenizer, for instance saved using the method, e.g., ./my_model_directory/
.
A path or url to a single saved vocabulary file if and only if the tokenizer only requires a single vocabulary file (like Bert or XLNet), e.g.: ./my_model_directory/vocab.txt
. (Not applicable to all derived classes)
inputs (additional positional arguments, optional) — Will be passed along to the Tokenizer __init__()
method.
config (, optional) — The configuration object used to determine the tokenizer class to instantiate.
cache_dir (str
or os.PathLike
, optional) — Path to a directory in which a downloaded pretrained model configuration should be cached if the standard cache should not be used.
force_download (bool
, optional, defaults to False
) — Whether or not to force the (re-)download the model weights and configuration files and override the cached versions if they exist.
resume_download (bool
, optional, defaults to False
) — Whether or not to delete incompletely received files. Will attempt to resume the download if such a file exists.
proxies (Dict[str, str]
, optional) — A dictionary of proxy servers to use by protocol or endpoint, e.g., {'http': 'foo.bar:3128', 'http://hostname': 'foo.bar:4012'}
. The proxies are used on each request.
revision (str
, optional, defaults to "main"
) — The specific model version to use. It can be a branch name, a tag name, or a commit id, since we use a git-based system for storing models and other artifacts on huggingface.co, so revision
can be any identifier allowed by git.
subfolder (str
, optional) — In case the relevant files are located inside a subfolder of the model repo on huggingface.co (e.g. for facebook/rag-token-base), specify it here.
use_fast (bool
, optional, defaults to True
) — Use a if it is supported for a given model. If a fast tokenizer is not available for a given model, a normal Python-based tokenizer is returned instead.
tokenizer_type (str
, optional) — Tokenizer type to be loaded.
trust_remote_code (bool
, optional, defaults to False
) — Whether or not to allow for custom models defined on the Hub in their own modeling files. This option should only be set to True
for repositories you trust and in which you have read the code, as it will execute code present on the Hub on your local machine.
kwargs (additional keyword arguments, optional) — Will be passed to the Tokenizer __init__()
method. Can be used to set special tokens like bos_token
, eos_token
, unk_token
, sep_token
, pad_token
, cls_token
, mask_token
, additional_special_tokens
. See parameters in the __init__()
for more details.
Instantiate one of the tokenizer classes of the library from a pretrained model vocabulary.
The tokenizer class to instantiate is selected based on the model_type
property of the config object (either passed as an argument or loaded from pretrained_model_name_or_path
if possible), or when it’s missing, by falling back to using pattern matching on pretrained_model_name_or_path
:
Examples:
Copied
register
( config_classslow_tokenizer_class = Nonefast_tokenizer_class = Noneexist_ok = False )
Parameters
slow_tokenizer_class (PretrainedTokenizer
, optional) — The slow tokenizer to register.
fast_tokenizer_class (PretrainedTokenizerFast
, optional) — The fast tokenizer to register.
Register a new tokenizer in this mapping.
albert — or (ALBERT model)
align — or (ALIGN model)
bark — or (Bark model)
bart — or (BART model)
barthez — or (BARThez model)
bartpho — (BARTpho model)
bert — or (BERT model)
bert-generation — (Bert Generation model)
bert-japanese — (BertJapanese model)
bertweet — (BERTweet model)
big_bird — or (BigBird model)
bigbird_pegasus — or (BigBird-Pegasus model)
biogpt — (BioGpt model)
blenderbot — or (Blenderbot model)
blenderbot-small — (BlenderbotSmall model)
blip — or (BLIP model)
blip-2 — or (BLIP-2 model)
bloom — (BLOOM model)
bridgetower — or (BridgeTower model)
bros — or (BROS model)
byt5 — (ByT5 model)
camembert — or (CamemBERT model)
canine — (CANINE model)
chinese_clip — or (Chinese-CLIP model)
clap — or (CLAP model)
clip — or (CLIP model)
clipseg — or (CLIPSeg model)
code_llama — or (CodeLlama model)
codegen — or (CodeGen model)
convbert — or (ConvBERT model)
cpm — or (CPM model)
cpmant — (CPM-Ant model)
ctrl — (CTRL model)
data2vec-audio — (Data2VecAudio model)
data2vec-text — or (Data2VecText model)
deberta — or (DeBERTa model)
deberta-v2 — or (DeBERTa-v2 model)
distilbert — or (DistilBERT model)
dpr — or (DPR model)
electra — or (ELECTRA model)
ernie — or (ERNIE model)
ernie_m — (ErnieM model)
esm — (ESM model)
flaubert — (FlauBERT model)
fnet — or (FNet model)
fsmt — (FairSeq Machine-Translation model)
funnel — or (Funnel Transformer model)
git — or (GIT model)
gpt-sw3 — (GPT-Sw3 model)
gpt2 — or (OpenAI GPT-2 model)
gpt_bigcode — or (GPTBigCode model)
gpt_neo — or (GPT Neo model)
gpt_neox — (GPT NeoX model)
gpt_neox_japanese — (GPT NeoX Japanese model)
gptj — or (GPT-J model)
gptsan-japanese — (GPTSAN-japanese model)
groupvit — or (GroupViT model)
herbert — or (HerBERT model)
hubert — (Hubert model)
ibert — or (I-BERT model)
idefics — (IDEFICS model)
instructblip — or (InstructBLIP model)
jukebox — (Jukebox model)
layoutlm — or (LayoutLM model)
layoutlmv2 — or (LayoutLMv2 model)
layoutlmv3 — or (LayoutLMv3 model)
layoutxlm — or (LayoutXLM model)
led — or (LED model)
lilt — or (LiLT model)
llama — or (LLaMA model)
longformer — or (Longformer model)
longt5 — or (LongT5 model)
luke — (LUKE model)
lxmert — or (LXMERT model)
m2m_100 — (M2M100 model)
marian — (Marian model)
mbart — or (mBART model)
mbart50 — or (mBART-50 model)
mega — or (MEGA model)
megatron-bert — or (Megatron-BERT model)
mgp-str — (MGP-STR model)
mistral — or (Mistral model)
mluke — (mLUKE model)
mobilebert — or (MobileBERT model)
mpnet — or (MPNet model)
mpt — (MPT model)
mra — or (MRA model)
mt5 — or (MT5 model)
musicgen — or (MusicGen model)
mvp — or (MVP model)
nezha — or (Nezha model)
nllb — or (NLLB model)
nllb-moe — or (NLLB-MOE model)
nystromformer — or (Nyströmformer model)
oneformer — or (OneFormer model)
openai-gpt — or (OpenAI GPT model)
opt — or (OPT model)
owlvit — or (OWL-ViT model)
pegasus — or (Pegasus model)
pegasus_x — or (PEGASUS-X model)
perceiver — (Perceiver model)
persimmon — or (Persimmon model)
phobert — (PhoBERT model)
pix2struct — or (Pix2Struct model)
plbart — (PLBart model)
prophetnet — (ProphetNet model)
qdqbert — or (QDQBert model)
rag — (RAG model)
realm — or (REALM model)
reformer — or (Reformer model)
rembert — or (RemBERT model)
retribert — or (RetriBERT model)
roberta — or (RoBERTa model)
roberta-prelayernorm — or (RoBERTa-PreLayerNorm model)
roc_bert — (RoCBert model)
roformer — or (RoFormer model)
rwkv — (RWKV model)
speech_to_text — (Speech2Text model)
speech_to_text_2 — (Speech2Text2 model)
speecht5 — (SpeechT5 model)
splinter — or (Splinter model)
squeezebert — or (SqueezeBERT model)
switch_transformers — or (SwitchTransformers model)
t5 — or (T5 model)
tapas — (TAPAS model)
tapex — (TAPEX model)
transfo-xl — (Transformer-XL model)
umt5 — or (UMT5 model)
vilt — or (ViLT model)
visual_bert — or (VisualBERT model)
vits — (VITS model)
wav2vec2 — (Wav2Vec2 model)
wav2vec2-conformer — (Wav2Vec2-Conformer model)
wav2vec2_phoneme — (Wav2Vec2Phoneme model)
whisper — or (Whisper model)
xclip — or (X-CLIP model)
xglm — or (XGLM model)
xlm — (XLM model)
xlm-prophetnet — (XLM-ProphetNet model)
xlm-roberta — or (XLM-RoBERTa model)
xlm-roberta-xl — or (XLM-RoBERTa-XL model)
xlnet — or (XLNet model)
xmod — or (X-MOD model)
yoso — or (YOSO model)
config_class () — The configuration corresponding to the model to register.