# AutoModelForSequenceClassification

#### AutoModelForSequenceClassification

#### class transformers.AutoModelForSequenceClassification

[\<source>](https://github.com/huggingface/transformers/blob/v4.34.1/src/transformers/models/auto/modeling_auto.py#L1269)

( \*args\*\*kwargs )

This is a generic model class that will be instantiated as one of the model classes of the library (with a sequence classification head) when created with the [from\_pretrained()](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/auto#transformers.FlaxAutoModelForVision2Seq.from_pretrained) class method or the [from\_config()](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/auto#transformers.FlaxAutoModelForVision2Seq.from_config) class method.

This class cannot be instantiated directly using `__init__()` (throws an error).

**from\_config**

[\<source>](https://github.com/huggingface/transformers/blob/v4.34.1/src/transformers/models/auto/auto_factory.py#L417)

( \*\*kwargs )

Parameters

* **config** ([PretrainedConfig](https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/configuration#transformers.PretrainedConfig)) — The model class to instantiate is selected based on the configuration class:
  * [AlbertConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/albert#transformers.AlbertConfig) configuration class: [AlbertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/albert#transformers.AlbertForSequenceClassification) (ALBERT model)
  * [BartConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/bart#transformers.BartConfig) configuration class: [BartForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/bart#transformers.BartForSequenceClassification) (BART model)
  * [BertConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/bert#transformers.BertConfig) configuration class: [BertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/bert#transformers.BertForSequenceClassification) (BERT model)
  * [BigBirdConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/big_bird#transformers.BigBirdConfig) configuration class: [BigBirdForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/big_bird#transformers.BigBirdForSequenceClassification) (BigBird model)
  * [BigBirdPegasusConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/bigbird_pegasus#transformers.BigBirdPegasusConfig) configuration class: [BigBirdPegasusForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/bigbird_pegasus#transformers.BigBirdPegasusForSequenceClassification) (BigBird-Pegasus model)
  * [BioGptConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/biogpt#transformers.BioGptConfig) configuration class: [BioGptForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/biogpt#transformers.BioGptForSequenceClassification) (BioGpt model)
  * [BloomConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/bloom#transformers.BloomConfig) configuration class: [BloomForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/bloom#transformers.BloomForSequenceClassification) (BLOOM model)
  * [CTRLConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/ctrl#transformers.CTRLConfig) configuration class: [CTRLForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/ctrl#transformers.CTRLForSequenceClassification) (CTRL model)
  * [CamembertConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/camembert#transformers.CamembertConfig) configuration class: [CamembertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/camembert#transformers.CamembertForSequenceClassification) (CamemBERT model)
  * [CanineConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/canine#transformers.CanineConfig) configuration class: [CanineForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/canine#transformers.CanineForSequenceClassification) (CANINE model)
  * [ConvBertConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/convbert#transformers.ConvBertConfig) configuration class: [ConvBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/convbert#transformers.ConvBertForSequenceClassification) (ConvBERT model)
  * [Data2VecTextConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/data2vec#transformers.Data2VecTextConfig) configuration class: [Data2VecTextForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/data2vec#transformers.Data2VecTextForSequenceClassification) (Data2VecText model)
  * [DebertaConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/deberta#transformers.DebertaConfig) configuration class: [DebertaForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/deberta#transformers.DebertaForSequenceClassification) (DeBERTa model)
  * [DebertaV2Config](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/deberta-v2#transformers.DebertaV2Config) configuration class: [DebertaV2ForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/deberta-v2#transformers.DebertaV2ForSequenceClassification) (DeBERTa-v2 model)
  * [DistilBertConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/distilbert#transformers.DistilBertConfig) configuration class: [DistilBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/distilbert#transformers.DistilBertForSequenceClassification) (DistilBERT model)
  * [ElectraConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/electra#transformers.ElectraConfig) configuration class: [ElectraForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/electra#transformers.ElectraForSequenceClassification) (ELECTRA model)
  * [ErnieConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/ernie#transformers.ErnieConfig) configuration class: [ErnieForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/ernie#transformers.ErnieForSequenceClassification) (ERNIE model)
  * [ErnieMConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/ernie_m#transformers.ErnieMConfig) configuration class: [ErnieMForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/ernie_m#transformers.ErnieMForSequenceClassification) (ErnieM model)
  * [EsmConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/esm#transformers.EsmConfig) configuration class: [EsmForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/esm#transformers.EsmForSequenceClassification) (ESM model)
  * [FNetConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/fnet#transformers.FNetConfig) configuration class: [FNetForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/fnet#transformers.FNetForSequenceClassification) (FNet model)
  * [FalconConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/falcon#transformers.FalconConfig) configuration class: [FalconForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/falcon#transformers.FalconForSequenceClassification) (Falcon model)
  * [FlaubertConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/flaubert#transformers.FlaubertConfig) configuration class: [FlaubertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/flaubert#transformers.FlaubertForSequenceClassification) (FlauBERT model)
  * [FunnelConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/funnel#transformers.FunnelConfig) configuration class: [FunnelForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/funnel#transformers.FunnelForSequenceClassification) (Funnel Transformer model)
  * [GPT2Config](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gpt2#transformers.GPT2Config) configuration class: [GPT2ForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gpt2#transformers.GPT2ForSequenceClassification) (OpenAI GPT-2 model)
  * [GPTBigCodeConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gpt_bigcode#transformers.GPTBigCodeConfig) configuration class: [GPTBigCodeForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gpt_bigcode#transformers.GPTBigCodeForSequenceClassification) (GPTBigCode model)
  * [GPTJConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gptj#transformers.GPTJConfig) configuration class: [GPTJForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gptj#transformers.GPTJForSequenceClassification) (GPT-J model)
  * [GPTNeoConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gpt_neo#transformers.GPTNeoConfig) configuration class: [GPTNeoForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gpt_neo#transformers.GPTNeoForSequenceClassification) (GPT Neo model)
  * [GPTNeoXConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gpt_neox#transformers.GPTNeoXConfig) configuration class: [GPTNeoXForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gpt_neox#transformers.GPTNeoXForSequenceClassification) (GPT NeoX model)
  * [IBertConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/ibert#transformers.IBertConfig) configuration class: [IBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/ibert#transformers.IBertForSequenceClassification) (I-BERT model)
  * [LEDConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/led#transformers.LEDConfig) configuration class: [LEDForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/led#transformers.LEDForSequenceClassification) (LED model)
  * [LayoutLMConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/layoutlm#transformers.LayoutLMConfig) configuration class: [LayoutLMForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/layoutlm#transformers.LayoutLMForSequenceClassification) (LayoutLM model)
  * [LayoutLMv2Config](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/layoutlmv2#transformers.LayoutLMv2Config) configuration class: [LayoutLMv2ForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/layoutlmv2#transformers.LayoutLMv2ForSequenceClassification) (LayoutLMv2 model)
  * [LayoutLMv3Config](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/layoutlmv3#transformers.LayoutLMv3Config) configuration class: [LayoutLMv3ForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/layoutlmv3#transformers.LayoutLMv3ForSequenceClassification) (LayoutLMv3 model)
  * [LiltConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/lilt#transformers.LiltConfig) configuration class: [LiltForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/lilt#transformers.LiltForSequenceClassification) (LiLT model)
  * [LlamaConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/llama2#transformers.LlamaConfig) configuration class: [LlamaForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/llama2#transformers.LlamaForSequenceClassification) (LLaMA model)
  * [LongformerConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/longformer#transformers.LongformerConfig) configuration class: [LongformerForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/longformer#transformers.LongformerForSequenceClassification) (Longformer model)
  * [LukeConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/luke#transformers.LukeConfig) configuration class: [LukeForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/luke#transformers.LukeForSequenceClassification) (LUKE model)
  * [MBartConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mbart#transformers.MBartConfig) configuration class: [MBartForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mbart#transformers.MBartForSequenceClassification) (mBART model)
  * [MPNetConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mpnet#transformers.MPNetConfig) configuration class: [MPNetForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mpnet#transformers.MPNetForSequenceClassification) (MPNet model)
  * [MT5Config](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mt5#transformers.MT5Config) configuration class: [MT5ForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mt5#transformers.MT5ForSequenceClassification) (MT5 model)
  * [MarkupLMConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/markuplm#transformers.MarkupLMConfig) configuration class: [MarkupLMForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/markuplm#transformers.MarkupLMForSequenceClassification) (MarkupLM model)
  * [MegaConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mega#transformers.MegaConfig) configuration class: [MegaForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mega#transformers.MegaForSequenceClassification) (MEGA model)
  * [MegatronBertConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/megatron-bert#transformers.MegatronBertConfig) configuration class: [MegatronBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/megatron-bert#transformers.MegatronBertForSequenceClassification) (Megatron-BERT model)
  * [MistralConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mistral#transformers.MistralConfig) configuration class: [MistralForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mistral#transformers.MistralForSequenceClassification) (Mistral model)
  * [MobileBertConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mobilebert#transformers.MobileBertConfig) configuration class: [MobileBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mobilebert#transformers.MobileBertForSequenceClassification) (MobileBERT model)
  * [MptConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mpt#transformers.MptConfig) configuration class: [MptForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mpt#transformers.MptForSequenceClassification) (MPT model)
  * [MraConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mra#transformers.MraConfig) configuration class: [MraForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mra#transformers.MraForSequenceClassification) (MRA model)
  * [MvpConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mvp#transformers.MvpConfig) configuration class: [MvpForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mvp#transformers.MvpForSequenceClassification) (MVP model)
  * [NezhaConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/nezha#transformers.NezhaConfig) configuration class: [NezhaForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/nezha#transformers.NezhaForSequenceClassification) (Nezha model)
  * [NystromformerConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/nystromformer#transformers.NystromformerConfig) configuration class: [NystromformerForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/nystromformer#transformers.NystromformerForSequenceClassification) (Nyströmformer model)
  * [OPTConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/opt#transformers.OPTConfig) configuration class: [OPTForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/opt#transformers.OPTForSequenceClassification) (OPT model)
  * [OpenAIGPTConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/openai-gpt#transformers.OpenAIGPTConfig) configuration class: [OpenAIGPTForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/openai-gpt#transformers.OpenAIGPTForSequenceClassification) (OpenAI GPT model)
  * [OpenLlamaConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/open-llama#transformers.OpenLlamaConfig) configuration class: [OpenLlamaForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/open-llama#transformers.OpenLlamaForSequenceClassification) (OpenLlama model)
  * [PLBartConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/plbart#transformers.PLBartConfig) configuration class: [PLBartForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/plbart#transformers.PLBartForSequenceClassification) (PLBart model)
  * [PerceiverConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/perceiver#transformers.PerceiverConfig) configuration class: [PerceiverForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/perceiver#transformers.PerceiverForSequenceClassification) (Perceiver model)
  * [PersimmonConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/persimmon#transformers.PersimmonConfig) configuration class: [PersimmonForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/persimmon#transformers.PersimmonForSequenceClassification) (Persimmon model)
  * [QDQBertConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/qdqbert#transformers.QDQBertConfig) configuration class: [QDQBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/qdqbert#transformers.QDQBertForSequenceClassification) (QDQBert model)
  * [ReformerConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/reformer#transformers.ReformerConfig) configuration class: [ReformerForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/reformer#transformers.ReformerForSequenceClassification) (Reformer model)
  * [RemBertConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/rembert#transformers.RemBertConfig) configuration class: [RemBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/rembert#transformers.RemBertForSequenceClassification) (RemBERT model)
  * [RoCBertConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/roc_bert#transformers.RoCBertConfig) configuration class: [RoCBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/roc_bert#transformers.RoCBertForSequenceClassification) (RoCBert model)
  * [RoFormerConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/roformer#transformers.RoFormerConfig) configuration class: [RoFormerForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/roformer#transformers.RoFormerForSequenceClassification) (RoFormer model)
  * [RobertaConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/roberta#transformers.RobertaConfig) configuration class: [RobertaForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/roberta#transformers.RobertaForSequenceClassification) (RoBERTa model)
  * [RobertaPreLayerNormConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/roberta-prelayernorm#transformers.RobertaPreLayerNormConfig) configuration class: [RobertaPreLayerNormForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/roberta-prelayernorm#transformers.RobertaPreLayerNormForSequenceClassification) (RoBERTa-PreLayerNorm model)
  * [SqueezeBertConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/squeezebert#transformers.SqueezeBertConfig) configuration class: [SqueezeBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/squeezebert#transformers.SqueezeBertForSequenceClassification) (SqueezeBERT model)
  * [T5Config](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/t5#transformers.T5Config) configuration class: [T5ForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/t5#transformers.T5ForSequenceClassification) (T5 model)
  * [TapasConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/tapas#transformers.TapasConfig) configuration class: [TapasForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/tapas#transformers.TapasForSequenceClassification) (TAPAS model)
  * [TransfoXLConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/transfo-xl#transformers.TransfoXLConfig) configuration class: [TransfoXLForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/transfo-xl#transformers.TransfoXLForSequenceClassification) (Transformer-XL model)
  * [UMT5Config](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/umt5#transformers.UMT5Config) configuration class: [UMT5ForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/umt5#transformers.UMT5ForSequenceClassification) (UMT5 model)
  * [XLMConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/xlm#transformers.XLMConfig) configuration class: [XLMForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/xlm#transformers.XLMForSequenceClassification) (XLM model)
  * [XLMRobertaConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/xlm-roberta#transformers.XLMRobertaConfig) configuration class: [XLMRobertaForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/xlm-roberta#transformers.XLMRobertaForSequenceClassification) (XLM-RoBERTa model)
  * [XLMRobertaXLConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/xlm-roberta-xl#transformers.XLMRobertaXLConfig) configuration class: [XLMRobertaXLForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/xlm-roberta-xl#transformers.XLMRobertaXLForSequenceClassification) (XLM-RoBERTa-XL model)
  * [XLNetConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/xlnet#transformers.XLNetConfig) configuration class: [XLNetForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/xlnet#transformers.XLNetForSequenceClassification) (XLNet model)
  * [XmodConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/xmod#transformers.XmodConfig) configuration class: [XmodForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/xmod#transformers.XmodForSequenceClassification) (X-MOD model)
  * [YosoConfig](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/yoso#transformers.YosoConfig) configuration class: [YosoForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/yoso#transformers.YosoForSequenceClassification) (YOSO model)

Instantiates one of the model classes of the library (with a sequence classification head) from a configuration.

Note: Loading a model from its configuration file does **not** load the model weights. It only affects the model’s configuration. Use [from\_pretrained()](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/auto#transformers.FlaxAutoModelForVision2Seq.from_pretrained) to load the model weights.

Examples:

Copied

```
>>> from transformers import AutoConfig, AutoModelForSequenceClassification

>>> # Download configuration from huggingface.co and cache.
>>> config = AutoConfig.from_pretrained("bert-base-cased")
>>> model = AutoModelForSequenceClassification.from_config(config)
```

**from\_pretrained**

[\<source>](https://github.com/huggingface/transformers/blob/v4.34.1/src/transformers/models/auto/auto_factory.py#L448)

( \*model\_args\*\*kwargs )

Parameters

* **pretrained\_model\_name\_or\_path** (`str` or `os.PathLike`) — Can be either:
  * A string, the *model id* of a pretrained model hosted inside a model repo on huggingface.co. Valid model ids can be located at the root-level, like `bert-base-uncased`, or namespaced under a user or organization name, like `dbmdz/bert-base-german-cased`.
  * A path to a *directory* containing model weights saved using [save\_pretrained()](https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/model#transformers.PreTrainedModel.save_pretrained), e.g., `./my_model_directory/`.
  * A path or url to a *tensorflow index checkpoint file* (e.g, `./tf_model/model.ckpt.index`). In this case, `from_tf` should be set to `True` and a configuration object should be provided as `config` argument. This loading path is slower than converting the TensorFlow checkpoint in a PyTorch model using the provided conversion scripts and loading the PyTorch model afterwards.
* **model\_args** (additional positional arguments, *optional*) — Will be passed along to the underlying model `__init__()` method.
* **config** ([PretrainedConfig](https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/configuration#transformers.PretrainedConfig), *optional*) — Configuration for the model to use instead of an automatically loaded configuration. Configuration can be automatically loaded when:
  * The model is a model provided by the library (loaded with the *model id* string of a pretrained model).
  * The model was saved using [save\_pretrained()](https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/model#transformers.PreTrainedModel.save_pretrained) and is reloaded by supplying the save directory.
  * The model is loaded by supplying a local directory as `pretrained_model_name_or_path` and a configuration JSON file named *config.json* is found in the directory.
* **state\_dict** (*Dict\[str, torch.Tensor]*, *optional*) — A state dictionary to use instead of a state dictionary loaded from saved weights file.

  This option can be used if you want to create a model from a pretrained configuration but load your own weights. In this case though, you should check if using [save\_pretrained()](https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/model#transformers.PreTrainedModel.save_pretrained) and [from\_pretrained()](https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/model#transformers.PreTrainedModel.from_pretrained) is not a simpler option.
* **cache\_dir** (`str` or `os.PathLike`, *optional*) — Path to a directory in which a downloaded pretrained model configuration should be cached if the standard cache should not be used.
* **from\_tf** (`bool`, *optional*, defaults to `False`) — Load the model weights from a TensorFlow checkpoint save file (see docstring of `pretrained_model_name_or_path` argument).
* **force\_download** (`bool`, *optional*, defaults to `False`) — Whether or not to force the (re-)download of the model weights and configuration files, overriding the cached versions if they exist.
* **resume\_download** (`bool`, *optional*, defaults to `False`) — Whether or not to delete incompletely received files. Will attempt to resume the download if such a file exists.
* **proxies** (`Dict[str, str]`, *optional*) — A dictionary of proxy servers to use by protocol or endpoint, e.g., `{'http': 'foo.bar:3128', 'http://hostname': 'foo.bar:4012'}`. The proxies are used on each request.
* **output\_loading\_info(`bool`,** *optional*, defaults to `False`) — Whether ot not to also return a dictionary containing missing keys, unexpected keys and error messages.
* **local\_files\_only(`bool`,** *optional*, defaults to `False`) — Whether or not to only look at local files (e.g., not try downloading the model).
* **revision** (`str`, *optional*, defaults to `"main"`) — The specific model version to use. It can be a branch name, a tag name, or a commit id, since we use a git-based system for storing models and other artifacts on huggingface.co, so `revision` can be any identifier allowed by git.
* **trust\_remote\_code** (`bool`, *optional*, defaults to `False`) — Whether or not to allow for custom models defined on the Hub in their own modeling files. This option should only be set to `True` for repositories you trust and in which you have read the code, as it will execute code present on the Hub on your local machine.
* **code\_revision** (`str`, *optional*, defaults to `"main"`) — The specific revision to use for the code on the Hub, if the code leaves in a different repository than the rest of the model. It can be a branch name, a tag name, or a commit id, since we use a git-based system for storing models and other artifacts on huggingface.co, so `revision` can be any identifier allowed by git.
* **kwargs** (additional keyword arguments, *optional*) — Can be used to update the configuration object (after it being loaded) and initiate the model (e.g., `output_attentions=True`). Behaves differently depending on whether a `config` is provided or automatically loaded:
  * If a configuration is provided with `config`, `**kwargs` will be directly passed to the underlying model’s `__init__` method (we assume all relevant updates to the configuration have already been done)
  * If a configuration is not provided, `kwargs` will be first passed to the configuration class initialization function ([from\_pretrained()](https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/configuration#transformers.PretrainedConfig.from_pretrained)). Each key of `kwargs` that corresponds to a configuration attribute will be used to override said attribute with the supplied `kwargs` value. Remaining keys that do not correspond to any configuration attribute will be passed to the underlying model’s `__init__` function.

Instantiate one of the model classes of the library (with a sequence classification head) from a pretrained model.

The model class to instantiate is selected based on the `model_type` property of the config object (either passed as an argument or loaded from `pretrained_model_name_or_path` if possible), or when it’s missing, by falling back to using pattern matching on `pretrained_model_name_or_path`:

* **albert** — [AlbertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/albert#transformers.AlbertForSequenceClassification) (ALBERT model)
* **bart** — [BartForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/bart#transformers.BartForSequenceClassification) (BART model)
* **bert** — [BertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/bert#transformers.BertForSequenceClassification) (BERT model)
* **big\_bird** — [BigBirdForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/big_bird#transformers.BigBirdForSequenceClassification) (BigBird model)
* **bigbird\_pegasus** — [BigBirdPegasusForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/bigbird_pegasus#transformers.BigBirdPegasusForSequenceClassification) (BigBird-Pegasus model)
* **biogpt** — [BioGptForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/biogpt#transformers.BioGptForSequenceClassification) (BioGpt model)
* **bloom** — [BloomForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/bloom#transformers.BloomForSequenceClassification) (BLOOM model)
* **camembert** — [CamembertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/camembert#transformers.CamembertForSequenceClassification) (CamemBERT model)
* **canine** — [CanineForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/canine#transformers.CanineForSequenceClassification) (CANINE model)
* **code\_llama** — [LlamaForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/llama2#transformers.LlamaForSequenceClassification) (CodeLlama model)
* **convbert** — [ConvBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/convbert#transformers.ConvBertForSequenceClassification) (ConvBERT model)
* **ctrl** — [CTRLForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/ctrl#transformers.CTRLForSequenceClassification) (CTRL model)
* **data2vec-text** — [Data2VecTextForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/data2vec#transformers.Data2VecTextForSequenceClassification) (Data2VecText model)
* **deberta** — [DebertaForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/deberta#transformers.DebertaForSequenceClassification) (DeBERTa model)
* **deberta-v2** — [DebertaV2ForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/deberta-v2#transformers.DebertaV2ForSequenceClassification) (DeBERTa-v2 model)
* **distilbert** — [DistilBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/distilbert#transformers.DistilBertForSequenceClassification) (DistilBERT model)
* **electra** — [ElectraForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/electra#transformers.ElectraForSequenceClassification) (ELECTRA model)
* **ernie** — [ErnieForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/ernie#transformers.ErnieForSequenceClassification) (ERNIE model)
* **ernie\_m** — [ErnieMForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/ernie_m#transformers.ErnieMForSequenceClassification) (ErnieM model)
* **esm** — [EsmForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/esm#transformers.EsmForSequenceClassification) (ESM model)
* **falcon** — [FalconForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/falcon#transformers.FalconForSequenceClassification) (Falcon model)
* **flaubert** — [FlaubertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/flaubert#transformers.FlaubertForSequenceClassification) (FlauBERT model)
* **fnet** — [FNetForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/fnet#transformers.FNetForSequenceClassification) (FNet model)
* **funnel** — [FunnelForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/funnel#transformers.FunnelForSequenceClassification) (Funnel Transformer model)
* **gpt-sw3** — [GPT2ForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gpt2#transformers.GPT2ForSequenceClassification) (GPT-Sw3 model)
* **gpt2** — [GPT2ForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gpt2#transformers.GPT2ForSequenceClassification) (OpenAI GPT-2 model)
* **gpt\_bigcode** — [GPTBigCodeForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gpt_bigcode#transformers.GPTBigCodeForSequenceClassification) (GPTBigCode model)
* **gpt\_neo** — [GPTNeoForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gpt_neo#transformers.GPTNeoForSequenceClassification) (GPT Neo model)
* **gpt\_neox** — [GPTNeoXForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gpt_neox#transformers.GPTNeoXForSequenceClassification) (GPT NeoX model)
* **gptj** — [GPTJForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/gptj#transformers.GPTJForSequenceClassification) (GPT-J model)
* **ibert** — [IBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/ibert#transformers.IBertForSequenceClassification) (I-BERT model)
* **layoutlm** — [LayoutLMForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/layoutlm#transformers.LayoutLMForSequenceClassification) (LayoutLM model)
* **layoutlmv2** — [LayoutLMv2ForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/layoutlmv2#transformers.LayoutLMv2ForSequenceClassification) (LayoutLMv2 model)
* **layoutlmv3** — [LayoutLMv3ForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/layoutlmv3#transformers.LayoutLMv3ForSequenceClassification) (LayoutLMv3 model)
* **led** — [LEDForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/led#transformers.LEDForSequenceClassification) (LED model)
* **lilt** — [LiltForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/lilt#transformers.LiltForSequenceClassification) (LiLT model)
* **llama** — [LlamaForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/llama2#transformers.LlamaForSequenceClassification) (LLaMA model)
* **longformer** — [LongformerForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/longformer#transformers.LongformerForSequenceClassification) (Longformer model)
* **luke** — [LukeForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/luke#transformers.LukeForSequenceClassification) (LUKE model)
* **markuplm** — [MarkupLMForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/markuplm#transformers.MarkupLMForSequenceClassification) (MarkupLM model)
* **mbart** — [MBartForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mbart#transformers.MBartForSequenceClassification) (mBART model)
* **mega** — [MegaForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mega#transformers.MegaForSequenceClassification) (MEGA model)
* **megatron-bert** — [MegatronBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/megatron-bert#transformers.MegatronBertForSequenceClassification) (Megatron-BERT model)
* **mistral** — [MistralForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mistral#transformers.MistralForSequenceClassification) (Mistral model)
* **mobilebert** — [MobileBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mobilebert#transformers.MobileBertForSequenceClassification) (MobileBERT model)
* **mpnet** — [MPNetForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mpnet#transformers.MPNetForSequenceClassification) (MPNet model)
* **mpt** — [MptForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mpt#transformers.MptForSequenceClassification) (MPT model)
* **mra** — [MraForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mra#transformers.MraForSequenceClassification) (MRA model)
* **mt5** — [MT5ForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mt5#transformers.MT5ForSequenceClassification) (MT5 model)
* **mvp** — [MvpForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/mvp#transformers.MvpForSequenceClassification) (MVP model)
* **nezha** — [NezhaForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/nezha#transformers.NezhaForSequenceClassification) (Nezha model)
* **nystromformer** — [NystromformerForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/nystromformer#transformers.NystromformerForSequenceClassification) (Nyströmformer model)
* **open-llama** — [OpenLlamaForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/open-llama#transformers.OpenLlamaForSequenceClassification) (OpenLlama model)
* **openai-gpt** — [OpenAIGPTForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/openai-gpt#transformers.OpenAIGPTForSequenceClassification) (OpenAI GPT model)
* **opt** — [OPTForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/opt#transformers.OPTForSequenceClassification) (OPT model)
* **perceiver** — [PerceiverForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/perceiver#transformers.PerceiverForSequenceClassification) (Perceiver model)
* **persimmon** — [PersimmonForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/persimmon#transformers.PersimmonForSequenceClassification) (Persimmon model)
* **plbart** — [PLBartForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/plbart#transformers.PLBartForSequenceClassification) (PLBart model)
* **qdqbert** — [QDQBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/qdqbert#transformers.QDQBertForSequenceClassification) (QDQBert model)
* **reformer** — [ReformerForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/reformer#transformers.ReformerForSequenceClassification) (Reformer model)
* **rembert** — [RemBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/rembert#transformers.RemBertForSequenceClassification) (RemBERT model)
* **roberta** — [RobertaForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/roberta#transformers.RobertaForSequenceClassification) (RoBERTa model)
* **roberta-prelayernorm** — [RobertaPreLayerNormForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/roberta-prelayernorm#transformers.RobertaPreLayerNormForSequenceClassification) (RoBERTa-PreLayerNorm model)
* **roc\_bert** — [RoCBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/roc_bert#transformers.RoCBertForSequenceClassification) (RoCBert model)
* **roformer** — [RoFormerForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/roformer#transformers.RoFormerForSequenceClassification) (RoFormer model)
* **squeezebert** — [SqueezeBertForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/squeezebert#transformers.SqueezeBertForSequenceClassification) (SqueezeBERT model)
* **t5** — [T5ForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/t5#transformers.T5ForSequenceClassification) (T5 model)
* **tapas** — [TapasForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/tapas#transformers.TapasForSequenceClassification) (TAPAS model)
* **transfo-xl** — [TransfoXLForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/transfo-xl#transformers.TransfoXLForSequenceClassification) (Transformer-XL model)
* **umt5** — [UMT5ForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/umt5#transformers.UMT5ForSequenceClassification) (UMT5 model)
* **xlm** — [XLMForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/xlm#transformers.XLMForSequenceClassification) (XLM model)
* **xlm-roberta** — [XLMRobertaForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/xlm-roberta#transformers.XLMRobertaForSequenceClassification) (XLM-RoBERTa model)
* **xlm-roberta-xl** — [XLMRobertaXLForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/xlm-roberta-xl#transformers.XLMRobertaXLForSequenceClassification) (XLM-RoBERTa-XL model)
* **xlnet** — [XLNetForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/xlnet#transformers.XLNetForSequenceClassification) (XLNet model)
* **xmod** — [XmodForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/xmod#transformers.XmodForSequenceClassification) (X-MOD model)
* **yoso** — [YosoForSequenceClassification](https://huggingface.co/docs/transformers/v4.34.1/en/model_doc/yoso#transformers.YosoForSequenceClassification) (YOSO model)

The model is set in evaluation mode by default using `model.eval()` (so for instance, dropout modules are deactivated). To train the model, you should first set it back in training mode with `model.train()`

Examples:

Copied

```
>>> from transformers import AutoConfig, AutoModelForSequenceClassification

>>> # Download model and configuration from huggingface.co and cache.
>>> model = AutoModelForSequenceClassification.from_pretrained("bert-base-cased")

>>> # Update configuration during loading
>>> model = AutoModelForSequenceClassification.from_pretrained("bert-base-cased", output_attentions=True)
>>> model.config.output_attentions
True

>>> # Loading from a TF checkpoint file instead of a PyTorch model (slower)
>>> config = AutoConfig.from_pretrained("./tf_model/bert_tf_model_config.json")
>>> model = AutoModelForSequenceClassification.from_pretrained(
...     "./tf_model/bert_tf_checkpoint.ckpt.index", from_tf=True, config=config
... )
```


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://boinc-ai.gitbook.io/transformers/api/main-classes/auto-classes/natural-language-processing/automodelforsequenceclassification.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
