Neuron Trainer
Last updated
Last updated
The class provides an extended API for the feature-complete . It is used in all the .
The class is optimized for π Transformers models running on AWS Trainium.
Here is an example of how to customize to use a weighted loss (useful when you have an unbalanced training set):
Copied
( *args **kwargs )
Trainer that is suited for performing training on AWS Tranium instances.
( *args **kwargs )
Seq2SeqTrainer that is suited for performing training on AWS Tranium instances.
Another way to customize the training loop behavior for the PyTorch is to use that can inspect the training loop state (for progress reporting, logging on TensorBoard or other ML platformsβ¦) and take decisions (like early stopping).