PaddleNLP
Last updated
Last updated
Leveraging the PaddlePaddle framework, PaddleNLP
is an easy-to-use and powerful NLP library with awesome pre-trained model zoo, supporting wide-range of NLP tasks from research to industrial applications.
You can find PaddleNLP
models by filtering at the left of the models page.
All models on the Hub come up with the following features:
An automatically generated model card with a brief description and metadata tags that help for discoverability.
An interactive widget you can use to play out with the model directly in the browser.
An Inference API that allows to make inference requests.
Easily deploy your model as a Gradio app on Spaces.
To get started, you can follow PaddlePaddle Quick Start to install the PaddlePaddle Framework with your favorite OS, Package Manager and Compute Platform.
paddlenlp
offers a quick one-line install through pip:
Copied
Similar to transformer
models, the paddlenlp
library provides a simple one-liner to load models from the Hugging Face Hub by setting from_hf_hub=True
! Depending on how you want to use them, you can use the high-level API using the Taskflow
function or you can use AutoModel
and AutoTokenizer
for more control.
Copied
If you want to see how to load a specific model, you can click Use in paddlenlp
and you will be given a working snippet that you can load it!
You can share your PaddleNLP
models by using the save_to_hf_hub
method under all Model
and Tokenizer
classes.
Copied
PaddlePaddle Installation guide.
PaddleNLP GitHub Repo.