Hub Python Library
  • 🌍GET STARTED
    • Home
    • Quickstart
    • Installation
  • 🌍HOW-TO GUIDES
    • Overview
    • Download files
    • Upload files
    • BAFileSystem
    • Repository
    • Search
    • Inference
    • Community Tab
    • Collections
    • Cache
    • Model Cards
    • Manage your Space
    • Integrate a library
    • Webhooks server
  • 🌍CONCEPTUAL GUIDES
    • Git vs HTTP paradigm
  • 🌍REFERENCE
    • Overview
    • Login and logout
    • Environment variables
    • Managing local and online repositories
    • BOINC AI Hub API
    • Downloading files
    • Mixins & serialization methods
    • Inference Client
    • BaFileSystem
    • Utilities
    • Discussions and Pull Requests
    • Cache-system reference
    • Repo Cards and Repo Card Data
    • Space runtime
    • Collections
    • TensorBoard logger
    • Webhooks server
Powered by GitBook
On this page
  • Quickstart
  • Installation
  • Download files
  • Login
  • Create a repository
  • Upload files
  • Next steps
  1. GET STARTED

Quickstart

PreviousHomeNextInstallation

Last updated 1 year ago

Quickstart

The is the go-to place for sharing machine learning models, demos, datasets, and metrics. boincai_hub library helps you interact with the Hub without leaving your development environment. You can create and manage repositories easily, download and upload files, and get useful model and dataset metadata from the Hub.

Installation

To get started, install the boincai_hub library:

Copied

pip install --upgrade boincai_hub

For more details, check out the guide.

Download files

Repositories on the Hub are git version controlled, and users can download a single file or the whole repository. You can use the function to download files. This function will download and cache a file on your local disk. The next time you need that file, it will load from your cache, so you don’t need to re-download it.

You will need the repository id and the filename of the file you want to download. For example, to download the model configuration file:

Copied

>>> from boincai_hub import hf_hub_download
>>> hf_hub_download(repo_id="google/pegasus-xsum", filename="config.json")

To download a specific version of the file, use the revision parameter to specify the branch name, tag, or commit hash. If you choose to use the commit hash, it must be the full-length hash instead of the shorter 7-character commit hash:

Copied

>>> from boincai_hub import hf_hub_download
>>> hf_hub_download(
...     repo_id="google/pegasus-xsum", 
...     filename="config.json", 
...     revision="4d33b01d79672f27f001f6abade33f22d993b151"
... )

Login

Once you have your User Access Token, run the following command in your terminal:

Copied

boincai-cli login
# or using an environment variable
boincai-cli login --token $BOINCAI_TOKEN

Copied

>>> from boincai_hub import login
>>> login()

You can be logged in only to 1 account at a time. If you login your machine to a new account, you will get logged out from the previous. Make sure to always which account you are using with the command boincai-cli whoami. If you want to handle several accounts in the same script, you can provide your token when calling each method. This is also useful if you don’t want to store any token on your machine.

Once you are logged in, all requests to the Hub -even methods that don’t necessarily require authentication- will use your access token by default. If you want to disable implicit use of your token, you should set the HF_HUB_DISABLE_IMPLICIT_TOKEN environment variable.

Create a repository

Copied

>>> from boincai_hub import HfApi
>>> api = HfApi()
>>> api.create_repo(repo_id="super-cool-model")

If you want your repository to be private, then:

Copied

>>> from boincai_hub import HfApi
>>> api = HfApi()
>>> api.create_repo(repo_id="super-cool-model", private=True)

Private repositories will not be visible to anyone except yourself.

Upload files

  1. The path of the file to upload.

  2. The path of the file in the repository.

  3. The repository id of where you want to add the file.

Copied

>>> from boincai_hub import HfApi
>>> api = HfApi()
>>> api.upload_file(
...     path_or_fileobj="/home/lysandre/dummy-test/README.md",
...     path_in_repo="README.md",
...     repo_id="lysandre/test-model",
... )

Next steps

For more details and options, see the API reference for .

In a lot of cases, you must be logged in with a BOINC AI account to interact with the Hub: download private repos, upload files, create PRs,… if you don’t already have one, and then sign in to get your from your . The User Access Token is used to authenticate your identity to the Hub.

Alternatively, you can programmatically login using in a notebook or a script:

It is also possible to login programmatically without being prompted to enter your token by directly passing the token to like login(token="hf_xxx"). If you do so, be careful when sharing your source code. It is a best practice to load the token from a secure vault instead of saving it explicitly in your codebase/notebook.

Once you’ve registered and logged in, create a repository with the function:

To create a repository or to push content to the Hub, you must provide a User Access Token that has the write permission. You can choose the permission when creating the token in your .

Use the function to add a file to your newly created repository. You need to specify:

To upload more than one file at a time, take a look at the guide which will introduce you to several methods for uploading files (with or without git).

The boincai_hub library provides an easy way for users to interact with the Hub with Python. To learn more about how you can manage your files and repositories on the Hub, we recommend reading our to:

.

files from the Hub.

files to the Hub.

for your desired model or dataset.

for fast inference.

🌍
BOINC AI Hub
installation
hf_hub_download()
Pegasus
hf_hub_download()
Create an account
User Access Token
Settings page
login()
login()
create_repo()
Settings page
upload_file()
Upload
how-to guides
Manage your repository
Download
Upload
Search the Hub
Access the Inference API