BOINC AI Inference Endpoints

BOINC AI Inference Endpoints

🌍 Inference Endpoints offers a secure production solution to easily deploy any 🌍 Transformers, Sentence-Transformers and Diffusion models from the Hub on dedicated and autoscaling infrastructure managed by BOINC AI

A BOINC AI Endpoint is built from a BOINC AI Model Repositoryarrow-up-right. When an Endpoint is created, the service creates image artifacts that are either built from the model you select or a custom-provided container image. The image artifacts are completely decoupled from the BOINC AI Hub source repositories to ensure the highest security and reliability levels.

🌍 Inference Endpoints support all of the🌍 Transformers, Sentence-Transformers and Diffusion tasksarrow-up-right as well as custom tasksarrow-up-right not supported by 🌍 Transformers yet like speaker diarization and diffusion.

In addition, 🌍 Inference Endpoints gives you the option to use a custom container image managed on an external service, for instance, Docker Hubarrow-up-right, AWS ECRarrow-up-right, Azure ACRarrow-up-right, or Google GCRarrow-up-right.

creation-flow

Documentation and Examples

Guides

Others

Last updated