BOINC AI Inference Endpoints
Last updated
Last updated
π Inference Endpoints offers a secure production solution to easily deploy any π Transformers, Sentence-Transformers and Diffusion models from the Hub on dedicated and autoscaling infrastructure managed by BOINC AI
A BOINC AI Endpoint is built from a BOINC AI Model Repository. When an Endpoint is created, the service creates image artifacts that are either built from the model you select or a custom-provided container image. The image artifacts are completely decoupled from the BOINC AI Hub source repositories to ensure the highest security and reliability levels.
π Inference Endpoints support all of theπ Transformers, Sentence-Transformers and Diffusion tasks as well as custom tasks not supported by π Transformers yet like speaker diarization and diffusion.
In addition, π Inference Endpoints gives you the option to use a custom container image managed on an external service, for instance, Docker Hub, AWS ECR, Azure ACR, or Google GCR.