How-to: Build a Discussion bot based on BLOOM
Last updated
Last updated
Webhooks are now publicly available!
Here’s a short guide on how to use BOINC AI Webhooks to build a bot that replies to Discussion comments on the Hub with a response generated by BLOOM, a multilingual language model, using the free Inference API.
First, let’s create a Webhook from your settings.
Input a few target repositories that your Webhook will listen to.
You can put a dummy Webhook URL for now, but defining your webhook will let you look at the events that will be sent to it (and you can replay them, which will be useful for debugging).
Input a secret as it will be more secure.
Subscribe to Community (PR & discussions) events, as we are building a Discussion bot.
Your Webhook will look like this:
In this guide, we create a separate user account to host a Space and to post comments:
When creating a bot that will interact with other users on the Hub, we ask that you clearly label the account as a "Bot" (see profile screenshot).
The third step is actually to listen to the Webhook events.
An easy way is to use a Space for this. We use the user account we created, but you could do it from your main user account if you wanted to.
The Space’s code is here.
We used NodeJS and Typescript to implement it, but any language or framework would work equally well. Read more about Docker Spaces here.
The main server.ts
file is here
Let’s walk through what happens in this file:
Copied
Here, we listen to POST requests made to /
, and then we check that the X-Webhook-Secret
header is equal to the secret we had previously defined (you need to also set the WEBHOOK_SECRET
secret in your Space’s settings to be able to verify it).
Copied
The event’s payload is encoded as JSON. Here, we specify that we will run our Webhook only when:
the event concerns a discussion comment
the event is a creation, i.e. a new comment has been posted
the comment’s content contains @discussion-bot
, i.e. our bot was just mentioned in a comment.
In that case, we will continue to the next step:
Copied
This is the coolest part: we call the Inference API for the BLOOM model, prompting it with PROMPT
, and we get the continuation text, i.e., the part generated by the model.
Finally, we will post it as a reply in the same discussion thread:
Copied
Last but not least, you’ll need to configure your Webhook to send POST requests to your Space.
Let’s first grab our Space’s “direct URL” from the contextual menu. Click on “Embed this Space” and copy the “Direct URL”.
Update your webhook to send requests to that URL: