Building a Next.js Application
Building a Next.js application
In this tutorial, weβll build a simple Next.js application that performs sentiment analysis using Transformers.js! Since Transformers.js can run in the browser or in Node.js, you can choose whether you want to perform inference client-side or server-side (weβll show you how to do both). In either case, we will be developing with the new App Router paradigm. The final product will look something like this:

Useful links:
Demo site: client-side or server-side
Source code: client-side or server-side
Prerequisites
Client-side inference
Step 1: Initialise the project
Start by creating a new Next.js application using create-next-app:
Copied
On installation, youβll see various prompts. For this demo, weβll be selecting those shown below in bold:
Step 2: Install and configure Transformers.js
You can install Transformers.js from NPM with the following command:
Copied
We also need to update the next.config.js file to ignore node-specific modules when bundling for the browser:
Copied
Next, weβll create a new Web Worker script where weβll place all ML-related code. This is to ensure that the main thread is not blocked while the model is loading and performing inference. For this application, weβll be using Xenova/distilbert-base-uncased-finetuned-sst-2-english, a ~67M parameter model finetuned on the Stanford Sentiment Treebank dataset. Add the following code to ./src/app/worker.js:
Copied
Step 3: Design the user interface
Weβll now modify the default ./src/app/page.js file so that it connects to our worker thread. Since weβll only be performing in-browser inference, we can opt-in to Client components using the 'use client' directive.
Copied
Initialise the following state variables at the beginning of the Home component:
Copied
and fill in the onMessageReceived function to update these variables when the worker thread sends a message:
Copied
Finally, we can add a simple UI to the Home component, consisting of an input textbox and a preformatted text element to display the classification result:
Copied
You can now run your application using the following command:
Copied
Visit the URL shown in the terminal (e.g., http://localhost:3000/) to see your application in action!
(Optional) Step 4: Build and deploy
To build your application, simply run:
Copied
This will bundle your application and output the static files to the out folder.
For this demo, we will deploy our application as a static BOINC AI Space, but you can deploy it anywhere you like! If you havenβt already, you can create a free BOINC AI account here.
Visit https://boincai.com/new-space and fill in the form. Remember to select βStaticβ as the space type.
Click the βCreate spaceβ button at the bottom of the page.
Go to βFilesβ β βAdd fileβ β βUpload filesβ. Drag the files from the
outfolder into the upload box and click βUploadβ. After they have uploaded, scroll down to the button and click βCommit changes to mainβ.
Thatβs it! Your application should now be live at https://boincai.com/spaces/<your-username>/<your-space-name>!
Server-side inference
While there are many different ways to perform server-side inference, the simplest (which we will discuss in this tutorial) is using the new Route Handlers feature.
Step 1: Initialise the project
Start by creating a new Next.js application using create-next-app:
Copied
On installation, youβll see various prompts. For this demo, weβll be selecting those shown below in bold:
Step 2: Install and configure Transformers.js
You can install Transformers.js from NPM with the following command:
Copied
We also need to update the next.config.js file to prevent Webpack from bundling certain packages:
Copied
Next, letβs set up our Route Handler. We can do this by creating two files in a new ./src/app/classify/ directory:
pipeline.js- to handle the construction of our pipeline.Copied
route.js- to process requests made to the/classifyroute.Copied
Step 3: Design the user interface
Weβll now modify the default ./src/app/page.js file to make requests to our newly-created Route Handler.
Copied
You can now run your application using the following command:
Copied
Visit the URL shown in the terminal (e.g., http://localhost:3000/) to see your application in action!
(Optional) Step 4: Build and deploy
For this demo, we will build and deploy our application to BOINC AI Spaces. If you havenβt already, you can create a free BOINC AI account here.
Create a new
Dockerfilein your projectβs root folder. You can use our example Dockerfile as a template.Visit https://boincai.com/new-space and fill in the form. Remember to select βDockerβ as the space type (you can choose the βBlankβ Docker template).
Click the βCreate spaceβ button at the bottom of the page.
Go to βFilesβ β βAdd fileβ β βUpload filesβ. Drag the files from your project folder (excluding
node_modulesand.next, if present) into the upload box and click βUploadβ. After they have uploaded, scroll down to the button and click βCommit changes to mainβ.Add the following lines to the top of your
README.md:Copied
Thatβs it! Your application should now be live at https://boincai.com/spaces/<your-username>/<your-space-name>!
Last updated