Let's solve together the pain points of developing and using AI apps

Our mission is to help developers build sustainable AI apps, agents and tools with improved UX for everyone

How to handle unpredictable inference costs?

The high cost of AI inference makes it impossible for indie devs to build free/inexpensive apps. Why don't we just allow each user to pay for exactly what they consume on every app without the friction of paywalls and subscriptions?

Why does each user need to subscribe to every AI app to gain access to the same inference APIs?

We solve this with portable accounts that can be connected to every app with a single click. We call these accounts 'brains'.

How it Works

Brain (account)

Each user has a portable brain that they can link to the apps with a single-click.

The brain provides your app with specific features to help you develop and monetize without paywalls. Conversion increases as users do not need to signup and add payment details to every app separatedly.

Step Brain (account) preview}
Wallet

Forget about inference costs.

Users can topup their brains and spend credits across all apps. Users' brains automatically cover inference costs on your app, so you don't have to worry about them.

Step Wallet preview}
Signup

Frictionless sign-up without forms

Brains are linked to your app with a single click, providing you with the user information and eliminating the need for extra sign up forms.

Step Signup preview}
Monetize

Monetize your app without paywalls.

Users are tired of subscriptions. You can set a markup on user inference spent and/or charge per run. Your app monetizes automatically through the user's brain, eliminating the need for extra paywalls.

Step Monetize preview}
Distribution channel

Leverage an effective distribution channel

Benefit from the large user base we generate together, just one click away from becoming users of your app as well.

Step Distribution channel preview}
LLM Proxy

Unified LLM API

Use a growing set of 150+ AI models through a single OpenAI compatible API.

Step LLM Proxy preview}

Brain Linking UX

You can create your own custom button. Click here to play with the live demo.

Demo GIF

Frequently Asked Questions

Take a look to the most common questions we receive.

How to integrate

Step 1

Show the brainlink connect button

Create a button or link pointing to the brainlink connection page. Or use our embeddable button. When clicked, the user will be prompted to accept the connection.

import BrainLinkButton from "@brainlink/react-button";
...
<BrainLinkButton appClientId="your-brainlink-app-client-id" />
Step 2

Obtain the user access token

Once the user accepts the connection it will be redirected back to your application. You can then get the user access token to perform inference calls on behalf of the user.

import * as BrainLink from "@brainlink/spa-sdk";
...
const userAccessToken = await BrainLink.getUserToken();
Step 3

Use the access token on your inference requests

Use the user access token to make inference requests to the model of your choice on behalf of the user. You can use any AI SDK by configuring the baseUrl (BrainLink is OpenAI compatible) or direct REST API calls.

if (BrainLink.isConnected()) {
    const userAccessToken = await BrainLink.getUserToken();
    const openai = new OpenAI({
        baseURL: "https://www.brainlink.dev/api/v1",
        apiKey: userAccessToken,
        // Required to use the OpenAI SDK in the browser.
        // Since your app is securely using the user's access token via BrainLink, it's totally safe to use the SDK on the browser.
        dangerouslyAllowBrowser: true,
    });
}

const completion = await openai?.chat.completions.create({
    model: "google/gemini-2.0-flash-lite-preview-02-05:free",
    messages: messages,
});
Developer Docs