Replace bring-your-own-key (BYOK) by a one-click solution in your AI application

Stop asking for AI
API keys

We provide every user with a global AI account they can connect to your app with one click. After connection, your app obtains a secure access token to run inference on behalf of the user. 150+ models supported.
Note: we are NOT a key aggregator. Users never have to share any key with us.

BrainLink flow diagram

BrainLink connection UX

You can create your own custom button. Click here to play with the live demo.

Demo GIF

How to integrate

Step 1

Show the brainlink connect button

Create a button or link pointing to the brainlink connection page. Or use our embeddable button. When clicked, the user will be prompted to accept the connection.

import BrainLinkButton from "@brainlink/react-button";
...
<BrainLinkButton appClientId="your-brainlink-app-client-id" />
Step 2

Obtain the user access token

Once the user accepts the connection it will be redirected back to your application. You can then get the user access token to perform inference calls on behalf of the user.

import * as BrainLink from "@brainlink/spa-sdk";
...
const userAccessToken = await BrainLink.getUserToken();
Step 3

Use the access token on your inference requests

Use the user access token to make inference requests to the model of your choice on behalf of the user. You can use any AI SDK by configuring the baseUrl (BrainLink is OpenAI compatible) or direct REST API calls.

if (BrainLink.isConnected()) {
    const userAccessToken = await BrainLink.getUserToken();
    const openai = new OpenAI({
        baseURL: "https://www.brainlink.dev/api/v1",
        apiKey: userAccessToken,
        // Required to use the OpenAI SDK in the browser.
        // Since your app is securely using the user's access token via BrainLink, it's totally safe to use the SDK on the browser.
        dangerouslyAllowBrowser: true,
    });
}

const completion = await openai?.chat.completions.create({
    model: "google/gemini-2.0-flash-lite-preview-02-05:free",
    messages: messages,
});

Why should you use Brainlink?

Avoid manual configuration

Most non-technical people don't know how to configure API keys or other credentials. Allow your users to connect their inference account with a single click.

Compliant with all the AI providers' policies

Most AI providers discourage the usage of API keys in the client side, and strictly prohibit the storage of users' API keys on servers. We remove API keys from the equation and follow secure OAuth/PKCE flows.

Increased security

Most users don't know how to manage API keys properly. We replace them by secure access tokens that follow the high security OAuth/PKCE flows.

Full model control

Take control of which model to use for each task. No more dependency on provider-supported models. We support most models available out there.

Unified LLM API

You can combine multiple models without having to configure different inference providers or SDKs. Access all of them with a standard OpenAI compatible API.

Seamless user experience

Allow the users to top up their inference credits without leaving your app and show them exactly how much they are consuming.

Get rid of unpredictable inference costs

Forget about complex monetization strategies to cover unpredictable inference costs, let each user pay for what they consume by linking their account with one click.

Platform agnostic

Whether you are building a web application, a mobile application or a desktop application, Brainlink can be used in all of them.

Pricing

$0

totally free for developers
Get started in minutes