← All docs Embed & integrate

Bring your own key (BYOK)

Use your own LLM provider key for billing, model choice, and rate limits.

Bring your own LLM key (BYOK)

By default, BestAI uses our own Anthropic key, so you do not need to set anything up to start chatting. If you would prefer to use your own LLM provider account — for example, to keep usage on your own bill, to satisfy a compliance requirement, or to use a different model — you can plug in your own key.

This is called BYOK, short for “bring your own key”. Available now.

What you can do

  • Use your own API key from one of four providers
  • Choose the default model for each provider
  • Replace, disable, or delete a key without losing your conversations
  • Keep your key encrypted; it is never visible after you save it

Step-by-step

1. Decide whether you need it

Most businesses do not need BYOK. Use this table to decide:

SituationUse BYOK?
Just trying BestAI, no LLM provider account yetNo — use the platform default
You already have credit on Anthropic, OpenAI, Groq, or DeepSeekOptional, but useful
Your business policy says AI traffic must run on your own providerYes
You want a stronger model than the default (for example claude-sonnet-4-5)Yes

2. Pick a provider

We support four providers today:

ProviderDefault modelNotes
Anthropicclaude-haiku-4-5Our own default.
OpenAIgpt-4o-miniUses the standard chat completions API.
Groqllama-3.1-70b-versatileLowest latency.
DeepSeekdeepseek-chatBest price for volume.

Get the API key from your provider’s console.

3. Add the key in BestAI

Only an Owner can add or replace an LLM key. Admins, Viewers, and Support can see the masked value but cannot change it.

  1. Sign in to the admin.
  2. Open Integrations, then LLM.
  3. Click Add provider (or Replace on an existing row).
  4. Choose your provider.
  5. Paste the API key from your provider’s console.
  6. (Optional) Override the model — for example claude-sonnet-4-5. Leave blank to use the default in step 2.
  7. Click Save.

The key is encrypted on save. From that moment on, the admin only ever shows you a masked version like sk-...abcd. Even our staff cannot read the original value.

4. Confirm it is working

Send a test message through your chat widget. Then return to Integrations, then LLM, and check that the Last used timestamp has updated. If the provider rejects the key, you will see an error in the admin and your conversations will fall back to a generic error message.

5. Replace, disable, or remove a key

ActionWhat it does
ReplaceSaves a new key over the old one. The next chat uses the new key.
DisableStops using this provider without removing it.
DeleteRemoves the row entirely. The encrypted value is purged from our database.

All changes are recorded to your workspace audit log, with provider name and masked key only. The full key never appears anywhere visible.

Common questions

My key has run out or my provider has blocked it. What happens to visitors?

A new visitor message hits your provider, fails, and BestAI shows the visitor a friendly error. We do not silently swap to our platform key. To stay safe, top up before you run out, or keep a second provider configured as a manual backup.

Can I use a different key per chatbot?

Not yet. Today, one workspace has one key per provider. Per-chatbot keys are on the roadmap.

If I replace a key, does it cut off conversations in flight?

No. The current chat finishes on the previous key. The next message uses the new key.

I have both Anthropic and OpenAI keys. Which one runs my chats?

The default provider is Anthropic. If you want to force a specific provider for a chatbot, disable the others, or use the model override field.

Where can I see how many tokens or how much my key has been used?

We track input and output tokens per message internally. Per-key billing dashboards are on the roadmap. Until then, check usage and billing in your provider’s own console.

Is my key really safe?

Yes. The key is encrypted using a secret only the BestAI server holds, never written to logs, never returned in any API response, and never visible to BestAI staff. Even a database backup is unreadable without the application secret.

Next steps