Skip to main content
    Integration Ecosystem

    Works With Your Stack

    OpenAI-compatible API. If your app can call OpenAI, it can call Behest.

    No SDK required. Standard HTTP with JSON — from any language, any framework, any platform.

    LLM Providers

    Smart routing across multiple providers. BYO API keys or use models through Behest.

    G

    Google Gemini

    Live Now

    Gemini 2.5 Flash — live and routing traffic today

    O

    OpenAI

    Coming Soon

    GPT-4o, GPT-4.1, o3 and more via smart routing

    A

    Anthropic

    Coming Soon

    Claude Sonnet, Opus, Haiku via smart routing

    M

    Mistral

    Coming Soon

    Mistral Large, Medium, and Small models

    L

    Meta (Llama)

    Coming Soon

    Llama 4 open-weight models

    C

    Cohere

    Coming Soon

    Command R+ for enterprise RAG workloads

    Behest uses an OpenAI-compatible API (POST /v1/chat/completions). When BYO keys ships, bring any provider.

    Frontend Frameworks

    CORS handled — call Behest directly from any frontend

    R

    React

    Compatible

    Direct browser calls via CORS-ready API

    N

    Next.js

    Compatible

    Server components, API routes, or client-side

    V

    Vue

    Compatible

    Composition API or Options API — just fetch

    S

    Svelte

    Compatible

    Lightweight calls from any Svelte component

    A

    Angular

    Compatible

    HttpClient or standalone fetch

    A

    Astro

    Compatible

    Server-side or client island integration

    R

    Remix

    Compatible

    Loaders, actions, or client-side fetch

    RN

    React Native

    Compatible

    Mobile apps — same API, same fetch call

    F

    Flutter

    Compatible

    Dart http package — standard REST call

    S

    Swift / iOS

    Compatible

    URLSession or Alamofire — any HTTP client

    K

    Kotlin / Android

    Compatible

    OkHttp, Ktor, or Retrofit — standard REST

    Backend Languages & SDKs

    Any language that can make an HTTP request works with Behest

    Py

    Python

    Compatible

    requests, httpx, or the OpenAI SDK

    JS

    Node.js / TypeScript

    Compatible

    fetch, axios, or the OpenAI SDK

    Go

    Go

    Compatible

    net/http — idiomatic Go HTTP calls

    Rb

    Ruby

    Compatible

    Net::HTTP, Faraday, or HTTParty

    J

    Java

    Compatible

    HttpClient, OkHttp, or Spring WebClient

    C#

    C# / .NET

    Compatible

    HttpClient — standard .NET HTTP

    P

    PHP

    Compatible

    Guzzle, cURL, or native HTTP

    Rs

    Rust

    Compatible

    reqwest or hyper — async HTTP

    Official SDKs for Python and Node.js coming soon.

    Infrastructure

    Self-hosted in your cloud — deploy on any container platform

    K8

    Kubernetes

    Compatible

    Deploy Behest in any K8s cluster

    D

    Docker

    Compatible

    Container-first — single docker run

    H

    Helm

    Compatible

    Helm chart for repeatable K8s deploys

    G

    GKE

    Compatible

    Google Kubernetes Engine — native support

    E

    AWS EKS

    Compatible

    Amazon Elastic Kubernetes Service

    T

    Terraform

    Compatible

    Infrastructure as code — IaC-ready

    A

    ArgoCD

    Compatible

    GitOps continuous delivery

    Auth Providers

    Multi-tenant authentication with JWT verification

    S

    Supabase

    Live Now

    JWT verification — fully supported

    JWT

    Custom JWT

    Live Now

    Any JWT issuer — bring your own tokens

    A0

    Auth0

    Coming Soon

    Auth0 JWT integration

    C

    Clerk

    Coming Soon

    Clerk session tokens

    F

    Firebase Auth

    Coming Soon

    Firebase ID tokens

    How Integration Works

    One HTTP request. That's it.

    Your App
    HTTP Request
    Behest
    LLM

    CORS, auth, memory, PII scrubbing, prompt defense, rate limiting, caching — all handled by Behest in the middle.

    fetch — no SDK required
    const response = await fetch("https://your-instance.behest.ai/v1/chat/completions", {
      method: "POST",
      headers: {
        "Content-Type": "application/json",
        "Authorization": "Bearer YOUR_API_KEY"
      },
      body: JSON.stringify({
        model: "gemini-2.5-flash",
        messages: [{ role: "user", content: "Hello, world!" }]
      })
    });
    
    const data = await response.json();
    console.log(data.choices[0].message.content);

    Start integrating in 5 minutes

    No SDK to install. No configuration ceremony. Just an HTTP request with your API key.