v0 Prompt: Behest Chat UI
For Vercel v0. v0 is strongest at UI generation — use it to produce the chat surface, then hand the Behest integration to a backend-capable tool (Cursor, Claude) or do it yourself.
Two-step workflow
v0 shines at pixel-perfect React + Tailwind + shadcn/ui. It's less good at server endpoints and env wiring. So:
- Use v0 to scaffold the chat UI + hook signature.
- Add the Behest backend + token route yourself, following the Next.js quickstart.
Prompt (step 1 — UI only)
Paste into v0:
SECURITY NON-NEGOTIABLES (do not deviate)
- This component must NOT import fetch, @behest/client-ts, OpenAI SDK, or any network code. IO goes through the onSend() prop only.
- Do NOT invent a token endpoint here; the parent page owns that. The parent will derive user_id from its verified session and mint on the server.
- Do NOT put BEHEST_KEY anywhere in this component or reference it from env. The key is server-only.
Build a chat UI called BehestChat with these requirements:
PROPS
- threadId: string
- onSend(message: string, opts: { threadId: string; signal: AbortSignal }): AsyncIterable<string>
(Returns an async iterable of string deltas. Caller handles the Behest SDK.)
UI
- Two-column layout, 280px sidebar + flex main.
- Sidebar:
- "+ New chat" button at top.
- List of threads: title, timestamp, active highlight.
- Accept a `threads` prop: { id, title, last_message_at }[]
- Accept `activeThreadId` + onSelect(threadId) + onDelete(threadId).
- Main:
- Message list, user bubbles right-aligned, assistant left.
- Smooth streaming — render deltas as they arrive.
- Typing indicator ("...") before first delta.
- Stop button shown while a stream is active.
- Input at bottom: shadcn Textarea, Enter to send, Shift-Enter for newline.
- Disabled while a stream is in-flight (except the Stop button).
STATE MANAGEMENT
- Use useState + useRef. No external state lib.
- Messages = { id, role: "user" | "assistant", content: string, createdAt }[]
- When user submits:
1. Add user message.
2. Add empty assistant message.
3. Call onSend(text, { threadId, signal }).
4. For each delta, append to the last assistant message.
5. Catch AbortError quietly. Other errors: render inline "Something went wrong" with retry.
STYLING
- shadcn/ui primitives.
- Clean, minimal, dark-mode friendly (use CSS variables, not hard-coded colors).
- Messages max-width 720px centered.
- Autoscroll to bottom as streaming deltas arrive, but respect user scroll
position — if they scroll up, don't yank them back down.
DO NOT
- Include any network code. The chat component must not import fetch,
OpenAI SDK, or @behest/client-ts. All IO goes through onSend().
- Add features not listed (attachments, voice, settings page).
Output:
- src/components/BehestChat.tsx
- src/components/MessageList.tsx
- src/components/ThreadSidebar.tsx
- src/components/ChatInput.tsx
Step 2 — wire Behest
Once v0 gives you the UI, add the Behest integration in your Next.js app:
/api/behest/token/route.ts
ts
import { auth } from "@/auth";
import { NextResponse } from "next/server";
import { Behest } from "@behest/client-ts";
const behest = new Behest(); // reads BEHEST_KEY + BEHEST_BASE_URL from env
export async function POST() {
const session = await auth();
if (!session?.user?.id)
return new NextResponse("Unauthorized", { status: 401 });
try {
const { token, ttl, sessionId, expiresAt } = await behest.auth.mint({
user_id: session.user.id,
// tier should be read from your billing system, not hardcoded.
tier: (session.user as any).plan ?? 1,
ttl: 900,
});
return NextResponse.json({ token, ttl, sessionId, expiresAt });
} catch (err) {
return NextResponse.json({ error: String(err) }, { status: 500 });
}
}Parent page — wires v0's UI to Behest (browser uses OpenAI SDK directly)
tsx
"use client";
import { useState } from "react";
import OpenAI from "openai";
import { BehestChat } from "@/components/BehestChat";
type TokenBundle = {
token: string;
ttl: number;
sessionId: string;
expiresAt: number;
};
let cached: TokenBundle | null = null;
async function getToken(): Promise<TokenBundle> {
const now = Math.floor(Date.now() / 1000);
if (cached && cached.expiresAt - now > 60) return cached;
cached = await (await fetch("/api/behest/token", { method: "POST" })).json();
return cached!;
}
export default function ChatPage() {
const [threadId] = useState(() => crypto.randomUUID());
async function* onSend(
message: string,
{ threadId, signal }: { threadId: string; signal: AbortSignal }
) {
const { token, sessionId } = await getToken();
const openai = new OpenAI({
apiKey: token,
baseURL: `${process.env.NEXT_PUBLIC_BEHEST_BASE_URL}/v1`,
dangerouslyAllowBrowser: true,
defaultHeaders: { "X-Session-Id": sessionId, "X-Thread-Id": threadId },
});
const stream = await openai.chat.completions.create(
{ messages: [{ role: "user", content: message }], stream: true },
{ signal }
);
for await (const chunk of stream)
yield chunk.choices[0]?.delta?.content ?? "";
}
return (
<BehestChat
threadId={threadId}
threads={[]}
onSend={onSend}
activeThreadId={threadId}
onSelect={() => {}}
onDelete={() => {}}
/>
);
}Add env vars:
BEHEST_KEY=behest_sk_live_...
BEHEST_BASE_URL=https://amber-fox-042.behest.app
NEXT_PUBLIC_BEHEST_BASE_URL=https://amber-fox-042.behest.app
Why the split
v0 produces excellent UI but often mixes frontend code with invented server patterns. Keeping the contract at onSend(): AsyncIterable<string> gives v0 a clean boundary, and the Behest wiring is 30 lines of boilerplate that the Next.js quickstart already documents.
See also
- Next.js App Router quickstart
- Streaming UI
- Cursor prompt — better for end-to-end generation