Skip to content
All guides

AI SDK System Prompts

Store system prompts as agentdocs. Fetch the .md at runtime. Ship a prompt edit without a deploy.

2 min readGuide

Why this works

  • Edit without deploying — your app fetches the latest .md.
  • Version history — every change attributed, pinnable, revertible.
  • Team collaboration — product, engineering, and AI teams all edit the same doc.
  • Caching built in/@you/slug.md is a static-feeling endpoint you can cache at the edge.

Fetch the prompt

Option A — raw markdown (no API key, public docs)

export async function getSystemPrompt(slug: string) {
  const res = await fetch(`https://agentdoc.com/@yourteam/${slug}.md`, {
    next: { revalidate: 60 },
  });
  return res.text();
}

Option B — REST API (private docs)

export async function getSystemPrompt(slug: string) {
  const res = await fetch(`https://api.agentdoc.com/v1/docs/${slug}`, {
    headers: { Authorization: `Bearer ${process.env.AGENTDOC_API_KEY}` },
    next: { revalidate: 60 },
  });
  const doc = await res.json();
  return doc.content;
}

Get an API key instantly:

curl -X POST https://api.agentdoc.com/v1/keys -d '{"name":"my-app"}'

Use with the Vercel AI SDK

import { streamText } from 'ai';
import { anthropic } from '@ai-sdk/anthropic';
import { getSystemPrompt } from '@/lib/prompts';

export async function POST(req: Request) {
  const { messages } = await req.json();
  const system = await getSystemPrompt('customer-support-prompt');

  return streamText({
    model: anthropic('claude-sonnet-4-6'),
    system,
    messages,
  }).toDataStreamResponse();
}

Works identically with the Anthropic SDK (client.messages.create({ system, ... })) or OpenAI (role: 'system').

Pin a version in production

const res = await fetch(
  `https://api.agentdoc.com/v1/docs/${slug}/versions/${version}`,
  { headers: { Authorization: `Bearer ${process.env.AGENTDOC_API_KEY}` } }
);
  • Staging → getSystemPrompt() (always latest).
  • Production → getPromptVersion(slug, 12) (pinned).
  • Promote by bumping the pinned number.

Cache at the edge

import { unstable_cache } from 'next/cache';

const getPrompt = unstable_cache(
  (slug: string) =>
    fetch(`https://agentdoc.com/@yourteam/${slug}.md`).then((r) => r.text()),
  ['system-prompt'],
  { revalidate: 300 }
);

Organize many prompts

agentdoc create "Support Agent Prompt"
agentdoc create "Summarizer Prompt"
agentdoc create "Intent Classifier Prompt"
agentdoc project create "AI Prompts"
agentdoc project assign <prompt-slug> ai-prompts

agentdoc list shows every doc; agentdoc history <slug> shows every edit.