AI & LLMs
Integrate AI functionality to Fumadocs.
Docs for LLM
You can make your docs site more AI-friendly with dedicated docs content for large language models.
To begin, make a getLLMText function that converts pages into static MDX content.
In Fumadocs MDX, you can do:
import { source } from '@/lib/source';
import type { InferPageType } from 'fumadocs-core/source';
export async function getLLMText(page: InferPageType<typeof source>) {
const processed = await page.data.getText('processed');
return `# ${page.data.title} (${page.url})
${processed}`;
}It requires includeProcessedMarkdown to be enabled:
import { defineDocs } from 'fumadocs-mdx/config';
export const docs = defineDocs({
docs: {
postprocess: {
includeProcessedMarkdown: true,
},
},
});llms.txt
You can generate it from page tree using Loader API.
import { source } from '@/lib/source';
import { llms } from 'fumadocs-core/source';
// cached forever
export const revalidate = false;
export function GET() {
return new Response(llms(source).index());
}llms-full.txt
A version of docs for AIs to read.
import { source } from '@/lib/source';
import { getLLMText } from '@/lib/get-llm-text';
// cached forever
export const revalidate = false;
export async function GET() {
const scan = source.getPages().map(getLLMText);
const scanned = await Promise.all(scan);
return new Response(scanned.join('\n\n'));
}*.mdx
Allow AI agents to get the content of a page as Markdown/MDX, by appending .mdx to the end of path.
Make a route handler to return page content, and a middleware to point to it:
import { getLLMText } from '@/lib/get-llm-text';
import { source } from '@/lib/source';
import { notFound } from 'next/navigation';
export const revalidate = false;
export async function GET(_req: Request, { params }: RouteContext<'/llms.mdx/docs/[[...slug]]'>) {
const { slug } = await params;
const page = source.getPage(slug);
if (!page) notFound();
return new Response(await getLLMText(page), {
headers: {
'Content-Type': 'text/markdown',
},
});
}
export function generateStaticParams() {
return source.generateParams();
}import type { NextConfig } from 'next';
const config: NextConfig = {
async rewrites() {
return [
{
source: '/docs/:path*.mdx',
destination: '/llms.mdx/docs/:path*',
},
];
},
};Accept
To serve the Markdown content instead for AI agents, you can leverage the Accept header.
import { NextRequest, NextResponse } from 'next/server';
import { isMarkdownPreferred, rewritePath } from 'fumadocs-core/negotiation';
const { rewrite: rewriteLLM } = rewritePath('/docs{/*path}', '/llms.mdx/docs{/*path}');
export default function proxy(request: NextRequest) {
if (isMarkdownPreferred(request)) {
const result = rewriteLLM(request.nextUrl.pathname);
if (result) {
return NextResponse.rewrite(new URL(result, request.nextUrl));
}
}
return NextResponse.next();
}Page Actions
Common page actions for AI, require *.mdx to be implemented first.

npx @fumadocs/cli add ai/page-actionsUse it in your docs page like:
// URL to fetch Markdown content, only need to append .mdx to URL if you have `*.mdx` configured.
const markdownUrl = `${page.url}.mdx`
<div className="flex flex-row gap-2 items-center border-b pt-2 pb-6">
<LLMCopyButton markdownUrl={markdownUrl} />
<ViewOptions
markdownUrl={markdownUrl}
githubUrl={`https://github.com/${owner}/${repo}/blob/main/content/docs/${page.path}`}
/>
</div>Ask AI

You can install the AI chat dialog using Fumadocs CLI.
npx @fumadocs/cli add ai/openrouterIt's automatically configured for OpenRouter using Vercel AI SDK, with a /search tool for AI.
npx @fumadocs/cli add ai/inkeepIt's automatically configured for Inkeep AI using Vercel AI SDK.
Add your Inkeep API key to environment variables:
INKEEP_API_KEY="..."Add the component & trigger to root layout (or anywhere you prefer):
import { AISearch, AISearchPanel, AISearchTrigger } from '@/components/ai/search';
import { MessageCircleIcon } from 'lucide-react';
// or import your own button styles
import { buttonVariants } from 'fumadocs-ui/components/ui/button';
export default function RootLayout({ children }: { children: React.ReactNode }) {
return (
<html lang="en">
<body>
<AISearch>
<AISearchPanel />
<AISearchTrigger
position="float"
className={cn(
buttonVariants({
variant: 'secondary',
className: 'text-fd-muted-foreground rounded-2xl',
}),
)}
>
<MessageCircleIcon className="size-4.5" />
Ask AI
</AISearchTrigger>
</AISearch>
{children}
</body>
</html>
);
}To use your own AI models, update the configurations in useChat and /api/chat route.
Note that Fumadocs doesn't provide the AI model, it's up to you.
Your AI model can use the llms-full.txt file generated above, or more diversified sources of information when combined with 3rd party solutions.
How is this guide?
Last updated on
