Route Handlers are how you build API endpoints in Next.js these days. They replaced the old Pages Router API Routes, and honestly, the upgrade is worth it — you get full control over HTTP semantics using standard Web Request and Response APIs. Whether you're building a public REST API, handling webhooks from Stripe, serving file downloads, or streaming AI responses in real time, Route Handlers are the right tool.
This guide walks through everything you need to build production-ready APIs with Next.js 16 and the App Router. We'll start with basic CRUD and work our way up to streaming, rate limiting, authentication middleware, and more.
What Are Route Handlers?
Route Handlers let you define server-side API endpoints inside the app/ directory using route.ts files. Each file exports functions named after HTTP methods — GET, POST, PUT, PATCH, DELETE, HEAD, and OPTIONS. Next.js automatically maps these exports to the corresponding HTTP methods at that route's URL path.
If you've worked with the Pages Router's API Routes before, the biggest shift is this: Route Handlers use the standard Web Request and Response APIs instead of the Express-style req/res objects. Next.js also provides extended versions — NextRequest and NextResponse — with convenience helpers for cookies, headers, and URL manipulation.
File Structure and Routing
Route Handlers follow the same file-system routing conventions as pages and layouts. You create a route.ts file inside any folder under app/, and the folder path becomes the URL path:
app/
├── api/
│ ├── users/
│ │ ├── route.ts → GET/POST /api/users
│ │ └── [id]/
│ │ └── route.ts → GET/PUT/DELETE /api/users/:id
│ ├── webhooks/
│ │ └── stripe/
│ │ └── route.ts → POST /api/webhooks/stripe
│ └── health/
│ └── route.ts → GET /api/health
There's one important rule to keep in mind: a route.ts file and a page.tsx file can't exist at the same route segment level. Makes sense when you think about it — a single URL can't serve both a page and an API response.
Your First Route Handler: Basic CRUD
Let's build a complete CRUD API for managing blog posts. This covers all four HTTP methods with proper request parsing and response handling.
List and Create Posts
// app/api/posts/route.ts
import { NextRequest, NextResponse } from 'next/server';
import { db } from '@/lib/db';
import { posts } from '@/lib/db/schema';
import { eq } from 'drizzle-orm';
// GET /api/posts — List all posts
export async function GET(request: NextRequest) {
const { searchParams } = request.nextUrl;
const page = parseInt(searchParams.get('page') ?? '1');
const limit = parseInt(searchParams.get('limit') ?? '10');
const offset = (page - 1) * limit;
const allPosts = await db
.select()
.from(posts)
.limit(limit)
.offset(offset);
return NextResponse.json({
data: allPosts,
page,
limit,
});
}
// POST /api/posts — Create a new post
export async function POST(request: NextRequest) {
const body = await request.json();
if (!body.title || !body.content) {
return NextResponse.json(
{ error: 'Title and content are required' },
{ status: 400 }
);
}
const [newPost] = await db
.insert(posts)
.values({
title: body.title,
content: body.content,
slug: body.title.toLowerCase().replace(/\s+/g, '-'),
createdAt: new Date(),
})
.returning();
return NextResponse.json(newPost, { status: 201 });
}
Read, Update, and Delete a Single Post
// app/api/posts/[id]/route.ts
import { NextRequest, NextResponse } from 'next/server';
import { db } from '@/lib/db';
import { posts } from '@/lib/db/schema';
import { eq } from 'drizzle-orm';
// GET /api/posts/:id
export async function GET(
request: NextRequest,
{ params }: { params: Promise<{ id: string }> }
) {
const { id } = await params;
const [post] = await db
.select()
.from(posts)
.where(eq(posts.id, parseInt(id)));
if (!post) {
return NextResponse.json(
{ error: 'Post not found' },
{ status: 404 }
);
}
return NextResponse.json(post);
}
// PUT /api/posts/:id
export async function PUT(
request: NextRequest,
{ params }: { params: Promise<{ id: string }> }
) {
const { id } = await params;
const body = await request.json();
const [updated] = await db
.update(posts)
.set({
...body,
updatedAt: new Date(),
})
.where(eq(posts.id, parseInt(id)))
.returning();
if (!updated) {
return NextResponse.json(
{ error: 'Post not found' },
{ status: 404 }
);
}
return NextResponse.json(updated);
}
// DELETE /api/posts/:id
export async function DELETE(
request: NextRequest,
{ params }: { params: Promise<{ id: string }> }
) {
const { id } = await params;
const [deleted] = await db
.delete(posts)
.where(eq(posts.id, parseInt(id)))
.returning();
if (!deleted) {
return NextResponse.json(
{ error: 'Post not found' },
{ status: 404 }
);
}
return NextResponse.json({ message: 'Post deleted' });
}
One thing that trips people up: the params type is Promise<{ id: string }> and must be awaited. This changed in Next.js 15 to support streaming, and it's still required in Next.js 16. If you forget the await, you'll get a runtime error that isn't always obvious.
Working with Request Data
Route Handlers give you several ways to read incoming data depending on the content type. Let's go through each one.
Query Parameters
export async function GET(request: NextRequest) {
const searchParams = request.nextUrl.searchParams;
const query = searchParams.get('q'); // ?q=nextjs
const sort = searchParams.get('sort'); // ?sort=date
const tags = searchParams.getAll('tag'); // ?tag=react&tag=nextjs
// Use the params to build your query...
}
JSON Body
export async function POST(request: NextRequest) {
const body = await request.json();
// body is typed as `any` — always validate at runtime
}
FormData (File Uploads)
export async function POST(request: NextRequest) {
const formData = await request.formData();
const name = formData.get('name') as string;
const file = formData.get('file') as File;
if (!file) {
return NextResponse.json(
{ error: 'No file provided' },
{ status: 400 }
);
}
const bytes = await file.arrayBuffer();
const buffer = Buffer.from(bytes);
// Save buffer to storage (S3, local filesystem, etc.)
return NextResponse.json({
name: file.name,
size: file.size,
type: file.type,
});
}
Headers and Cookies
import { cookies, headers } from 'next/headers';
export async function GET() {
const cookieStore = await cookies();
const token = cookieStore.get('session-token');
const headerList = await headers();
const userAgent = headerList.get('user-agent');
const authorization = headerList.get('authorization');
return NextResponse.json({ token: token?.value, userAgent });
}
CORS Configuration
If your Route Handlers serve as a public API consumed by external clients, you'll need to handle Cross-Origin Resource Sharing (CORS). There are two approaches — per-route and global — and which one you pick really depends on how many routes need it.
Per-Route CORS
Export an OPTIONS handler that returns preflight headers, and include the same headers in all other responses:
// app/api/public/route.ts
const corsHeaders = {
'Access-Control-Allow-Origin': 'https://your-frontend.com',
'Access-Control-Allow-Methods': 'GET, POST, PUT, DELETE, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type, Authorization',
};
export async function OPTIONS() {
return new Response(null, {
status: 204,
headers: corsHeaders,
});
}
export async function GET() {
const data = { message: 'Hello from the API' };
return NextResponse.json(data, {
headers: corsHeaders,
});
}
Global CORS via Middleware
For app-wide CORS, handle it in middleware.ts so every route is covered automatically:
// middleware.ts
import { NextRequest, NextResponse } from 'next/server';
const allowedOrigins = ['https://your-frontend.com'];
export function middleware(request: NextRequest) {
const origin = request.headers.get('origin') ?? '';
const isAllowed = allowedOrigins.includes(origin);
// Handle preflight
if (request.method === 'OPTIONS') {
const response = new NextResponse(null, { status: 204 });
if (isAllowed) {
response.headers.set('Access-Control-Allow-Origin', origin);
response.headers.set('Access-Control-Allow-Methods', 'GET, POST, PUT, DELETE, OPTIONS');
response.headers.set('Access-Control-Allow-Headers', 'Content-Type, Authorization');
}
return response;
}
const response = NextResponse.next();
if (isAllowed) {
response.headers.set('Access-Control-Allow-Origin', origin);
}
return response;
}
export const config = {
matcher: '/api/:path*',
};
A common pitfall (and one I've seen trip up even experienced developers): forgetting the OPTIONS handler. Browsers send a preflight OPTIONS request before POST, PUT, and DELETE requests that have custom headers. Without it, CORS fails silently, and you're left wondering why your fetch calls aren't working.
Caching and Revalidation
Route Handlers support the same caching controls as pages and layouts. Understanding when your handlers are static versus dynamic is key to getting good performance out of your API.
Static vs. Dynamic Behavior
In Next.js 16, GET handlers default to dynamic (uncached). To make a GET handler static and cached, you need to explicitly opt in:
// app/api/config/route.ts
// Force static — response is cached at build time
export const dynamic = 'force-static';
export async function GET() {
return NextResponse.json({
version: '1.0.0',
features: ['dark-mode', 'notifications'],
});
}
This is a change from earlier Next.js versions, by the way. The default used to be static for GET handlers that didn't read dynamic data. Now it's the other way around.
Time-Based Revalidation
// Revalidate every 60 seconds
export const revalidate = 60;
export async function GET() {
const data = await fetchExternalApi();
return NextResponse.json(data);
}
On-Demand Revalidation
You can also trigger revalidation from a Route Handler after a mutation or webhook:
// app/api/revalidate/route.ts
import { revalidatePath, revalidateTag } from 'next/cache';
import { NextRequest, NextResponse } from 'next/server';
export async function POST(request: NextRequest) {
const { path, tag, secret } = await request.json();
if (secret !== process.env.REVALIDATION_SECRET) {
return NextResponse.json({ error: 'Invalid secret' }, { status: 401 });
}
if (tag) {
revalidateTag(tag);
} else if (path) {
revalidatePath(path);
}
return NextResponse.json({ revalidated: true, now: Date.now() });
}
Streaming Responses
Route Handlers support streaming out of the box using the Web Streams API. This is where things get really interesting, especially if you're working with AI/LLM integrations where you want to send tokens to the client as they're generated.
Basic Streaming with ReadableStream
// app/api/stream/route.ts
export async function GET() {
const encoder = new TextEncoder();
const stream = new ReadableStream({
async start(controller) {
for (let i = 0; i < 10; i++) {
controller.enqueue(
encoder.encode(`data: Message ${i}\n\n`)
);
await new Promise((r) => setTimeout(r, 500));
}
controller.close();
},
});
return new Response(stream, {
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
Connection: 'keep-alive',
},
});
}
AI/LLM Streaming with the Vercel AI SDK
The Vercel AI SDK is probably the most popular way to handle streaming AI responses in Next.js right now. It pairs a Route Handler on the server with the useChat hook on the client, and it handles all the streaming plumbing for you:
// app/api/chat/route.ts
import { streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
export async function POST(request: Request) {
const { messages } = await request.json();
const result = streamText({
model: openai('gpt-4o'),
system: 'You are a helpful assistant.',
messages,
});
return result.toUIMessageStreamResponse();
}
// app/chat/page.tsx
'use client';
import { useChat } from '@ai-sdk/react';
export default function ChatPage() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
return (
<div>
{messages.map((m) => (
<div key={m.id}>
<strong>{m.role}:</strong> {m.content}
</div>
))}
<form onSubmit={handleSubmit}>
<input value={input} onChange={handleInputChange} />
<button type="submit">Send</button>
</form>
</div>
);
}
This pattern works with any AI provider — just swap openai for anthropic, google, or whatever adapter you need.
File Downloads
Serving files from a Route Handler is straightforward. Return a Response with the right headers and you're good:
// app/api/download/[filename]/route.ts
import { readFile } from 'fs/promises';
import path from 'path';
import { NextRequest, NextResponse } from 'next/server';
export async function GET(
request: NextRequest,
{ params }: { params: Promise<{ filename: string }> }
) {
const { filename } = await params;
const filePath = path.join(process.cwd(), 'private-files', filename);
try {
const fileBuffer = await readFile(filePath);
return new Response(fileBuffer, {
headers: {
'Content-Disposition': `attachment; filename="${filename}"`,
'Content-Type': 'application/octet-stream',
'Content-Length': fileBuffer.length.toString(),
},
});
} catch {
return NextResponse.json(
{ error: 'File not found' },
{ status: 404 }
);
}
}
For large files, though, don't read the entire thing into memory. Use Node.js createReadStream and convert it to a Web ReadableStream, or better yet, proxy directly from cloud storage like S3.
Webhook Handling
Route Handlers are the natural choice for receiving webhooks from services like Stripe, GitHub, or Twilio. The critical part — and this is the one thing you absolutely cannot skip — is verifying the webhook signature:
// app/api/webhooks/stripe/route.ts
import Stripe from 'stripe';
import { NextRequest, NextResponse } from 'next/server';
const stripe = new Stripe(process.env.STRIPE_SECRET_KEY!);
export async function POST(request: NextRequest) {
const body = await request.text();
const signature = request.headers.get('stripe-signature')!;
let event: Stripe.Event;
try {
event = stripe.webhooks.constructEvent(
body,
signature,
process.env.STRIPE_WEBHOOK_SECRET!
);
} catch (err) {
console.error('Webhook signature verification failed');
return NextResponse.json(
{ error: 'Invalid signature' },
{ status: 400 }
);
}
switch (event.type) {
case 'checkout.session.completed':
// Fulfill the order
break;
case 'invoice.payment_failed':
// Notify the customer
break;
}
return NextResponse.json({ received: true });
}
A quick heads-up: use request.text() instead of request.json() for webhooks. Signature verification requires the raw body string, not a parsed object. I've seen this mistake cause hours of debugging more than once.
Authentication Patterns
Protecting Route Handlers means verifying the caller's identity before doing anything else. Here are two common approaches.
Inline Auth Check
// lib/auth.ts
import { cookies } from 'next/headers';
import { verifyToken } from './jwt';
export async function getAuthUser() {
const cookieStore = await cookies();
const token = cookieStore.get('session-token')?.value;
if (!token) return null;
try {
return await verifyToken(token);
} catch {
return null;
}
}
// app/api/profile/route.ts
import { getAuthUser } from '@/lib/auth';
export async function GET() {
const user = await getAuthUser();
if (!user) {
return NextResponse.json(
{ error: 'Unauthorized' },
{ status: 401 }
);
}
return NextResponse.json({ user });
}
Reusable Higher-Order Handler
If you've got more than a couple of protected routes (and you probably do), repeating the auth check everywhere gets tedious fast. Here's a wrapper that handles it:
// lib/with-auth.ts
import { NextRequest, NextResponse } from 'next/server';
import { getAuthUser } from './auth';
type AuthHandler = (
request: NextRequest,
context: { params: Promise<Record<string, string>>; user: { id: string; email: string } }
) => Promise<Response>;
export function withAuth(handler: AuthHandler) {
return async (request: NextRequest, context: { params: Promise<Record<string, string>> }) => {
const user = await getAuthUser();
if (!user) {
return NextResponse.json(
{ error: 'Unauthorized' },
{ status: 401 }
);
}
return handler(request, { ...context, user });
};
}
// app/api/profile/route.ts
import { withAuth } from '@/lib/with-auth';
export const GET = withAuth(async (request, { user }) => {
return NextResponse.json({ user });
});
Rate Limiting
Rate limiting is essential for any public-facing API. Without it, a single client can hammer your endpoints and rack up costs (or just bring things to a crawl). For production, you'll want a distributed store like Redis. But for prototyping and development, an in-memory map does the job.
In-Memory Rate Limiter
// lib/rate-limit.ts
const rateMap = new Map<string, { count: number; resetAt: number }>();
export function rateLimit(
key: string,
limit: number = 10,
windowMs: number = 60_000
): { success: boolean; remaining: number } {
const now = Date.now();
const entry = rateMap.get(key);
if (!entry || now > entry.resetAt) {
rateMap.set(key, { count: 1, resetAt: now + windowMs });
return { success: true, remaining: limit - 1 };
}
if (entry.count >= limit) {
return { success: false, remaining: 0 };
}
entry.count++;
return { success: true, remaining: limit - entry.count };
}
// app/api/generate/route.ts
import { rateLimit } from '@/lib/rate-limit';
import { NextRequest, NextResponse } from 'next/server';
export async function POST(request: NextRequest) {
const ip = request.headers.get('x-forwarded-for') ?? 'anonymous';
const { success, remaining } = rateLimit(ip, 5, 60_000);
if (!success) {
return NextResponse.json(
{ error: 'Too many requests' },
{
status: 429,
headers: { 'Retry-After': '60' },
}
);
}
// Process the request...
return NextResponse.json(
{ data: 'result' },
{ headers: { 'X-RateLimit-Remaining': remaining.toString() } }
);
}
Production Rate Limiting with Upstash Redis
For anything going to production — especially on Vercel where serverless instances don't share memory — you need a distributed store:
// lib/rate-limit-redis.ts
import { Ratelimit } from '@upstash/ratelimit';
import { Redis } from '@upstash/redis';
export const rateLimiter = new Ratelimit({
redis: Redis.fromEnv(),
limiter: Ratelimit.slidingWindow(10, '1m'),
analytics: true,
});
// Usage in a route handler
export async function POST(request: NextRequest) {
const ip = request.headers.get('x-forwarded-for') ?? 'anonymous';
const { success, limit, remaining, reset } = await rateLimiter.limit(ip);
if (!success) {
return NextResponse.json(
{ error: 'Rate limit exceeded' },
{
status: 429,
headers: {
'X-RateLimit-Limit': limit.toString(),
'X-RateLimit-Remaining': remaining.toString(),
'X-RateLimit-Reset': reset.toString(),
},
}
);
}
// Process the request...
}
The in-memory approach resets when the server restarts and won't work across multiple serverless instances. Keep that in mind before shipping it.
Route Handlers vs. Server Actions
This is a question that comes up a lot, so here's a quick breakdown to help you pick the right tool for each situation:
| Scenario | Use |
|---|---|
| Internal form submissions and mutations | Server Actions |
| Public REST API for external clients | Route Handlers |
| Webhook endpoints (Stripe, GitHub) | Route Handlers |
| Streaming responses (AI chat, SSE) | Route Handlers |
| File downloads | Route Handlers |
| Cached GET endpoints | Route Handlers |
| Mobile app backend | Route Handlers |
| Component-level data mutations | Server Actions |
The golden rule: default to Server Actions for internal mutations from your React components, and use Route Handlers when you need external access, explicit HTTP semantics, caching, or streaming.
And if you need both — say, a Server Action for your web app and a public API for a mobile client — just extract the core logic into a shared data access layer. Both the Server Action and the Route Handler can call into it.
Production Best Practices
1. Validate Request Bodies at Runtime
The request body is always any in TypeScript. Don't trust it. Use a validation library like Zod to enforce schemas:
import { z } from 'zod';
const CreatePostSchema = z.object({
title: z.string().min(1).max(200),
content: z.string().min(1),
tags: z.array(z.string()).optional(),
});
export async function POST(request: NextRequest) {
const body = await request.json();
const parsed = CreatePostSchema.safeParse(body);
if (!parsed.success) {
return NextResponse.json(
{ error: 'Validation failed', issues: parsed.error.issues },
{ status: 400 }
);
}
// parsed.data is now fully typed
const post = await createPost(parsed.data);
return NextResponse.json(post, { status: 201 });
}
2. Handle Errors Gracefully
Wrap your handler logic in try/catch blocks and return appropriate HTTP status codes. And whatever you do, don't leak internal error messages to clients in production — that's both a security risk and a bad user experience:
export async function GET(request: NextRequest) {
try {
const data = await fetchData();
return NextResponse.json(data);
} catch (error) {
console.error('API error:', error);
return NextResponse.json(
{ error: 'Internal server error' },
{ status: 500 }
);
}
}
3. Use the Edge Runtime When It Fits
For lightweight handlers that don't need Node.js-specific APIs, the Edge Runtime gives you lower latency:
export const runtime = 'edge';
export async function GET() {
return NextResponse.json({ status: 'healthy', timestamp: Date.now() });
}
Edge handlers run closer to the user and start faster, but they can't use Node.js modules like fs, full crypto, or native database drivers. So it's a tradeoff.
4. Keep Handlers Small
Since Route Handlers deploy as serverless functions, keep them focused. Push heavy or long-running work to background jobs, queues, or external services. Serverless functions have execution time limits and connection isolation — they're not the place for persistent connections or long-lived shared state.
Frequently Asked Questions
Can I use Route Handlers and Server Actions in the same project?
Absolutely. This is actually the recommended approach. Use Server Actions for internal component mutations and Route Handlers for external APIs, webhooks, and streaming. Extract shared business logic into a data access layer that both can call.
How do I handle CORS in Next.js Route Handlers?
Export an OPTIONS function in your route.ts that returns CORS headers (Access-Control-Allow-Origin, Access-Control-Allow-Methods, Access-Control-Allow-Headers). Include the same headers in all your other method responses too. If you need CORS across your whole app, handle it in middleware.ts instead.
Are Route Handlers cached by default in Next.js 16?
No. Starting in Next.js 15, GET handlers default to dynamic (uncached) rendering. To enable caching, export const dynamic = 'force-static' or set a revalidate value in your route file.
How do I verify Stripe webhook signatures in a Route Handler?
Read the raw body with request.text() (not request.json()), grab the stripe-signature header, and call stripe.webhooks.constructEvent(body, signature, webhookSecret). This ensures the event actually came from Stripe and hasn't been tampered with.
What is the difference between Route Handlers and API Routes?
API Routes belong to the Pages Router (pages/api/) and use Express-style req/res objects. Route Handlers belong to the App Router (app/) and use standard Web Request/Response APIs. If you're starting a new Next.js project, Route Handlers are the way to go.