File uploads are one of those features that seem straightforward until you actually try to build them in Next.js App Router. The App Router introduced Server Actions — a fundamentally different way to handle form submissions, files included. But here's the thing: Server Actions are only one piece of the puzzle. Depending on your file size requirements, hosting setup, and how polished you want the UX to be, you might need Route Handlers, presigned URLs for direct cloud uploads, or a drag-and-drop interface with real progress tracking.
This guide walks through every practical approach to file uploads in the Next.js App Router. We'll build working examples for each method, and I'll help you figure out when to reach for which tool.
Understanding Your Options
Before we write any code, let's lay out the three main approaches and when each one makes sense:
- Server Actions — The simplest approach. Files go through your Next.js server via
FormData. Best for small-to-medium files (under 10 MB) where you want minimal setup. - Route Handlers — API-style endpoints that give you full control over the HTTP request and response. Useful when you need custom headers, streaming responses, or want to support non-form clients.
- Presigned URLs (S3/Cloud Storage) — Files upload directly from the browser to cloud storage, bypassing your server entirely. Essential for large files, progress tracking, and production scalability.
You can also combine these approaches — and honestly, that's what most production apps end up doing. A common pattern uses a Server Action or Route Handler to generate a presigned URL, then the client uploads directly to S3.
Method 1: File Upload with Server Actions
Server Actions are the most ergonomic way to handle file uploads in the App Router. They eliminate the need for API routes entirely — you define an async function marked with "use server" and attach it directly to a form. It's almost suspiciously simple.
Basic Server Action Upload
Start by creating the server action that handles the file:
// app/actions/upload.ts
"use server";
import { writeFile, mkdir } from "fs/promises";
import path from "path";
export async function uploadFile(formData: FormData) {
const file = formData.get("file") as File | null;
if (!file || file.size === 0) {
return { error: "No file provided" };
}
const bytes = await file.arrayBuffer();
const buffer = Buffer.from(bytes);
// Ensure the upload directory exists
const uploadDir = path.join(process.cwd(), "public", "uploads");
await mkdir(uploadDir, { recursive: true });
// Generate a unique filename to prevent overwrites
const uniqueName = `${Date.now()}-${file.name.replace(/[^a-zA-Z0-9.-]/g, "_")}`;
const filePath = path.join(uploadDir, uniqueName);
await writeFile(filePath, buffer);
return { success: true, filename: uniqueName, url: `/uploads/${uniqueName}` };
}
Then create a page that uses this action:
// app/upload/page.tsx
import { uploadFile } from "@/app/actions/upload";
export default function UploadPage() {
return (
<main>
<h1>Upload a File</h1>
<form action={uploadFile}>
<input type="file" name="file" required />
<button type="submit">Upload</button>
</form>
</main>
);
}
That's it. Seriously — that's the complete implementation. When the form submits, Next.js automatically serializes the file into FormData, sends a POST request to the server, and your action receives the file as a standard Web API File object. No API route needed.
Adding Pending State with useActionState
The basic version works, but your users deserve better than a form that just... sits there while uploading. Let's add a loading indicator. Convert the form to a client component and use useActionState:
// app/upload/upload-form.tsx
"use client";
import { useActionState } from "react";
import { uploadFile } from "@/app/actions/upload";
const initialState = { error: "", success: false, url: "" };
export default function UploadForm() {
const [state, action, pending] = useActionState(
async (_prev: typeof initialState, formData: FormData) => {
const result = await uploadFile(formData);
if (result.error) return { error: result.error, success: false, url: "" };
return { error: "", success: true, url: result.url ?? "" };
},
initialState
);
return (
<form action={action}>
<input type="file" name="file" required disabled={pending} />
<button type="submit" disabled={pending}>
{pending ? "Uploading..." : "Upload"}
</button>
{state.error && <p style={{ color: "red" }}>{state.error}</p>}
{state.success && <p>Uploaded to: {state.url}</p>}
</form>
);
}
The pending boolean from useActionState flips to true while the server action is executing. This lets you disable the form and show loading feedback without managing any state manually.
Configuring the Body Size Limit
Here's something that trips up a lot of developers: Server Actions have a default body size limit of 1 MB. For file uploads, you'll almost certainly need to bump this up. Update your next.config.ts:
// next.config.ts
import type { NextConfig } from "next";
const nextConfig: NextConfig = {
serverActions: {
bodySizeLimit: "10mb",
},
};
export default nextConfig;
Pick a limit that matches your actual requirements. For profile avatars, 5mb is usually plenty. For document uploads, 10mb to 25mb covers most cases. Anything larger than that? Skip ahead to the presigned URL approach.
Method 2: File Upload with Route Handlers
Route Handlers give you a proper API endpoint with full control over the request and response. This is the way to go when your upload client isn't a browser form — maybe it's a mobile app, a CLI tool, or you need to return specific HTTP status codes and headers.
// app/api/upload/route.ts
import { writeFile, mkdir } from "fs/promises";
import path from "path";
import { NextRequest, NextResponse } from "next/server";
export async function POST(request: NextRequest) {
const formData = await request.formData();
const file = formData.get("file") as File | null;
if (!file || file.size === 0) {
return NextResponse.json(
{ error: "No file provided" },
{ status: 400 }
);
}
// Validate file type
const allowedTypes = ["image/jpeg", "image/png", "image/webp", "application/pdf"];
if (!allowedTypes.includes(file.type)) {
return NextResponse.json(
{ error: "File type not allowed" },
{ status: 415 }
);
}
// Validate file size (10 MB)
const maxSize = 10 * 1024 * 1024;
if (file.size > maxSize) {
return NextResponse.json(
{ error: "File too large. Maximum size is 10 MB." },
{ status: 413 }
);
}
const bytes = await file.arrayBuffer();
const buffer = Buffer.from(bytes);
const uploadDir = path.join(process.cwd(), "public", "uploads");
await mkdir(uploadDir, { recursive: true });
const uniqueName = `${Date.now()}-${file.name.replace(/[^a-zA-Z0-9.-]/g, "_")}`;
await writeFile(path.join(uploadDir, uniqueName), buffer);
return NextResponse.json({
success: true,
filename: uniqueName,
url: `/uploads/${uniqueName}`,
});
}
Call this endpoint from a client component using fetch:
// app/upload/route-handler-form.tsx
"use client";
import { useState } from "react";
export default function RouteHandlerForm() {
const [status, setStatus] = useState<string>("");
async function handleSubmit(e: React.FormEvent<HTMLFormElement>) {
e.preventDefault();
const formData = new FormData(e.currentTarget);
setStatus("Uploading...");
const res = await fetch("/api/upload", { method: "POST", body: formData });
const data = await res.json();
if (res.ok) {
setStatus(`Uploaded: ${data.url}`);
} else {
setStatus(`Error: ${data.error}`);
}
}
return (
<form onSubmit={handleSubmit}>
<input type="file" name="file" required />
<button type="submit">Upload</button>
{status && <p>{status}</p>}
</form>
);
}
One thing to keep in mind: Route Handlers don't share the Server Actions body size limit. Instead, they follow the default Node.js or hosting platform limits. On Vercel, the payload limit for serverless functions is 4.5 MB on Hobby and 6 MB on Pro. For self-hosted deployments you can handle larger payloads, but for really big files, presigned URLs are still your best bet.
Method 3: Direct-to-Cloud Upload with S3 Presigned URLs
For production apps handling files over a few megabytes, this is the approach I'd recommend. Instead of routing files through your Next.js server, the browser uploads directly to cloud storage. This eliminates server bandwidth bottlenecks, sidesteps payload size limits, and — here's the real win — enables actual upload progress tracking.
How Presigned URLs Work
The flow is straightforward (once you see it):
- Your client requests a presigned URL from your Next.js server (via a Server Action or Route Handler).
- Your server generates a time-limited, signed URL using AWS credentials — the client never sees your secret keys.
- The client uploads the file directly to S3 using an HTTP
PUTrequest to that presigned URL. - The server can optionally record the uploaded file URL in your database.
Setting Up the Server Action for Presigned URL Generation
First, install the AWS SDK packages:
npm install @aws-sdk/client-s3 @aws-sdk/s3-request-presigner
Then create a server action that generates the presigned upload URL:
// app/actions/s3-upload.ts
"use server";
import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3";
import { getSignedUrl } from "@aws-sdk/s3-request-presigner";
import crypto from "crypto";
const s3 = new S3Client({
region: process.env.AWS_REGION!,
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
},
});
export async function getPresignedUploadUrl(
filename: string,
contentType: string
) {
// Validate content type on the server
const allowedTypes = ["image/jpeg", "image/png", "image/webp", "application/pdf"];
if (!allowedTypes.includes(contentType)) {
return { error: "File type not allowed" };
}
// Generate a unique key to prevent collisions
const fileKey = `uploads/${crypto.randomUUID()}-${filename}`;
const command = new PutObjectCommand({
Bucket: process.env.AWS_S3_BUCKET!,
Key: fileKey,
ContentType: contentType,
});
const presignedUrl = await getSignedUrl(s3, command, { expiresIn: 300 });
return {
presignedUrl,
fileKey,
fileUrl: `https://${process.env.AWS_S3_BUCKET}.s3.${process.env.AWS_REGION}.amazonaws.com/${fileKey}`,
};
}
Client Upload with Progress Tracking
So here's an annoying quirk: the fetch API doesn't support upload progress events. To show a real progress bar, you need to reach for XMLHttpRequest. Yeah, XHR in 2026 — but it's the only browser API that gives you native upload progress through xhr.upload.onprogress:
// app/upload/s3-upload-form.tsx
"use client";
import { useState, useRef } from "react";
import { getPresignedUploadUrl } from "@/app/actions/s3-upload";
export default function S3UploadForm() {
const [progress, setProgress] = useState(0);
const [status, setStatus] = useState<"idle" | "uploading" | "done" | "error">("idle");
const [fileUrl, setFileUrl] = useState("");
const abortRef = useRef<XMLHttpRequest | null>(null);
async function handleUpload(e: React.ChangeEvent<HTMLInputElement>) {
const file = e.target.files?.[0];
if (!file) return;
setStatus("uploading");
setProgress(0);
// Step 1: Get presigned URL from the server
const result = await getPresignedUploadUrl(file.name, file.type);
if ("error" in result) {
setStatus("error");
return;
}
// Step 2: Upload directly to S3 with progress tracking
const xhr = new XMLHttpRequest();
abortRef.current = xhr;
xhr.upload.addEventListener("progress", (event) => {
if (event.lengthComputable) {
setProgress(Math.round((event.loaded / event.total) * 100));
}
});
xhr.addEventListener("load", () => {
if (xhr.status >= 200 && xhr.status < 300) {
setStatus("done");
setFileUrl(result.fileUrl);
} else {
setStatus("error");
}
});
xhr.addEventListener("error", () => setStatus("error"));
xhr.open("PUT", result.presignedUrl);
xhr.setRequestHeader("Content-Type", file.type);
xhr.send(file);
}
function handleCancel() {
abortRef.current?.abort();
setStatus("idle");
setProgress(0);
}
return (
<div>
<input
type="file"
onChange={handleUpload}
disabled={status === "uploading"}
/>
{status === "uploading" && (
<div>
<div style={{
width: "100%", background: "#e0e0e0", borderRadius: 4, overflow: "hidden"
}}>
<div style={{
width: `${progress}%`, background: "#0070f3", height: 8,
transition: "width 0.2s ease"
}} />
</div>
<p>{progress}% uploaded</p>
<button onClick={handleCancel}>Cancel</button>
</div>
)}
{status === "done" && <p>Upload complete: {fileUrl}</p>}
{status === "error" && <p style={{ color: "red" }}>Upload failed</p>}
</div>
);
}
This gives you a genuine progress bar that updates in real time as bytes transfer to S3. And users can cancel mid-upload by aborting the XHR request — a small detail that makes a big difference in UX.
Building a Drag-and-Drop Upload Zone
A polished upload experience usually includes drag-and-drop support. The good news? You can build this with native HTML5 drag events — no external libraries needed:
// app/upload/drop-zone.tsx
"use client";
import { useState, useCallback } from "react";
interface DropZoneProps {
onFileSelected: (file: File) => void;
accept?: string;
disabled?: boolean;
}
export default function DropZone({ onFileSelected, accept, disabled }: DropZoneProps) {
const [isDragging, setIsDragging] = useState(false);
const handleDragOver = useCallback((e: React.DragEvent) => {
e.preventDefault();
e.stopPropagation();
if (!disabled) setIsDragging(true);
}, [disabled]);
const handleDragLeave = useCallback((e: React.DragEvent) => {
e.preventDefault();
e.stopPropagation();
setIsDragging(false);
}, []);
const handleDrop = useCallback((e: React.DragEvent) => {
e.preventDefault();
e.stopPropagation();
setIsDragging(false);
if (disabled) return;
const file = e.dataTransfer.files[0];
if (file) onFileSelected(file);
}, [disabled, onFileSelected]);
return (
<div
onDragOver={handleDragOver}
onDragLeave={handleDragLeave}
onDrop={handleDrop}
style={{
border: `2px dashed ${isDragging ? "#0070f3" : "#ccc"}`,
borderRadius: 8,
padding: 40,
textAlign: "center",
background: isDragging ? "#f0f7ff" : "transparent",
cursor: disabled ? "not-allowed" : "pointer",
transition: "all 0.2s ease",
}}
>
<p>{isDragging ? "Drop your file here" : "Drag and drop a file, or click to browse"}</p>
<input
type="file"
accept={accept}
disabled={disabled}
onChange={(e) => {
const file = e.target.files?.[0];
if (file) onFileSelected(file);
}}
style={{ display: "none" }}
id="file-input"
/>
<label htmlFor="file-input" style={{ cursor: "pointer", color: "#0070f3" }}>
Browse files
</label>
</div>
);
}
This component handles both drag-and-drop and traditional file browsing. The visual feedback — border color change and background highlight — gives users a clear signal that the drop zone is active. You can compose this with any of the three upload methods we've covered.
Server-Side Validation with Zod
Client-side validation is nice for the user experience, but let's be real: it can be bypassed entirely. Every Server Action is a public HTTP POST endpoint — anyone with curl can call it directly. You must validate files on the server, no exceptions.
Zod with the zod-form-data library makes this pretty painless:
npm install zod zod-form-data
// app/actions/validated-upload.ts
"use server";
import { z } from "zod";
import { zfd } from "zod-form-data";
import { writeFile, mkdir } from "fs/promises";
import path from "path";
const MAX_FILE_SIZE = 5 * 1024 * 1024; // 5 MB
const ALLOWED_TYPES = ["image/jpeg", "image/png", "image/webp"] as const;
const uploadSchema = zfd.formData({
file: zfd
.file()
.refine((file) => file.size > 0, "File is required")
.refine((file) => file.size <= MAX_FILE_SIZE, "File must be under 5 MB")
.refine(
(file) => ALLOWED_TYPES.includes(file.type as typeof ALLOWED_TYPES[number]),
"Only JPEG, PNG, and WebP images are allowed"
),
});
export async function uploadValidatedFile(formData: FormData) {
const result = uploadSchema.safeParse(formData);
if (!result.success) {
const errors = result.error.flatten().fieldErrors;
return { error: errors.file?.[0] ?? "Validation failed" };
}
const { file } = result.data;
const bytes = await file.arrayBuffer();
const buffer = Buffer.from(bytes);
const uploadDir = path.join(process.cwd(), "public", "uploads");
await mkdir(uploadDir, { recursive: true });
const ext = file.name.split(".").pop() ?? "bin";
const uniqueName = `${Date.now()}-${crypto.randomUUID()}.${ext}`;
await writeFile(path.join(uploadDir, uniqueName), buffer);
return { success: true, url: `/uploads/${uniqueName}` };
}
The safeParse method returns structured errors instead of throwing, which makes it easy to return field-level error messages to the client. The .refine() calls chain together to validate file presence, size, and MIME type in sequence.
Security Best Practices for File Uploads
File uploads are genuinely one of the most common attack vectors in web applications. I've seen production apps get burned by skipping these steps, so don't treat this section as optional.
Never Trust the File Extension or MIME Type
Both the file extension and the Content-Type header are set by the client and can be spoofed trivially. For images, verify the actual file contents by reading the magic bytes — the first few bytes that identify the real file format:
// lib/validate-image.ts
export function isValidImage(buffer: Buffer): boolean {
// Check magic bytes for common image formats
const signatures: Record<string, number[]> = {
jpeg: [0xff, 0xd8, 0xff],
png: [0x89, 0x50, 0x4e, 0x47],
webp: [0x52, 0x49, 0x46, 0x46], // RIFF header
};
for (const sig of Object.values(signatures)) {
if (sig.every((byte, i) => buffer[i] === byte)) {
return true;
}
}
return false;
}
Sanitize Filenames
Never use user-provided filenames directly for storage. They might contain path traversal attacks (../../etc/passwd), special characters that break filesystems, or absurdly long names. Always generate your own filename on the server — every code example in this article already does this.
Store Uploads Outside the Public Directory in Production
Storing files in public/uploads works fine for local development, but don't do this in production. Use dedicated object storage like S3, Google Cloud Storage, or Cloudflare R2 instead. This prevents your uploads directory from growing unbounded, avoids serving user-uploaded content from the same origin as your app (which mitigates certain XSS risks), and gives you CDN distribution for free.
Authenticate Upload Requests
Remember: every Server Action is a publicly accessible HTTP endpoint. Always verify that the user is authenticated and authorized before processing an upload:
// app/actions/secure-upload.ts
"use server";
import { auth } from "@/lib/auth";
export async function secureUpload(formData: FormData) {
const session = await auth();
if (!session?.user) {
return { error: "Authentication required" };
}
// Proceed with upload...
}
Set Rate Limits
Without rate limiting, a single bot could flood your server with upload requests and either eat through your storage or bring things to a crawl. Apply rate limiting at the middleware level or within the server action itself. Something like @upstash/ratelimit with Redis works well for this in serverless environments.
Choosing the Right Approach
So, which method should you actually use? Here's my take:
- Use Server Actions when files are small (under 10 MB), the upload is part of a form submission, and you want the simplest possible setup.
- Use Route Handlers when you need an API that non-browser clients can consume, you want fine-grained control over HTTP status codes and headers, or you need to process the file as a stream.
- Use Presigned URLs when files could be large (over 10 MB), you need real upload progress tracking, you're deploying on Vercel or another serverless platform with payload limits, or you want to keep your server out of the data path entirely.
For most production applications, the best architecture is a hybrid: use a Server Action to validate the request and generate a presigned URL, then let the browser upload directly to cloud storage. You get server-side validation, minimal server load, progress tracking, and support for files of any size. It's a bit more work upfront, but it scales beautifully.
Handling Multiple File Uploads
Need to accept multiple files at once? Add the multiple attribute to your file input and use formData.getAll() instead of formData.get():
// app/actions/multi-upload.ts
"use server";
import { writeFile, mkdir } from "fs/promises";
import path from "path";
export async function uploadMultipleFiles(formData: FormData) {
const files = formData.getAll("files") as File[];
if (files.length === 0) {
return { error: "No files provided" };
}
const uploadDir = path.join(process.cwd(), "public", "uploads");
await mkdir(uploadDir, { recursive: true });
const results = await Promise.all(
files.map(async (file) => {
if (file.size === 0) return null;
const bytes = await file.arrayBuffer();
const buffer = Buffer.from(bytes);
const uniqueName = `${Date.now()}-${file.name.replace(/[^a-zA-Z0-9.-]/g, "_")}`;
await writeFile(path.join(uploadDir, uniqueName), buffer);
return { filename: uniqueName, url: `/uploads/${uniqueName}` };
})
);
return {
success: true,
files: results.filter(Boolean),
};
}
<form action={uploadMultipleFiles}>
<input type="file" name="files" multiple />
<button type="submit">Upload All</button>
</form>
When using presigned URLs for multiple files, generate all your URLs upfront and upload files in parallel from the client. Your users will thank you for the speed.
FAQ
What is the maximum file size for Next.js Server Action uploads?
The default maximum body size for Server Actions is 1 MB. You can increase it by setting serverActions.bodySizeLimit in next.config.ts — for example, "10mb" allows files up to 10 MB. For files larger than about 10-25 MB, you're better off using presigned URLs for direct-to-cloud uploads.
How do I show an upload progress bar in Next.js?
The fetch API doesn't support upload progress events (frustrating, I know). To track real upload progress, use XMLHttpRequest with its xhr.upload.onprogress event. This works best with the presigned URL approach where the browser uploads directly to cloud storage. Server Action uploads only give you a binary pending/done state through useActionState — no granular progress.
Can I upload files to S3 from a Vercel-deployed Next.js app?
Absolutely. Use presigned URLs to upload files directly from the browser to S3, bypassing the Vercel serverless function entirely. This sidesteps Vercel's payload size limit (4.5 MB on Hobby, 6 MB on Pro) and means the serverless function isn't a bandwidth bottleneck. Your Next.js server only generates the presigned URL — the actual file transfer happens between the browser and S3.
Should I use Server Actions or Route Handlers for file uploads?
Use Server Actions for form-based uploads where simplicity is your priority. Use Route Handlers when you need to support non-browser clients, return specific HTTP status codes, or handle streaming. For production apps dealing with large files, combine either approach with presigned URLs — that way you get the best of both worlds.
How do I validate file uploads securely in Next.js?
Always validate on the server, even if you also validate on the client. Use Zod with zod-form-data to check file size, MIME type, and presence in your Server Action. For images, go a step further and verify the file's magic bytes to confirm the actual format matches the declared MIME type. And never trust user-provided filenames — always generate unique names on the server to prevent path traversal attacks.