Overview
Vercel Edge Functions run in V8 isolates distributed across Vercel’s global network. They start in under 50ms, run close to the user, and have no Node.js runtime. The constraints are real: no node:* modules, no native dependencies, no filesystem access. Edge functions pay off when the workload is latency-sensitive, globally distributed, and fits the Web API surface. This page covers when to use them, what the platform actually provides, and how to move work that does not fit.
Choose edge for latency-critical, globally distributed work
Edge functions eliminate the round-trip to a regional data center for work that does not need one. The use cases that justify the runtime constraints are:
- Auth token verification and session cookie reads before page render.
- A/B routing and feature-flag redirects based on header values.
- Geolocation-based redirects or locale selection.
- Request signing or header injection before passing upstream.
- Lightweight JSON responses that depend only on request data.
Do not use edge for database queries, heavy computation, file generation, or anything that calls a Node-only SDK. Those belong in a Node.js function or a server component. See vercel for the runtime selection overview and nextjs-middleware for the middleware variant of this pattern.
Set the runtime with the exported config object
In Next.js, declare the runtime per route or middleware file.
// app/api/check-session/route.ts
export const runtime = "edge";
export async function GET(request: Request) {
const token = request.headers.get("authorization");
if (!token) return new Response("Unauthorized", { status: 401 });
const valid = await verifyJwt(token); // must use Web Crypto, not jsonwebtoken
return Response.json({ valid });
}Routes default to nodejs if the runtime export is absent. Declare edge only when the route passes the constraints check below.
For standalone Vercel functions (outside Next.js), create a file in api/ and export a default handler:
// api/geo-redirect.ts
export const config = { runtime: "edge" };
export default function handler(req: Request) {
const country = (req as any).geo?.country ?? "us";
return Response.redirect(`https://example.com/${country.toLowerCase()}`);
}Know the Web API surface and its limits
Edge functions have access to: fetch, Request, Response, URL, URLSearchParams, Headers, ReadableStream, TextEncoder/TextDecoder, crypto (Web Crypto API), atob/btoa, setTimeout, setInterval, and console.
What is not available: fs, path, child_process, Buffer (use Uint8Array), process.env (use the edge runtime’s env object), node:crypto (use globalThis.crypto), and any native .node addon.
Common trap: jsonwebtoken uses node:crypto internally and will not work on edge. Replace it with jose (José is a Web Crypto-based JWT library). Similarly, bcrypt uses a native addon; replace with @node-rs/bcrypt only on Node, or avoid password hashing on edge entirely and delegate to a Node API route.
Keep edge functions under 1 MB; check the bundle
Vercel enforces a 1 MB compressed size limit for edge functions. Import graphs that pull in heavy libraries (full ORM, large utility package, analytics SDK) push past this limit.
Check the bundle after adding new imports:
ANALYZE=true next buildWhen the edge function is over budget, move the heavy import to a separate Node function and call it from edge via fetch. The overhead of an internal fetch is usually less than the cost of a failed build or a 500 at runtime. See nextjs-route-handlers for the Node handler pattern.
Handle cold starts correctly; do not warm-up poll
V8 isolate cold starts are fast (under 50ms), but they do happen. Do not write warming cron jobs or ping schedulers for edge functions. The platform handles isolate recycling; warming is counterproductive and wastes function invocations.
What does affect cold start time: the size of the function bundle. A large import graph takes longer to parse. The 1 MB limit is also a cold-start latency hint: lean bundles start faster. Keep edge function files focused on one task, with a minimal import graph.
Test the edge runtime locally before deploying
Vercel Dev runs the Edge runtime locally when the function exports runtime = "edge". Start the dev server with vercel dev (not next dev) to exercise the real runtime constraints before pushing.
vercel dev
# Edge functions run at http://localhost:3000/api/*next dev runs all functions under Node, which hides edge incompatibilities. A function that passes next dev and fails on deploy usually imports a Node-only module. Use vercel dev for edge function development. See vercel-preview-deployments for the preview environment where edge functions run in production conditions.