Edge Functions and the Future of Serverless: Modern Architecture in 2025
Hello HaWkers, the serverless world has evolved significantly in recent years, and 2025 marks the consolidation of Edge Functions as the standard for high-performance applications. If you're not yet familiar with edge computing or want to understand how to leverage this technology, this article will guide you through the most important concepts and practices.
What exactly are Edge Functions? Why do they offer superior performance to traditional serverless functions? And how can you implement them in your projects? Let's explore all of this with practical examples.
What Are Edge Functions
Edge Functions are serverless functions that execute on geographically distributed servers, close to end users. Unlike traditional functions that run in specific regions, edge computing brings computation to the edge of the network.
Difference Between Traditional Serverless and Edge
Traditional Serverless:
- Runs in specific regions (us-east-1, eu-west-1, etc.)
- Latency depends on user distance to datacenter
- Cold start can be significant (hundreds of ms to seconds)
- Full Node.js environment generally available
Edge Functions:
- Execute in hundreds of global points of presence
- Consistently low latency regardless of location
- Minimal cold start (few milliseconds)
- Optimized runtime based on V8 or similar
💡 Context: For a user in Brazil accessing a function in us-east-1, network latency can be 100-200ms. With edge, this latency drops to 10-30ms.
Why Edge Functions Dominate in 2025
Edge function adoption has grown exponentially for several reasons:
1. Global Performance
With servers in more than 300 locations around the world, users anywhere have similar and low latency.
2. Cost-Benefit
Despite similar per-execution costs, lower latency means less wait time and better experience, justifying the investment.
3. Expanded Use Cases
Initially used only for simple redirects, they now support:
Modern use cases:
- Authentication and authorization
- Content personalization
- A/B testing
- Rate limiting
- Header manipulation
- Server-side rendering
- Complete APIs
4. Mature Ecosystem
Platforms like Cloudflare Workers, Vercel Edge Functions, Deno Deploy, and AWS CloudFront Functions have matured significantly.
Implementing Edge Functions in Practice
Let's see practical examples using different platforms.
Vercel Edge Functions
Perfect for Next.js projects:
// pages/api/geo.ts or app/api/geo/route.ts
import { NextRequest, NextResponse } from 'next/server';
export const config = {
runtime: 'edge', // Defines as edge function
};
export default function handler(req: NextRequest) {
// Geolocation information available automatically
const country = req.geo?.country ?? 'Unknown';
const city = req.geo?.city ?? 'Unknown';
const region = req.geo?.region ?? 'Unknown';
return NextResponse.json({
message: `Hello visitor from ${city}, ${region}, ${country}!`,
latency: 'This response came from the edge closest to you',
timestamp: new Date().toISOString(),
});
}Cloudflare Workers
One of the most popular edge platforms:
// worker.ts
export interface Env {
MY_KV: KVNamespace;
}
export default {
async fetch(
request: Request,
env: Env,
ctx: ExecutionContext
): Promise<Response> {
const url = new URL(request.url);
// Simple rate limiting using KV
const ip = request.headers.get('CF-Connecting-IP') ?? 'unknown';
const rateLimitKey = `ratelimit:${ip}`;
const currentCount = await env.MY_KV.get(rateLimitKey);
const count = currentCount ? parseInt(currentCount) : 0;
if (count > 100) {
return new Response('Rate limit exceeded', { status: 429 });
}
// Increment counter with 1 minute TTL
ctx.waitUntil(
env.MY_KV.put(rateLimitKey, String(count + 1), {
expirationTtl: 60
})
);
// Main logic
if (url.pathname === '/api/data') {
const data = await fetchDataFromOrigin();
return new Response(JSON.stringify(data), {
headers: { 'Content-Type': 'application/json' },
});
}
return new Response('Not Found', { status: 404 });
},
};
async function fetchDataFromOrigin() {
// Simulates data fetching
return { message: 'Data from edge', timestamp: Date.now() };
}
Deno Deploy
Native edge platform with Deno:
// server.ts
import { serve } from 'https://deno.land/std/http/server.ts';
interface RequestInfo {
method: string;
url: string;
userAgent: string | null;
region: string | null;
}
serve(async (request: Request): Promise<Response> => {
const requestInfo: RequestInfo = {
method: request.method,
url: request.url,
userAgent: request.headers.get('user-agent'),
region: Deno.env.get('DENO_REGION') ?? 'unknown',
};
// Simple routing
const url = new URL(request.url);
switch (url.pathname) {
case '/':
return new Response('Hello from Deno Edge!', {
headers: { 'content-type': 'text/plain' },
});
case '/api/info':
return new Response(JSON.stringify(requestInfo), {
headers: { 'content-type': 'application/json' },
});
case '/api/time':
const now = new Date();
return new Response(
JSON.stringify({
utc: now.toUTCString(),
iso: now.toISOString(),
region: requestInfo.region,
}),
{ headers: { 'content-type': 'application/json' } }
);
default:
return new Response('Not Found', { status: 404 });
}
});
Advanced Patterns with Edge Functions
Authentication Middleware
// middleware.ts (Next.js Edge Middleware)
import { NextRequest, NextResponse } from 'next/server';
import { jwtVerify } from 'jose';
const PUBLIC_PATHS = ['/', '/login', '/api/auth'];
const JWT_SECRET = new TextEncoder().encode(process.env.JWT_SECRET);
export async function middleware(request: NextRequest) {
const { pathname } = request.nextUrl;
// Allow public routes
if (PUBLIC_PATHS.some(path => pathname.startsWith(path))) {
return NextResponse.next();
}
// Verify JWT token
const token = request.cookies.get('auth-token')?.value;
if (!token) {
return NextResponse.redirect(new URL('/login', request.url));
}
try {
const { payload } = await jwtVerify(token, JWT_SECRET);
// Add user info to headers
const response = NextResponse.next();
response.headers.set('x-user-id', payload.sub as string);
response.headers.set('x-user-role', payload.role as string);
return response;
} catch (error) {
// Invalid token
const response = NextResponse.redirect(new URL('/login', request.url));
response.cookies.delete('auth-token');
return response;
}
}
export const config = {
matcher: [
'/((?!_next/static|_next/image|favicon.ico).*)',
],
};A/B Testing on the Edge
// Cloudflare Worker for A/B testing
export default {
async fetch(request: Request): Promise<Response> {
const url = new URL(request.url);
// Check existing cookie or assign variant
const cookies = request.headers.get('Cookie') ?? '';
let variant = getCookie(cookies, 'ab-variant');
if (!variant) {
// Randomly assign variant (50/50)
variant = Math.random() < 0.5 ? 'A' : 'B';
}
// Modify response based on variant
const response = await fetch(request);
const html = await response.text();
let modifiedHtml = html;
if (variant === 'B') {
// Apply variant B changes
modifiedHtml = html.replace(
'<button class="cta">Buy Now</button>',
'<button class="cta-new">Add to Cart</button>'
);
}
const newResponse = new Response(modifiedHtml, response);
// Set cookie to maintain consistency
newResponse.headers.append(
'Set-Cookie',
`ab-variant=${variant}; Path=/; Max-Age=604800`
);
return newResponse;
},
};
function getCookie(cookies: string, name: string): string | null {
const match = cookies.match(new RegExp(`${name}=([^;]+)`));
return match ? match[1] : null;
}
Limitations and Considerations
What Edge Functions Don't Do Well
Important limitations:
- Limited execution time (generally 30s max, some 50ms)
- Restricted memory (128MB typical)
- No filesystem access
- Limited APIs (no complete native Node.js)
- Long-running connections problematic
When to Use Traditional Serverless
Prefer traditional serverless for:
- CPU-intensive processing
- Complex database operations
- Long-running tasks
- When you need complete Node.js ecosystem
- Large file processing
Hybrid Architecture
The best approach usually combines edge and serverless:
User
↓
Edge Function (authentication, cache, personalization)
↓
Serverless Function (business logic, database)
↓
Database / Services
Platform Comparison
| Platform | Cold Start | Locations | Languages | Free Tier |
|---|---|---|---|---|
| Cloudflare Workers | ~0ms | 300+ | JS/TS, Rust, WASM | 100k req/day |
| Vercel Edge | ~1ms | 100+ | JS/TS | Per plan |
| Deno Deploy | ~1ms | 35+ | JS/TS, WASM | 1M req/month |
| AWS CloudFront | ~5ms | 400+ | JS | Per usage |
| Fastly Compute | ~5ms | 70+ | JS, Rust, WASM | Per usage |
Choosing the Right Platform
Cloudflare Workers:
- Best for standalone applications
- Most mature ecosystem (KV, D1, R2)
- Competitive pricing
Vercel Edge:
- Ideal for Next.js projects
- Perfect framework integration
- Powerful Edge Middleware
Deno Deploy:
- Great for those who prefer Deno
- Simple GitHub deployment
- TypeScript first-class
Best Practices
1. Aggressive Caching
// Implement cache for cacheable responses
export default {
async fetch(request: Request): Promise<Response> {
const cacheKey = new Request(request.url, request);
const cache = caches.default;
// Check cache first
let response = await cache.match(cacheKey);
if (!response) {
response = await fetch(request);
// Clone to cache
response = new Response(response.body, response);
response.headers.set('Cache-Control', 'public, max-age=3600');
// Cache in background
ctx.waitUntil(cache.put(cacheKey, response.clone()));
}
return response;
},
};2. Robust Error Handling
export default {
async fetch(request: Request): Promise<Response> {
try {
return await handleRequest(request);
} catch (error) {
console.error('Edge function error:', error);
// Return user-friendly error response
return new Response(
JSON.stringify({
error: 'Internal Error',
message: 'Something went wrong',
}),
{
status: 500,
headers: { 'Content-Type': 'application/json' },
}
);
}
},
};3. Monitoring and Observability
// Add structured logging
interface LogEntry {
timestamp: string;
level: 'info' | 'warn' | 'error';
message: string;
metadata?: Record<string, unknown>;
}
function log(entry: LogEntry) {
console.log(JSON.stringify(entry));
}
export default {
async fetch(request: Request): Promise<Response> {
const startTime = Date.now();
const response = await handleRequest(request);
log({
timestamp: new Date().toISOString(),
level: 'info',
message: 'Request handled',
metadata: {
url: request.url,
method: request.method,
status: response.status,
duration: Date.now() - startTime,
},
});
return response;
},
};
Conclusion
Edge Functions represent a significant evolution in serverless architecture, offering consistently low latency for global users. In 2025, they've become an essential component for applications that prioritize performance.
The key is understanding when to use edge versus traditional serverless. For authentication, personalization, rate limiting, and fast responses, edge is ideal. For heavy processing and complex business logic, traditional serverless remains the right choice.
The combination of both offers the best of both worlds: fast responses at the edge with processing power in the backend when needed.
If you want to dive deeper into modern architectures, I recommend checking out another article: PWAs with JavaScript: The Web Apps Revolution where you'll discover how to build modern and performant web experiences.
Let's go! 🦅
🎯 Join the Developers Who Are Evolving
Thousands of developers already use our material to accelerate their studies and achieve better positions in the market.
Why invest in structured knowledge?
Learning in an organized way with practical examples makes all the difference in your journey as a developer.
Start now:
- 1x R$9.90 by card
- or R$9.90 upfront
"Excellent material for those who want to go deeper!" - João, Developer

