Back to blog

Edge Computing and Serverless: The Future of JavaScript Applications in 2025

Hey HaWkers, imagine running JavaScript code just milliseconds away from your users, across 300+ locations around the world, paying only for what you actually use. This isn't the future—it's the reality of Edge Computing and Serverless in 2025.

Have you ever wondered why some applications respond instantly while others take seconds to process simple requests?

What Is Edge Computing and Why It Changed Everything

Edge Computing is the practice of executing code as close as possible to the end user, instead of in distant centralized servers. In 2025, platforms like Cloudflare Workers, Vercel Edge Functions, and Deno Deploy distribute your JavaScript code across hundreds of global data centers.

The fundamental difference:

Traditional model (centralized server):

  • User in Brazil → request goes to server in USA (200ms latency)
  • Server processes → response comes back to Brazil (200ms latency)
  • Total: 400ms+ latency from network alone

Edge model:

  • User in Brazil → request goes to edge node in São Paulo (5ms latency)
  • Edge processes → response comes back immediately (5ms latency)
  • Total: 10ms latency

Result: 97% latency reduction for global users.

Real Benefits of Edge in 2025:

  • Ultra-low latency: 5-20ms vs 200-500ms (traditional server)
  • Automatic scalability: From 0 to 10 million requests with no configuration
  • Reduced costs: 60-70% cheaper than traditional VMs
  • Native geo-location: Code runs near each user automatically

Cloudflare Workers: JavaScript on the Edge in Practice

Cloudflare Workers lets you run JavaScript/TypeScript across 300+ cities globally. Let's see practical examples:

Example 1: REST API on the Edge

// worker.js - API running on global edge
export default {
  async fetch(request, env, ctx) {
    const url = new URL(request.url);

    // Simple routing
    if (url.pathname === '/api/user') {
      return handleGetUser(request, env);
    }

    if (url.pathname === '/api/products') {
      return handleGetProducts(request, env);
    }

    return new Response('Not Found', { status: 404 });
  },
};

async function handleGetUser(request, env) {
  const userId = new URL(request.url).searchParams.get('id');

  // Access D1 (SQLite on Cloudflare edge)
  const result = await env.DB.prepare(
    'SELECT * FROM users WHERE id = ?'
  ).bind(userId).first();

  if (!result) {
    return new Response('User not found', { status: 404 });
  }

  return new Response(JSON.stringify(result), {
    headers: { 'Content-Type': 'application/json' },
  });
}

async function handleGetProducts(request, env) {
  // Cache on the edge (KV storage)
  const cached = await env.PRODUCTS_CACHE.get('all-products');

  if (cached) {
    return new Response(cached, {
      headers: {
        'Content-Type': 'application/json',
        'X-Cache': 'HIT',
      },
    });
  }

  // Fetch from data source
  const products = await fetchProductsFromOrigin(env);
  const json = JSON.stringify(products);

  // Cache for 1 hour
  await env.PRODUCTS_CACHE.put('all-products', json, {
    expirationTtl: 3600,
  });

  return new Response(json, {
    headers: {
      'Content-Type': 'application/json',
      'X-Cache': 'MISS',
    },
  });
}

Advantages of this code:

  • ✅ Runs across 300+ locations automatically
  • ✅ 5-20ms latency globally
  • ✅ Infinite scalability (0 to millions of requests)
  • ✅ Cost: $5/month for 10 million requests (vs $200+ on VMs)

Example 2: Authentication and Middleware on the Edge

// auth-middleware.js - Route protection on the edge
export default {
  async fetch(request, env, ctx) {
    const url = new URL(request.url);

    // Public routes don't need auth
    const publicPaths = ['/', '/login', '/register', '/api/public'];
    if (publicPaths.some(path => url.pathname.startsWith(path))) {
      return fetch(request); // Pass through
    }

    // Verify JWT
    const authHeader = request.headers.get('Authorization');
    if (!authHeader || !authHeader.startsWith('Bearer ')) {
      return new Response('Unauthorized', { status: 401 });
    }

    const token = authHeader.substring(7);

    try {
      // Validate JWT (jose library for edge)
      const { jwtVerify } = await import('jose');
      const secret = new TextEncoder().encode(env.JWT_SECRET);

      const { payload } = await jwtVerify(token, secret);

      // Add user info to headers for origin
      const newRequest = new Request(request);
      newRequest.headers.set('X-User-Id', payload.sub);
      newRequest.headers.set('X-User-Role', payload.role);

      return fetch(newRequest);
    } catch (error) {
      return new Response('Invalid token', { status: 401 });
    }
  },
};

Result: Authentication processed in <10ms on the edge, without touching origin server.

edge computing diagram

Vercel Edge Functions: Edge + Next.js

Vercel Edge Functions integrates seamlessly with Next.js, enabling SSR on the edge:

// middleware.js - Next.js middleware on the edge
import { NextResponse } from 'next/server';

export function middleware(request) {
  const url = request.nextUrl;

  // A/B Testing on the edge
  const bucket = Math.random() < 0.5 ? 'a' : 'b';
  url.searchParams.set('variant', bucket);

  const response = NextResponse.rewrite(url);

  // Set cookie to maintain variant
  response.cookies.set('ab-test-variant', bucket, {
    maxAge: 60 * 60 * 24 * 30, // 30 days
  });

  return response;
}

export const config = {
  matcher: '/product/:path*',
};
// app/api/personalize/route.js - Edge API Route
export const runtime = 'edge'; // Mark to run on the edge!

export async function GET(request) {
  const geo = request.geo; // Automatic geolocation

  // Personalize content by location
  const content = {
    country: geo.country,
    city: geo.city,
    currency: getCurrencyForCountry(geo.country),
    language: getLanguageForCountry(geo.country),
    shippingEstimate: getShippingEstimate(geo.country),
  };

  return Response.json(content);
}

function getCurrencyForCountry(country) {
  const currencies = {
    BR: 'BRL',
    US: 'USD',
    GB: 'GBP',
    EU: 'EUR',
  };
  return currencies[country] || 'USD';
}

function getShippingEstimate(country) {
  const estimates = {
    BR: '3-5 business days',
    US: '2-3 business days',
    GB: '1-2 business days',
  };
  return estimates[country] || '5-7 business days';
}

Benefits:

  • ✅ Instant personalization by geolocation
  • ✅ A/B testing without affecting performance
  • ✅ Zero-config integration with Next.js

Serverless vs Serverful: When to Use Each

Serverless doesn't mean "without servers"—it means you don't manage servers. In 2025, choosing between serverless and traditional servers depends on your use case:

Use Serverless/Edge when:

Unpredictable and variable traffic

  • E-commerce with seasonal spikes (Black Friday)
  • Viral apps that can scale 1000x overnight
  • APIs with sporadic usage

Latency is critical

  • Real-time applications
  • Online games, chat apps
  • APIs serving global users

Cost optimized for intermittent workloads

  • Cron jobs running a few times a day
  • Webhooks and event-driven architectures
  • Asynchronous event processing

Use Traditional Servers when:

Constant and predictable workloads

  • Corporate applications with stable traffic
  • Continuous batch processing

Long-running processes

  • Machine learning training
  • Video encoding that takes hours
  • Long-duration WebSocket connections

Full environment control

  • Specific kernel configurations
  • Complex native libraries

Real Cost Comparison:

Scenario: API with 10 million requests/month, 50ms average execution

Platform Type Cost/month Average Latency
AWS Lambda Serverless $25 80ms
Cloudflare Workers Edge $5 15ms
Vercel Edge Edge $20 20ms
AWS EC2 (t3.medium) VM $35 120ms
DigitalOcean (4GB) VM $24 150ms

Conclusion: Edge serverless wins in both cost AND performance for most cases.

Deno Deploy: TypeScript-First Alternative to Node.js

Deno Deploy is an edge serverless platform optimized for TypeScript and Web Standards:

// main.ts - Deno Deploy edge function
import { serve } from 'https://deno.land/std@0.200.0/http/server.ts';

// TypeScript interface
interface Product {
  id: number;
  name: string;
  price: number;
  stock: number;
}

// Connect to PostgreSQL (Supabase, Neon, etc)
async function getProducts(): Promise<Product[]> {
  const response = await fetch(
    `${Deno.env.get('SUPABASE_URL')}/rest/v1/products`,
    {
      headers: {
        apikey: Deno.env.get('SUPABASE_KEY')!,
        Authorization: `Bearer ${Deno.env.get('SUPABASE_KEY')}`,
      },
    }
  );

  return response.json();
}

serve(async (req: Request) => {
  const url = new URL(req.url);

  // Automatic CORS
  if (req.method === 'OPTIONS') {
    return new Response(null, {
      headers: {
        'Access-Control-Allow-Origin': '*',
        'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
      },
    });
  }

  // Routing
  switch (url.pathname) {
    case '/api/products': {
      const products = await getProducts();

      return new Response(JSON.stringify(products), {
        headers: {
          'Content-Type': 'application/json',
          'Access-Control-Allow-Origin': '*',
        },
      });
    }

    case '/api/search': {
      const query = url.searchParams.get('q');
      if (!query) {
        return new Response('Missing query parameter', { status: 400 });
      }

      // Full-text search on the edge
      const products = await getProducts();
      const filtered = products.filter(p =>
        p.name.toLowerCase().includes(query.toLowerCase())
      );

      return new Response(JSON.stringify(filtered), {
        headers: { 'Content-Type': 'application/json' },
      });
    }

    default:
      return new Response('Not found', { status: 404 });
  }
});

Advantages of Deno Deploy:

  • ✅ Native TypeScript (no transpilation)
  • ✅ Web Standards (fetch, Response, Request)
  • ✅ Direct URL imports (no package.json)
  • ✅ Instant deployment via CLI or GitHub

Architecture Patterns: Edge + Origin

The ideal architecture in 2025 combines edge and origin servers:

Pattern 1: Edge Cache + Origin API

// edge-cache.js - Cloudflare Worker
export default {
  async fetch(request, env, ctx) {
    const cache = caches.default;
    const url = new URL(request.url);

    // Try cache first
    let response = await cache.match(request);

    if (response) {
      return response;
    }

    // Cache miss - fetch from origin
    response = await fetch(`https://origin-api.example.com${url.pathname}`);

    // Cache for 5 minutes
    const cachedResponse = new Response(response.body, response);
    cachedResponse.headers.set('Cache-Control', 'public, max-age=300');

    ctx.waitUntil(cache.put(request, cachedResponse.clone()));

    return response;
  },
};

Pattern 2: Edge Authentication + Origin Logic

// edge-auth.js - Authentication on edge, logic on origin
export default {
  async fetch(request, env) {
    // Auth on edge (fast)
    const user = await authenticateUser(request, env);

    if (!user) {
      return new Response('Unauthorized', { status: 401 });
    }

    // Pass authenticated request to origin
    const originRequest = new Request(request);
    originRequest.headers.set('X-User-Id', user.id);
    originRequest.headers.set('X-User-Tier', user.tier);

    // Origin processes complex logic
    return fetch(`https://origin.example.com${new URL(request.url).pathname}`, originRequest);
  },
};

Pattern 3: Edge Transformations

// edge-transform.js - Transform responses on the edge
export default {
  async fetch(request, env) {
    const response = await fetch(request);

    // Minify HTML on the edge (save bandwidth)
    if (response.headers.get('Content-Type')?.includes('text/html')) {
      let html = await response.text();

      // Remove unnecessary whitespace
      html = html.replace(/\s+/g, ' ').replace(/>\s+</g, '><');

      return new Response(html, {
        headers: response.headers,
      });
    }

    return response;
  },
};

Limitations and Challenges of Edge Computing

Despite its benefits, edge computing has important limitations:

1. Resource Constraints

Cloudflare Workers:

  • CPU Time: 50ms (free), 30s (paid)
  • Memory: 128MB
  • Request size: 100MB

Vercel Edge Functions:

  • Execution time: 30s max
  • Memory: 128MB
  • Response size: 4MB

Implication: Heavy operations (machine learning, video processing) don't fit on the edge.

2. Cold Starts (smaller than traditional serverless)

// edge-cold-start.js
let globalCounter = 0; // Persists between invocations

export default {
  async fetch(request) {
    globalCounter++;

    // globalCounter may reset on cold starts
    return new Response(`Request #${globalCounter}`);
  },
};

Edge functions have cold starts of 0-10ms (vs 100-1000ms on Lambda), but global state is not guaranteed.

3. API Compatibility

Edge runtimes don't support all Node.js APIs:

// ❌ Does NOT work on the edge
const fs = require('fs'); // No filesystem
const child_process = require('child_process'); // No child processes

// ✅ Works on the edge
fetch(); // Web Standard
crypto.subtle; // Web Crypto API
Response, Request, Headers; // Web Standards

Solution: Use libraries compatible with Web Standards.

Conclusion: Edge + Serverless Is the New Standard

In 2025, the combination of edge computing and serverless has become the standard for modern JavaScript applications. The advantages in latency, cost, and scalability are irrefutable for most use cases.

Practical recommendations:

For new projects:

  1. Start with edge serverless (Cloudflare Workers or Vercel Edge)
  2. Use origin servers only for heavy logic
  3. Implement aggressive caching on the edge

For existing projects:

  1. Migrate authentication and static routes to edge
  2. Use edge as a cache layer in front of origin
  3. Gradually migrate API endpoints to edge

Recommended 2025 stack:

  • Edge Runtime: Cloudflare Workers or Vercel Edge Functions
  • Framework: Next.js (App Router with edge support)
  • Database: Cloudflare D1, Turso, or Planetscale (edge-compatible)
  • Storage: Cloudflare R2 or Vercel Blob

If you feel inspired by edge computing, I recommend checking out another article: React 19 and Server Components where you'll discover how to combine edge with React Server Components.

Let's go! 🦅

📚 Want to Deepen Your JavaScript Knowledge?

This article covered edge computing and serverless, but there's much more to explore in the world of modern development.

Developers who invest in solid and structured knowledge tend to have more opportunities in the market.

Complete Study Material

If you want to master JavaScript from beginner to advanced, I've prepared a complete guide:

Investment options:

  • 3x of $11.52 on credit card
  • or $32.63 cash

👉 Get to Know the JavaScript Guide

💡 Material updated with best market practices

Comments (0)

This article has no comments yet 😢. Be the first! 🚀🦅

Add comments