Back to blog

Edge Computing with JavaScript: Cloudflare Workers, Vercel Edge and Deno Deploy

Hey HaWkers, edge computing is transforming how we develop web applications. Instead of processing requests on a distant central server, code runs at globally distributed points of presence, drastically reducing latency.

And the best part: you can use JavaScript for this. Let's explore the main platforms and understand when to use each one.

What Is Edge Computing

Edge computing moves computation closer to the end user. Instead of a request traveling thousands of miles to a central data center, it's processed at the nearest point of presence.

Practical benefits:

  • Reduced latency (< 50ms globally)
  • Better user experience
  • Lower load on origin server
  • Possibility of region-specific personalization
  • Virtually non-existent cold starts

Ideal use cases:

  • Low-latency APIs
  • Authentication and authorization
  • Smart redirects
  • A/B testing
  • Content personalization
  • Rate limiting
  • Response transformation

Cloudflare Workers

Cloudflare Workers pioneered edge computing with JavaScript. It uses the V8 isolates runtime, which is different from traditional Node.js.

Characteristics

Strengths:

  • Largest edge network (300+ locations)
  • Virtually zero cold start
  • Competitive pricing
  • Mature ecosystem (KV, Durable Objects, R2)
  • WebSocket support

Limitations:

  • Runtime is not Node.js (some different APIs)
  • CPU time limit per request
  • Learning curve for advanced features

Practical Example

// src/index.js - Cloudflare Worker
export default {
  async fetch(request, env, ctx) {
    const url = new URL(request.url);

    // Basic routing
    if (url.pathname === '/api/hello') {
      return new Response(JSON.stringify({
        message: 'Hello from the edge!',
        location: request.cf?.city || 'unknown',
        country: request.cf?.country || 'unknown',
      }), {
        headers: { 'Content-Type': 'application/json' },
      });
    }

    // Cache with KV Storage
    if (url.pathname.startsWith('/api/cached/')) {
      const key = url.pathname.replace('/api/cached/', '');
      const cached = await env.MY_KV.get(key);

      if (cached) {
        return new Response(cached, {
          headers: {
            'Content-Type': 'application/json',
            'X-Cache': 'HIT',
          },
        });
      }

      // Fetch from origin and cache
      const data = await fetchFromOrigin(key);
      await env.MY_KV.put(key, JSON.stringify(data), {
        expirationTtl: 3600, // 1 hour
      });

      return new Response(JSON.stringify(data), {
        headers: {
          'Content-Type': 'application/json',
          'X-Cache': 'MISS',
        },
      });
    }

    return new Response('Not Found', { status: 404 });
  },
};

async function fetchFromOrigin(key) {
  const response = await fetch(`https://api.origin.com/data/${key}`);
  return response.json();
}

Configuration

# wrangler.toml
name = "my-edge-api"
main = "src/index.js"
compatibility_date = "2025-11-01"

[[kv_namespaces]]
binding = "MY_KV"
id = "abc123"

[vars]
API_KEY = "your-api-key"

Vercel Edge Functions

Vercel Edge Functions integrates seamlessly with Next.js and other frameworks. It uses the same V8 isolates runtime.

Characteristics

Strengths:

  • Native integration with Next.js
  • Automatic deploy via Git
  • Excellent monitoring UI
  • Powerful middleware
  • Easy to get started

Limitations:

  • Fewer locations than Cloudflare
  • Pricing can scale quickly
  • Fewer native storage features

Example with Next.js

// middleware.ts - Vercel Edge Middleware
import { NextResponse } from 'next/server';
import type { NextRequest } from 'next/server';

export function middleware(request: NextRequest) {
  const { pathname, geo } = request.nextUrl;

  // Geolocation
  const country = geo?.country || 'US';
  const city = geo?.city || 'Unknown';

  // Region redirect
  if (pathname === '/' && country === 'BR') {
    return NextResponse.redirect(new URL('/pt-br', request.url));
  }

  // A/B Testing
  const bucket = request.cookies.get('ab-bucket')?.value;
  if (!bucket && pathname === '/landing') {
    const newBucket = Math.random() < 0.5 ? 'A' : 'B';
    const response = NextResponse.rewrite(
      new URL(`/landing/${newBucket}`, request.url)
    );
    response.cookies.set('ab-bucket', newBucket, {
      maxAge: 60 * 60 * 24 * 30, // 30 days
    });
    return response;
  }

  // Custom headers
  const response = NextResponse.next();
  response.headers.set('X-Geo-Country', country);
  response.headers.set('X-Geo-City', city);

  return response;
}

export const config = {
  matcher: ['/', '/landing/:path*', '/api/:path*'],
};

Edge API Route

// app/api/edge/route.ts
import { NextRequest } from 'next/server';

export const runtime = 'edge';

export async function GET(request: NextRequest) {
  const { searchParams } = new URL(request.url);
  const query = searchParams.get('q');

  // Lightweight processing at the edge
  const results = await searchIndex(query);

  return Response.json({
    results,
    processed_at: new Date().toISOString(),
    edge_location: request.geo?.city,
  });
}

async function searchIndex(query: string | null) {
  // Search logic optimized for edge
  return [];
}

Deno Deploy

Deno Deploy uses the Deno runtime, offering native TypeScript and modern APIs.

Characteristics

Strengths:

  • Native TypeScript without build
  • Standard Web APIs (fetch, Request, Response)
  • Integrated Deno KV
  • Broadcast Channels for communication
  • Generous free tier

Limitations:

  • Smaller ecosystem than Node.js
  • Fewer enterprise integrations
  • Learning curve if coming from Node.js

Practical Example

// main.ts - Deno Deploy
import { serve } from "https://deno.land/std@0.220.0/http/server.ts";

const kv = await Deno.openKv();

async function handler(request: Request): Promise<Response> {
  const url = new URL(request.url);

  // Rate Limiting API
  if (url.pathname.startsWith('/api/')) {
    const clientIP = request.headers.get('x-forwarded-for') || 'unknown';
    const rateLimit = await checkRateLimit(clientIP);

    if (!rateLimit.allowed) {
      return new Response(JSON.stringify({
        error: 'Rate limit exceeded',
        retry_after: rateLimit.retryAfter,
      }), {
        status: 429,
        headers: {
          'Content-Type': 'application/json',
          'Retry-After': String(rateLimit.retryAfter),
        },
      });
    }
  }

  // Main route
  if (url.pathname === '/api/data') {
    const data = await fetchData();
    return Response.json(data);
  }

  // Cache with Deno KV
  if (url.pathname.startsWith('/api/cached/')) {
    const key = url.pathname.replace('/api/cached/', '');
    const cached = await kv.get(['cache', key]);

    if (cached.value) {
      return Response.json(cached.value, {
        headers: { 'X-Cache': 'HIT' },
      });
    }

    const data = await fetchFromOrigin(key);
    await kv.set(['cache', key], data, {
      expireIn: 3600 * 1000, // 1 hour
    });

    return Response.json(data, {
      headers: { 'X-Cache': 'MISS' },
    });
  }

  return new Response('Not Found', { status: 404 });
}

async function checkRateLimit(clientIP: string) {
  const key = ['ratelimit', clientIP];
  const now = Date.now();
  const windowMs = 60000; // 1 minute
  const maxRequests = 100;

  const entry = await kv.get<{ count: number; resetAt: number }>(key);

  if (!entry.value || now > entry.value.resetAt) {
    await kv.set(key, { count: 1, resetAt: now + windowMs });
    return { allowed: true };
  }

  if (entry.value.count >= maxRequests) {
    return {
      allowed: false,
      retryAfter: Math.ceil((entry.value.resetAt - now) / 1000),
    };
  }

  await kv.set(key, {
    count: entry.value.count + 1,
    resetAt: entry.value.resetAt,
  });

  return { allowed: true };
}

async function fetchData() {
  return { message: 'Hello from Deno Deploy!' };
}

async function fetchFromOrigin(key: string) {
  return { key, data: 'origin data' };
}

serve(handler);

Platform Comparison

Performance and Network

Platform Locations Cold Start CPU Limit
Cloudflare 300+ ~0ms 50ms
Vercel Edge 100+ ~25ms 25ms
Deno Deploy 35+ ~10ms 50ms

Storage and Resources

Platform KV Storage Durable Objects Database
Cloudflare ✅ KV D1 (SQLite)
Vercel Via Upstash Via integration
Deno ✅ Deno KV Deno KV

Pricing (Free Tier)

Platform Requests/day CPU Time Bandwidth
Cloudflare 100k 10ms/req Unlimited
Vercel 100k - 100GB
Deno 1M/month - 100GB

When to Use Edge Computing

Use Edge When:

  • You need low global latency
  • Doing lightweight request transformation
  • Implementing authentication/authorization
  • Serving region-personalized content
  • Need distributed rate limiting
  • Doing smart redirects

Avoid Edge When:

  • You need heavy CPU processing
  • Require connection to traditional database
  • Have incompatible Node.js dependencies
  • Need batch processing
  • Have compliance requirements that require specific region

Architecture Patterns

Edge + Origin

// Pattern: Edge as gateway, Origin for heavy logic
async function handler(request: Request) {
  // Quick validation at edge
  const authResult = await validateAuth(request);
  if (!authResult.valid) {
    return new Response('Unauthorized', { status: 401 });
  }

  // Cache check at edge
  const cacheKey = getCacheKey(request);
  const cached = await cache.get(cacheKey);
  if (cached) {
    return new Response(cached);
  }

  // Forward to origin if needed
  const originResponse = await fetch('https://origin.example.com/api', {
    headers: {
      'Authorization': `Bearer ${authResult.token}`,
      'X-User-Id': authResult.userId,
    },
  });

  // Cache response at edge
  const data = await originResponse.text();
  await cache.put(cacheKey, data, { ttl: 300 });

  return new Response(data);
}

Conclusion

Edge computing with JavaScript offers a powerful way to improve web application performance. The platform choice depends on your context:

Cloudflare Workers is ideal for those who need the largest distribution network and advanced features like Durable Objects.

Vercel Edge Functions is perfect for Next.js projects that want seamless integration and good developer experience.

Deno Deploy is excellent for those who want native TypeScript and are comfortable with the Deno ecosystem.

The good news is that all platforms use standard Web APIs, so much of the code is portable between them.

If you're interested in exploring more about modern architectures with JavaScript, I recommend checking out the article Node.js vs Deno vs Bun: Which JavaScript Runtime to Choose in 2025 to better understand the runtimes that power these edge platforms.

Let's go! 🦅

Comments (0)

This article has no comments yet 😢. Be the first! 🚀🦅

Add comments