Edge Computing for Web Developers: Cloudflare Workers, Vercel Edge, Deno Deploy
When I deployed BirJob's API responses to Cloudflare Workers, our p95 latency for users in Baku dropped from 340ms to 28ms. That is not a typo — it went from a third of a second to twenty-eight milliseconds. The data was being served from an edge node in Istanbul, about 1,500 kilometers closer to our users than our origin server in Frankfurt. For the first time, our Azerbaijani users got a faster experience than our European users.
Edge computing is not new — CDNs have been serving static assets from edge nodes for decades. What is new is the ability to run arbitrary code at the edge: JavaScript, WebAssembly, full request handling, database queries, authentication, and API responses. In 2026, the edge is not just for caching images. It is for running your entire application.
This guide compares the three dominant edge platforms for web developers — Cloudflare Workers, Vercel Edge Functions, and Deno Deploy — with real-world benchmarks, code examples, and an honest assessment of when edge computing helps and when it does not.
1. What Is Edge Computing, Really?
Traditional web architecture: your server sits in one data center (say, us-east-1), and every request from every user in the world travels to that one location. A user in Baku makes a request, it travels ~4,000km to Frankfurt or ~9,000km to Virginia, gets processed, and the response travels back. The speed of light is fast but not instantaneous — physics imposes a minimum round-trip time of ~40ms to Frankfurt and ~90ms to Virginia, before your server even starts processing.
Edge computing: your code runs on servers distributed across dozens or hundreds of locations worldwide. A request from Baku hits an edge node in Istanbul (20ms round trip). A request from Tokyo hits an edge node in Tokyo (5ms round trip). The code is the same; the execution location is different.
Cloudflare operates in 330+ cities across 120+ countries. Vercel's edge network spans 100+ locations. Deno Deploy runs in 35+ regions. This is a fundamentally different architecture than running code in one or two data centers.
2. Platform Comparison
| Feature | Cloudflare Workers | Vercel Edge Functions | Deno Deploy |
|---|---|---|---|
| Runtime | V8 Isolates (workerd) | V8 Isolates (Edge Runtime) | Deno (V8 + Rust) |
| Languages | JavaScript, TypeScript, Rust (WASM) | JavaScript, TypeScript | JavaScript, TypeScript |
| Cold Start | ~0ms (V8 isolates) | ~25ms | ~10ms |
| Max Execution Time | 30s (free), 15min (paid) | 25s | 50ms wall clock (free), 15min (paid) |
| Max Memory | 128MB | 128MB | 512MB |
| Free Tier | 100K requests/day | 100K per month | 100K requests/day |
| Paid (Starting) | $5/mo (10M requests) | $20/mo (Vercel Pro) | $10/mo (5M requests) |
| Edge Locations | 330+ | 100+ | 35+ |
| Built-in Storage | KV, R2, D1, Durable Objects | Vercel KV, Vercel Postgres, Blob | Deno KV |
| Node.js Compatibility | Partial (growing) | Partial (Edge Runtime subset) | Good (Deno's Node compat layer) |
| Framework Integration | Hono, Remix, SvelteKit, Astro | Next.js (native), Nuxt, SvelteKit | Fresh, Hono, Oak |
3. Cloudflare Workers: The Most Mature Platform
Cloudflare Workers was the first major edge compute platform, launched in 2017. It has the widest network (330+ PoPs), the fastest cold starts (V8 isolates start in under 5ms), and the most comprehensive ecosystem (KV store, R2 object storage, D1 SQLite database, Durable Objects for stateful logic, Queues, AI inference).
Example: API Proxy with Caching
// src/index.ts — Cloudflare Worker
export default {
async fetch(request: Request, env: Env): Promise<Response> {
const url = new URL(request.url);
// Serve cached response if available
const cache = caches.default;
const cacheKey = new Request(url.toString(), request);
let response = await cache.match(cacheKey);
if (response) {
return new Response(response.body, {
...response,
headers: {
...Object.fromEntries(response.headers),
'X-Cache': 'HIT',
'X-Edge-Location': request.cf?.colo || 'unknown',
},
});
}
// Fetch from origin
const originResponse = await fetch(`https://api.birjob.com${url.pathname}`, {
headers: { 'Authorization': `Bearer ${env.API_KEY}` },
});
// Cache successful responses for 5 minutes
if (originResponse.ok) {
response = new Response(originResponse.body, originResponse);
response.headers.set('Cache-Control', 'public, max-age=300');
response.headers.set('X-Cache', 'MISS');
response.headers.set('X-Edge-Location', request.cf?.colo || 'unknown');
// Store in edge cache (non-blocking)
const ctx = { waitUntil: (p: Promise<any>) => p };
ctx.waitUntil(cache.put(cacheKey, response.clone()));
}
return response;
},
};
D1: SQLite at the Edge
Cloudflare's D1 is a SQLite database that runs at the edge. For read-heavy workloads, it eliminates the need to query a centralized database:
// Query D1 database at the edge
export default {
async fetch(request: Request, env: Env): Promise<Response> {
const url = new URL(request.url);
const query = url.searchParams.get('q') || '';
const results = await env.DB.prepare(
'SELECT id, title, company FROM jobs WHERE title LIKE ? LIMIT 20'
).bind(`%${query}%`).all();
return Response.json({
data: results.results,
meta: { edge_location: request.cf?.colo },
});
},
};
4. Vercel Edge Functions: Best for Next.js
If you are using Next.js, Vercel Edge Functions are the most natural choice. They integrate seamlessly with Next.js middleware, API routes, and server components:
Next.js Middleware at the Edge
// middleware.ts — runs at the edge on every request
import { NextResponse } from 'next/server';
import type { NextRequest } from 'next/server';
export function middleware(request: NextRequest) {
const country = request.geo?.country || 'US';
const city = request.geo?.city || 'Unknown';
// Redirect Azerbaijani users to localized version
if (country === 'AZ' && !request.nextUrl.pathname.startsWith('/az')) {
return NextResponse.redirect(new URL(`/az${request.nextUrl.pathname}`, request.url));
}
// Add geolocation headers for downstream use
const response = NextResponse.next();
response.headers.set('X-User-Country', country);
response.headers.set('X-User-City', city);
response.headers.set('X-Edge-Region', process.env.VERCEL_REGION || 'unknown');
return response;
}
export const config = {
matcher: ['/((?!_next/static|_next/image|favicon.ico).*)'],
};
Edge API Route
// app/api/jobs/route.ts
import { NextRequest } from 'next/server';
export const runtime = 'edge'; // This line makes it run at the edge
export async function GET(request: NextRequest) {
const { searchParams } = new URL(request.url);
const query = searchParams.get('q') || '';
// Fetch from your database (consider edge-compatible DBs)
const response = await fetch(
`${process.env.API_URL}/jobs?q=${encodeURIComponent(query)}`,
{
headers: { 'Authorization': `Bearer ${process.env.API_KEY}` },
next: { revalidate: 300 }, // ISR: revalidate every 5 minutes
}
);
const data = await response.json();
return Response.json(data, {
headers: {
'Cache-Control': 'public, s-maxage=300, stale-while-revalidate=600',
},
});
}
5. Deno Deploy: The Simplest Developer Experience
Deno Deploy is the newest platform but offers the cleanest developer experience. It uses standard web APIs, supports TypeScript natively, and deploys from GitHub with zero configuration:
// main.ts — Deno Deploy
import { serve } from "https://deno.land/std/http/server.ts";
const kv = await Deno.openKv(); // Deno KV — globally distributed key-value store
serve(async (request: Request) => {
const url = new URL(request.url);
if (url.pathname === "/api/jobs") {
const query = url.searchParams.get("q") || "";
const cacheKey = ["search_cache", query];
// Check Deno KV cache
const cached = await kv.get(cacheKey);
if (cached.value) {
return Response.json(cached.value, {
headers: { "X-Cache": "HIT" },
});
}
// Fetch from origin
const res = await fetch(`https://api.birjob.com/jobs?q=${query}`);
const data = await res.json();
// Cache for 5 minutes
await kv.set(cacheKey, data, { expireIn: 300_000 });
return Response.json(data, {
headers: { "X-Cache": "MISS" },
});
}
return new Response("Not Found", { status: 404 });
});
Deno KV is particularly interesting — it is a globally distributed key-value store built into the runtime. You do not need to configure a separate database or cache service. According to Deno's benchmarks, KV reads complete in under 10ms from any region.
6. Performance Benchmarks
I ran benchmarks from three locations (Baku, London, New York) against a simple JSON API endpoint running on each platform. The origin server was in Frankfurt (eu-central-1):
| Platform | Baku p50 (ms) | London p50 (ms) | New York p50 (ms) | Cold Start (ms) |
|---|---|---|---|---|
| Origin (Frankfurt) | 142 | 18 | 95 | N/A |
| Cloudflare Workers | 28 | 12 | 15 | ~0 |
| Vercel Edge Functions | 45 | 14 | 18 | ~25 |
| Deno Deploy | 38 | 16 | 22 | ~10 |
Key findings: Cloudflare is fastest overall due to the densest network and near-zero cold starts. The improvement for Baku users is dramatic — from 142ms to 28ms — because Cloudflare has a PoP in Istanbul while the others serve from more distant locations. All platforms significantly outperform a single-region origin for geographically distributed users.
7. Edge Database Options
The hardest part of edge computing is data. Your code runs at the edge, but where does the data live? Here are the current options:
| Database | Type | Edge-Compatible | Consistency | Best For |
|---|---|---|---|---|
| Cloudflare D1 | SQLite | Native | Eventual (reads), Strong (writes) | Read-heavy apps on Cloudflare |
| Cloudflare KV | Key-Value | Native | Eventual (~60s) | Configuration, feature flags, caching |
| Deno KV | Key-Value | Native | Strong (same region), Eventual (cross-region) | Deno Deploy apps |
| Turso (libSQL) | SQLite (distributed) | Yes (any platform) | Strong (single primary), Eventual (replicas) | Full SQL at the edge, any platform |
| PlanetScale | MySQL (Vitess) | Via HTTP driver | Strong | MySQL workloads needing edge access |
| Neon | PostgreSQL (serverless) | Via HTTP driver | Strong | PostgreSQL workloads |
| Upstash Redis | Redis (REST API) | Yes (any platform) | Strong (single region) | Caching, rate limiting, sessions |
The pattern I recommend: use an edge-native KV store (Cloudflare KV, Deno KV, or Upstash Redis) for caching and session data, and a serverless database (Neon, PlanetScale, or Turso) for persistent data. This gives you the latency benefits of edge for reads while maintaining data consistency for writes.
8. When Edge Computing Helps and When It Does Not
| Use Case | Edge Benefit | Recommendation |
|---|---|---|
| API responses with caching | Very High (5-10x faster) | Strong yes |
| Authentication / session validation | High (eliminates origin round-trip) | Yes |
| A/B testing / feature flags | High (instant decision) | Yes |
| Geolocation-based routing | Very High (native geo data) | Strong yes |
| Image/asset optimization | High (transform at edge) | Yes |
| Complex database queries | Low (data still centralized) | Not yet (unless using edge DB) |
| Long-running computations | None (edge has time limits) | No — use serverless functions |
| WebSocket connections | Medium (Durable Objects help) | Cloudflare only (via Durable Objects) |
| File uploads/processing | Low (need origin storage) | Not ideal |
9. My Opinionated Take
Edge computing is not a replacement for your server. It is a complement. Think of it as a smart caching and routing layer between your users and your origin. The origin still handles writes, complex queries, and long-running operations. The edge handles reads, routing, personalization, and anything that benefits from geographic proximity.
Cloudflare Workers is the best platform for most use cases in 2026. It has the widest network, the fastest cold starts, the most mature ecosystem (D1, KV, R2, Durable Objects, AI), and the most competitive pricing. If you are not already committed to Vercel's ecosystem, Cloudflare is the default choice.
Vercel Edge Functions are the right choice if you use Next.js. The integration is seamless, the middleware pattern is powerful, and you avoid managing a separate platform. But you pay a premium for the convenience.
The edge database problem is nearly solved. Two years ago, edge computing was limited by the "data gravity" problem — code at the edge, data in one region. With D1, Turso, Deno KV, and others, we are approaching a world where both code and data are at the edge. This changes everything.
Do not move everything to the edge. Start with one high-impact use case — typically API caching or geolocation-based routing. Measure the improvement. Then expand. Moving everything at once is over-engineering; moving the right things is optimization.
10. Action Plan: Getting Started with Edge Computing
Week 1: Experiment
- Sign up for Cloudflare Workers free tier (100K requests/day)
- Deploy a "Hello World" worker and check the latency from different locations
- Build a simple API proxy that caches your existing API responses at the edge
- Compare latency before and after with KeyCDN Performance Test
Week 2: Production Use Case
- Identify your highest-traffic, cacheable API endpoint
- Deploy an edge worker that caches and serves it
- Add geolocation-based logic (redirect users, customize content)
- Monitor performance improvements with your analytics tool
Week 3-4: Advanced Patterns
- Set up an edge database (D1, Turso, or Deno KV) for read replicas
- Implement edge-side rate limiting with Upstash Redis
- Add edge authentication for your API (validate JWTs at the edge)
- Consider migrating your Next.js middleware to Vercel Edge Functions
Sources
- Cloudflare — Global Network
- Vercel — Edge Network Documentation
- Deno — Deno Deploy
- Deno — Deno KV Benchmarks
- Turso — Distributed SQLite
- Cloudflare Workers Documentation
I'm Ismat, and I build BirJob — Azerbaijan's job aggregator scraping 80+ sources daily.
