Serverless Showdown: Cloudflare Workers vs Lambda vs Cloud Functions vs Azure Functions
A deep technical comparison of serverless compute platforms — Cloudflare Workers, AWS Lambda, Google Cloud Functions, and Azure Functions — covering runtime architecture, cold starts, programming models, pricing, and the edge vs region debate.
Frequently Asked Questions
Find answers to common questions
Workers use V8 isolates instead of containers. A V8 isolate is a lightweight execution environment within the V8 JavaScript engine (the same engine that powers Chrome). Isolates spin up in under 1 millisecond because they do not need to boot an operating system, load a runtime, or initialize a container — they just create a new execution context within an already-running V8 instance. Lambda, Cloud Functions, and Azure Functions use containers, which require 100ms to several seconds to initialize.
Not natively. Workers natively support JavaScript and TypeScript. Other languages can run via WebAssembly (WASM) — Rust, C, C++, and Go can be compiled to WASM and executed in Workers. Python support is available through Pyodide (a Python interpreter compiled to WASM), but it adds startup overhead and lacks full standard library support. For workloads requiring native Python, Java, or Go runtimes, Lambda, Cloud Functions, or Azure Functions are better choices.
At 100M requests/month with 10ms average CPU time: Workers costs approximately 85/month (10M included free, 90M at /bin/sh.30/million, plus CPU time). Lambda with 128MB memory and 100ms average duration costs approximately 87/month (1M free, 99M at /bin/sh.20/million, plus GB-seconds). At this scale they are comparable, but Workers is significantly cheaper for I/O-heavy workloads where CPU time is low relative to wall-clock time, since you do not pay for time spent waiting on network requests.
Durable Objects provide strongly consistent, single-threaded state coordination at the edge. Each Durable Object is a JavaScript class instance that runs in a single location and provides transactional storage. They solve the hard problem of distributed state without a traditional database — useful for real-time collaboration, rate limiting, game state, WebSocket coordination, and session management. No equivalent exists on Lambda, Cloud Functions, or Azure Functions.
No. Lambda@Edge runs Node.js or Python functions at CloudFront's 13 regional edge caches, not at all 600+ edge locations. It has cold starts (100ms-5s), maximum 30-second execution for viewer triggers, and limited to 128MB-10GB memory. Workers runs V8 isolates at all 310+ PoPs with zero cold starts and sub-millisecond startup. CloudFront Functions is closer to Workers in concept (lightweight JavaScript at every edge location) but is severely limited: 1ms max execution, 2MB memory, no network access.
Workers: 30 seconds (free plan 10ms CPU time, paid plan 30 seconds wall-clock). Lambda: 15 minutes. Google Cloud Functions 2nd gen: 60 minutes. Azure Functions Consumption plan: 10 minutes (can be extended to 60 minutes on Premium/Dedicated plans). Workers' shorter limit reflects its edge architecture — these are meant for request/response handling, not long-running batch jobs.
For globally distributed API backends with low-latency requirements, Cloudflare Workers is the best choice — zero cold starts, global deployment, and sub-10ms response times at the edge. For API backends that need heavy computation, large dependencies, or deep cloud service integration, Lambda with API Gateway is the most mature and feature-rich option. Azure Functions with Durable Functions is best for orchestration-heavy APIs with complex workflows.
Both provide serverless orchestration, but with different models. Azure Durable Functions lets you write orchestration logic in code (C#, JavaScript, Python) using familiar async/await patterns — the orchestrator function itself manages the workflow. AWS Step Functions uses a JSON-based state machine definition (Amazon States Language) that is separate from your function code. Durable Functions feels more natural to developers; Step Functions provides better visual workflow design and state management.
By default, Workers deploys to all 310+ locations globally. If you need to restrict execution to specific regions (for data residency or compliance), Workers offers Smart Placement which automatically runs your Worker close to the backend services it communicates with, and Jurisdiction restrictions (EU-only, for example) to constrain where code and data reside. You cannot choose a single specific region like you can with Lambda.
Lambda's maximum memory allocation is 10GB (10,240 MB), and CPU scales proportionally with memory. If you need more than 10GB, you must move to a different compute model: ECS/Fargate tasks (up to 120GB), EC2 instances, or SageMaker for ML workloads. Similarly, if your function exceeds the 15-minute timeout, you need Step Functions for orchestration or ECS for long-running tasks. Workers' 128MB limit is the most restrictive of the four platforms.
Is your cloud secure? Find out free.
Get a complimentary cloud security review. We'll identify misconfigurations, excess costs, and security gaps across AWS, GCP, or Azure.