Cloudflare Cheat Sheet
wrangler.toml
1name = "my-site"
2compatibility_date = "2025-09-24"
3
4[site]
5bucket = "./public"
6
7[triggers]
8crons = ["0 0 * * *"] # daily at midnight UTC
9
10# Bind the Rust Worker under /api/rust
11[[routes]]
12pattern = "example.com/api/rust/*"
13zone_name = "example.com"
14service = "rust-worker"
Folder structure
.
├── wrangler.toml # main site config
├── public/
│ └── index.html # static site root
├── functions/ # best for small APIs tied to the site
│ ├── api/
│ │ └── hello.ts # /api/hello
│ ├── auth/
│ │ └── login.ts # /auth/login
│ └── health.ts # /health
├── worker.ts # cron job handler (JS/TS)
└── rust-worker/ # separate Worker (best for heavier/isolated services)
├── Cargo.toml
├── wrangler.toml
└── src/
└── lib.rs
Pages Functions routing
functions/api/hello.ts→/api/hellofunctions/auth/login.ts→/auth/loginfunctions/health.ts→/health- Rust worker is mounted separately at
/api/rust/*.
Static Asset Headers (_headers)
Custom headers for static assets. Create _headers in your static directory (not served as an asset).
public/_headers
/assets/*
Cache-Control: public, max-age=31536000
/*
X-Frame-Options: DENY
# Block search crawlers on preview URLs
https://:project.pages.dev/*
X-Robots-Tag: noindex
https://:version.:project.pages.dev/*
X-Robots-Tag: noindex
Note: Only applies to static assets, not Pages Functions.
Workers
functions/api/hello.ts
1export async function onRequestGet() {
2 return new Response("Hello from /api/hello");
3}
functions/auth/login.ts
1export async function onRequestPost({ request }) {
2 const body = await request.json().catch(() => ({}));
3 return new Response(`Login called with: ${JSON.stringify(body)}`);
4}
functions/health.ts
1export async function onRequestGet() {
2 return new Response("OK", { status: 200 });
3}
Cron Trigger
worker.ts
1export default {
2 async scheduled(event, env, ctx) {
3 console.log("Cron ran at", event.cron);
4 },
5};
Rust Worker (rust-worker/)
rust-worker/wrangler.toml
1name = "rust-worker"
2main = "build/worker/shim.mjs"
3compatibility_date = "2025-09-24"
4
5[build]
6command = "cargo install -q worker-build && worker-build --release"
rust-worker/src/lib.rs
1use worker::*;
2
3#[event(fetch)]
4async fn main(_req: Request, _env: Env, _ctx: Context) -> Result<Response> {
5 Response::ok("Hello from Rust Worker at /api/rust")
6}
Deployment commands
Dev locally:
1npx wrangler dev
Publish to Cloudflare:
1npx wrangler publish
Check logs:
1npx wrangler tail
Test cron locally:
1npx wrangler dev --test-scheduled
Run Rust worker (inside rust-worker/):
1npx wrangler dev
2npx wrangler deploy
Caching
Three levels of caching in Cloudflare Workers:
Browser Cache
- User’s browser caches responses locally based on Cache-Control headers
- Cost: Free (happens on user’s device)
Edge Cache
- Cloudflare’s edge servers cache responses using cf.cacheTtlByStatus configuration
- Cost: Free (Cloudflare’s standard edge caching)
Cache API
- Worker explicitly checks caches.default.match() for cached responses before making upstream requests
- Programmatic control over what gets cached and when
- Cost: Free (Cloudflare Workers Cache API has no charges)
Two ways to interact with Cloudflare cache:
- Via fetch() with
cfoptions - Customize caching of proxied requests - Via Cache API - Cache Worker-generated responses with custom keys
Cache control headers
1// Cache for 1 hour in browser, 1 day at edge
2return new Response(data, {
3 headers: {
4 "Cache-Control": "public, max-age=3600, s-maxage=86400",
5 },
6});
7
8// No cache
9return new Response(data, {
10 headers: {
11 "Cache-Control": "no-cache, no-store, must-revalidate",
12 },
13});
fetch() with cf options
Control caching behavior when proxying requests through Cloudflare:
1// Force cache with TTL (ignores origin headers)
2const response = await fetch(url, {
3 cf: {
4 cacheTtl: 3600, // Cache for 1 hour
5 cacheEverything: true, // Cache all content types, not just static
6 },
7});
8
9// TTL by status code
10const response = await fetch(url, {
11 cf: {
12 cacheTtlByStatus: {
13 "200-299": 86400, // Success: 1 day
14 "404": 60, // Not found: 1 minute
15 "500-599": 0, // Errors: no cache
16 },
17 },
18});
19
20// Custom cache key (serve same cached response for different URLs)
21const response = await fetch(url, {
22 cf: { cacheKey: "my-custom-key" },
23});
| Option | Description |
|---|---|
cacheTtl | Force cache TTL in seconds (overrides origin headers) |
cacheTtlByStatus | TTL per status code range. Negative = don’t cache |
cacheEverything | Cache all content types (default: only static assets) |
cacheKey | Custom string key for cache lookup |
cacheTags | Array of tags for tag-based purging |
Note: cf options only work for proxied fetch() requests, not for responses you generate in the Worker. For Worker-generated responses, use the Cache API.
Cache API
1const cache = caches.default;
2const cacheKey = new Request(url, request);
3
4// Check cache first
5let response = await cache.match(cacheKey);
6if (response) {
7 return response;
8}
9
10// Fetch and cache
11response = await fetch(request);
12await cache.put(cacheKey, response.clone());
13return response;
Custom cache keys for immutable data
Use a synthetic URL as the cache key when caching by a unique identifier (e.g., accession number, commit hash) rather than the request URL:
1const cache = caches.default;
2// Synthetic URL - doesn't need to be a real endpoint
3const cacheKey = new Request(`https://example.com/cache/${accessionNumber}`);
4
5const cached = await cache.match(cacheKey);
6if (cached) {
7 return cached;
8}
9
10const response = await fetch(upstreamUrl);
11const immutable = new Response(response.body, {
12 headers: { "Cache-Control": "public, max-age=31536000, immutable" },
13});
14cache.put(cacheKey, immutable.clone()); // fire-and-forget
15return immutable;
Auth-dependent caching
Cloudflare ignores Vary headers by default (except Accept-Encoding). To cache different responses for authenticated vs anonymous users, encode the variant into the cache key:
1const cache = caches.default;
2const isAuth = hasSessionCookie(request);
3// Include auth state in cache key URL
4const cacheKey = new Request(
5 `https://example.com/cache/${path}?auth=${isAuth ? "1" : "0"}`
6);
7
8const cached = await cache.match(cacheKey);
9if (cached) return cached;
10
11const response = await fetchWithAuthLogic(request, isAuth);
12cache.put(cacheKey, response.clone());
13return response;
Note: Vary headers still work for browser caching and downstream proxies, just not Cloudflare’s edge cache.
When to use which:
| Use case | Approach |
|---|---|
| Proxy upstream API, control TTL | fetch() with cf.cacheTtl |
| Cache by status code | fetch() with cf.cacheTtlByStatus |
| Worker-generated responses | Cache API with caches.default |
| Custom cache key (immutable data) | Cache API with synthetic URL |
| Auth-dependent responses | Cache API with variant in key |
References:
Rate Limiting
Options:
- Workers Rate Limit bindings - Built-in rate limiting for Workers
- Durable Objects rate limiter - Custom rate limiting with Durable Objects
- WAF rate limiting rules - Edge-level rate limiting
When to use which:
- WAF rate limiting: Traffic protection at the edge (DDoS, basic abuse). No worker code needed.
- Workers rate limiting (bindings): Simple per-key limits within workers. Built-in, fast, minimal setup.
- Durable Objects: Complex rate limiting logic, custom algorithms, or when you need per-user/per-IP state with geographic distribution.
Service Bindings
Service bindings enable Worker-to-Worker communication for building microservices without public URLs.
wrangler.toml:
1[[services]]
2binding = "AUTH_SERVICE"
3service = "auth-worker"
HTTP-style:
1export default {
2 async fetch(request, env) {
3 return env.AUTH_SERVICE.fetch(request);
4 },
5};
RPC-style:
1// auth-worker
2import { WorkerEntrypoint } from "cloudflare:workers";
3
4export default class AuthService extends WorkerEntrypoint {
5 async authenticate(token: string) {
6 return { userId: "123", valid: true };
7 }
8}
9
10// main-worker
11export default {
12 async fetch(request, env) {
13 const token = request.headers.get("Authorization");
14 const result = await env.AUTH_SERVICE.authenticate(token);
15 return Response.json(result);
16 },
17};
Wrangler basics
Wrangler is the CLI to manage Workers, Pages, and Cloudflare services. Run commands with:
1npx wrangler <COMMAND> <SUBCOMMAND> [PARAMETERS] [OPTIONS]
Wrangler CLI commands
wrangler dev— Start local dev server.wrangler deploy— Deploy Worker to Cloudflare.wrangler tail— Stream Worker logs.wrangler pages— Configure Cloudflare Pages.wrangler kv namespace|key|bulk— Manage Workers KV.wrangler r2 bucket|object— Manage R2 storage.wrangler d1— Manage D1 databases.wrangler vectorize— Manage Vectorize indexes.wrangler queues— Manage Queues.wrangler secret— Manage Worker secrets.wrangler login/logout— Authenticate with Cloudflare.wrangler whoami— Show authenticated user.wrangler check— Validate Worker configuration.
Cloudflare recommends local project install via npm, yarn, or pnpm:
1npm install -D wrangler
2npx wrangler dev
AWS
AWS client that is compatible with Cloudflare workers: https://developers.cloudflare.com/r2/examples/aws/aws4fetch/
NOTE: aws-sdk is not compatible with cloudflare.
Tags: Devops, Cloudflare, Edge-Computing, Cdn, Workers, Serverless