Why I Chose Cloudflare's Edge-First Stack for My B2B Marketplace
From Next.js + Vercel to Cloudflare Workers: Why I rebuilt my entire infrastructure
Building a B2B2C marketplace connecting event organizers with service providers, I needed a tech stack that could handle real-time interactions, scale across diverse regions, and keep costs predictable while growing. Here’s why I went all-in on Cloudflare’s edge ecosystem.
The Business Context First
The platform isn’t just another marketplace. It’s solving a real problem in European’s fragmented event industry where finding verified caterers, photographers, or venues often happens through WhatsApp groups and personal connections.
The platform needs to handle
Real-time quote negotiations between clients and vendors
Geographic distribution across all European regions
Subscription billing for users/vendors
File-heavy workflows (proposals, contracts, media)
Multi-language support (English primary, Italian, Spanish, and other 7 languages)
The Stack Decision: Edge-First Architecture
Core Framework: React Router v7 + Vite
// Modern React with full SSR support
import { createBrowserRouter } from “react-router-dom”;
import { Outlet } from “react-router-dom”;
Why this choice:
SSR out of the box - Critical for SEO in competitive markets
Type safety with TypeScript everywhere
Vite’s speed - Sub-second rebuilds during development
No Next.js complexity - Router v7 gives me routing without framework lock-in
Runtime: Cloudflare Workers
The game-changer. Instead of traditional servers, every request runs on Cloudflare’s edge network.
export default {
async fetch(request: Request, env: Env): Promise<Response> {
// Runs in 200+ cities Europe wise
// 0ms cold starts
// Automatic scaling
}
}
Real benefits:
Sub-500ms quick server response times across several countries (observed 485ms)
Zero server management - No Docker, no Kubernetes, no AWS complexity
Automatic scaling - Handles traffic spikes during wedding season
Cost predictability - Pay per request, not idle server time (large free tiers, ideal for startups!)
Database: Cloudflare D1 (SQLite at Edge)
-- Runs SQLite at every edge location
CREATE TABLE vendors (
id TEXT PRIMARY KEY,
company_name TEXT NOT NULL,
region TEXT NOT NULL,
subscription_status TEXT CHECK(subscription_status IN (’active’, ‘trial’, ‘expired’))
);
Why D1 over traditional databases:
Geographic replication - Data close to users
SQLite familiarity - No new query language to learn
Built-in migrations - Schema changes deploy with code
Free tier generosity - 100k reads/day before paying anything
Storage Strategy: KV + R2
Cloudflare KV for sessions and config:
// Lightning-fast key-value reads
await env.KV.get(`session:${userId}`);
await env.KV.put(`vendor:${id}:subscription`, JSON.stringify(data));
Cloudflare R2 for files:
// S3-compatible but cheaper egress
await env.R2.put(`proposals/${eventId}.pdf`, fileData);
The storage split logic:
KV: User sessions, cache, small config data
R2: Event proposals, vendor portfolios, contract documents
D1: Structured business data and relationships
Real-time: Durable Objects
For live features that traditional databases can’t handle:
export class QuoteNegotiation extends DurableObject {
async handleWebSocket(request: Request) {
// Stateful real-time negotiations
// Each event gets its own “room”
}
}
Use cases:
Live quote updates - Vendors adjust prices in real-time
Chat systems - Direct client-vendor communication
Availability calendars - Real-time venue booking conflicts
The Practical Trade-offs
What I Gave Up
Familiar deployment patterns - No Heroku-style git push
Rich ORM ecosystem - Prisma support for D1 is still early
Large package ecosystem - Some Node.js packages don’t work in Workers
What I Gained
Operational simplicity - Zero servers to maintain
Geographic performance - Fast everywhere in Europe by default
Cost structure - Scales from €0 to profitable without big hosting jumps
Developer experience - Local development with Wrangler CLI matches production exactly
Real-World Performance Numbers
After 3 months in beta:
Average Server response time: 0.5ms (Europe-wide)
Average Browser load response time: 485ms
P95 response time: 120ms
Monthly hosting costs: €0 for 30k requests
Zero downtime incidents - Edge redundancy just works
Compare this to my previous Next.js + Vercel setup:
Average response time: 180ms (single region)
Monthly costs: €89 for similar traffic
3 outages in same period
The Developer Experience Reality
Local development with Wrangler:
npm run dev # Runs entire stack locally
wrangler d1 execute DB --local --command “SELECT * FROM vendors”
wrangler r2 object get bucket-name file.pdf --local
Deployment:
npm run deploy # One command, global deployment
Everything runs the same locally and in production. No Docker differences, no environment config hell.
When This Stack Makes Sense
Perfect for:
Geographic applications - Users spread across countries/regions
B2B SaaS - Predictable traffic patterns, subscription billing
Document-heavy workflows - Contracts, proposals, media files
Real-time features - Chat, live updates, collaborative editing
Avoid if:
Heavy computational workloads - ML training, video processing
Complex database requirements - Advanced PostgreSQL features
Large existing team - Switching costs might outweigh benefits
The Bottom Line
Six months in, this edge-first architecture proved itself. The platform handles peak wedding season traffic without breaking, costs stay predictable as we grow, and I spend zero time on infrastructure.
The European event market is competitive and relationship-driven. Having a platform that’s genuinely fast and reliable everywhere in European - from Greece to Norway - gives us a real competitive advantage.
Would I choose this stack again? Absolutely. The operational simplicity alone is worth it.
For other developers building B2B marketplaces: Consider whether your users are geographically distributed and your workflows are more CRUD + real-time than heavy computation. If yes, this edge-first approach might be your secret weapon too.
Building something similar or have questions about implementing this stack? Hit me up - always happy to discuss the real-world trade-offs.


