Caddy delivers serious performance with automatic HTTPS and zero config. Here's how it works under the hood.

Why Caddy might be the fastest Go web server you can run


I switched from nginx to Caddy about a year ago. The automatic HTTPS was the selling point. But the performance surprised me more.

Caddy is a web server written entirely in Go. It handles HTTP/1.1, HTTP/2, and HTTP/3 out of the box. No config files to fiddle with for TLS. It just works.

But how does it perform? And what makes it fast?

What Makes Caddy Fast

Caddy benefits from Go’s concurrency model. Each incoming request gets its own goroutine. This means thousands of concurrent connections without the thread overhead you’d see in C-based servers.

The server uses Go’s net/http package as its foundation, but adds significant optimizations on top. Connection pooling, efficient buffer reuse, and smart timeout handling all contribute to performance.

Here’s what a basic Caddy setup looks like using the Caddyfile:

example.com {
    reverse_proxy localhost:8080
}

That’s it. Caddy automatically:

  • Obtains a TLS certificate via ACME
  • Redirects HTTP to HTTPS
  • Enables HTTP/2 and HTTP/3
  • Handles certificate renewal

No certbot cron jobs. No nginx config blocks. The security and privacy benefits come free.

Benchmarking Caddy as a Reverse Proxy

I ran some tests comparing Caddy as a reverse-proxy against a simple Go backend. The backend just returns JSON:

package main

import (
	"encoding/json"
	"net/http"
)

type Response struct {
	Message string `json:"message"`
	Status  int    `json:"status"`
}

func main() {
	http.HandleFunc("/api", func(w http.ResponseWriter, r *http.Request) {
		resp := Response{
			Message: "Hello from backend",
			Status:  200,
		}
		w.Header().Set("Content-Type", "application/json")
		json.NewEncoder(w).Encode(resp)
	})
	
	http.ListenAndServe(":8080", nil)
}

With Caddy proxying this backend, I consistently saw sub-millisecond proxy overhead. The http-server layer adds almost nothing to your latency.

Extending Caddy with Go

One of Caddy’s best features is its plugin system. You can write custom modules in Go that compile directly into the binary. This is much faster than external scripting.

Here’s a minimal middleware example:

package custommodule

import (
	"net/http"

	"github.com/caddyserver/caddy/v2"
	"github.com/caddyserver/caddy/v2/caddyhttp"
)

func init() {
	caddy.RegisterModule(Middleware{})
}

type Middleware struct{}

func (Middleware) CaddyModule() caddy.ModuleInfo {
	return caddy.ModuleInfo{
		ID:  "http.handlers.custom",
		New: func() caddy.Module { return new(Middleware) },
	}
}

func (m Middleware) ServeHTTP(w http.ResponseWriter, r *http.Request, next caddyhttp.Handler) error {
	// Add custom header
	w.Header().Set("X-Custom-Header", "processed")
	return next.ServeHTTP(w, r)
}

func (m *Middleware) UnmarshalCaddyfile(d *caddyfile.Dispenser) error {
	return nil
}

This compiles into Caddy itself. No runtime overhead from plugin loading.

Automatic HTTPS Performance

The automatic-https feature uses the ACME protocol to obtain certificates. Caddy caches these intelligently and handles renewal in the background.

First-time certificate issuance adds latency to the initial request. But after that? Zero overhead. The tls handshake is optimized, and HTTP/3 with QUIC reduces connection setup time significantly.

If you’re building a Go http-server and need a production-ready frontend, Caddy pairs well. You can focus on your application logic while Caddy handles the infrastructure.

When to Use Caddy

Caddy shines when you need:

  • Automatic https without configuration
  • A fast reverse-proxy for Go backends
  • HTTP/3 support
  • Simple configuration via Caddyfile

For raw static file serving, nginx still edges ahead in some benchmarks. But for most golang applications where you’re proxying to a backend, Caddy’s performance is excellent.

Final Thoughts

Caddy proves that a web-server written in Go can compete with C-based alternatives. The automatic certificate management alone saves hours of DevOps work. The performance is a bonus.

Check out the official Caddy documentation for more configuration options. And if you’re writing custom middleware, their extending Caddy guide is worth reading.