Build Speed Optimization
A slow Docker build means slow CI pipelines, slow developer iteration, and wasted compute cost. Most build slowness comes from cache invalidation -- small changes triggering expensive reinstalls. This lesson covers how to design cache-friendly Dockerfiles.
The #1 Rule: Dependencies Before Source Code
Docker caches each layer. When a layer changes, all subsequent layers are rebuilt. Put slow, stable steps first:
flowchart TD
A["COPY package.json<br/>(changes rarely)"] --> B["RUN npm ci<br/>(cached if lockfile unchanged)"]
B --> C["COPY . .<br/>(changes every build)"]
C --> D["RUN npm run build<br/>(always runs)"]
style A fill:#e8f5e9,stroke:#2e7d32
style B fill:#e8f5e9,stroke:#2e7d32
style C fill:#fff3e0,stroke:#ef6c00
style D fill:#fff3e0,stroke:#ef6c00
Bad: Source First (Cache Broken Every Time)
# Every code change invalidates npm ci cache
COPY . .
RUN npm ci
RUN npm run build
Good: Dependencies First (Cache Preserved)
# Dependencies cached until package.json changes
COPY package.json package-lock.json ./
RUN npm ci
COPY . .
RUN npm run build
This pattern works for every language:
| Language | Copy First | Install Command |
|---|---|---|
| Node.js | package.json, package-lock.json | npm ci |
| Python | requirements.txt | pip install -r requirements.txt |
| Go | go.mod, go.sum | go mod download |
| Rust | Cargo.toml, Cargo.lock | cargo build --release |
| Java | pom.xml / build.gradle | mvn dependency:resolve |
Use .dockerignore
A large build context slows every build. Exclude files Docker does not need:
.git
node_modules
dist
build
*.md
.env
.vscode
__pycache__
Check your context size:
# See how much data is sent to the Docker daemon
docker build . 2>&1 | head -1
# => Sending build context to Docker daemon 15.2MB
Enable BuildKit
BuildKit is Docker's modern build engine with better caching, parallelism, and features:
# Enable for a single build
DOCKER_BUILDKIT=1 docker build -t my-app .
# Enable permanently in daemon.json
# { "features": { "buildkit": true } }
BuildKit Cache Mounts
Persist package manager caches across builds without storing them in the image:
# Node: cache npm modules
RUN \
npm ci
# Python: cache pip downloads
RUN \
pip install -r requirements.txt
# Go: cache module downloads
RUN \
go mod download
# APT: cache package downloads
RUN
\
apt-get update && apt-get install -y curl
Remote Cache for CI
CI runners often start with no local cache. Use registry-based caching:
# Build and push cache to registry
docker buildx build \
--cache-to type=registry,ref=registry.example.com/cache/my-app \
--cache-from type=registry,ref=registry.example.com/cache/my-app \
-t my-app:1.0.0 .
This lets every CI run benefit from layers built by previous runs.
Build Speed Diagnostics
# Time a clean build (no cache)
time docker build --no-cache -t my-app:test .
# Time an incremental build (with cache)
time docker build -t my-app:test .
# See which layers are cached vs rebuilt
docker build -t my-app:test . 2>&1 | grep -E "CACHED|RUN"
Build Speed Checklist
| Check | Fix |
|---|---|
| Dependencies rebuilt on every code change | Move COPY lockfile before RUN install |
| Context transfer takes seconds | Add .dockerignore |
| Clean builds always download packages | Use BuildKit cache mounts |
| CI builds start from scratch | Use remote cache (--cache-from) |
| Build takes minutes but only source changed | Verify layer ordering |
| Build args in early layers | Move ARG declarations after stable layers |
Key Takeaways
- Copy dependency files before source code -- this is the single most impactful optimization.
- Use
.dockerignoreto reduce context transfer time. - Enable BuildKit for cache mounts, parallelism, and remote cache support.
- Use
--cache-fromin CI to share cache across builds. - Always measure: time clean builds and incremental builds separately.
What's Next
- Continue to Runtime Performance Optimization.