🤖AUTOMATION
2024-07-1211 min

GitLab CI/CD: From 45 Minutes to 8 Minutes Build Time

How we optimized our CI/CD pipeline by 5x using caching, parallel jobs, Docker layer optimization, and smart test splitting.

#GitLab#CI/CD#DevOps#Automation#Performance

The Problem

Our GitLab CI/CD pipeline was taking 45 minutes. Developers were context-switching, productivity was suffering, and merge request reviews were delayed.

Goal: Get builds under 10 minutes.

Optimization 1: Docker Layer Caching

Before (rebuilding everything):

dockerfile
FROM node:18
WORKDIR /app
COPY . .
RUN npm install
RUN npm run build

After (leveraging layer cache):

dockerfile
FROM node:18
WORKDIR /app

# Dependencies change less often than code
COPY package*.json ./
RUN npm ci --cache .npm

# Code changes frequently
COPY . .
RUN npm run build

Impact: -5 minutes

Optimization 2: GitLab CI Caching

yaml
variables:
  npm_config_cache: "$CI_PROJECT_DIR/.npm"

cache:
  key:
    files:
      - package-lock.json
  paths:
    - .npm/
    - node_modules/
  policy: pull-push

Impact: -8 minutes (npm install: 3min → 20sec)

Optimization 3: Parallel Test Execution

Before (sequential):

yaml
test:
  script:
    - npm run test

After (parallel):

yaml
test:
  parallel: 4
  script:
    - npm run test -- --shard=$CI_NODE_INDEX/$CI_NODE_TOTAL

Impact: -12 minutes (tests: 15min → 4min)

Optimization 4: Build Matrix

Run independent jobs in parallel:

yaml
stages:
  - build
  - test
  - deploy

build:frontend:
  stage: build
  script:
    - npm run build:frontend

build:backend:
  stage: build
  script:
    - npm run build:backend

# These run in parallel!
test:unit:
  stage: test
  script:
    - npm run test:unit

test:integration:
  stage: test
  script:
    - npm run test:integration

test:e2e:
  stage: test
  script:
    - npm run test:e2e

Impact: -10 minutes

Optimization 5: Docker BuildKit

yaml
build:
  variables:
    DOCKER_BUILDKIT: 1
  script:
    - docker build --cache-from $CI_REGISTRY_IMAGE:latest -t $CI_REGISTRY_IMAGE:$CI_COMMIT_SHA .
    - docker push $CI_REGISTRY_IMAGE:$CI_COMMIT_SHA

Impact: -3 minutes

Optimization 6: Rules-Based Execution

Skip unnecessary jobs:

yaml
test:frontend:
  rules:
    - changes:
        - "frontend/**/*"
        - "package.json"
  script:
    - npm run test:frontend

test:backend:
  rules:
    - changes:
        - "backend/**/*"
        - "package.json"
  script:
    - npm run test:backend

Impact: -5 minutes (on average)

The Final Pipeline

yaml
stages:
  - prepare
  - build
  - test
  - deploy

# Parallel build stage
build:
  stage: build
  parallel:
    matrix:
      - COMPONENT: [frontend, backend, worker]
  cache:
    key: $COMPONENT-$CI_COMMIT_REF_SLUG
    paths:
      - node_modules/
  script:
    - npm ci --cache .npm
    - npm run build:$COMPONENT

# Parallel test stage
test:
  stage: test
  parallel: 4
  script:
    - npm run test -- --shard=$CI_NODE_INDEX/$CI_NODE_TOTAL

# Deploy only on main
deploy:
  stage: deploy
  rules:
    - if: $CI_COMMIT_BRANCH == "main"
  script:
    - ./deploy.sh

Results

Key Takeaways

  • Cache everything - npm, Docker layers, build artifacts
  • Parallelize aggressively - Use parallel and matrix builds
  • Skip what you can - Use rules to avoid unnecessary work
  • Optimize Docker builds - Layer order matters, use BuildKit
  • Measure and iterate - Track pipeline analytics in GitLab

$ echo "Thanks for reading!"

Share:
AZ

Written by

Ayoub Zakaria

DevOps / Cloud / MLOps Engineer

Building reliable infrastructure for 400+ machines. Sharing real-world DevOps challenges, solutions, and lessons learned.

Get in touch →

Want more DevOps insights?

Explore all articles