Reference

Version Compatibility


Overview

This document specifies version compatibility between GitLab BDA components. Always verify compatibility before upgrading.

Current versions (as of deployment):

  • GitLab: v18.5.1

  • GitLab Runner: alpine-v18.5.0

  • Harbor: v2.14.0

  • PostgreSQL: 16.6

  • Redis: 7-alpine

  • Kubernetes: 1.31+


GitLab Version Compatibility

GitLab Core vs Runner

Rule: GitLab Runner version MUST match GitLab major.minor version.

Compatibility matrix:

GitLab Version

Compatible Runner Versions

Notes

18.5.x

alpine-v18.5.x

Exact minor match required

18.4.x

alpine-v18.4.x

Patch differences OK

18.3.x

alpine-v18.3.x

-

Why strict matching?

  • GitLab API changes between minor versions

  • Runner uses GitLab internal API for job processing

  • Mismatched versions cause job failures

Upgrade strategy:

  1. Upgrade GitLab first

  2. Wait for GitLab to stabilize (check all pods running)

  3. Upgrade Runner to matching version

  4. Test CI/CD pipeline

GitLab vs PostgreSQL

Supported PostgreSQL versions:

GitLab Version

Min PostgreSQL

Max PostgreSQL

Recommended

18.5.x

14.9

17.x

16.6

18.0-18.4

14.0

16.x

16.4

17.x

13.6

15.x

15.7

Why PostgreSQL 16?

  • GitLab 18.5 supports PostgreSQL 14-17

  • PostgreSQL 16 is LTS (long-term support until 2028)

  • CNPG operator best support for PostgreSQL 16

Upgrade path:

  • PostgreSQL 14 → 15 → 16 (major version upgrades)

  • Use CNPG initdb method for clean upgrade

  • Backup before upgrading!

GitLab vs Redis

Supported Redis versions:

GitLab Version

Min Redis

Max Redis

Recommended

18.5.x

6.2

7.x

7.2

17.x

6.0

7.x

7.0

Why Redis 7?

  • Performance improvements (30% faster than Redis 6)

  • Better memory efficiency

  • ACL support (future feature)

Upgrade strategy:

  • Redis minor upgrades are safe (7.0 → 7.2)

  • For major upgrades (6 → 7), test in staging first

GitLab vs Gitaly

Rule: Gitaly version MUST exactly match GitLab version.

Compatibility:

  • GitLab 18.5.1 requires Gitaly 18.5.1 (same version)

  • Gitaly is part of GitLab distribution (bundled in Helm chart)

Why exact match?

  • Gitaly uses internal GitLab API

  • Protocol changes between versions

  • Mismatched versions cause git operation failures


Harbor Version Compatibility

Harbor vs PostgreSQL

Supported PostgreSQL versions:

Harbor Version

Min PostgreSQL

Max PostgreSQL

Recommended

2.14.x

13.0

16.x

16.6

2.13.x

12.0

15.x

15.7

Sharing PostgreSQL with GitLab:

  • Safe: GitLab and Harbor can share same PostgreSQL cluster

  • Efficient: Reuse CNPG infrastructure

  • Separate databases: gitlab and harbor databases isolated

Current setup: GitLab and Harbor both use gitlab-postgres CNPG cluster with separate databases.

Harbor vs Redis

Supported Redis versions:

Harbor Version

Min Redis

Max Redis

Recommended

2.14.x

6.0

7.x

7.2

2.13.x

5.0

7.x

7.0

Sharing Redis with GitLab:

  • Safe: Use separate Redis databases (DB 0 for GitLab, DB 2 for Harbor)

  • Current setup: Single Redis instance, database isolation

Harbor vs GitLab (OAuth)

OAuth2/OIDC compatibility:

Harbor Version

GitLab OAuth Support

Notes

2.14.x

GitLab 14.x+

Full OIDC support

2.13.x

GitLab 13.x+

Requires openid scope

OAuth configuration requirements:

  • GitLab: OAuth Application with openid, profile, email scopes

  • Harbor: OIDC auth mode with GitLab endpoint

  • Redirect URI: https://<harbor-domain>/c/oidc/callback

For OAuth setup, see Harbor Integration.


Kubernetes Version Compatibility

GitLab vs Kubernetes

Supported Kubernetes versions:

GitLab Version

Min K8S

Max K8S

Recommended

18.5.x

1.28

1.32

1.31

18.0-18.4

1.27

1.31

1.30

17.x

1.26

1.30

1.29

KUP6S cluster: Kubernetes 1.31 (K3S distribution)

Kubernetes API deprecations:

  • GitLab Helm chart uses stable APIs (no beta dependencies)

  • Safe to upgrade Kubernetes within supported range

Harbor vs Kubernetes

Supported Kubernetes versions:

Harbor Version

Min K8S

Max K8S

Recommended

2.14.x

1.25

1.32

1.31

2.13.x

1.24

1.31

1.30

Note: Harbor requires AMD64 nodes (no ARM64 support as of 2.14).

CNPG vs Kubernetes

Supported Kubernetes versions:

CNPG Version

Min K8S

Max K8S

Notes

1.25.x

1.26

1.32

Current KUP6S

1.24.x

1.25

1.31

Older version

PostgreSQL versions:

  • CNPG 1.25.x supports PostgreSQL 13, 14, 15, 16, 17

  • CNPG images: ghcr.io/cloudnative-pg/postgresql:16.6


CDK8S Version Compatibility

CDK8S vs Kubernetes

Supported Kubernetes versions:

CDK8S Version

K8S API Version

Imports

2.73.x

1.31

cdk8s-plus-28

2.72.x

1.30

cdk8s-plus-27

Current setup: CDK8S 2.73.x with cdk8s-plus-28 (Kubernetes 1.31 API)

Upgrading CDK8S:

cd argoapps
npm run upgrade  # Updates CDK8S + imports
npm run import   # Regenerates K8S API types

ArgoCD vs Kubernetes

Supported Kubernetes versions:

ArgoCD Version

Min K8S

Max K8S

Notes

2.13.x

1.26

1.32

Current KUP6S

2.12.x

1.25

1.31

Older version

ArgoCD Application CRD: Stable (v1alpha1, no changes since 2020)


Storage Version Compatibility

Longhorn vs Kubernetes

Supported Kubernetes versions:

Longhorn Version

Min K8S

Max K8S

Notes

1.7.x

1.25

1.32

Current KUP6S

1.6.x

1.24

1.30

Older version

Storage classes used:

  • longhorn-redundant-app (1 replica)

  • longhorn (2 replicas, default)

  • longhorn-ha (3 replicas)

Hetzner CSI Driver vs Kubernetes

Supported Kubernetes versions:

CSI Driver Version

Min K8S

Max K8S

Storage Class

2.10.x

1.26

1.32

hcloud-volumes

Used by: Gitaly (20Gi hcloud-volumes PVC)


Upgrade Paths

GitLab Major Version Upgrade (17.x → 18.x)

Process:

  1. Backup everything (GitLab backup + PostgreSQL backup)

  2. Check deprecations: Review GitLab upgrade path

  3. Upgrade to latest 17.x first (e.g., 17.12.5)

  4. Upgrade to 18.0.0 (first major version)

  5. Upgrade to 18.5.1 (latest minor)

Why staged upgrade?

  • GitLab requires upgrading to latest minor before major version jump

  • Database migrations run between versions

  • Skipping versions causes migration failures

Downtime: ~10-30 minutes (database migration time)

GitLab Minor Version Upgrade (18.4.x → 18.5.x)

Process:

  1. Optional backup (low risk, but recommended)

  2. Update config.yaml: versions.gitlab: v18.5.1

  3. Rebuild manifests: npm run build in argoapps/

  4. Apply: kubectl apply -f dist/gitlab-bda.k8s.yaml

  5. Wait for rollout: kubectl rollout status -n gitlabbda

  6. Update Runner: versions.gitlabRunner: alpine-v18.5.0

  7. Rebuild and apply

Downtime: None (rolling update)

Harbor Version Upgrade (2.13.x → 2.14.x)

Process:

  1. Backup Harbor database

  2. Update config.yaml: versions.harbor: v2.14.0

  3. Rebuild and apply

  4. Wait for pods: kubectl get pods -n gitlabbda | grep harbor

  5. Verify UI: Access Harbor web UI

Downtime: ~5 minutes (database schema upgrade)

PostgreSQL Major Version Upgrade (15 → 16)

Process (CNPG initdb method):

  1. Create full backup via CNPG backup

  2. Create new Cluster with PostgreSQL 16

  3. Restore from backup using recovery.source

  4. Update application configs (point to new cluster)

  5. Delete old cluster after verification

Downtime: ~30-60 minutes (full database restore)

For detailed steps, see CNPG Cluster Upgrade.


Version Pinning Strategy

Why Pin Versions?

Pros:

  • ✅ Predictable deployments (no surprise updates)

  • ✅ Tested compatibility (all components verified together)

  • ✅ Rollback safety (exact version to restore)

Cons:

  • ❌ Manual upgrade work (must update config.yaml)

  • ❌ Security patches delayed (unless actively monitored)

Current strategy: Pin all versions in config.yaml, upgrade manually with testing.

Automated Version Updates (Future)

Option 1: Dependabot (GitHub)

# .github/dependabot.yml
version: 2
updates:
  - package-ecosystem: docker
    directory: /dp-infra/gitlabbda
    schedule:
      interval: weekly

Option 2: Renovate Bot

  • Scans config.yaml for version fields

  • Creates PRs for updates

  • Runs tests before merging


Compatibility Testing

Before Upgrading

Checklist:

  1. ✅ Read GitLab/Harbor release notes for breaking changes

  2. ✅ Check compatibility matrix above

  3. ✅ Test upgrade in staging environment (if available)

  4. ✅ Create full backups

  5. ✅ Plan rollback strategy

Staging Environment (Future)

Recommended setup:

# Staging deployment with same versions
namespace: gitlabbda-staging
domains:
  gitlab: gitlab-staging.example.com
  harbor: registry-staging.example.com

versions:
  gitlab: v18.6.0-rc1  # Test release candidates
  gitlabRunner: alpine-v18.6.0-rc1
  harbor: v2.15.0-rc1

Version History

Current Deployment

versions:
  gitlab: v18.5.1
  gitlabRunner: alpine-v18.5.0
  harbor: v2.14.0
  postgresql: "16.6"
  redis: "7-alpine"
  kubernetes: "1.31"

Past Versions (Example)

Date

GitLab

Runner

Harbor

PostgreSQL

Notes

2025-10-27

18.5.1

alpine-18.5.0

2.14.0

16.6

Current

2025-09-15

18.3.2

alpine-18.3.2

2.13.1

16.4

Previous

2025-07-10

18.1.0

alpine-18.1.0

2.12.5

15.7

Initial


Summary

Strict compatibility requirements:

  • GitLab ↔ Runner: Exact major.minor match

  • GitLab ↔ Gitaly: Exact version match

  • PostgreSQL: Within supported range (14-17 for GitLab 18.5)

  • Kubernetes: Within supported range (1.28-1.32 for GitLab 18.5)

Flexible compatibility:

  • Redis: Any version within range (6.2-7.x)

  • Harbor: Independent versioning (shares PostgreSQL/Redis safely)

Upgrade order:

  1. Backup everything

  2. Upgrade PostgreSQL (if needed)

  3. Upgrade GitLab

  4. Upgrade GitLab Runner (match GitLab version)

  5. Upgrade Harbor (independent timeline)

For configuration, see Configuration Reference. For upgrades, see How-To: Upgrade Components.