Contributing to Hive
Thank you for your interest in contributing to OpenShift Hive! We welcome contributions from the community.
Project Overview
OpenShift Hive is an operator which runs as a service on top of Kubernetes/OpenShift. The Hive service can be used to provision and perform initial configuration of OpenShift clusters.
Hive uses the OpenShift installer for cluster provisioning.
For detailed design overview and usage, please refer to the README.md and documentation.
Getting Started
Prerequisites
- Go (version specified in
go.mod) - Make
Development Environment
You can build the project binaries using the provided make targets.
make update
# Compile the project binaries
make build
# Clean up build artifacts
make clean
See the Developing Hive guide for detailed setup instructions.
Testing
Before submitting a Pull Request, ensure that all tests pass and the code is verified.
make verify
# Run unit tests (excludes e2e tests)
make test
Dependency Management
make modcheck # Check module dependencies
make modfix # Fix module dependencies
Test Types
- Unit Tests (
make test): Runs unit tests for./pkg/...,./cmd/...,./contrib/..., and submodules. This excludes e2e tests. - E2E Tests: End-to-end tests require a running cluster and are run separately:
make test-e2e: Run full e2e test suitemake test-e2e-pool: Run cluster pool e2e testsmake test-e2e-postdeploy: Run post-deployment e2e testsmake test-e2e-postinstall: Run post-installation e2e tests
Project Structure
Understanding the project structure will help you navigate the codebase:
apis/: API definitions (separate Go submodule)hive/v1/: Hive v1 APIs (ClusterDeployment, SyncSet, etc.)hiveinternal/v1alpha1/: Internal APIshivecontracts/v1alpha1/: Contract APIs
cmd/: Binary entry pointscmd/manager/: Main entry point for Hive controllerscmd/operator/: Main entry point for Hive operatorcmd/hiveadmission/: Admission webhook server
pkg/: Package source codepkg/controller/: Operator controllerspkg/install/: Installation logic and OpenShift installer integrationpkg/installmanager/: Manages cluster installation processpkg/operator/: Hive operator logicpkg/resource/: Utilities for applying resources to remote clusterspkg/remoteclient/: Client for connecting to remote clusterspkg/{awsclient,azureclient,gcpclient,ibmclient}/: Cloud provider-specific client implementations
config/: Kubernetes YAML manifests for deploying the operatordocs/: Developer and user documentationhack/: Developer scripts and toolstest/e2e/: End-to-end tests
Pull Requests
Commit Messages
All git commits should follow a standard format to ensure clarity and traceability.
Title format:
Example:
HIVE-2980: How to refresh ClusterPool cloud creds
Add doc content describing different ways to rotate a ClusterPool's
cloud credentials.
Add a script, `hack/refresh-clusterpool-creds.sh` to nondisruptively
update the (currently AWS; other platforms TODO) cloud credentials for
all existing ClusterDeployments associated with a given ClusterPool.
- Accepts two args: the clusterpool namespace and name.
- Discovers the current AWS creds Secret from the clusterpool.
- Discovers all existing ClusterDeployments associated with the
clusterpool.
- Discovers the AWS creds Secret for each CD.
- Patches that Secret with the `.data` of the clusterpool's Secret.