It's quite straightforward to push Docker images to Google Container Registry and deploy to GKE clusters with rok8s-scripts.
To connect to Google Cloud from a CI workflow, a GCP service account is recommended. To create a service account:
- Go to the service accounts page
- Choose a name for the account (e.g.
rok8s-scripts
) and hit "Create" - For "Service account permissions", choose
Kubernetes Engine Developer
and hit "continue" - Click "Create Key" and choose "JSON"
To load the JSON credentials into a rok8s-scripts CI workflow, they'll need to be base64 encoded. This can be accomplished with a command like this:
cat downloaded_google_credentials.json | base64 -w 0
With those credentials in base64 format, you'll need to add them as a protected environment variable in your CI tool of choice. This environment variable needs to be named GCLOUD_KEY
, and contain a base64 encoded copy of GCP Service Account credentials. It's important that this value is not checked into your codebase, as the credentials could potentially provide a great deal of access into your systems.
With the credentials properly configured, there are a few more environment variables that need to be set:
-
GCP_PROJECT
The Google Cloud project that you'll be working with. -
GOOGLE_APPLICATION_CREDENTIALS
A path for the decoded$GCLOUD_KEY
to be stored. A simple value like/tmp/gcloud_key.json
is generally sufficient here.
The configuration above is sufficient to connect to a GCP project and pull or push images, but more configuration is required to connect to a GKE cluster.
CLUSTER_NAME
The name of the GKE cluster to connect to.
-
GCP_REGIONAL_CLUSTER
Set this variable totrue
if the cluster to connect to is regional. -
GCP_REGION
The GCP region the cluster is in.
GCP_ZONE
The GCP zone the cluster is in.
With the above environment variables in place, it's time to run a script to pull it all together and configure the Google Cloud CLI:
prepare-gcloud