Cloud Build is everywhere

Antoine Castex
Google Cloud - Community
4 min readJan 18, 2021

--

One of the things that we like the most with Cloud Providers is when you just give some lines of code and the rest of the story is not made by you.

Cloud Build, the serverless CI/CD product of Google Cloud is amazing, because it’s simple, smart and efficient.

Today I’m going to tell you :

  • How to use CloudBuild
  • How to secure your CloudBuild
  • Deploy a Cloud Function with CloudBuild
  • Call cURL from CloudBuild
  • Use Terraform in CloudBuild
  • Trigger a CloudBuild from everywhere
  • Trigger a CloudBuild from another CloudBuild
  • GCP IAM “Add Permissions” operation from CloudBuild

The trigger functionality of Cloud Build is easy to configure.

Here I have a Cloud Repository where my code is hosted, and I want to have my Cloudbuild code (whatever it’s doing) run every-time I push a new version of my code :

Every-time I will push a modification of my code in the master branch the trigger will automatically execute my code.

My configuration is a YAML file, I can add substitutions variables which is very helpfull when you have some variables to import (don’t put critical informations, please prefer the use of Secret Manager for secrets …)

Very recently Google has launched a new feature to secure your CloudBuild pipeline. Like they already do in other serverless products (Cloud Run or Cloud Functions) you can now specify the ServiceAccount used for your CloudBuild.

That mean you can separate all your pipeline with their own identity, based on the least privileges principle.

Following this :

Don’t forget to add the line at the end of your YAML

serviceAccount: 'projects/PROJECT_NAME/serviceAccounts/SERVICE_ACCOUNT'

Now let’s speak about the YAML for the configuration, what if I’m saying

I can use Cloud Build to deploy my Cloud Function from a Cloud Source Repo hosted code with gcloud

steps:
- name: "gcr.io/cloud-builders/gcloud"
args:
[
"functions",
"deploy",
"$_GCF",
"--region",
"europe-west2",
"--trigger-http",
"--runtime",
"python38",
"--entry-point",
"manager",
"--timeout",
"540s",
"--service-account",
"$_SA",
"--env-vars-file",
"./.env.yaml",
"--update-labels",
"name=$_GCF",
]

Here you can see the code will deploy my GCF using gcloud with parameters like substitutions variables ( $_xxx )

I can also use Cloud Build to use cURL

according to this

This is very easy to do

steps:
- name: "launcher.gcr.io/google/ubuntu1604"
entrypoint: "curl"
args:
["-d", '"{\"id\":\"$BUILD_ID\"}"', "-X", "POST", "http://www.example.com"]

Now the next step is use CloudBuild to run some Terraform Script

The idea is to host your main.tf file on Google Cloud repo like this :

resource "google_project" "project" {
provider = google-beta
name = var.PROJECT_NAME
project_id = var.PROJECT_ID
folder_id = var.FOLDER_ID
billing_account = var.BILLING_ACCOUNT
labels = var.LABELS
}

To run this code your YAML have to be just like this :

{
"steps": [
{
"name": "hashicorp/terraform",
"args": [
"init",
"-backend=true",
"-lock=false",
"-backend-config=prefix=" "project",
],
},
{
"name": "hashicorp/terraform",
"args": [
"apply",
"-var=PROJECT_ID=xxx",
"-var=PROJECT_NAME=xxx",
"-var=FOLDER_ID=xxx",
"-var=BILLING_ACCOUNT=xxx",
"-auto-approve",
"-lock=false",
],
},
]
}

and what if i want to trigger a build from outside the tool ?

curl --request POST \
'https://cloudbuild.googleapis.com/v1/projects/xxxx/triggers/a809f71a:run?key=[YOUR_API_KEY]' \
--header 'Authorization: Bearer [YOUR_ACCESS_TOKEN]' \
--header 'Accept: application/json' \
--header 'Content-Type: application/json' \
--data '{}' \
--compressed

If you want to trigger a build from a Cloud build step you just have to do something like that (token generation and curl have to be in the same command) :

(* indentation have been modified because of the article)steps:
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: 'bash'
args:
- '-c'
- |-
apt update && apt install jq -y
TOKEN=$$(curl -H Metadata-Flavor: Google http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/token |
jq -r .access_token)
echo $${TOKEN}
curl --request
POST 'https://cloudbuild.googleapis.com/v1/projects/xxxx/triggers/a809f71a:run'
- '--header'
- 'Authorization: Bearer ${TOKEN}'
- '--header'
- 'Accept: application/json'
- '--header'
- 'Content-Type: application/json'
- '--compressed'

This will change very soon, check the conclusion of the article to understand what will be possible very soon for calling a CloudBuild.

Finally i can use CloudBuild to call almost everything …and good news, Cloud Build is the serverless product on GCP with the biggest timeout offered : 24h !

That mean you will probably use it for more than just what it was originally designed…

Here is an example of usage, i’m so tired when i have to add multiple roles to someone on a GCP project. Of course i have the possibility to use gcloud add-iam-bindings but … one command per role !

So why not generate that in a CloudBuild CI and run it ? yeah !

Supposed that the list of permission we want to add are

permissions = [
"roles/servicemanagement.admin",
"roles/iap.admin",
"roles/iap.settingsAdmin",
"roles/iam.serviceAccountCreator",
"roles/compute.instanceAdmin.v1",
"roles/compute.loadBalancerAdmin",
]

We can loop on this array to generate a YAML file with one step per role

for role in permissions:    data["steps"].append(
[
{
"name": "gcr.io/cloud-builders/gcloud",
"entrypoint": "bash",
"args": [
"-c",
'gcloud projects add-iam-policy-binding projectA - member="my_user@domain.com" -role='+role,
],
}
]
)

Conclusion

Today a build can be triggered by a commit on a repo, can also recently be launched manually, tomorrow I hope it will be possible to trigger it from Pub/Sub the most important service of GCP used in all the architecture deployed on the platform… Keep it simple and have fun !

--

--