Close this search box.

How to create a GCP Credential to integrate with Cloud8

Follow the step-by-step guide for creating a credential to integrate your GCP cloud with Cloud8.

We need a JSON file that contains the service account:

IMPORTANT : enable APIs at  – check the project and enable the APIs for “Cloud Resource Manager API”, “Compute”, “Cloud SQL” and “Cloud Billing” (if you are going to use cost analysis through Cloud8);

  • me Manage, enable;
  • Create the Service Account:  (if you have another project, see which one GCP picked up by default!);
  • “Create Service Account”;
  • Roles – depending on the type of action you want Cloud8 to perform – viewing and/or backup, schedules for cost reduction, etc.
  • Browser
  • Viewer
  • View Service Accounts
  • Compute Engine -> Compute Admin ou Compute Viewer;
  • Cloud SQL -> Cloud SQL Admin ou Cloud SQL Viewer;
  • Monitoring -> Monitoring Viewer – metrics;
  • BigQuery -> BigQuery Data Viewer and BigQuery Job User – cost analysis;
  • Kubernetes -> Kubernetes Engine Cluster Viewer;
  • Cloud Asset Viewer
  • Compute Recommender Viewer
  • Cloud Functions Viewer

Choose a name and create. After creation, click on the ‘3 dots’ on the right and create a new JSON key.

Note : if the project is a member of an “Organization”, it must have the same roles as the Service Account, otherwise you will receive the message “User is not Authorized”.

After creating, open the .json file and place the content in Cloud8 for synchronization.

Support for exporting data from Cloud SQL to a bucket

If you are going to use the Cloud SQL data export workflow, one of the following permissions is required:

  • Role: “Cloud Storage Admin”
  • In “Cloud Storage”, choose the bucket and add the ServiceAccount of the “Cloud SQL” instance as Object Creator


If you want to configure via API permission, you can generate a YAML file with the following content:

title: Automation
description: “”
stage: “GA”
– cloudsql.instances.get
– cloudsql.instances.list
– cloudsql.instances.update –> CloudSQL has no setLabels
– cloudsql.instances.restart
– cloudsql.instances.export
– compute.autoscalers.get
– compute.autoscalers.list
– compute.autoscalers.update
– compute.instances.start
– compute.instances.startWithEncryptionKey
– compute.instances.stop
– compute.instances.get
– compute.instances.list
– compute.addresses.list- compute.instances.setLabels
– compute.instanceGroupManagers.get
– compute.instanceGroupManagers.list
– compute.instanceGroupManagers.update
– compute.instanceGroupManagers.use
– compute.disks.list
– compute.disks.get
– compute.disks.createSnapshot
– compute.disks.setLabels
– compute.zones.get
– compute.zones.list
– compute.snapshots.list
– compute.snapshots.get
– compute.snapshots.setLabels
– monitoring.groups.get
– monitoring.groups.list
– monitoring.metricDescriptors.get
– monitoring.metricDescriptors.list
– monitoring.monitoredResourceDescriptors.get
– monitoring.monitoredResourceDescriptors.list
– monitoring.timeSeries.list
– resourcemanager.projects.get
– compute.regions.list
– cloudfunctions.functions.list
– resourcemanager.projects.getIamPolicy
– iam.serviceAccounts.get
– cloudasset.assets.searchAllResources
– recommender.computeInstanceIdleResourceRecommendations.list
– recommender.computeInstanceMachineTypeRecommendations.list
– cloudasset.assets.searchAllResources

And then create the Service Account:

gcloud iam roles create cloudautomation –project <project_id> –file ./cloud8.yaml
gcloud projects add-iam-policy-binding <project_id> –role projects/<project_id>/roles/cloudautomation –member

Integration with costs

Important: Cloud8 needs the project that contains the costs BigQuery. In the Serice Account of this project, we need the Roles: Viewer, Browser, BigQuery Data Viewer and BigQuery Job User.

In the GCP console: