Home

GCP Terraform Cheat Sheet

Prerequisites

Before starting, ensure you have:

  1. GCP Account - Sign up at cloud.google.com
  2. Billing Account - Credit card required (even for free tier)
  3. gcloud CLI - Install: brew install google-cloud-sdk (macOS) or download
  4. Terraform - Install: brew install terraform (macOS) or download

Finding Your Billing Account ID

  1. Go to GCP Billing Console
  2. Select your billing account
  3. Copy the ID from the page (format: XXXXXX-XXXXXX-XXXXXX)

You need this ID for linking projects to billing.

Project Structure Basics

Essential Files

project/
├── main.tf           # Main infrastructure definitions
├── variables.tf      # Input variables
├── outputs.tf        # Output values
├── terraform.tfvars  # Variable values (add to .gitignore!)
└── .gitignore        # Exclude sensitive files

Critical: Always add terraform.tfvars and *.tfstate to .gitignore to avoid committing credentials.

Creating Your First Project

  1. Go to GCP Console
  2. Click “Select a project” → “New Project”
  3. Enter project name and note the auto-generated Project ID
  4. Click “Create”
  5. Link billing: Navigation Menu → Billing → Link a billing account

Then reference it in Terraform using a data source (see below).

Authentication Setup

Step-by-Step Authentication

 1# 1. Login to gcloud (opens browser)
 2gcloud auth login
 3
 4# 2. Set up credentials for Terraform
 5gcloud auth application-default login
 6
 7# 3. Set quota project (IMPORTANT - prevents API quota errors)
 8gcloud auth application-default set-quota-project YOUR_PROJECT_ID
 9
10# 4. Set default project for gcloud commands
11gcloud config set project YOUR_PROJECT_ID
12
13# 5. Verify authentication
14gcloud auth list
15gcloud config get-value project

Why both login commands?

  • gcloud auth login - For CLI commands (gcloud commands)
  • gcloud auth application-default login - For SDKs and tools like Terraform

Understanding GCP Project Concepts

Important distinction:

  • Project ID: String you choose (e.g., my-project-123) - Used in most Terraform resources
  • Project Number: Auto-generated integer (e.g., 450148906793) - Required for budget filters
  • Project Name: Display name (e.g., “My Project”) - Just for UI

Find your project number:

1gcloud projects describe PROJECT_ID --format="value(projectNumber)"

Terraform Provider Configuration

Basic Provider Setup

 1terraform {
 2  required_version = ">= 1.5"
 3  required_providers {
 4    google = {
 5      source  = "hashicorp/google"
 6      version = "~> 6.0"
 7    }
 8  }
 9}
10
11provider "google" {
12  project               = var.project_id
13  region                = var.region
14  user_project_override = true
15  billing_project       = var.project_id
16}

Key settings:

  • user_project_override = true - Makes Terraform explicitly specify your project on every API call, preventing “quota project not set” errors
  • billing_project - Which project to bill for API calls (required when using user_project_override)
  • ~> 6.0 - Allows minor version upgrades (6.1, 6.2, etc.) but not major (7.0)

Working with Projects

If you created a project manually in GCP Console:

 1# Reference existing project
 2data "google_project" "my_project" {
 3  project_id = var.project_id
 4}
 5
 6# Use it in resources
 7resource "google_storage_bucket" "example" {
 8  project = data.google_project.my_project.project_id
 9  name    = "my-bucket"
10  location = "US"
11}

Creating New Project (Advanced)

Only use if automating project creation:

1resource "google_project" "new_project" {
2  name            = "My Project"
3  project_id      = "unique-id-123"
4  billing_account = var.billing_account_id
5  org_id          = var.org_id  # Optional
6}

Gotcha: If the project already exists, Terraform will try to recreate it and fail. Always use data source for existing projects.

Enabling APIs

 1resource "google_project_service" "apis" {
 2  for_each = toset([
 3    "run.googleapis.com",
 4    "artifactregistry.googleapis.com",
 5    "cloudfunctions.googleapis.com",
 6    "storage.googleapis.com",
 7  ])
 8
 9  project            = var.project_id
10  service            = each.key
11  disable_on_destroy = true  # Optional: disable when destroying
12}
13
14# Always use depends_on for resources that need APIs
15resource "google_storage_bucket" "example" {
16  name       = "example-bucket"
17  location   = "US"
18  depends_on = [google_project_service.apis]
19}

Important: APIs take 30-60 seconds to propagate after enabling. Use depends_on to avoid race conditions where Terraform tries to use an API before it’s ready.

Cloud Functions Regions

Define region:

1resource "google_cloudfunctions_function" "example" {
2  region = "europe-west1"
3}

Check available regions:

1gcloud functions regions list
2gcloud run regions list

Common supported regions:

  • US: us-central1, us-east1, us-west1
  • Europe: europe-west1, europe-west2, europe-west3
  • Asia: asia-east1, asia-northeast1

Budget Configuration

GCP doesn’t support a budget cap, but we can always implement it ourselves.

Architecture Overview

This event-driven cost control system uses GCP’s pub/sub service to monitor spending and trigger automated responses when budget thresholds are exceeded.

Components:

  1. Budget Monitor (google_billing_budget) - Tracks spending against a defined limit, monitors at the project level, and can define multiple alert thresholds
  2. Pub/Sub Topic (google_pubsub_topic) - Receives notifications when budget thresholds are crossed
  3. Event Handler (google_cloudfunctions_function) - Subscribes to the Pub/Sub topic and executes custom logic when budget alerts fire

Data Flow:

Budget Threshold Crossed
        ↓
Google Billing API detects overspend
        ↓
Publishes message to Pub/Sub topic
        ↓
Cloud Function triggered automatically
        ↓
Function receives budget data (cost, threshold %)
        ↓
Function executes actions
(e.g., disable billing, alert team, shut down resources)

Budget with Pub/Sub Notifications

 1# Create Pub/Sub topic for alerts
 2resource "google_pubsub_topic" "budget_alerts" {
 3  name    = "budget-alerts"
 4  project = var.project_id
 5}
 6
 7# Budget with hard limit
 8resource "google_billing_budget" "monthly_limit" {
 9  billing_account = var.billing_account_id
10  display_name    = "Monthly Budget Limit"
11
12  budget_filter {
13    # Use project NUMBER (not ID)
14    projects = ["projects/${data.google_project.my_project.number}"]
15  }
16
17  amount {
18    specified_amount {
19      currency_code = "USD"
20      units         = "50"
21    }
22  }
23
24  # Multiple thresholds for early warning
25  threshold_rules {
26    threshold_percent = 0.5  # Alert at 50%
27  }
28  threshold_rules {
29    threshold_percent = 0.9  # Alert at 90%
30  }
31  threshold_rules {
32    threshold_percent = 1.0  # Hard limit at 100%
33  }
34
35  # Send to Pub/Sub for automation
36  all_updates_rule {
37    pubsub_topic = google_pubsub_topic.budget_alerts.id
38  }
39}

Important: Budget filters require the project number (integer), not the project id (string).

Cloud Functions with Event Triggers

 1# Storage bucket for function code
 2resource "google_storage_bucket" "function_code" {
 3  name     = "${var.project_id}-functions"
 4  location = "US"
 5  project  = var.project_id
 6}
 7
 8# Package function code
 9data "archive_file" "function_zip" {
10  type        = "zip"
11  source_dir  = "${path.module}/function"
12  output_path = "${path.module}/function.zip"
13}
14
15# Upload to storage
16resource "google_storage_bucket_object" "function_zip" {
17  name   = "function-${data.archive_file.function_zip.output_md5}.zip"
18  bucket = google_storage_bucket.function_code.name
19  source = data.archive_file.function_zip.output_path
20}
21
22# Deploy function
23resource "google_cloudfunctions_function" "budget_enforcer" {
24  name                  = "budget-enforcer"
25  project               = var.project_id
26  region                = "europe-west1"
27  runtime               = "python310"
28  available_memory_mb   = 128
29  timeout               = 60
30  entry_point           = "main_function"
31
32  source_archive_bucket = google_storage_bucket.function_code.name
33  source_archive_object = google_storage_bucket_object.function_zip.name
34
35  # Pub/Sub trigger
36  event_trigger {
37    event_type = "google.pubsub.topic.publish"
38    resource   = google_pubsub_topic.budget_alerts.id
39  }
40
41  environment_variables = {
42    PROJECT_ID = var.project_id
43  }
44
45  depends_on = [google_project_service.apis]
46}

Minimal Function Example

Directory structure:

function/
├── main.py
└── requirements.txt

function/main.py:

 1import base64
 2import json
 3
 4def main_function(event, context):
 5    """Triggered by Pub/Sub message."""
 6
 7    # Decode Pub/Sub message
 8    if 'data' in event:
 9        message = base64.b64decode(event['data']).decode('utf-8')
10        data = json.loads(message)
11        print(f'Received: {data}')
12
13    # Your logic here
14    print('Processing budget alert...')
15
16    return 'OK'

function/requirements.txt:

functions-framework==3.5.0

Tip: Include MD5 hash in zip filename for automatic versioning when code changes.

IAM Permissions

Grant Function Permissions

1resource "google_project_iam_member" "function_billing" {
2  project = var.project_id
3  role    = "roles/billing.projectManager"
4  member  = "serviceAccount:${google_cloudfunctions_function.budget_enforcer.service_account_email}"
5}

Common IAM roles:

  • roles/viewer - Read-only access to all resources
  • roles/editor - Read-write access (cannot change IAM)
  • roles/owner - Full access including IAM
  • roles/billing.projectManager - Link/unlink billing accounts
  • roles/cloudfunctions.invoker - Invoke Cloud Functions
  • roles/storage.objectViewer - Read objects from Cloud Storage

Best practice: Use least privilege - grant only the specific permissions needed.

Variables Best Practices

variables.tf

 1variable "project_id" {
 2  description = "GCP project ID"
 3  type        = string
 4}
 5
 6variable "billing_account_id" {
 7  description = "Billing account ID"
 8  type        = string
 9}
10
11variable "region" {
12  description = "Default region for resources"
13  type        = string
14  default     = "europe-west1"
15}
16
17variable "budget_amount" {
18  description = "Monthly budget limit"
19  type        = string
20  default     = "50"
21}

terraform.tfvars (NEVER COMMIT!)

1project_id         = "my-project-123"
2billing_account_id = "XXXXXX-XXXXXX-XXXXXX"
3region             = "europe-west1"
4budget_amount      = "100"

.gitignore (REQUIRED)

# Terraform state files
*.tfstate
*.tfstate.backup
.terraform/
.terraform.lock.hcl

# Variable files with secrets
terraform.tfvars
*.auto.tfvars

# Credentials
credentials.json
service-account-key.json

Critical: Committing terraform.tfvars or *.tfstate files exposes credentials and infrastructure details.

Terraform Workflow

Complete Deployment Process

 1# 1. Initialize - downloads providers and modules
 2terraform init
 3
 4# 2. Validate syntax
 5terraform validate
 6
 7# 3. Format code consistently
 8terraform fmt
 9
10# 4. Preview changes (ALWAYS do this!)
11terraform plan
12
13# 5. Review the plan output carefully
14#    Look for: + (create), ~ (modify), - (destroy)
15
16# 6. Apply changes
17terraform apply
18# Type "yes" when prompted
19
20# 7. View outputs
21terraform output
22
23# 8. Check resource state
24terraform state list

NOTE: NEVER run terraform apply -auto-approve without reviewing terraform plan first.

State Management

 1# List all resources in state
 2terraform state list
 3
 4# Show details of specific resource
 5terraform state show google_storage_bucket.example
 6
 7# Remove resource from state (doesn't delete in GCP!)
 8terraform state rm google_storage_bucket.example
 9
10# Move resource in state (rename)
11terraform state mv google_storage_bucket.old google_storage_bucket.new

Warning: terraform state rm removes Terraform’s tracking but leaves the actual GCP resource running (and billing you!).

Troubleshooting Common Issues

First-Time Setup Issues

Error: “Project not found”

Cause: Project doesn’t exist or wrong project ID

Fix:

1# List all your projects
2gcloud projects list
3
4# Check current active project
5gcloud config get-value project

Error: “You do not have permission to access project”

Cause: Not logged in or using wrong Google account

Fix:

1# Check which account is active
2gcloud auth list
3
4# Switch to correct account
5gcloud config set account [email protected]
6
7# Re-authenticate
8gcloud auth login

API and Service Issues

Error: “API [service] not enabled on project”

Cause: Required API not enabled

Fix:

 1resource "google_project_service" "required_api" {
 2  project = var.project_id
 3  service = "servicename.googleapis.com"
 4}
 5
 6# Add to dependent resources
 7resource "google_storage_bucket" "example" {
 8  depends_on = [google_project_service.required_api]
 9  # ...
10}

Or enable manually:

1gcloud services enable servicename.googleapis.com --project=PROJECT_ID

Error: “The API requires a quota project, which is not set”

Cause: ADC credentials missing quota project

Fix:

1gcloud auth application-default set-quota-project PROJECT_ID

This is one of the most common errors - make sure to run this during setup.

Billing Issues

Error: “Billing account for project not found”

Cause: Project not linked to billing account

Fix:

1# Link project to billing
2gcloud billing projects link PROJECT_ID \
3  --billing-account=BILLING_ACCOUNT_ID
4
5# Verify billing is linked
6gcloud billing projects describe PROJECT_ID

Region and Location Issues

Error: “Permission denied on ’locations/REGION'”

Cause: Region doesn’t support the service

Fix: Check supported regions and change your configuration

1# For Cloud Functions
2gcloud functions regions list
3
4# For Cloud Run
5gcloud run regions list
6
7# For general compute regions
8gcloud compute regions list

Service Dependencies

Error: “Service X depends on service Y”

Cause: Trying to disable API that other services depend on

Fix when destroying:

1resource "google_project_service" "api" {
2  service                    = "storage.googleapis.com"
3  disable_on_destroy         = true
4  disable_dependent_services = true  # Add this
5}

Warning: This will disable ALL dependent services when you run terraform destroy.

Verification Steps

After deploying infrastructure, verify everything works:

 1# Verify APIs are enabled
 2gcloud services list --enabled --project=PROJECT_ID
 3
 4# Check Cloud Function deployed
 5gcloud functions list --project=PROJECT_ID
 6gcloud functions describe FUNCTION_NAME --region=REGION --project=PROJECT_ID
 7
 8# Verify budget created
 9gcloud billing budgets list --billing-account=BILLING_ACCOUNT_ID
10
11# Check Pub/Sub topics
12gcloud pubsub topics list --project=PROJECT_ID
13
14# View recent logs
15gcloud logging read --limit=10 --project=PROJECT_ID
16
17# Test Cloud Function
18gcloud functions call FUNCTION_NAME \
19  --region=REGION \
20  --data='{"test": "data"}'

Cost Management

1. Set Budget Alerts Early

Configure budgets BEFORE deploying expensive resources:

 1# Early warning alerts
 2threshold_rules {
 3  threshold_percent = 0.5  # Alert at 50%
 4}
 5threshold_rules {
 6  threshold_percent = 0.9  # Alert at 90%
 7}
 8threshold_rules {
 9  threshold_percent = 1.0  # Hard limit at 100%
10}

2. Understand Free Tier Limits

  • Cloud Functions: 2M invocations/month, 400k GB-seconds compute
  • Cloud Storage: 5GB, 5k Class A operations, 50k Class B operations
  • Cloud Run: 2M requests/month, 360k GB-seconds compute
  • Artifact Registry: 0.5GB storage

Monitor usage:

1gcloud billing accounts list
2gcloud billing projects describe PROJECT_ID

3. Clean Up Properly

 1# Preview what will be destroyed
 2terraform plan -destroy
 3
 4# Destroy all managed resources
 5terraform destroy
 6
 7# Verify nothing remains
 8gcloud functions list --project=PROJECT_ID
 9gcloud run services list --project=PROJECT_ID
10gcloud storage buckets list --project=PROJECT_ID

Important gotchas:

  • Budget resources are NOT deleted by terraform destroy
  • Storage buckets with objects may require force deletion
  • Some resources have deletion protection enabled by default

Security Best Practices

1. Never Commit Credentials

Essential .gitignore entries:

*.tfvars
*.tfstate*
.terraform/
credentials.json
service-account-key.json
.env

Check before committing:

1# View what will be committed
2git status
3git diff
4
5# If you accidentally committed secrets:
6git reset HEAD~1  # Undo last commit (if not pushed)

2. Use Least Privilege IAM

1# AVOID - too permissive
2role = "roles/owner"
3
4# PREFER - specific permissions
5role = "roles/cloudfunctions.invoker"

IAM Best Practices:

  • Grant minimum permissions needed
  • Use service accounts instead of user accounts
  • Review permissions regularly
  • Enable audit logging

3. Enable Audit Logging

 1resource "google_project_iam_audit_config" "audit" {
 2  project = var.project_id
 3  service = "allServices"
 4
 5  audit_log_config {
 6    log_type = "ADMIN_READ"
 7  }
 8  audit_log_config {
 9    log_type = "DATA_WRITE"
10  }
11  audit_log_config {
12    log_type = "DATA_READ"
13  }
14}

Note: Admin activity is logged automatically; this adds data access logging.

Complete Working Example

This example creates a minimal Cloud Run service:

Directory Structure

my-gcp-project/
├── main.tf
├── variables.tf
├── terraform.tfvars  (create this, add to .gitignore!)
└── .gitignore

main.tf

 1terraform {
 2  required_version = ">= 1.5"
 3  required_providers {
 4    google = {
 5      source  = "hashicorp/google"
 6      version = "~> 6.0"
 7    }
 8  }
 9}
10
11provider "google" {
12  project               = var.project_id
13  region                = var.region
14  user_project_override = true
15  billing_project       = var.project_id
16}
17
18# Reference existing project
19data "google_project" "my_project" {
20  project_id = var.project_id
21}
22
23# Enable Cloud Run API
24resource "google_project_service" "run" {
25  project = var.project_id
26  service = "run.googleapis.com"
27}
28
29# Deploy sample container
30resource "google_cloud_run_service" "hello" {
31  name     = "hello-world"
32  location = var.region
33
34  template {
35    spec {
36      containers {
37        image = "gcr.io/cloudrun/hello"
38      }
39    }
40  }
41
42  traffic {
43    percent         = 100
44    latest_revision = true
45  }
46
47  depends_on = [google_project_service.run]
48}
49
50# Make service publicly accessible
51resource "google_cloud_run_service_iam_member" "public" {
52  service  = google_cloud_run_service.hello.name
53  location = google_cloud_run_service.hello.location
54  role     = "roles/run.invoker"
55  member   = "allUsers"
56}
57
58output "service_url" {
59  description = "URL of deployed Cloud Run service"
60  value       = google_cloud_run_service.hello.status[0].url
61}

variables.tf

 1variable "project_id" {
 2  description = "GCP project ID"
 3  type        = string
 4}
 5
 6variable "region" {
 7  description = "GCP region"
 8  type        = string
 9  default     = "europe-west1"
10}

terraform.tfvars

1project_id = "your-project-id-here"
2region     = "europe-west1"

.gitignore

*.tfstate
*.tfstate.backup
.terraform/
terraform.tfvars

Deploy

 1# Initialize
 2terraform init
 3
 4# Plan
 5terraform plan
 6
 7# Deploy
 8terraform apply
 9
10# Visit the output URL to see your service

Useful Commands Reference

 1# Project management
 2gcloud projects list
 3gcloud projects describe PROJECT_ID
 4gcloud config set project PROJECT_ID
 5
 6# API management
 7gcloud services list --enabled
 8gcloud services enable SERVICE_NAME.googleapis.com
 9
10# Billing
11gcloud billing accounts list
12gcloud billing projects link PROJECT_ID --billing-account=ACCOUNT_ID
13gcloud billing budgets list --billing-account=ACCOUNT_ID
14
15# Cloud Functions
16gcloud functions list --project=PROJECT_ID
17gcloud functions deploy FUNCTION_NAME \
18  --runtime python310 \
19  --trigger-topic TOPIC_NAME \
20  --region europe-west1
21
22# Logging
23gcloud logging read "resource.type=cloud_function" --limit 50
24gcloud logging read "severity>=ERROR" --limit 20
25
26# Storage
27gcloud storage buckets list
28gcloud storage buckets delete gs://BUCKET_NAME
29
30# Cleanup
31gcloud projects delete PROJECT_ID

References

Tags: Devops, Terraform, Gcp, Infrastructure-as-Code, Cloud-Computing, Automation