Skip to content

Latest commit

 

History

History
 
 

slo-pipeline

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

SLO Pipeline

The SLO pipeline submodule creates the following resources:

  • A Pub/Sub topic that will receive each SLO report.
  • A Cloud Function that will process each SLO report and send them to a destination (BigQuery / Stackdriver Monitoring).
  • A JSON configuration file (needed by the slo-generator), indicating what export destinations will be configured (exporters.json).
  • A JSON configuration file (needed by the slo-generator), describing which error budgets to calculate (error-budget-policy.json).

Architecture

Pre-requisites

To use this module, you will need:

  • A GCP project with the following APIs enabled:

    • App Engine API: appengine.googleapis.com
    • Cloud Functions API: cloudfunctions.googleapis.com
    • Monitoring API: monitoring.googleapis.com
    • Logging API: logging.googleapis.com
    • Pub/Sub API: pubsub.googleapis.com
    • Storage API: storage.googleapis.com
    • BigQuery API: bigquery-json.googleapis.com
  • The following IAM roles on the project, for the service account running the Terraform:

    • App Engine Admin (roles/appengine.appAdmin)
    • Cloud Functions Admin (roles/cloudfunctions.admin)
    • PubSub Admin (roles/pubsub.admin)
    • Storage Admin (roles/storage.admin)
    • IAM Admin (roles/iam.admin)
  • For the BigQuery exporter:

    • BigQuery Admin (roles/bigquery.admin)

See the fixture project for an example to create this project and enable the App Engine application using Terraform and the Project Factory module.

Inputs

Name Description Type Default Required
dataset_create Whether to create the BigQuery dataset bool "true" no
dataset_default_table_expiration_ms The default lifetime of the slo table in the dataset, in milliseconds. Default is never (Recommended) number "-1" no
exporters SLO export destinations config any n/a yes
extra_files Extra files to add to the Google Cloud Function code object <list> no
function_bucket_name Name of the bucket to create to store the Cloud Function code string "slo-pipeline" no
function_environment_variables Cloud Function environment variables map <map> no
function_memory Memory in MB for the Cloud Function (increases with no. of SLOs) string "128" no
function_name Cloud Function name string "slo-pipeline" no
function_source_directory The contents of this directory will be archived and used as the function source. (defaults to standard SLO generator code) string "" no
function_timeout Timeout (in seconds) string "90" no
grant_iam_roles Grant IAM roles to created service accounts string "true" no
labels Labels to apply to all resources created map <map> no
project_id Project id to create SLO infrastructure string n/a yes
pubsub_topic_name Pub/Sub topic name string "slo-export-topic" no
region Region for the App Engine app string "us-east1" no
service_account_email Service account email (optional) string "" no
service_account_name Name of the service account to create string "slo-pipeline" no
slo_generator_version SLO generator library version string "1.4.0" no
storage_bucket_class The Storage Class of the new bucket. Supported values include: STANDARD, MULTI_REGIONAL, REGIONAL, NEARLINE, COLDLINE string "STANDARD" no
storage_bucket_location The GCS location string "US" no
use_custom_service_account Use a custom service account (pass service_account_email if true) bool "false" no
vpc_connector VPC Connector. The format of this field is projects//locations//connectors/*. string "null" no
vpc_connector_egress_settings VPC Connector Egress Settings. Allowed values are ALL_TRAFFIC and PRIVATE_RANGES_ONLY. string "null" no

Outputs

Name Description
exporters Exporter config
function_bucket_name Cloud Function bucket name
function_name Cloud Function name
project_id Project id
pubsub_topic_name Ingress PubSub topic to SLO pipeline
service_account_email Service account email used to run the Cloud Function