Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add BigQuery, Source Repositories #113

Merged
merged 57 commits into from
Mar 1, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
e87bd0f
Add InSpec support for backend service
slevenick Jan 25, 2019
6a8728b
Merge pull request #87 from modular-magician/codegen-pr-1300
slevenick Jan 25, 2019
9ecfaaa
Add HTTP health check for InSpec
slevenick Jan 25, 2019
0942279
Merge pull request #88 from modular-magician/codegen-pr-1303
slevenick Jan 25, 2019
98a2d36
Add HTTPS health check to InSpec
slevenick Jan 25, 2019
f0be43a
Merge pull request #89 from modular-magician/codegen-pr-1305
slevenick Jan 25, 2019
e4f5d3e
Add compute instance template for InSpec
slevenick Jan 26, 2019
2e55d45
Merge pull request #90 from modular-magician/codegen-pr-1308
slevenick Jan 28, 2019
0a10a32
Add compute global address to InSpec
slevenick Jan 28, 2019
752a463
Merge pull request #91 from modular-magician/codegen-pr-1309
slevenick Jan 28, 2019
2f6ded5
Inspec url map
slevenick Jan 28, 2019
9d387eb
Merge pull request #92 from modular-magician/codegen-pr-1310
slevenick Jan 28, 2019
adb5a42
Add InSpec support for HTTP proxy
slevenick Jan 28, 2019
226fac4
Merge pull request #94 from modular-magician/codegen-pr-1314
slevenick Jan 28, 2019
67a8582
Add global forwarding rule generation to InSpec
slevenick Jan 29, 2019
1753ef9
Merge pull request #95 from modular-magician/codegen-pr-1319
slevenick Jan 29, 2019
9ce89f7
Add support for target TCP proxy in InSpec
slevenick Jan 29, 2019
2c51de4
Merge pull request #96 from modular-magician/codegen-pr-1321
slevenick Jan 30, 2019
bf0e504
Inspec regional cluster
slevenick Jan 30, 2019
a562de7
Merge pull request #97 from modular-magician/codegen-pr-1295
slevenick Jan 30, 2019
ec04e25
Add InSpec support for compute routes
slevenick Jan 30, 2019
c18dd70
Merge pull request #98 from modular-magician/codegen-pr-1331
slevenick Jan 31, 2019
a167f4c
Update InSpec doc template to use underscored name in title box
slevenick Jan 31, 2019
7aceed0
Merge pull request #100 from modular-magician/codegen-pr-1333
slevenick Jan 31, 2019
73aaadb
Add router support in InSpec
slevenick Jan 31, 2019
df79fb9
Merge pull request #99 from modular-magician/codegen-pr-1332
slevenick Jan 31, 2019
507ad5c
Add support for InSpec disk snapshot
slevenick Feb 1, 2019
c3d9a69
Merge pull request #101 from modular-magician/codegen-pr-1343
slevenick Feb 1, 2019
858fa89
Inspec ssl certificate
slevenick Feb 2, 2019
55558ec
Merge pull request #102 from modular-magician/codegen-pr-1347
slevenick Feb 5, 2019
280de46
Fix InSpec pubsub subscription test
slevenick Feb 6, 2019
3608612
Merge pull request #103 from modular-magician/codegen-pr-1357
slevenick Feb 6, 2019
ed63fb1
InSpec add support for BigQuery Dataset
slevenick Feb 6, 2019
e12467d
Merge pull request #104 from modular-magician/codegen-pr-1358
slevenick Feb 8, 2019
a3bbe4b
Retrieve SOA record using DNS zone instead of building it from record…
matco Feb 12, 2019
ac3d1fd
Inspec nested refactor
slevenick Feb 13, 2019
8360494
Merge pull request #105 from modular-magician/codegen-pr-1368
slevenick Feb 13, 2019
009f814
Remove old nested objects with bad namespaces
slevenick Feb 13, 2019
c268f98
Add VCR back for unit testing in InSpec
slevenick Feb 13, 2019
a8cc444
Merge branch 'master' of https://github.com/inspec/inspec-gcp
slevenick Feb 13, 2019
e24b30c
Merge pull request #107 from modular-magician/codegen-pr-1373
slevenick Feb 13, 2019
519ebca
Add terraform upgrade to Rakefile
slevenick Feb 15, 2019
bf0cbf2
Templates, inspec.yaml for bigquery table
slevenick Feb 15, 2019
a9b2537
Merge pull request #110 from modular-magician/codegen-pr-1399
slevenick Feb 15, 2019
2688372
Retrieve SOA record using DNS zone instead of building it from record…
rambleraptor Feb 15, 2019
fb2b900
Add InSpec support for source repositories
slevenick Feb 19, 2019
28ec6a7
Add labels to Pubsub Subscription/Topics (#109)
modular-magician Feb 19, 2019
f5b6860
Update display names across products based on cloud.google.com (#106)
modular-magician Feb 19, 2019
3cf9d74
Merge branch 'master' into codegen-pr-1411
slevenick Feb 20, 2019
d627f42
Merge pull request #112 from modular-magician/codegen-pr-1411
slevenick Feb 20, 2019
b68cb8b
Add convenience outputs for public/private IP in Cloud SQL
rileykarson Feb 20, 2019
c5b2dec
Merge pull request #116 from modular-magician/codegen-pr-1417
nat-henderson Feb 20, 2019
1f43702
Merge remote-tracking branch 'origin/master' into gcp-master
slevenick Feb 25, 2019
e591cf4
Reset merge issues
slevenick Feb 25, 2019
bce4ef4
Add notes on API requirements to markdown docs for InSpec generated r…
slevenick Feb 28, 2019
a7b11d4
Merge pull request #119 from modular-magician/codegen-pr-1449
slevenick Feb 28, 2019
0c000e9
Improve docs for Cloud Build (#118)
modular-magician Mar 1, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions Gemfile
Original file line number Diff line number Diff line change
Expand Up @@ -13,5 +13,6 @@ group :development do
gem 'passgen'
gem 'pry-coolline'
gem 'rake'
gem 'vcr'
gem 'webmock'
end
2 changes: 1 addition & 1 deletion Rakefile
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ namespace :test do

task :init_workspace do
# Initialize terraform workspace
cmd = format("cd %s/build/ && terraform init", integration_dir)
cmd = format("cd %s/build/ && terraform init -upgrade", integration_dir)
sh(cmd)
end

Expand Down
6 changes: 6 additions & 0 deletions docs/resources/google_bigquery_dataset.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,3 +71,9 @@ Properties that can be accessed from the `google_bigquery_dataset` resource:
* `last_modified_time`: The date when this dataset or any of its tables was last modified, in milliseconds since the epoch.

* `location`: The geographic location where the dataset should reside. Possible values include EU and US. The default value is US.



## GCP Permissions

Ensure the [BigQuery API](https://console.cloud.google.com/apis/library/bigquery-json.googleapis.com/) is enabled for the current project.
4 changes: 4 additions & 0 deletions docs/resources/google_bigquery_datasets.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,3 +28,7 @@ See [google_bigquery_dataset.md](google_bigquery_dataset.md) for more detailed i
## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
with `where` as a block or a method.

## GCP Permissions

Ensure the [BigQuery API](https://console.cloud.google.com/apis/library/bigquery-json.googleapis.com/) is enabled for the current project.
117 changes: 117 additions & 0 deletions docs/resources/google_bigquery_table.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,117 @@
---
title: About the google_bigquery_table resource
platform: gcp
---

## Syntax
A `google_bigquery_table` is used to test a Google Table resource

## Examples
```
describe google_bigquery_table(project: 'chef-gcp-inspec', dataset: 'inspec_gcp_dataset', name: 'inspec_gcp_bigquery_table') do
it { should exist }

its('expiration_time') { should cmp '1738882264000' }
its('time_partitioning.type') { should eq 'DAY' }
its('description') { should eq 'A BigQuery table' }
end

describe google_bigquery_table(project: 'chef-gcp-inspec', dataset: 'inspec_gcp_dataset', name: 'nonexistent') do
it { should_not exist }
end
```

## Properties
Properties that can be accessed from the `google_bigquery_table` resource:

* `table_reference`: Reference describing the ID of this table

* `datasetId`: The ID of the dataset containing this table

* `projectId`: The ID of the project containing this table

* `tableId`: The ID of the the table

* `creation_time`: The time when this dataset was created, in milliseconds since the epoch.

* `description`: A user-friendly description of the dataset

* `friendly_name`: A descriptive name for this table

* `id`: An opaque ID uniquely identifying the table.

* `labels`: The labels associated with this dataset. You can use these to organize and group your datasets

* `last_modified_time`: The time when this table was last modified, in milliseconds since the epoch.

* `location`: The geographic location where the table resides. This value is inherited from the dataset.

* `name`: Name of the table

* `num_bytes`: The size of this table in bytes, excluding any data in the streaming buffer.

* `num_long_term_bytes`: The number of bytes in the table that are considered "long-term storage".

* `num_rows`: The number of rows of data in this table, excluding any data in the streaming buffer.

* `type`: Describes the table type

* `view`: The view definition.

* `useLegacySql`: Specifies whether to use BigQuery's legacy SQL for this view

* `userDefinedFunctionResources`: Describes user-defined function resources used in the query.

* `time_partitioning`: If specified, configures time-based partitioning for this table.

* `expirationMs`: Number of milliseconds for which to keep the storage for a partition.

* `type`: The only type supported is DAY, which will generate one partition per day.

* `streaming_buffer`: Contains information regarding this table's streaming buffer, if one is present. This field will be absent if the table is not being streamed to or if there is no data in the streaming buffer.

* `estimatedBytes`: A lower-bound estimate of the number of bytes currently in the streaming buffer.

* `estimatedRows`: A lower-bound estimate of the number of rows currently in the streaming buffer.

* `oldestEntryTime`: Contains the timestamp of the oldest entry in the streaming buffer, in milliseconds since the epoch, if the streaming buffer is available.

* `schema`: Describes the schema of this table

* `fields`: Describes the fields in a table.

* `encryption_configuration`: Custom encryption configuration

* `kmsKeyName`: Describes the Cloud KMS encryption key that will be used to protect destination BigQuery table. The BigQuery Service Account associated with your project requires access to this encryption key.

* `expiration_time`: The time when this table expires, in milliseconds since the epoch. If not present, the table will persist indefinitely.

* `external_data_configuration`: Describes the data format, location, and other properties of a table stored outside of BigQuery. By defining these properties, the data source can then be queried as if it were a standard BigQuery table.

* `autodetect`: Try to detect schema and format options automatically. Any option specified explicitly will be honored.

* `compression`: The compression type of the data source

* `ignoreUnknownValues`: Indicates if BigQuery should allow extra values that are not represented in the table schema

* `maxBadRecords`: The maximum number of bad records that BigQuery can ignore when reading data

* `sourceFormat`: The data format

* `sourceUris`: The fully-qualified URIs that point to your data in Google Cloud. For Google Cloud Storage URIs: Each URI can contain one '*' wildcard character and it must come after the 'bucket' name. Size limits related to load jobs apply to external data sources. For Google Cloud Bigtable URIs: Exactly one URI can be specified and it has be a fully specified and valid HTTPS URL for a Google Cloud Bigtable table. For Google Cloud Datastore backups, exactly one URI can be specified. Also, the '*' wildcard character is not allowed.

* `schema`: The schema for the data. Schema is required for CSV and JSON formats

* `googleSheetsOptions`: Additional options if sourceFormat is set to GOOGLE_SHEETS.

* `csvOptions`: Additional properties to set if sourceFormat is set to CSV.

* `bigtableOptions`: Additional options if sourceFormat is set to BIGTABLE.

* `dataset`: Name of the dataset



## GCP Permissions

Ensure the [BigQuery API](https://console.cloud.google.com/apis/library/bigquery-json.googleapis.com/) is enabled for the current project.
51 changes: 51 additions & 0 deletions docs/resources/google_bigquery_tables.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
---
title: About the google_bigquery_tables resource
platform: gcp
---

## Syntax
A `google_bigquery_tables` is used to test a Google Table resource

## Examples
```
describe.one do
google_bigquery_tables(project: 'chef-gcp-inspec', dataset: 'inspec_gcp_dataset').table_references.each do |table_reference|
describe google_bigquery_table(project: 'chef-gcp-inspec', dataset: 'inspec_gcp_dataset', name: table_reference.table_id) do
its('expiration_time') { should cmp '1738882264000' }
its('description') { should eq 'A BigQuery table' }
end
end
end
```

## Properties
Properties that can be accessed from the `google_bigquery_tables` resource:

See [google_bigquery_table.md](google_bigquery_table.md) for more detailed information
* `table_references`: an array of `google_bigquery_table` table_reference
* `creation_times`: an array of `google_bigquery_table` creation_time
* `friendly_names`: an array of `google_bigquery_table` friendly_name
* `ids`: an array of `google_bigquery_table` id
* `labels`: an array of `google_bigquery_table` labels
* `last_modified_times`: an array of `google_bigquery_table` last_modified_time
* `locations`: an array of `google_bigquery_table` location
* `num_bytes`: an array of `google_bigquery_table` num_bytes
* `num_long_term_bytes`: an array of `google_bigquery_table` num_long_term_bytes
* `num_rows`: an array of `google_bigquery_table` num_rows
* `types`: an array of `google_bigquery_table` type
* `views`: an array of `google_bigquery_table` view
* `time_partitionings`: an array of `google_bigquery_table` time_partitioning
* `streaming_buffers`: an array of `google_bigquery_table` streaming_buffer
* `schemas`: an array of `google_bigquery_table` schema
* `encryption_configurations`: an array of `google_bigquery_table` encryption_configuration
* `expiration_times`: an array of `google_bigquery_table` expiration_time
* `external_data_configurations`: an array of `google_bigquery_table` external_data_configuration
* `datasets`: an array of `google_bigquery_table` dataset

## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
with `where` as a block or a method.

## GCP Permissions

Ensure the [BigQuery API](https://console.cloud.google.com/apis/library/bigquery-json.googleapis.com/) is enabled for the current project.
16 changes: 11 additions & 5 deletions docs/resources/google_cloudbuild_trigger.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Properties that can be accessed from the `google_cloudbuild_trigger` resource:

* `substitutions`: Substitutions data for Build resource.

* `filename`: Path, from the source root, to a file whose contents is used for the template.
* `filename`: Path, from the source root, to a file whose contents is used for the template. Either a filename or build template must be provided.

* `ignored_files`: ignoredFiles and includedFiles are file glob matches using http://godoc/pkg/path/filepath#Match extended with support for `**`. If ignoredFiles and changed files are both empty, then they are not used to determine whether or not to trigger a build. If ignoredFiles is not empty, then we ignore any files that match any of the ignored_file globs. If the change has no files that are outside of the ignoredFiles globs, then we do not trigger a build.

Expand All @@ -49,16 +49,22 @@ Properties that can be accessed from the `google_cloudbuild_trigger` resource:

* `dir`: Directory, relative to the source root, in which to run the build. This must be a relative path. If a step's dir is specified and is an absolute path, this value is ignored for that step's execution.

* `branchName`: Name of the branch to build.
* `branchName`: Name of the branch to build. Exactly one a of branch name, tag, or commit SHA must be provided.

* `tagName`: Name of the tag to build.
* `tagName`: Name of the tag to build. Exactly one of a branch name, tag, or commit SHA must be provided.

* `commitSha`: Explicit commit SHA to build.
* `commitSha`: Explicit commit SHA to build. Exactly one of a branch name, tag, or commit SHA must be provided.

* `build`: Contents of the build template.
* `build`: Contents of the build template. Either a filename or build template must be provided.

* `tags`: Tags for annotation of a Build. These are not docker tags.

* `images`: A list of images to be pushed upon the successful completion of all build steps. The images are pushed using the builder service account's credentials. The digests of the pushed images will be stored in the Build resource's results field. If any of the images fail to be pushed, the build status is marked FAILURE.

* `steps`: The operations to be performed on the workspace.



## GCP Permissions

Ensure the [Cloud Build API](https://console.cloud.google.com/apis/library/cloudbuild.googleapis.com/) is enabled for the current project.
4 changes: 4 additions & 0 deletions docs/resources/google_cloudbuild_triggers.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,3 +40,7 @@ See [google_cloudbuild_trigger.md](google_cloudbuild_trigger.md) for more detail
## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
with `where` as a block or a method.

## GCP Permissions

Ensure the [Cloud Build API](https://console.cloud.google.com/apis/library/cloudbuild.googleapis.com/) is enabled for the current project.
6 changes: 6 additions & 0 deletions docs/resources/google_compute_autoscaler.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,3 +51,9 @@ Properties that can be accessed from the `google_compute_autoscaler` resource:
* `target`: URL of the managed instance group that this autoscaler will scale.

* `zone`: URL of the zone where the instance group resides.



## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
4 changes: 4 additions & 0 deletions docs/resources/google_compute_autoscalers.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,3 +36,7 @@ See [google_compute_autoscaler.md](google_compute_autoscaler.md) for more detail
## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
with `where` as a block or a method.

## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
6 changes: 6 additions & 0 deletions docs/resources/google_compute_backend_service.md
Original file line number Diff line number Diff line change
Expand Up @@ -88,3 +88,9 @@ Properties that can be accessed from the `google_compute_backend_service` resour
* `session_affinity`: Type of session affinity to use. The default is NONE. When the load balancing scheme is EXTERNAL, can be NONE, CLIENT_IP, or GENERATED_COOKIE. When the load balancing scheme is INTERNAL, can be NONE, CLIENT_IP, CLIENT_IP_PROTO, or CLIENT_IP_PORT_PROTO. When the protocol is UDP, this field is not used.

* `timeout_sec`: How many seconds to wait for the backend before considering it a failed request. Default is 30 seconds. Valid range is [1, 86400].



## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
4 changes: 4 additions & 0 deletions docs/resources/google_compute_backend_services.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,3 +42,7 @@ See [google_compute_backend_service.md](google_compute_backend_service.md) for m
## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
with `where` as a block or a method.

## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
6 changes: 6 additions & 0 deletions docs/resources/google_compute_disk.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,3 +91,9 @@ Properties that can be accessed from the `google_compute_disk` resource:
* `sha256`: The RFC 4648 base64 encoded SHA-256 hash of the customer-supplied encryption key that protects this resource.

* `source_snapshot_id`: The unique ID of the snapshot used to create this disk. This value identifies the exact snapshot that was used to create this persistent disk. For example, if you created the persistent disk from a snapshot that was later deleted and recreated under the same name, the source snapshot ID would identify the exact version of the snapshot that was used.



## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
4 changes: 4 additions & 0 deletions docs/resources/google_compute_disks.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,3 +44,7 @@ See [google_compute_disk.md](google_compute_disk.md) for more detailed informati
## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
with `where` as a block or a method.

## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
6 changes: 6 additions & 0 deletions docs/resources/google_compute_global_address.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,3 +36,9 @@ Properties that can be accessed from the `google_compute_global_address` resourc
* `region`: A reference to the region where the regional address resides.

* `address_type`: The type of the address to reserve, default is EXTERNAL. * EXTERNAL indicates public/external single IP address. * INTERNAL indicates internal IP ranges belonging to some network.



## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
4 changes: 4 additions & 0 deletions docs/resources/google_compute_global_addresses.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,3 +31,7 @@ See [google_compute_global_address.md](google_compute_global_address.md) for mor
## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
with `where` as a block or a method.

## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
6 changes: 6 additions & 0 deletions docs/resources/google_compute_global_forwarding_rule.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,3 +51,9 @@ Properties that can be accessed from the `google_compute_global_forwarding_rule`
* `region`: A reference to the region where the regional forwarding rule resides. This field is not applicable to global forwarding rules.

* `target`: This target must be a global load balancing resource. The forwarded traffic must be of a type appropriate to the target object. Valid types: HTTP_PROXY, HTTPS_PROXY, SSL_PROXY, TCP_PROXY



## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
4 changes: 4 additions & 0 deletions docs/resources/google_compute_global_forwarding_rules.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,3 +37,7 @@ See [google_compute_global_forwarding_rule.md](google_compute_global_forwarding_
## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
with `where` as a block or a method.

## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
6 changes: 6 additions & 0 deletions docs/resources/google_compute_health_check.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,3 +91,9 @@ Properties that can be accessed from the `google_compute_health_check` resource:
* `portName`: Port name as defined in InstanceGroup#NamedPort#name. If both port and port_name are defined, port takes precedence.

* `proxyHeader`: Specifies the type of proxy header to append before sending data to the backend, either NONE or PROXY_V1. The default is NONE.



## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
4 changes: 4 additions & 0 deletions docs/resources/google_compute_health_checks.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,3 +35,7 @@ See [google_compute_health_check.md](google_compute_health_check.md) for more de
## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
with `where` as a block or a method.

## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
6 changes: 6 additions & 0 deletions docs/resources/google_compute_http_health_check.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,3 +44,9 @@ Properties that can be accessed from the `google_compute_http_health_check` reso
* `timeout_sec`: How long (in seconds) to wait before claiming failure. The default value is 5 seconds. It is invalid for timeoutSec to have greater value than checkIntervalSec.

* `unhealthy_threshold`: A so-far healthy instance will be marked unhealthy after this many consecutive failures. The default value is 2.



## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
4 changes: 4 additions & 0 deletions docs/resources/google_compute_http_health_checks.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,3 +34,7 @@ See [google_compute_http_health_check.md](google_compute_http_health_check.md) f
## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
with `where` as a block or a method.

## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
6 changes: 6 additions & 0 deletions docs/resources/google_compute_https_health_check.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,3 +45,9 @@ Properties that can be accessed from the `google_compute_https_health_check` res
* `timeout_sec`: How long (in seconds) to wait before claiming failure. The default value is 5 seconds. It is invalid for timeoutSec to have greater value than checkIntervalSec.

* `unhealthy_threshold`: A so-far healthy instance will be marked unhealthy after this many consecutive failures. The default value is 2.



## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
4 changes: 4 additions & 0 deletions docs/resources/google_compute_https_health_checks.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,3 +34,7 @@ See [google_compute_https_health_check.md](google_compute_https_health_check.md)
## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
with `where` as a block or a method.

## GCP Permissions

Ensure the [Compute Engine API](https://console.cloud.google.com/apis/library/compute.googleapis.com/) is enabled for the current project.
Loading