Skip to content
This repository has been archived by the owner on Nov 14, 2024. It is now read-only.

InSpec add support for BigQuery Dataset #104

Merged
merged 1 commit into from
Feb 8, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
82 changes: 82 additions & 0 deletions docs/resources/google_bigquery_dataset.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
---
title: About the google_bigquery_dataset resource
platform: gcp
---

## Syntax
A `google_bigquery_dataset` is used to test a Google Dataset resource

## Examples
```
describe google_bigquery_dataset(project: 'chef-gcp-inspec', name: 'inspec_gcp_dataset') do
it { should exist }

its('friendly_name') { should eq 'A BigQuery dataset test' }
its('location') { should eq 'EU' }
its('description') { should eq 'Test BigQuery dataset description' }
its('default_table_expiration_ms') { should cmp '3600000' }
end

describe.one do
google_bigquery_dataset(project: 'chef-gcp-inspec', name: 'inspec_gcp_dataset').access.each do |dataset_access|
describe dataset_access do
its('role') { should eq 'READER' }
its('domain') { should eq 'example.com' }
end
end
end

describe.one do
google_bigquery_dataset(project: 'chef-gcp-inspec', name: 'inspec_gcp_dataset').access.each do |dataset_access|
describe dataset_access do
its('role') { should eq 'WRITER' }
its('special_group') { should eq 'projectWriters' }
end
end
end

describe google_bigquery_dataset(project: 'chef-gcp-inspec', name: 'nonexistent') do
it { should_not exist }
end
```

## Properties
Properties that can be accessed from the `google_bigquery_dataset` resource:

* `name`: Dataset name

* `access`: Access controls on the bucket.

* `domain`: A domain to grant access to. Any users signed in with the domain specified will be granted the specified access

* `groupByEmail`: An email address of a Google Group to grant access to

* `role`: Describes the rights granted to the user specified by the other member of the access object

* `specialGroup`: A special group to grant access to.

* `userByEmail`: An email address of a user to grant access to. For example: [email protected]

* `view`: A view from a different dataset to grant access to. Queries executed against that view will have read access to tables in this dataset. The role field is not required when this field is set. If that view is updated by any user, access to the view needs to be granted again via an update operation.

* `creation_time`: The time when this dataset was created, in milliseconds since the epoch.

* `dataset_reference`: A reference that identifies the dataset.

* `datasetId`: A unique ID for this dataset, without the project name. The ID must contain only letters (a-z, A-Z), numbers (0-9), or underscores (_). The maximum length is 1,024 characters.

* `projectId`: The ID of the project containing this dataset.

* `default_table_expiration_ms`: The default lifetime of all tables in the dataset, in milliseconds

* `description`: A user-friendly description of the dataset

* `friendly_name`: A descriptive name for the dataset

* `id`: The fully-qualified unique name of the dataset in the format projectId:datasetId. The dataset name without the project name is given in the datasetId field

* `labels`: The labels associated with this dataset. You can use these to organize and group your datasets

* `last_modified_time`: The date when this dataset or any of its tables was last modified, in milliseconds since the epoch.

* `location`: The geographic location where the dataset should reside. Possible values include EU and US. The default value is US.
30 changes: 30 additions & 0 deletions docs/resources/google_bigquery_datasets.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
---
title: About the google_bigquery_datasets resource
platform: gcp
---

## Syntax
A `google_bigquery_datasets` is used to test a Google Dataset resource

## Examples
```
describe google_bigquery_datasets(project: 'chef-gcp-inspec') do
its('count') { should be >= 1 }
its('friendly_names') { should include 'A BigQuery dataset test' }
its('locations') { should include 'EU' }
end
```

## Properties
Properties that can be accessed from the `google_bigquery_datasets` resource:

See [google_bigquery_dataset.md](google_bigquery_dataset.md) for more detailed information
* `dataset_references`: an array of `google_bigquery_dataset` dataset_reference
* `friendly_names`: an array of `google_bigquery_dataset` friendly_name
* `ids`: an array of `google_bigquery_dataset` id
* `labels`: an array of `google_bigquery_dataset` labels
* `locations`: an array of `google_bigquery_dataset` location

## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
with `where` as a block or a method.
53 changes: 53 additions & 0 deletions libraries/google/bigquery/property/dataset_access.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
# frozen_string_literal: false
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I looked through inspec-gcp and there's currently 48 of these property files (51 after this commit).

Do these need to be separate files? It might be easier to parse (and no require logic!) just to inline these directly into the controls. The property files aren't shared between resources (they're namespaced separately), so there's no real gain to having them as separate files that I can see.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They don't need to be separate files. I have a personal preference for having a single class in a file, but there is no technical reason they have to be separate. Do you believe it would make it easier to consolidate these within the resource?

I think it could be overwhelming on some of the larger, more nested resources.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I personally think it would be easier to manage the InSpec codebase at scale with just controls, as opposed to controls + property files.

If you think it'll be quick (and you have time), maybe mock something up so we can see what it'll look like?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Refactor of nested object structure happening in next PR due to unrelated namespacing issues, can explore there.


# ----------------------------------------------------------------------------
#
# *** AUTO GENERATED CODE *** AUTO GENERATED CODE ***
#
# ----------------------------------------------------------------------------
#
# This file is automatically generated by Magic Modules and manual
# changes will be clobbered when the file is regenerated.
#
# Please read more about how to change this file in README.md and
# CONTRIBUTING.md located at the root of this package.
#
# ----------------------------------------------------------------------------
require 'google/bigquery/property/dataset_view'
module GoogleInSpec
module BigQuery
module Property
class DatasetAccess
attr_reader :domain

attr_reader :group_by_email

attr_reader :role

attr_reader :special_group

attr_reader :user_by_email

attr_reader :view

def initialize(args = nil)
return if args.nil?
@domain = args['domain']
@group_by_email = args['groupByEmail']
@role = args['role']
@special_group = args['specialGroup']
@user_by_email = args['userByEmail']
@view = GoogleInSpec::BigQuery::Property::DatasetView.new(args['view'])
end
end

class DatasetAccessArray
def self.parse(value)
return if value.nil?
return DatasetAccess.new(value) unless value.is_a?(::Array)
value.map { |v| DatasetAccess.new(v) }
end
end
end
end
end
32 changes: 32 additions & 0 deletions libraries/google/bigquery/property/dataset_dataset_reference.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# frozen_string_literal: false

# ----------------------------------------------------------------------------
#
# *** AUTO GENERATED CODE *** AUTO GENERATED CODE ***
#
# ----------------------------------------------------------------------------
#
# This file is automatically generated by Magic Modules and manual
# changes will be clobbered when the file is regenerated.
#
# Please read more about how to change this file in README.md and
# CONTRIBUTING.md located at the root of this package.
#
# ----------------------------------------------------------------------------
module GoogleInSpec
module BigQuery
module Property
class DatasetDatasetreference
attr_reader :dataset_id

attr_reader :project_id

def initialize(args = nil)
return if args.nil?
@dataset_id = args['datasetId']
@project_id = args['projectId']
end
end
end
end
end
35 changes: 35 additions & 0 deletions libraries/google/bigquery/property/dataset_view.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# frozen_string_literal: false

# ----------------------------------------------------------------------------
#
# *** AUTO GENERATED CODE *** AUTO GENERATED CODE ***
#
# ----------------------------------------------------------------------------
#
# This file is automatically generated by Magic Modules and manual
# changes will be clobbered when the file is regenerated.
#
# Please read more about how to change this file in README.md and
# CONTRIBUTING.md located at the root of this package.
#
# ----------------------------------------------------------------------------
module GoogleInSpec
module BigQuery
module Property
class DatasetView
attr_reader :dataset_id

attr_reader :project_id

attr_reader :table_id

def initialize(args = nil)
return if args.nil?
@dataset_id = args['datasetId']
@project_id = args['projectId']
@table_id = args['tableId']
end
end
end
end
end
73 changes: 73 additions & 0 deletions libraries/google_bigquery_dataset.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
# frozen_string_literal: false

# ----------------------------------------------------------------------------
#
# *** AUTO GENERATED CODE *** AUTO GENERATED CODE ***
#
# ----------------------------------------------------------------------------
#
# This file is automatically generated by Magic Modules and manual
# changes will be clobbered when the file is regenerated.
#
# Please read more about how to change this file in README.md and
# CONTRIBUTING.md located at the root of this package.
#
# ----------------------------------------------------------------------------
require 'gcp_backend'
require 'google/bigquery/property/dataset_access'
require 'google/bigquery/property/dataset_dataset_reference'

# A provider to manage Google Cloud BigQuery resources.
class Dataset < GcpResourceBase
name 'google_bigquery_dataset'
desc 'Dataset'
supports platform: 'gcp'

attr_reader :name
attr_reader :access
attr_reader :creation_time
attr_reader :dataset_reference
attr_reader :default_table_expiration_ms
attr_reader :description
attr_reader :friendly_name
attr_reader :id
attr_reader :labels
attr_reader :last_modified_time
attr_reader :location
def base
'https://www.googleapis.com/bigquery/v2/'
end

def url
'projects/{{project}}/datasets/{{name}}'
end

def initialize(params)
super(params.merge({ use_http_transport: true }))
@fetched = @connection.fetch(base, url, params)
parse unless @fetched.nil?
end

def parse
@name = @fetched['name']
@access = GoogleInSpec::BigQuery::Property::DatasetAccessArray.parse(@fetched['access'])
@creation_time = @fetched['creationTime']
@dataset_reference = GoogleInSpec::BigQuery::Property::DatasetDatasetreference.new(@fetched['datasetReference'])
@default_table_expiration_ms = @fetched['defaultTableExpirationMs']
@description = @fetched['description']
@friendly_name = @fetched['friendlyName']
@id = @fetched['id']
@labels = @fetched['labels']
@last_modified_time = @fetched['lastModifiedTime']
@location = @fetched['location']
end

# Handles parsing RFC3339 time string
def parse_time_string(time_string)
time_string ? Time.parse(time_string) : nil
end

def exists?
[email protected]?
end
end
Loading