Skip to content

Latest commit

 

History

History
212 lines (166 loc) · 19.8 KB

README.md

File metadata and controls

212 lines (166 loc) · 19.8 KB

Snowflake Schema Terraform Module

Snowflake Terraform

License Release

We help companies turn their data into assets


Terraform module for Snowflake schema management.

  • Creates Snowflake schema
  • Can create custom Snowflake database roles with role-to-role assignments
  • Can create a set of default database roles to simplify access management:
    • READONLY - granted select on all tables and views and usage on some objects in the schema
    • READWRITE - granted write type grants on tables and stages. Additionally, allows calling procedures and tasks in the schema
    • TRANSFORMER - Allows creating tables and views on top of the READWRITE role privileges
    • ADMIN - Full access, including setting schema options like data_retention_days
    • Each role can be disabled seperately by setting enabled to false - see Complete example
    • Each role can be additionally modified - see Complete example

USAGE

module "snowflake_schema" {
  source = "github.com/getindata/terraform-snowflake-schema"
  # version = "x.x.x"

  name     = "MY_SCHEMA"
  database = "MY_DB"

  is_managed          = false
  is_transient        = false
  data_retention_days = 1

  create_default_database_roles = true
}

EXAMPLES

  • Complete - Advanced usage of the module
  • Simple - Basic usage of the module

Breaking changes in v2.x of the module

Due to breaking changes in Snowflake provider and additional code optimizations, breaking changes were introduced in v2.0.0 version of this module.

List of code and variable (API) changes:

  • Switched to snowflake_database_role module to leverage new database_roles mechanism
  • stage module version was updated (v2.1.0) to use newly introduced changes by Snowflake provider
  • overhaul of roles and stages variables
  • variable add_grants_to_existing_objects was removed as it is no longer needed

When upgrading from v1.x, expect most of the resources to be recreated - if recreation is impossible, then it is possible to import some existing resources.

For more information, refer to variables.tf, list of inputs below and Snowflake provider documentation

Breaking changes in v3.x of the module

Due to replacement of nulllabel (context.tf) with context provider, some breaking changes were introduced in v3.0.0 version of this module.

List od code and variable (API) changes:

  • Removed context.tf file (a single-file module with additonal variables), which implied a removal of all its variables (except name):
    • descriptor_formats
    • label_value_case
    • label_key_case
    • id_length_limit
    • regex_replace_chars
    • label_order
    • additional_tag_map
    • tags
    • labels_as_tags
    • attributes
    • delimiter
    • stage
    • environment
    • tenant
    • namespace
    • enabled
    • context
  • Remove support enabled flag - that might cause some backward compatibility issues with terraform state (please take into account that proper move clauses were added to minimize the impact), but proceed with caution
  • Additional context provider configuration
  • New variables were added, to allow naming configuration via context provider:
    • context_templates
    • name_schema

Inputs

Name Description Type Default Required
catalog Parameter that specifies the default catalog to use for Iceberg tables. string null no
comment Specifies a comment for the schema string null no
context_templates Map of context templates used for naming conventions - this variable supersedes naming_scheme.properties and naming_scheme.delimiter configuration map(string) {} no
create_default_roles Whether the default database roles should be created bool false no
data_retention_time_in_days Specifies the number of days for which Time Travel actions (CLONE and UNDROP) can be performed on the schema,
as well as specifying the default Time Travel retention time for all tables created in the schema
number 1 no
database Database where the schema should be created string n/a yes
default_ddl_collation Specifies a default collation specification for all schemas and tables added to the database.
It can be overridden on schema or table level.
string null no
enable_console_output Enables console output for user tasks. bool null no
external_volume Parameter that specifies the default external volume to use for Iceberg tables. string null no
is_transient Specifies a schema as transient.
Transient schemas do not have a Fail-safe period so they do not incur additional storage costs once they leave Time Travel;
however, this means they are also not protected by Fail-safe in the event of a data loss.
bool false no
log_level Specifies the severity level of messages that should be ingested and made available in the active event table.
Valid options are: [TRACE DEBUG INFO WARN ERROR FATAL OFF].
Messages at the specified level (and at more severe levels) are ingested.
string null no
max_data_extension_time_in_days Object parameter that specifies the maximum number of days for which Snowflake can extend the data retention period
for tables in the database to prevent streams on the tables from becoming stale.
number null no
name Name of the resource string n/a yes
name_scheme Naming scheme configuration for the resource. This configuration is used to generate names using context provider:
- properties - list of properties to use when creating the name - is superseded by var.context_templates
- delimiter - delimited used to create the name from properties - is superseded by var.context_templates
- context_template_name - name of the context template used to create the name
- replace_chars_regex - regex to use for replacing characters in property-values created by the provider - any characters that match the regex will be removed from the name
- extra_values - map of extra label-value pairs, used to create a name
object({
properties = optional(list(string), ["name"])
delimiter = optional(string, "_")
context_template_name = optional(string, "snowflake-schema")
replace_chars_regex = optional(string, "[^a-zA-Z0-9_]")
extra_values = optional(map(string))
})
{} no
pipe_execution_paused Pauses the execution of a pipe. bool null no
quoted_identifiers_ignore_case If true, the case of quoted identifiers is ignored. bool null no
replace_invalid_characters Specifies whether to replace invalid UTF-8 characters with the Unicode replacement character () in query results for an Iceberg table.
You can only set this parameter for tables that use an external Iceberg catalog.
bool null no
roles Database roles created in the scheme scope
map(object({
name_scheme = optional(object({
properties = optional(list(string))
delimiter = optional(string)
context_template_name = optional(string)
replace_chars_regex = optional(string)
extra_labels = optional(map(string))
}))
role_ownership_grant = optional(string)
granted_to_roles = optional(list(string))
granted_to_database_roles = optional(list(string))
granted_database_roles = optional(list(string))
schema_grants = optional(list(object({
all_privileges = optional(bool)
with_grant_option = optional(bool, false)
privileges = optional(list(string), null)
})))
schema_objects_grants = optional(map(list(object({
all_privileges = optional(bool)
with_grant_option = optional(bool)
privileges = optional(list(string), null)
object_name = optional(string)
on_all = optional(bool, false)
on_future = optional(bool, false)
}))), {})
}))
{} no
skip_schema_creation Should schema creation be skipped but allow all other resources to be created.
Useful if schema already exsists but you want to add e.g. access roles."
bool false no
stages Stages to be created in the schema
map(object({
name_scheme = optional(object({
properties = optional(list(string))
delimiter = optional(string)
context_template_name = optional(string)
replace_chars_regex = optional(string)
extra_labels = optional(map(string))
}))
aws_external_id = optional(string)
comment = optional(string)
copy_options = optional(string)
credentials = optional(string)
directory = optional(string)
encryption = optional(string)
file_format = optional(string)
snowflake_iam_user = optional(string)
storage_integration = optional(string)
url = optional(string)
create_default_roles = optional(bool)
roles = optional(map(object({
name_scheme = optional(object({
properties = optional(list(string))
delimiter = optional(string)
context_template_name = optional(string)
replace_chars_regex = optional(string)
extra_labels = optional(map(string))
}))
with_grant_option = optional(bool)
granted_to_roles = optional(list(string))
granted_to_database_roles = optional(list(string))
granted_database_roles = optional(list(string))
stage_grants = optional(list(string))
all_privileges = optional(bool)
})), {})
}))
{} no
storage_serialization_policy The storage serialization policy for Iceberg tables that use Snowflake as the catalog.
Valid options are: [COMPATIBLE OPTIMIZED].
string null no
suspend_task_after_num_failures How many times a task must fail in a row before it is automatically suspended. 0 disables auto-suspending. number null no
task_auto_retry_attempts Maximum automatic retries allowed for a user task. number null no
trace_level Controls how trace events are ingested into the event table.
Valid options are: [ALWAYS ON_EVENT OFF]."
string null no
user_task_managed_initial_warehouse_size The initial size of warehouse to use for managed warehouses in the absence of history. string null no
user_task_minimum_trigger_interval_in_seconds Minimum amount of time between Triggered Task executions in seconds. number null no
user_task_timeout_ms User task execution timeout in milliseconds. number null no
with_managed_access Specifies a managed schema. Managed access schemas centralize privilege management with the schema owner bool false no

Modules

Name Source Version
roles_deep_merge Invicton-Labs/deepmerge/null 0.1.5
snowflake_custom_role getindata/database-role/snowflake 2.0.1
snowflake_default_role getindata/database-role/snowflake 2.0.1
snowflake_stage getindata/stage/snowflake 3.0.0

Outputs

Name Description
database Database where the schema is deployed to
database_roles Snowflake Database Roles
name Name of the schema
schema_catalog Catalog for the schema
schema_comment Comment of the schema
schema_data_retention_time_in_days Data retention time in days for the schema
schema_database Database where the schema is deployed to
schema_default_ddl_collation Default DDL collation for the schema
schema_enable_console_output Whether to enable console output for the schema
schema_external_volume External volume for the schema
schema_is_transient Is the schema transient
schema_log_level Log level for the schema
schema_max_data_extension_time_in_days Maximum data extension time in days for the schema
schema_pipe_execution_paused Whether to pause pipe execution for the schema
schema_quoted_identifiers_ignore_case Whether to ignore case for quoted identifiers for the schema
schema_replace_invalid_characters Whether to replace invalid characters for the schema
schema_storage_serialization_policy Storage serialization policy for the schema
schema_suspend_task_after_num_failures Number of task failures after which to suspend the task for the schema
schema_task_auto_retry_attempts Number of task auto retry attempts for the schema
schema_trace_level Trace level for the schema
schema_user_task_managed_initial_warehouse_size User task managed initial warehouse size for the schema
schema_user_task_minimum_trigger_interval_in_seconds User task minimum trigger interval in seconds for the schema
schema_user_task_timeout_ms User task timeout in milliseconds for the schema
schema_with_managed_access Whether the schema has managed access
stages Schema stages

Providers

Name Version
context >=0.4.0
snowflake ~> 0.95

Requirements

Name Version
terraform >= 1.3
context >=0.4.0
snowflake ~> 0.95

Resources

Name Type
snowflake_schema.this resource
context_label.this data source

CONTRIBUTING

Contributions are very welcomed!

Start by reviewing contribution guide and our code of conduct. After that, start coding and ship your changes by creating a new PR.

LICENSE

Apache 2 Licensed. See LICENSE for full details.

AUTHORS

Made with contrib.rocks.