Skip to content

Commit

Permalink
feat: View rework part 2 (#3021)
Browse files Browse the repository at this point in the history
<!-- Feel free to delete comments as you fill this in -->
- adjust SDK
    - support modifying data metric functions
- adjust handling data metric schedule - before it was generated with
incorrect format
- add `data_metric_schedule` and `data_metric_functions` to resource
- remove `or_replace` (it's used automatically when `copy_grants` is
set) and `tag`, with state upgrader and migration test
- use new id handlers
- add data metric function references in SDK and use them in acceptance
tests helper
- remove `USE WAREHOUSE` setup from tests
- use `sdk.PolicyKind` for handling policy kinds in policy references
<!-- summary of changes -->

## Test Plan
<!-- detail ways in which this PR has been tested or needs to be tested
-->
* [x] acceptance tests
<!-- add more below if you think they are relevant -->
https://docs.snowflake.com/en/sql-reference/sql/create-view#examples

https://docs.snowflake.com/en/sql-reference/functions/data_metric_function_references

## References
<!-- issues documentation links, etc  -->

## TODO (next pr)
- add `columns`
- handle data metric function's schedule status
- adjust views data source

---------

Co-authored-by: Jan Cieślak <[email protected]>
  • Loading branch information
sfc-gh-jmichalak and sfc-gh-jcieslak authored Aug 29, 2024
1 parent 1772387 commit e05377d
Show file tree
Hide file tree
Showing 40 changed files with 1,307 additions and 506 deletions.
7 changes: 5 additions & 2 deletions MIGRATION_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,13 +30,16 @@ New fields:
- `change_tracking`
- `is_recursive`
- `is_temporary`
- `data_metric_schedule`
- `data_metric_function`
- added `show_output` field that holds the response from SHOW VIEWS.
- added `describe_output` field that holds the response from DESCRIBE VIEW. Note that one needs to grant sufficient privileges e.g. with [grant_ownership](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/resources/grant_ownership) on the tables used in this view. Otherwise, this field is not filled.

#### *(breaking change)* Removed fields from snowflake_view resource
Removed fields:
- `tag`
The value of this field will be removed from the state automatically. Please, use [tag_association](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/resources/tag_association) instead.
- `or_replace` - `OR REPLACE` is added by the provider automatically when `copy_grants` is set to `"true"`
- `tag` - Please, use [tag_association](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/resources/tag_association) instead.
The value of these field will be removed from the state automatically.

#### *(breaking change)* Required warehouse
For this resource, the provider now uses [policy references](https://docs.snowflake.com/en/sql-reference/functions/policy_references) which requires a warehouse in the connection. Please, make sure you have either set a DEFAULT_WAREHOUSE for the user, or specified a warehouse in the provider configuration.
Expand Down
37 changes: 31 additions & 6 deletions docs/resources/view.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ resource "snowflake_view" "view" {
select * from foo;
SQL
}
# resource with attached policies
# resource with attached policies and data metric functions
resource "snowflake_view" "test" {
database = "database"
schema = "schema"
Expand All @@ -55,8 +55,15 @@ resource "snowflake_view" "test" {
policy_name = "aggregation_policy"
entity_key = ["id"]
}
data_metric_function {
function_name = "data_metric_function"
on = ["id"]
}
data_metric_schedule {
using_cron = "15 * * * * UTC"
}
statement = <<-SQL
select id from foo;
SELECT id FROM TABLE;
SQL
}
```
Expand All @@ -78,11 +85,12 @@ SQL
- `aggregation_policy` (Block List, Max: 1) Specifies the aggregation policy to set on a view. (see [below for nested schema](#nestedblock--aggregation_policy))
- `change_tracking` (String) Specifies to enable or disable change tracking on the table. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value.
- `comment` (String) Specifies a comment for the view.
- `copy_grants` (Boolean) Retains the access permissions from the original view when a new view is created using the OR REPLACE clause. OR REPLACE must be set when COPY GRANTS is set.
- `copy_grants` (Boolean) Retains the access permissions from the original view when a new view is created using the OR REPLACE clause.
- `data_metric_function` (Block Set) Data metric functions used for the view. (see [below for nested schema](#nestedblock--data_metric_function))
- `data_metric_schedule` (Block List, Max: 1) Specifies the schedule to run the data metric functions periodically. (see [below for nested schema](#nestedblock--data_metric_schedule))
- `is_recursive` (String) Specifies that the view can refer to itself using recursive syntax without necessarily using a CTE (common table expression). Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value.
- `is_secure` (String) Specifies that the view is secure. By design, the Snowflake's `SHOW VIEWS` command does not provide information about secure views (consult [view usage notes](https://docs.snowflake.com/en/sql-reference/sql/create-view#usage-notes)) which is essential to manage/import view with Terraform. Use the role owning the view while managing secure views. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value.
- `is_temporary` (String) Specifies that the view persists only for the duration of the session that you created it in. A temporary view and all its contents are dropped at the end of the session. In context of this provider, it means that it's dropped after a Terraform operation. This results in a permanent plan with object creation. Available options are: "true" or "false". When the value is not set in the configuration the provider will put "default" there which means to use the Snowflake default for this value.
- `or_replace` (Boolean) Overwrites the View if it exists.
- `row_access_policy` (Block List, Max: 1) Specifies the row access policy to set on a view. (see [below for nested schema](#nestedblock--row_access_policy))

### Read-Only
Expand All @@ -104,6 +112,24 @@ Optional:
- `entity_key` (Set of String) Defines which columns uniquely identify an entity within the view.


<a id="nestedblock--data_metric_function"></a>
### Nested Schema for `data_metric_function`

Required:

- `function_name` (String) Identifier of the data metric function to add to the table or view or drop from the table or view. This function identifier must be provided without arguments in parenthesis.
- `on` (Set of String) The table or view columns on which to associate the data metric function. The data types of the columns must match the data types of the columns specified in the data metric function definition.


<a id="nestedblock--data_metric_schedule"></a>
### Nested Schema for `data_metric_schedule`

Optional:

- `minutes` (Number) Specifies an interval (in minutes) of wait time inserted between runs of the data metric function. Conflicts with `using_cron`. Valid values are: `5` | `15` | `30` | `60` | `720` | `1440`. Due to Snowflake limitations, changes in this field is not managed by the provider. Please consider using [taint](https://developer.hashicorp.com/terraform/cli/commands/taint) command, `using_cron` field, or [replace_triggered_by](https://developer.hashicorp.com/terraform/language/meta-arguments/lifecycle#replace_triggered_by) metadata argument.
- `using_cron` (String) Specifies a cron expression and time zone for periodically running the data metric function. Supports a subset of standard cron utility syntax. Conflicts with `minutes`.


<a id="nestedblock--row_access_policy"></a>
### Nested Schema for `row_access_policy`

Expand Down Expand Up @@ -156,6 +182,5 @@ Read-Only:
Import is supported using the following syntax:

```shell
# format is database name | schema name | view name
terraform import snowflake_view.example 'dbName|schemaName|viewName'
terraform import snowflake_view.example '"<database_name>"."<schema_name>"."<view_name>"'
```
3 changes: 1 addition & 2 deletions examples/resources/snowflake_view/import.sh
Original file line number Diff line number Diff line change
@@ -1,2 +1 @@
# format is database name | schema name | view name
terraform import snowflake_view.example 'dbName|schemaName|viewName'
terraform import snowflake_view.example '"<database_name>"."<schema_name>"."<view_name>"'
11 changes: 9 additions & 2 deletions examples/resources/snowflake_view/resource.tf
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ resource "snowflake_view" "view" {
select * from foo;
SQL
}
# resource with attached policies
# resource with attached policies and data metric functions
resource "snowflake_view" "test" {
database = "database"
schema = "schema"
Expand All @@ -35,7 +35,14 @@ resource "snowflake_view" "test" {
policy_name = "aggregation_policy"
entity_key = ["id"]
}
data_metric_function {
function_name = "data_metric_function"
on = ["id"]
}
data_metric_schedule {
using_cron = "15 * * * * UTC"
}
statement = <<-SQL
select id from foo;
SELECT id FROM TABLE;
SQL
}

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions pkg/acceptance/bettertestspoc/config/model/view_model_gen.go

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

27 changes: 5 additions & 22 deletions pkg/acceptance/helpers/data_metric_function_references_client.go
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@ package helpers

import (
"context"
"fmt"
"testing"

"github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/sdk"
"github.com/stretchr/testify/require"
)

type DataMetricFunctionReferencesClient struct {
Expand All @@ -19,29 +19,12 @@ func NewDataMetricFunctionReferencesClient(context *TestClientContext) *DataMetr
}

// GetDataMetricFunctionReferences is based on https://docs.snowflake.com/en/sql-reference/functions/data_metric_function_references.
func (c *DataMetricFunctionReferencesClient) GetDataMetricFunctionReferences(t *testing.T, id sdk.SchemaObjectIdentifier, objectType sdk.ObjectType) ([]DataMetricFunctionReference, error) {
func (c *DataMetricFunctionReferencesClient) GetDataMetricFunctionReferences(t *testing.T, id sdk.SchemaObjectIdentifier, domain sdk.DataMetricFuncionRefEntityDomainOption) []sdk.DataMetricFunctionReference {
t.Helper()
ctx := context.Background()

s := []DataMetricFunctionReference{}
dmfReferencesId := sdk.NewSchemaObjectIdentifier(id.DatabaseName(), "INFORMATION_SCHEMA", "DATA_METRIC_FUNCTION_REFERENCES")
err := c.context.client.QueryForTests(ctx, &s, fmt.Sprintf(`SELECT * FROM TABLE(%s(REF_ENTITY_NAME => '%s', REF_ENTITY_DOMAIN => '%v'))`, dmfReferencesId.FullyQualifiedName(), id.FullyQualifiedName(), objectType))
refs, err := c.context.client.DataMetricFunctionReferences.GetForEntity(ctx, sdk.NewGetForEntityDataMetricFunctionReferenceRequest(id, domain))
require.NoError(t, err)

return s, err
}

type DataMetricFunctionReference struct {
MetricDatabaseName string `db:"METRIC_DATABASE_NAME"`
MetricSchemaName string `db:"METRIC_SCHEMA_NAME"`
MetricName string `db:"METRIC_NAME"`
MetricSignature string `db:"METRIC_SIGNATURE"`
MetricDataType string `db:"METRIC_DATA_TYPE"`
RefEntityDatabaseName string `db:"REF_ENTITY_DATABASE_NAME"`
RefEntitySchemaName string `db:"REF_ENTITY_SCHEMA_NAME"`
RefEntityName string `db:"REF_ENTITY_NAME"`
RefEntityDomain string `db:"REF_ENTITY_DOMAIN"`
RefArguments string `db:"REF_ARGUMENTS"`
RefId string `db:"REF_ID"`
Schedule string `db:"SCHEDULE"`
ScheduleStatus string `db:"SCHEDULE_STATUS"`
return refs
}
2 changes: 1 addition & 1 deletion pkg/acceptance/importchecks/import_checks.go
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ func TestCheckResourceAttrInstanceState(id string, attributeName, attributeValue

if attrVal, ok := v.Attributes[attributeName]; ok {
if attrVal != attributeValue {
return fmt.Errorf("expected: %s, got: %s", attributeValue, attrVal)
return fmt.Errorf("invalid value for attribute %s - expected: %s, got: %s", attributeName, attributeValue, attrVal)
}

return nil
Expand Down
8 changes: 1 addition & 7 deletions pkg/datasources/views_acceptance_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -41,23 +41,17 @@ func TestAcc_Views(t *testing.T) {

func views(viewId sdk.SchemaObjectIdentifier) string {
return fmt.Sprintf(`
resource "snowflake_unsafe_execute" "use_warehouse" {
execute = "USE WAREHOUSE \"%v\""
revert = "SELECT 1"
}
resource snowflake_view "v"{
name = "%v"
schema = "%v"
database = "%v"
statement = "SELECT ROLE_NAME, ROLE_OWNER FROM INFORMATION_SCHEMA.APPLICABLE_ROLES where ROLE_OWNER like 'foo%%'"
depends_on = [snowflake_unsafe_execute.use_warehouse]
}
data snowflake_views "v" {
database = snowflake_view.v.database
schema = snowflake_view.v.schema
depends_on = [snowflake_view.v]
}
`, acc.TestWarehouseName, viewId.Name(), viewId.SchemaName(), viewId.DatabaseName())
`, viewId.Name(), viewId.SchemaName(), viewId.DatabaseName())
}
4 changes: 2 additions & 2 deletions pkg/resources/doc_helpers.go
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,10 @@ import (
"strings"
)

func possibleValuesListed[T ~string](values []T) string {
func possibleValuesListed[T ~string | ~int](values []T) string {
valuesWrapped := make([]string, len(values))
for i, value := range values {
valuesWrapped[i] = fmt.Sprintf("`%s`", value)
valuesWrapped[i] = fmt.Sprintf("`%v`", value)
}
return strings.Join(valuesWrapped, " | ")
}
Expand Down
10 changes: 9 additions & 1 deletion pkg/resources/doc_helpers_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -6,14 +6,22 @@ import (
"github.com/stretchr/testify/assert"
)

func Test_PossibleValuesListed(t *testing.T) {
func Test_PossibleValuesListedStrings(t *testing.T) {
values := []string{"abc", "DEF"}

result := possibleValuesListed(values)

assert.Equal(t, "`abc` | `DEF`", result)
}

func Test_PossibleValuesListedInts(t *testing.T) {
values := []int{42, 21}

result := possibleValuesListed(values)

assert.Equal(t, "`42` | `21`", result)
}

func Test_PossibleValuesListed_empty(t *testing.T) {
var values []string

Expand Down
7 changes: 7 additions & 0 deletions pkg/resources/testdata/TestAcc_View/basic_update/test.tf
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,13 @@ resource "snowflake_view" "test" {
policy_name = var.aggregation_policy
entity_key = var.aggregation_policy_entity_key
}
data_metric_function {
function_name = var.data_metric_function
on = var.data_metric_function_on
}
data_metric_schedule {
using_cron = var.data_metric_schedule_using_cron
}
statement = var.statement
comment = var.comment
}
12 changes: 12 additions & 0 deletions pkg/resources/testdata/TestAcc_View/basic_update/variables.tf
Original file line number Diff line number Diff line change
Expand Up @@ -33,3 +33,15 @@ variable "aggregation_policy_entity_key" {
variable "comment" {
type = string
}

variable "data_metric_schedule_using_cron" {
type = string
}

variable "data_metric_function" {
type = string
}

variable "data_metric_function_on" {
type = list(string)
}
17 changes: 8 additions & 9 deletions pkg/resources/testdata/TestAcc_View/complete/test.tf
Original file line number Diff line number Diff line change
Expand Up @@ -4,24 +4,23 @@ resource "snowflake_view" "test" {
database = var.database
schema = var.schema
is_secure = var.is_secure
or_replace = var.or_replace
copy_grants = var.copy_grants
change_tracking = var.change_tracking
is_temporary = var.is_temporary
data_metric_function {
function_name = var.data_metric_function
on = var.data_metric_function_on
}
data_metric_schedule {
using_cron = var.data_metric_schedule_using_cron
}
row_access_policy {
policy_name = var.row_access_policy
on = var.row_access_policy_on

}
aggregation_policy {
policy_name = var.aggregation_policy
entity_key = var.aggregation_policy_entity_key
}
statement = var.statement
depends_on = [snowflake_unsafe_execute.use_warehouse]
}

resource "snowflake_unsafe_execute" "use_warehouse" {
execute = "USE WAREHOUSE \"${var.warehouse}\""
revert = "SELECT 1"
statement = var.statement
}
16 changes: 12 additions & 4 deletions pkg/resources/testdata/TestAcc_View/complete/variables.tf
Original file line number Diff line number Diff line change
Expand Up @@ -22,10 +22,6 @@ variable "change_tracking" {
type = string
}

variable "or_replace" {
type = bool
}

variable "copy_grants" {
type = bool
}
Expand Down Expand Up @@ -57,3 +53,15 @@ variable "statement" {
variable "warehouse" {
type = string
}

variable "data_metric_schedule_using_cron" {
type = string
}

variable "data_metric_function" {
type = string
}

variable "data_metric_function_on" {
type = list(string)
}
2 changes: 1 addition & 1 deletion pkg/resources/user_password_policy_attachment.go
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@ func ReadUserPasswordPolicyAttachment(d *schema.ResourceData, meta any) error {

passwordPolicyReferences := make([]sdk.PolicyReference, 0)
for _, policyReference := range policyReferences {
if policyReference.PolicyKind == "PASSWORD_POLICY" {
if policyReference.PolicyKind == sdk.PolicyKindPasswordPolicy {
passwordPolicyReferences = append(passwordPolicyReferences, policyReference)
}
}
Expand Down
Loading

0 comments on commit e05377d

Please sign in to comment.