Skip to content

Commit

Permalink
fix: go casting errors + additional fixes (#2743)
Browse files Browse the repository at this point in the history
Done in this pr:
- Fixed a few Go casting errors. (actually just two in
`external_oauth_integraiton`)
- Other casts like identifiers were adjusted by adding schema validation
where known object type's is expected (e.g. `table_id`). Also introduced
parsing function that will fail in Create operation (just in case).
- **Question:** Other castings were not adjusted as they were casting
the `d.Id()` in Update/Read/Delete operations and every single of them
was correct (I was hesitating because those castings were not in Create,
but can replace old parsing function the new one: `id :=
helpers.DecodeSnowflakeID(d.Id()).(sdk.SomeIdentifier)` would translate
to `id, err :=
helpers.SafelyDecodeSnowflakeID[sdk.SomeIdentifier](d.Id())`. Should we
do it or before identifiers rework just leave it as is ?
- When going through the resources and data sources also fixed.
  - Failover groups and unsetting `replication_schedule`.
- Table constraint and required field being `optional` (actually no
behavior change, because without the `references` block it would crash
anyway, but now the error makes more sense with the `required` flag).
- Resource monitor (followed IDE hint and added nil check as the value
could be not set in the previous lines).

## Test Plan
<!-- detail ways in which this PR has been tested or needs to be tested
-->
* [x] acceptance tests added for failover groups and table constraint
* [x] unit and integration also added for failover groups and new unset
option
* [x] units for new "decoding" helper function
  • Loading branch information
sfc-gh-jcieslak authored May 7, 2024
1 parent 68d0318 commit 66e380d
Show file tree
Hide file tree
Showing 20 changed files with 314 additions and 106 deletions.
21 changes: 21 additions & 0 deletions MIGRATION_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,27 @@ While solving issue [#2733](https://github.com/Snowflake-Labs/terraform-provider
#### *(behavior change)* Validation to arguments type added
Diff suppression for `arguments.type` is needed for the same reason as above for `snowflake_table` resource.

### tag_masking_policy_association resource changes
Now the `tag_masking_policy_association` resource will only accept fully qualified names separated by dot `.` instead of pipe `|`.

Before
```terraform
resource "snowflake_tag_masking_policy_association" "name" {
tag_id = snowflake_tag.this.id
masking_policy_id = snowflake_masking_policy.example_masking_policy.id
}
```

After
```terraform
resource "snowflake_tag_masking_policy_association" "name" {
tag_id = "\"${snowflake_tag.this.database}\".\"${snowflake_tag.this.schema}\".\"${snowflake_tag.this.name}\""
masking_policy_id = "\"${snowflake_masking_policy.example_masking_policy.database}\".\"${snowflake_masking_policy.example_masking_policy.schema}\".\"${snowflake_masking_policy.example_masking_policy.name}\""
}
```

It's more verbose now, but after identifier rework it should be similar to the previous form.

## v0.88.0 ➞ v0.89.0
#### *(behavior change)* ForceNew removed
The `ForceNew` field was removed in favor of in-place Update for `name` parameter in:
Expand Down
13 changes: 12 additions & 1 deletion docs/resources/stream.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,14 +12,25 @@ description: |-
## Example Usage

```terraform
resource "snowflake_table" "table" {
database = "database"
schema = "schema"
name = "name"
column {
type = "NUMBER(38,0)"
name = "id"
}
}
resource "snowflake_stream" "stream" {
comment = "A stream."
database = "database"
schema = "schema"
name = "stream"
on_table = "table"
on_table = snowflake_table.table.qualified_name
append_only = false
insert_only = false
Expand Down
5 changes: 4 additions & 1 deletion docs/resources/table_constraint.md
Original file line number Diff line number Diff line change
Expand Up @@ -125,12 +125,15 @@ resource "snowflake_table_constraint" "unique" {
<a id="nestedblock--foreign_key_properties"></a>
### Nested Schema for `foreign_key_properties`

Required:

- `references` (Block List, Min: 1, Max: 1) The table and columns that the foreign key references. (see [below for nested schema](#nestedblock--foreign_key_properties--references))

Optional:

- `match` (String) The match type for the foreign key. Not applicable for primary/unique keys
- `on_delete` (String) Specifies the action performed when the primary/unique key for the foreign key is deleted. Not applicable for primary/unique keys
- `on_update` (String) Specifies the action performed when the primary/unique key for the foreign key is updated. Not applicable for primary/unique keys
- `references` (Block List, Max: 1) The table and columns that the foreign key references. Not applicable for primary/unique keys (see [below for nested schema](#nestedblock--foreign_key_properties--references))

<a id="nestedblock--foreign_key_properties--references"></a>
### Nested Schema for `foreign_key_properties.references`
Expand Down
4 changes: 2 additions & 2 deletions docs/resources/tag_masking_policy_association.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,8 +53,8 @@ resource "snowflake_masking_policy" "example_masking_policy" {
}
resource "snowflake_tag_masking_policy_association" "name" {
tag_id = snowflake_tag.this.id
masking_policy_id = snowflake_masking_policy.example_masking_policy.id
tag_id = "\"${snowflake_tag.this.database}\".\"${snowflake_tag.this.schema}\".\"${snowflake_tag.this.name}\""
masking_policy_id = "\"${snowflake_masking_policy.example_masking_policy.database}\".\"${snowflake_masking_policy.example_masking_policy.schema}\".\"${snowflake_masking_policy.example_masking_policy.name}\""
}
```

Expand Down
13 changes: 12 additions & 1 deletion examples/resources/snowflake_stream/resource.tf
Original file line number Diff line number Diff line change
@@ -1,11 +1,22 @@
resource "snowflake_table" "table" {
database = "database"
schema = "schema"
name = "name"

column {
type = "NUMBER(38,0)"
name = "id"
}
}

resource "snowflake_stream" "stream" {
comment = "A stream."

database = "database"
schema = "schema"
name = "stream"

on_table = "table"
on_table = snowflake_table.table.qualified_name
append_only = false
insert_only = false

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,6 @@ resource "snowflake_masking_policy" "example_masking_policy" {
}

resource "snowflake_tag_masking_policy_association" "name" {
tag_id = snowflake_tag.this.id
masking_policy_id = snowflake_masking_policy.example_masking_policy.id
tag_id = "\"${snowflake_tag.this.database}\".\"${snowflake_tag.this.schema}\".\"${snowflake_tag.this.name}\""
masking_policy_id = "\"${snowflake_masking_policy.example_masking_policy.database}\".\"${snowflake_masking_policy.example_masking_policy.schema}\".\"${snowflake_masking_policy.example_masking_policy.name}\""
}
10 changes: 5 additions & 5 deletions pkg/resources/account_password_policy_attachment.go
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,11 @@ import (

var accountPasswordPolicyAttachmentSchema = map[string]*schema.Schema{
"password_policy": {
Type: schema.TypeString,
Required: true,
ForceNew: true,
Description: "Qualified name (`\"db\".\"schema\".\"policy_name\"`) of the password policy to apply to the current account.",
Type: schema.TypeString,
Required: true,
ForceNew: true,
Description: "Qualified name (`\"db\".\"schema\".\"policy_name\"`) of the password policy to apply to the current account.",
ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](),
},
}

Expand Down Expand Up @@ -45,7 +46,6 @@ func CreateAccountPasswordPolicyAttachment(d *schema.ResourceData, meta interfac
if !ok {
return fmt.Errorf("password_policy %s is not a valid password policy qualified name, expected format: `\"db\".\"schema\".\"policy\"`", d.Get("password_policy"))
}
// passwordPolicy := sdk.NewAccountObjectIdentifier(d.Get("password_policy").(string))

err := client.Accounts.Alter(ctx, &sdk.AlterAccountOptions{
Set: &sdk.AccountSet{
Expand Down
4 changes: 2 additions & 2 deletions pkg/resources/external_oauth_integration.go
Original file line number Diff line number Diff line change
Expand Up @@ -402,7 +402,7 @@ func UpdateExternalOauthIntegration(d *schema.ResourceData, meta interface{}) er
if d.HasChange("snowflake_user_mapping_attribute") {
val, ok := d.GetOk("snowflake_user_mapping_attribute")
if ok {
alterInput.ExternalOauthSnowflakeUserMappingAttribute = val.(snowflake.SFUserMappingAttribute)
alterInput.ExternalOauthSnowflakeUserMappingAttribute = snowflake.SFUserMappingAttribute(val.(string))
alterInput.ExternalOauthSnowflakeUserMappingAttributeOk = true
runAlter = true
} else {
Expand Down Expand Up @@ -479,7 +479,7 @@ func UpdateExternalOauthIntegration(d *schema.ResourceData, meta interface{}) er
if d.HasChange("any_role_mode") {
val, ok := d.GetOk("any_role_mode")
if ok {
alterInput.ExternalOauthAnyRoleMode = val.(snowflake.AnyRoleMode)
alterInput.ExternalOauthAnyRoleMode = snowflake.AnyRoleMode(val.(string))
alterInput.ExternalOauthAnyRoleModeOk = true
runAlter = true
} else {
Expand Down
56 changes: 33 additions & 23 deletions pkg/resources/failover_group.go
Original file line number Diff line number Diff line change
Expand Up @@ -438,32 +438,42 @@ func UpdateFailoverGroup(d *schema.ResourceData, meta interface{}) error {
}

if d.HasChange("replication_schedule") {
_, n := d.GetChange("replication_schedule")
replicationSchedule := n.([]interface{})[0].(map[string]interface{})
c := replicationSchedule["cron"].([]interface{})
var updatedReplicationSchedule string
if len(c) > 0 {
cron := c[0].(map[string]interface{})
cronExpression := cron["expression"].(string)
cronExpression = "USING CRON " + cronExpression
if v, ok := cron["time_zone"]; ok {
timeZone := v.(string)
if timeZone != "" {
cronExpression = cronExpression + " " + timeZone
replicationSchedules := d.Get("replication_schedule").([]any)
if len(replicationSchedules) > 0 {
replicationSchedule := replicationSchedules[0].(map[string]any)
crons := replicationSchedule["cron"].([]any)
var updatedReplicationSchedule string
if len(crons) > 0 {
cron := crons[0].(map[string]any)
cronExpression := cron["expression"].(string)
cronExpression = "USING CRON " + cronExpression
if v, ok := cron["time_zone"]; ok {
timeZone := v.(string)
if timeZone != "" {
cronExpression = cronExpression + " " + timeZone
}
}
updatedReplicationSchedule = cronExpression
} else {
updatedReplicationSchedule = fmt.Sprintf("%d MINUTE", replicationSchedule["interval"].(int))
}
err := client.FailoverGroups.AlterSource(ctx, id, &sdk.AlterSourceFailoverGroupOptions{
Set: &sdk.FailoverGroupSet{
ReplicationSchedule: sdk.String(updatedReplicationSchedule),
},
})
if err != nil {
return err
}
updatedReplicationSchedule = cronExpression
} else {
updatedReplicationSchedule = fmt.Sprintf("%d MINUTE", replicationSchedule["interval"].(int))
}

err := client.FailoverGroups.AlterSource(ctx, id, &sdk.AlterSourceFailoverGroupOptions{
Set: &sdk.FailoverGroupSet{
ReplicationSchedule: sdk.String(updatedReplicationSchedule),
},
})
if err != nil {
return err
err := client.FailoverGroups.AlterSource(ctx, id, &sdk.AlterSourceFailoverGroupOptions{
Unset: &sdk.FailoverGroupUnset{
ReplicationSchedule: sdk.Bool(true),
},
})
if err != nil {
return err
}
}
}

Expand Down
24 changes: 24 additions & 0 deletions pkg/resources/failover_group_acceptance_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -169,6 +169,18 @@ func TestAcc_FailoverGroupInterval(t *testing.T) {
resource.TestCheckResourceAttr("snowflake_failover_group.fg", "replication_schedule.0.cron.0.time_zone", "UTC"),
),
},
// Remove replication schedule
{
Config: failoverGroupWithoutReplicationSchedule(randomCharacters, accountName, acc.TestDatabaseName),
Check: resource.ComposeTestCheckFunc(
resource.TestCheckResourceAttr("snowflake_failover_group.fg", "name", randomCharacters),
resource.TestCheckResourceAttr("snowflake_failover_group.fg", "object_types.#", "4"),
resource.TestCheckResourceAttr("snowflake_failover_group.fg", "allowed_accounts.#", "1"),
resource.TestCheckResourceAttr("snowflake_failover_group.fg", "allowed_databases.#", "1"),
resource.TestCheckResourceAttr("snowflake_failover_group.fg", "allowed_integration_types.#", "1"),
resource.TestCheckResourceAttr("snowflake_failover_group.fg", "replication_schedule.#", "0"),
),
},
// Change to Interval
{
Config: failoverGroupWithInterval(randomCharacters, accountName, 10, acc.TestDatabaseName),
Expand Down Expand Up @@ -289,6 +301,18 @@ resource "snowflake_failover_group" "fg" {
`, randomCharacters, accountName, databaseName, interval)
}

func failoverGroupWithoutReplicationSchedule(randomCharacters, accountName string, databaseName string) string {
return fmt.Sprintf(`
resource "snowflake_failover_group" "fg" {
name = "%s"
object_types = ["WAREHOUSES","DATABASES", "INTEGRATIONS", "ROLES"]
allowed_accounts= ["%s"]
allowed_databases = ["%s"]
allowed_integration_types = ["SECURITY INTEGRATIONS"]
}
`, randomCharacters, accountName, databaseName)
}

func failoverGroupWithNoWarehouse(randomCharacters, accountName string, interval int) string {
return fmt.Sprintf(`
resource "snowflake_failover_group" "fg" {
Expand Down
2 changes: 1 addition & 1 deletion pkg/resources/resource_monitor.go
Original file line number Diff line number Diff line change
Expand Up @@ -464,7 +464,7 @@ func collectResourceMonitorTriggers(d *schema.ResourceData) []sdk.TriggerDefinit
if v, ok := d.GetOk("suspend_immediate_triggers"); ok {
siTrigs := expandIntList(v.(*schema.Set).List())
for _, threshold := range siTrigs {
if suspendImmediateTrigger == nil || suspendTrigger.Threshold > threshold {
if suspendImmediateTrigger == nil || (suspendTrigger != nil && suspendTrigger.Threshold > threshold) {
suspendImmediateTrigger = &sdk.TriggerDefinition{
Threshold: threshold,
TriggerAction: sdk.TriggerActionSuspendImmediate,
Expand Down
3 changes: 3 additions & 0 deletions pkg/resources/stream.go
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,7 @@ var streamSchema = map[string]*schema.Schema{
Description: "Specifies an identifier for the table the stream will monitor.",
ExactlyOneOf: []string{"on_table", "on_view", "on_stage"},
DiffSuppressFunc: suppressIdentifierQuoting,
ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](),
},
"on_view": {
Type: schema.TypeString,
Expand All @@ -52,6 +53,7 @@ var streamSchema = map[string]*schema.Schema{
Description: "Specifies an identifier for the view the stream will monitor.",
ExactlyOneOf: []string{"on_table", "on_view", "on_stage"},
DiffSuppressFunc: suppressIdentifierQuoting,
ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](),
},
"on_stage": {
Type: schema.TypeString,
Expand All @@ -63,6 +65,7 @@ var streamSchema = map[string]*schema.Schema{
// Suppress diff if the stage name is the same, even if database and schema are not specified
return strings.Trim(strings.Split(old, ".")[len(strings.Split(old, "."))-1], "\"") == strings.Trim(strings.Split(new, ".")[len(strings.Split(new, "."))-1], "\"")
},
ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](),
},
"append_only": {
Type: schema.TypeBool,
Expand Down
13 changes: 7 additions & 6 deletions pkg/resources/table_constraint.go
Original file line number Diff line number Diff line change
Expand Up @@ -38,10 +38,11 @@ var tableConstraintSchema = map[string]*schema.Schema{
},
},
"table_id": {
Type: schema.TypeString,
Required: true,
ForceNew: true,
Description: `Identifier for table to create constraint on. Format must follow: "\"&lt;db_name&gt;\".\"&lt;schema_name&gt;\".\"&lt;table_name&gt;\"" or "&lt;db_name&gt;.&lt;schema_name&gt;.&lt;table_name&gt;" (snowflake_table.my_table.id)`,
Type: schema.TypeString,
Required: true,
ForceNew: true,
Description: `Identifier for table to create constraint on. Format must follow: "\"&lt;db_name&gt;\".\"&lt;schema_name&gt;\".\"&lt;table_name&gt;\"" or "&lt;db_name&gt;.&lt;schema_name&gt;.&lt;table_name&gt;" (snowflake_table.my_table.id)`,
ValidateDiagFunc: IsValidIdentifier[sdk.SchemaObjectIdentifier](),
},
"columns": {
Type: schema.TypeList,
Expand Down Expand Up @@ -112,10 +113,10 @@ var tableConstraintSchema = map[string]*schema.Schema{
Schema: map[string]*schema.Schema{
"references": {
Type: schema.TypeList,
Optional: true,
Required: true,
ForceNew: true,
MaxItems: 1,
Description: "The table and columns that the foreign key references. Not applicable for primary/unique keys",
Description: "The table and columns that the foreign key references.",
Elem: &schema.Resource{
Schema: map[string]*schema.Schema{
"table_id": {
Expand Down
Loading

0 comments on commit 66e380d

Please sign in to comment.