Skip to content

Commit

Permalink
upd to main
Browse files Browse the repository at this point in the history
  • Loading branch information
tanmay-db committed Jul 17, 2024
2 parents 26e0384 + 65d1570 commit 37abfa5
Show file tree
Hide file tree
Showing 17 changed files with 206 additions and 38 deletions.
30 changes: 30 additions & 0 deletions .github/workflows/message.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
name: Validate Commit Message

on:
pull_request:
types: [opened, synchronize, edited]
merge_group:
types: [checks_requested]

jobs:
validate:
runs-on: ubuntu-latest
# GitHub required checks are shared between PRs and the Merge Queue.
# Since there is no PR title on Merge Queue, we need to trigger and
# skip this test for Merge Queue to succeed.
if: github.event_name == 'pull_request'
steps:
- name: Checkout
uses: actions/checkout@v3
with:
fetch-depth: 0

- name: Validate Tag
run: |
TAG=$(echo ${{ github.event.pull_request.title }} | sed -ne 's/\[\(.*\)\].*/\1/p')
if grep -q "tag: \"\[$TAG\]\"" .codegen/changelog_config.yml; then
echo "Valid tag found: [$TAG]"
else
echo "Invalid or missing tag in commit message: [$TAG]"
exit 1
fi
19 changes: 0 additions & 19 deletions .github/workflows/push.yml
Original file line number Diff line number Diff line change
Expand Up @@ -54,22 +54,3 @@ jobs:
run: |
# Exit with status code 1 if there are differences (i.e. unformatted files)
git diff --exit-code
commit-message:
runs-on: ubuntu-latest
if: ${{ github.event_name == 'pull_request' }}
steps:
- name: Checkout
uses: actions/checkout@v3
with:
fetch-depth: 0

- name: Validate Tag
run: |
TAG=$(echo ${{ github.event.pull_request.title }} | sed -ne 's/\[\(.*\)\].*/\1/p')
if grep -q "tag: \"\[$TAG\]\"" .codegen/changelog_config.yml; then
echo "Valid tag found: [$TAG]"
else
echo "Invalid or missing tag in commit message: [$TAG]"
exit 1
fi
8 changes: 6 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,15 +27,19 @@
### Internal Changes
* Add Release tag ([#3748](https://github.com/databricks/terraform-provider-databricks/pull/3748)).
* Improve Changelog by grouping changes ([#3747](https://github.com/databricks/terraform-provider-databricks/pull/3747)).
* Upgrade Go SDK to v0.43.2 ([#3750](https://github.com/databricks/terraform-provider-databricks/pull/3750)).
* Add new APIErrorBody struct and update deps ([#3745](https://github.com/databricks/terraform-provider-databricks/pull/3745)).
* Change TF registry ownership ([#3736](https://github.com/databricks/terraform-provider-databricks/pull/3736)).
* Refactored `databricks_cluster(s)` data sources to Go SDK ([#3685](https://github.com/databricks/terraform-provider-databricks/pull/3685)).
* Upgrade databricks-sdk-go ([#3743](https://github.com/databricks/terraform-provider-databricks/pull/3743)).
* Run goreleaser action in snapshot mode from merge queue ([#3646](https://github.com/databricks/terraform-provider-databricks/pull/3646)).
* Make `dashboard_name` random in integration tests for `databricks_dashboard` resource ([#3763](https://github.com/databricks/terraform-provider-databricks/pull/3763)).


## 1.48.3

### Internal Changes
* Bump Go SDK to `0.43.2` ([#3750](https://github.com/databricks/terraform-provider-databricks/pull/3750))
* Added new `APIErrorBody` struct and update deps ([#3745](https://github.com/databricks/terraform-provider-databricks/pull/3745))

## 1.48.2

### New Features and Improvements
Expand Down
Binary file modified docs/resources.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
14 changes: 13 additions & 1 deletion docs/resources/dashboard.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,18 @@ In addition to all arguments above, the following attributes are exported:

* `id` - The unique ID of the dashboard.

## Access Control

[databricks_permissions](permissions.md#dashboard-usage) can control which groups or individual users can *Manage*, *Edit*, *Read* or *Run* individual dashboards.

## Import

You can import a `databricks_dashboard` resource with ID like the following:

```bash
terraform import databricks_dashboard.this <dashboard-id>
```

## Notes
* Only one of `serialized_dashboard` or `file_path` can be used throughout the lifecycle of the dashboard. If you want to switch from one to the other, you must first destroy the dashboard resource and then recreate it with the new attribute.
* Dashboards managed by Terraform will be published automatically.
* Dashboards managed by Terraform will be published automatically.
42 changes: 38 additions & 4 deletions docs/resources/permissions.md
Original file line number Diff line number Diff line change
Expand Up @@ -662,9 +662,9 @@ resource "databricks_permissions" "endpoint_usage" {
}
```

## SQL Dashboard usage
## Dashboard usage

[SQL dashboards](https://docs.databricks.com/sql/user/security/access-control/dashboard-acl.html) have three possible permissions: `CAN_VIEW`, `CAN_RUN` and `CAN_MANAGE`:
[Dashboards](https://docs.databricks.com/en/dashboards/tutorials/manage-permissions.html) have four possible permissions: `CAN_READ`, `CAN_RUN`, `CAN_EDIT` and `CAN_MANAGE`:

```hcl
resource "databricks_group" "auto" {
Expand All @@ -675,7 +675,41 @@ resource "databricks_group" "eng" {
display_name = "Engineering"
}
resource "databricks_permissions" "endpoint_usage" {
resource "databricks_dashboard" "dashboard" {
display_name = "TF New Dashboard"
# ...
}
resource "databricks_permissions" "dashboard_usage" {
dashboard_id = databricks_dashboard.dashboard.id
access_control {
group_name = databricks_group.auto.display_name
permission_level = "CAN_RUN"
}
access_control {
group_name = databricks_group.eng.display_name
permission_level = "CAN_MANAGE"
}
}
```

## Legacy SQL Dashboard usage

[Legacy SQL dashboards](https://docs.databricks.com/sql/user/security/access-control/dashboard-acl.html) have three possible permissions: `CAN_VIEW`, `CAN_RUN` and `CAN_MANAGE`:

```hcl
resource "databricks_group" "auto" {
display_name = "Automation"
}
resource "databricks_group" "eng" {
display_name = "Engineering"
}
resource "databricks_permissions" "sql_dashboard_usage" {
sql_dashboard_id = "3244325"
access_control {
Expand Down Expand Up @@ -819,7 +853,7 @@ Exactly one of the below arguments is required:

In addition to all arguments above, the following attributes are exported:

- `id` - Canonical unique identifier for the permissions in form of `/object_type/object_id`.
- `id` - Canonical unique identifier for the permissions in form of `/<object type>/<object id>`.
- `object_type` - type of permissions.

## Import
Expand Down
2 changes: 1 addition & 1 deletion exporter/context.go
Original file line number Diff line number Diff line change
Expand Up @@ -1651,7 +1651,7 @@ func (ic *importContext) dataToHcl(i importable, path []string,
// In case when have zero value, but there is non-zero default, we also need to produce it
shouldSkip = false
}
if shouldSkip {
if shouldSkip && (i.ShouldGenerateField == nil || !i.ShouldGenerateField(ic, pathString, as, d)) {
continue
}
switch as.Type {
Expand Down
7 changes: 7 additions & 0 deletions exporter/exporter_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -1917,6 +1917,13 @@ func TestImportingSqlObjects(t *testing.T) {

err := ic.Run()
assert.NoError(t, err)

content, err := os.ReadFile(tmpDir + "/sql-endpoints.tf")
assert.NoError(t, err)
contentStr := string(content)
assert.True(t, strings.Contains(contentStr, `enable_serverless_compute = false`))
assert.True(t, strings.Contains(contentStr, `resource "databricks_sql_endpoint" "test" {`))
assert.False(t, strings.Contains(contentStr, `tags {`))
})
}

Expand Down
25 changes: 25 additions & 0 deletions exporter/importables.go
Original file line number Diff line number Diff line change
Expand Up @@ -1713,6 +1713,25 @@ var resourcesMap map[string]importable = map[string]importable{
return nil
},
Ignore: generateIgnoreObjectWithoutName("databricks_sql_endpoint"),
ShouldOmitField: func(ic *importContext, pathString string, as *schema.Schema, d *schema.ResourceData) bool {
switch pathString {
case "enable_serverless_compute":
return false
case "tags":
return d.Get("tags.0.custom_tags.#").(int) == 0
case "channel.0.name":
channelName := d.Get(pathString).(string)
return channelName == "" || channelName == "CHANNEL_NAME_CURRENT"
case "channel":
channelName := d.Get(pathString + ".0.name").(string)
return channelName == "" || channelName == "CHANNEL_NAME_CURRENT"
}
return defaultShouldOmitFieldFunc(ic, pathString, as, d)
},
ShouldGenerateField: func(ic *importContext, pathString string, as *schema.Schema, d *schema.ResourceData) bool {
// We need to generate it even if it's false...
return pathString == "enable_serverless_compute"
},
},
"databricks_sql_global_config": {
WorkspaceLevel: true,
Expand All @@ -1739,6 +1758,12 @@ var resourcesMap map[string]importable = map[string]importable{
}
return nil
},
ShouldOmitField: func(ic *importContext, pathString string, as *schema.Schema, d *schema.ResourceData) bool {
if pathString == "enable_serverless_compute" {
return false
}
return defaultShouldOmitFieldFunc(ic, pathString, as, d)
},
Depends: []reference{
{Path: "instance_profile_arn", Resource: "databricks_instance_profile"},
},
Expand Down
2 changes: 2 additions & 0 deletions exporter/model.go
Original file line number Diff line number Diff line change
Expand Up @@ -153,6 +153,8 @@ type importable struct {
Ignore func(ic *importContext, r *resource) bool
// Function to check if the field in the given resource should be omitted or not
ShouldOmitField func(ic *importContext, pathString string, as *schema.Schema, d *schema.ResourceData) bool
// Function to check if the field in the given resource should be generated or not independently of the value
ShouldGenerateField func(ic *importContext, pathString string, as *schema.Schema, d *schema.ResourceData) bool
// Defines which API version should be used for this specific resource
ApiVersion common.ApiVersion
// Defines if specific service is account level resource
Expand Down
2 changes: 0 additions & 2 deletions go.sum
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,6 @@ github.com/cloudflare/circl v1.3.7/go.mod h1:sRTcRWXGLrKw6yIGJ+l7amYJFfAXbZG0kBS
github.com/cncf/udpa/go v0.0.0-20191209042840-269d4d468f6f/go.mod h1:M8M6+tZqaGXZJjfX53e64911xZQV5JYwmTeXPW+k8Sc=
github.com/cyphar/filepath-securejoin v0.2.4 h1:Ugdm7cg7i6ZK6x3xDF1oEu1nfkyfH53EtKeQYTC3kyg=
github.com/cyphar/filepath-securejoin v0.2.4/go.mod h1:aPGpWjXOXUn2NCNjFvBE6aRxGGx79pTxQpKOJNYHHl4=
github.com/databricks/databricks-sdk-go v0.43.1 h1:JJJ0S5yiDLQF8dzo6V1O2jKsOAkULtNqrnmFcvHstLg=
github.com/databricks/databricks-sdk-go v0.43.1/go.mod h1:nlzeOEgJ1Tmb5HyknBJ3GEorCZKWqEBoHprvPmTSNq8=
github.com/databricks/databricks-sdk-go v0.43.2 h1:4B+sHAYO5kFqwZNQRmsF70eecqsFX6i/0KfXoDFQT/E=
github.com/databricks/databricks-sdk-go v0.43.2/go.mod h1:nlzeOEgJ1Tmb5HyknBJ3GEorCZKWqEBoHprvPmTSNq8=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
Expand Down
5 changes: 4 additions & 1 deletion internal/acceptance/cluster_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,10 @@ func awsClusterTemplate(availability string) string {
num_workers = 1
autotermination_minutes = 10
aws_attributes {
availability = "%s"
availability = "%s"
}
custom_tags = {
"Owner" = "[email protected]"
}
node_type_id = "i3.xlarge"
}
Expand Down
12 changes: 11 additions & 1 deletion internal/acceptance/dashboard_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,17 @@ func makeTemplate(template templateStruct) string {
`, template.EmbedCredentials)
}
templateString += `}
`
resource "databricks_permissions" "dashboard_usage" {
dashboard_id = databricks_dashboard.d1.id
access_control {
group_name = "users"
permission_level = "CAN_READ"
}
}
`

return templateString
}

Expand Down
2 changes: 1 addition & 1 deletion internal/acceptance/restrict_workspace_admins_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ func TestAccRestrictWorkspaceAdminsSetting(t *testing.T) {
}
}
`
workspaceLevel(t, step{
unityWorkspaceLevel(t, step{
Template: template,
Check: resourceCheckWithState("databricks_restrict_workspace_admins_setting.this",
func(ctx context.Context, client *common.DatabricksClient, state *terraform.InstanceState) error {
Expand Down
26 changes: 21 additions & 5 deletions internal/acceptance/sql_endpoint_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -12,18 +12,34 @@ import (

func TestAccSQLEndpoint(t *testing.T) {
workspaceLevel(t, step{
Template: `resource "databricks_sql_endpoint" "this" {
name = "tf-{var.RANDOM}"
cluster_size = "2X-Small"
max_num_clusters = 1
}`,
Template: `
resource "databricks_sql_endpoint" "this" {
name = "tf-{var.RANDOM}"
cluster_size = "2X-Small"
max_num_clusters = 1
tags {
custom_tags {
key = "Owner"
value = "[email protected]"
}
}
}`,
}, step{
Template: `
resource "databricks_sql_endpoint" "that" {
name = "tf-{var.RANDOM}"
cluster_size = "2X-Small"
max_num_clusters = 1
enable_serverless_compute = false
tags {
custom_tags {
key = "Owner"
value = "[email protected]"
}
}
}`,
Check: func(s *terraform.State) error {
w, err := databricks.NewWorkspaceClient()
Expand Down
10 changes: 9 additions & 1 deletion permissions/resource_permissions.go
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ import (
"context"
"errors"
"fmt"
"log"
"path"
"strconv"
"strings"
Expand Down Expand Up @@ -163,6 +164,7 @@ func (a PermissionsAPI) put(objectID string, objectACL AccessControlChangeList)
// SQLA entities use POST for permission updates.
return a.client.Post(a.context, urlPathForObjectID(objectID), objectACL, nil)
}
log.Printf("[DEBUG] PUT %s %v", objectID, objectACL)
return a.client.Put(a.context, urlPathForObjectID(objectID), objectACL)
}

Expand Down Expand Up @@ -262,6 +264,10 @@ func (a PermissionsAPI) Read(objectID string) (objectACL ObjectACL, err error) {
err = apiErr
return
}
if strings.HasPrefix(objectID, "/dashboards/") {
// workaround for inconsistent API response returning object ID of file in the workspace
objectACL.ObjectID = objectID
}
return
}

Expand Down Expand Up @@ -306,6 +312,7 @@ func permissionsResourceIDFields() []permissionsIDFieldMapping {
{"sql_dashboard_id", "dashboard", "sql/dashboards", []string{"CAN_EDIT", "CAN_RUN", "CAN_MANAGE", "CAN_VIEW"}, SIMPLE},
{"sql_alert_id", "alert", "sql/alerts", []string{"CAN_EDIT", "CAN_RUN", "CAN_MANAGE", "CAN_VIEW"}, SIMPLE},
{"sql_query_id", "query", "sql/queries", []string{"CAN_EDIT", "CAN_RUN", "CAN_MANAGE", "CAN_VIEW"}, SIMPLE},
{"dashboard_id", "dashboard", "dashboards", []string{"CAN_EDIT", "CAN_RUN", "CAN_MANAGE", "CAN_READ"}, SIMPLE},
{"experiment_id", "mlflowExperiment", "experiments", []string{"CAN_READ", "CAN_EDIT", "CAN_MANAGE"}, SIMPLE},
{"registered_model_id", "registered-model", "registered-models", []string{
"CAN_READ", "CAN_EDIT", "CAN_MANAGE_STAGING_VERSIONS", "CAN_MANAGE_PRODUCTION_VERSIONS", "CAN_MANAGE"}, SIMPLE},
Expand Down Expand Up @@ -335,9 +342,10 @@ func (oa *ObjectACL) ToPermissionsEntity(d *schema.ResourceData, me string) (Per
}
}
for _, mapping := range permissionsResourceIDFields() {
if mapping.objectType != oa.ObjectType {
if mapping.objectType != oa.ObjectType || !strings.HasPrefix(oa.ObjectID[1:], mapping.resourceType) {
continue
}
log.Printf("[DEBUG] mapping %v for object %v", mapping, oa)
entity.ObjectType = mapping.objectType
var pathVariant any
if mapping.objectType == "file" {
Expand Down
Loading

0 comments on commit 37abfa5

Please sign in to comment.