Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Internal] Refactored databricks_zones and databricks_spark_versions data sources to Go SDK #3687

Merged
merged 5 commits into from
Jul 23, 2024

Conversation

nkvuong
Copy link
Contributor

@nkvuong nkvuong commented Jun 20, 2024

Changes

  • LatestSparkVersionOrDefault now returns 11.3 LTS, as 7.3 LTS is deprecated
  • Refactored databricks_zones to Go SDK
  • Refactored databricks_spark_versions to Go SDK. This refactoring require one additional change to resource.go:
    • Add new method WorkspaceDataWithCustomizeFunc to allow customization of the data source schema
  • Removed Spark versions related methods, as these have now moved to Go SDK. This requires migrating the function LatestSparkVersionOrDefault to a Go SDK method, which requires changing existing structs in Terraform provider to equivalent in Go SDK (clusters.SparkVersionsList to compute.GetSparkVersionsResponse, etc.)

Tests

  • make test run locally
  • covered with integration tests in internal/acceptance
  • relevant acceptance tests are passing
  • using Go SDK

@nkvuong nkvuong requested review from a team as code owners June 20, 2024 17:06
@nkvuong nkvuong requested review from tanmay-db and removed request for a team June 20, 2024 17:06
Copy link
Contributor

@alexott alexott left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, small comment

clusters/clusters_api_sdk.go Outdated Show resolved Hide resolved
@nkvuong nkvuong force-pushed the refactor/data_zones_spark branch from 6eff9c9 to dd78ab8 Compare July 2, 2024 15:59
@alexott alexott changed the title Refactored databricks_zones and databricks_spark_versions data sources to Go SDK [Internal] Refactored databricks_zones and databricks_spark_versions data sources to Go SDK Jul 19, 2024
@nkvuong nkvuong added this pull request to the merge queue Jul 23, 2024
Merged via the queue into main with commit 34bac74 Jul 23, 2024
7 checks passed
@nkvuong nkvuong deleted the refactor/data_zones_spark branch July 23, 2024 10:36
tanmay-db added a commit that referenced this pull request Jul 23, 2024
### Bug Fixes

 * Fixed reading of permissions for SQL objects ([#3800](#3800)).
 * don't update `databricks_metastore` during creation if not required ([#3783](#3783)).

### Documentation

 * Clarified schedule block in `databricks_job` ([#3805](#3805)).
 * Use correct names for isolation mode for storage credentials and external locations ([#3804](#3804)).

### Internal Changes

 * Refactored `databricks_zones` and `databricks_spark_versions` data sources to Go SDK ([#3687](#3687)).

### Exporter

 * Add support for exporting of Lakeview dashboards ([#3779](#3779)).
github-merge-queue bot pushed a commit that referenced this pull request Jul 24, 2024
## 1.49.1

### Bug Fixes
* Fixed reading of permissions for SQL objects
([#3800](#3800)).
* don't update `databricks_metastore` during creation if not required
([#3783](#3783)).

### Documentation
* Clarified schedule block in `databricks_job`
([#3805](#3805)).
* Use correct names for isolation mode for storage credentials and
external locations
([#3804](#3804)).
* Fix incomplete note in databricks_workspace_binding resource
([#3806](#3806))

### Internal Changes
* Refactored `databricks_zones` and `databricks_spark_versions` data
sources to Go SDK
([#3687](#3687)).

### Exporter
* Add support for exporting of Lakeview dashboards
([#3779](#3779)).
* Adding more retries for SCIM API calls
([#3807](#3807))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants