-
Notifications
You must be signed in to change notification settings - Fork 394
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add a databricks_table
data source
#3170
Conversation
…-provider-databricks into table-data-source
I fixed a formatting error and added some basic unit testing. I am not really set up for running integration tests yet. Setting that up seems like a lot of overhead to commit a change that just passes an SDK result through unmodified, so if someone already set up for integration tests could help me out that would be appreciated. |
Please add at least a unit test... |
Sorry, I did and forgot to commit it.. Should be there now. |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #3170 +/- ##
==========================================
+ Coverage 83.43% 83.47% +0.04%
==========================================
Files 176 177 +1
Lines 16223 16258 +35
==========================================
+ Hits 13536 13572 +36
+ Misses 1865 1864 -1
Partials 822 822
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add a
Please rebase to the latest |
conflict should be fixed and switched to |
you can always run |
Oh, I missed that documentation isn't a part of PR - please add the documentation as |
I added a doc file - mostly copy/pasted from the Databricks SDK for Go docs. |
@jdavidheiser can you please resolve conflicts in the provider? |
fixed |
Hi - just circling back on this, hoping it could get merged soon to prevent more conflicts. |
Read: true, | ||
NonWritable: true, | ||
ID: "_", | ||
}.Apply(t) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you could rewrite this as
ApplyAndExpectData(t, map[string]any{
"catalog_name": "a",
...
})
|
||
func TestTableData(t *testing.T) { | ||
d, err := qa.ResourceFixture{ | ||
Fixtures: []qa.HTTPFixture{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should use MockWorkspaceClientFunc
instead of Fixtures
MockWorkspaceClientFunc: func(w *mocks.MockWorkspaceClient) {
e:= w.GetMockTablesAPI().EXPECT()
e.GetByFullName(mock.Anything, "a.b.c").Return(&catalog.TableInfo{})
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I pretty much copy pasted patterns I saw elsewhere in this repo - I don't really work in golang so things like this are outside of my comfort zone. I wonder if it might be a lot faster for someone with more familiarity to clean up the test style in a separate pass?
* `storage_location` - Storage root URL for table (for **MANAGED**, **EXTERNAL** tables) | ||
* `table_constraints` - List of table constraints | ||
* `table_id` - Name of table, relative to parent schema | ||
* `table_type` - Table or View |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the enum is listed here - https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TableType
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thank you for this PR. a couple more changes suggested
Co-authored-by: vuong-nguyen <[email protected]>
Supersed by #3571 |
Changes
This adds a new data source to describe a Databricks table, to address the request in #3148. I am not particularly familiar with the Databricks Terraform Provider repo, and not sure which styles are best to mimic. I opted to re-use the struct from the Databricks SDK, rather than implementing a new one. I don't see this pattern a lot in the repo, but it seems like recent changes to the SQL data source may have taken this approach.
Tests
I wanted to see if this approach of using the SDK structs is acceptable before writing tests. Go is not a language I normally work in, so it might take a bit of doing for me to figure out that part.
I did manually test it against a real Databricks workspace and confirm the return data shows up in my
terraform.tfstate
make test
run locallydocs/
folderinternal/acceptance