subcategory |
---|
Deployment |
-> Note This data source can only be used with an account-level provider!
This data source constructs necessary AWS cross-account policy for you, which is based on official documentation.
For more detailed usage please see databricks_aws_assume_role_policy or databricks_aws_s3_mount pages.
data "databricks_aws_crossaccount_policy" "this" {}
policy_type
(Optional) The type of cross account policy to generated:managed
for Databricks-managed VPC andcustomer
for customer-managed VPC,restricted
for customer-managed VPC with policy restrictionspass_roles
(Optional) (List) List of Data IAM role ARNs that are explicitly grantediam:PassRole
action. The below arguments are only valid forrestricted
policy typeaws_account_id
— Your AWS account ID, which is a number.aws_partition
- (Optional) AWS partition. The options areaws
oraws-us-gov
. Defaults toaws
vpc_id
— ID of the AWS VPC where you want to launch workspaces.region
— AWS Region name for your VPC deployment, for exampleus-west-2
.security_group_id
— ID of your AWS security group. When you add a security group restriction, you cannot reuse the cross-account IAM role or reference a credentials ID (credentials_id
) for any other workspaces. For those other workspaces, you must create separate roles, policies, and credentials objects.
In addition to all arguments above, the following attributes are exported:
json
- AWS IAM Policy JSON document
The following resources are used in the same context:
- Provisioning AWS Databricks workspaces with a Hub & Spoke firewall for data exfiltration protection guide
- databricks_aws_assume_role_policy data to construct the necessary AWS STS assume role policy.
- databricks_aws_bucket_policy data to configure a simple access policy for AWS S3 buckets, so that Databricks can access data in it.
- databricks_instance_profile to manage AWS EC2 instance profiles that users can launch databricks_cluster and access data, like databricks_mount.