Represents a {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job google_dataflow_job}.
import com.hashicorp.cdktf.providers.google_beta.google_dataflow_job.GoogleDataflowJob;
GoogleDataflowJob.Builder.create(Construct scope, java.lang.String id)
// .connection(SSHProvisionerConnection)
// .connection(WinrmProvisionerConnection)
// .count(java.lang.Number)
// .count(TerraformCount)
// .dependsOn(java.util.List<ITerraformDependable>)
// .forEach(ITerraformIterator)
// .lifecycle(TerraformResourceLifecycle)
// .provider(TerraformProvider)
// .provisioners(java.util.List<FileProvisioner)
// .provisioners(LocalExecProvisioner)
// .provisioners(RemoteExecProvisioner>)
.name(java.lang.String)
.tempGcsLocation(java.lang.String)
.templateGcsPath(java.lang.String)
// .additionalExperiments(java.util.List<java.lang.String>)
// .enableStreamingEngine(java.lang.Boolean)
// .enableStreamingEngine(IResolvable)
// .id(java.lang.String)
// .ipConfiguration(java.lang.String)
// .kmsKeyName(java.lang.String)
// .labels(java.util.Map<java.lang.String, java.lang.String>)
// .machineType(java.lang.String)
// .maxWorkers(java.lang.Number)
// .network(java.lang.String)
// .onDelete(java.lang.String)
// .parameters(java.util.Map<java.lang.String, java.lang.String>)
// .project(java.lang.String)
// .region(java.lang.String)
// .serviceAccountEmail(java.lang.String)
// .skipWaitOnJobTermination(java.lang.Boolean)
// .skipWaitOnJobTermination(IResolvable)
// .subnetwork(java.lang.String)
// .timeouts(GoogleDataflowJobTimeouts)
// .transformNameMapping(java.util.Map<java.lang.String, java.lang.String>)
// .zone(java.lang.String)
.build();
Name | Type | Description |
---|---|---|
scope |
software.constructs.Construct |
The scope in which to define this construct. |
id |
java.lang.String |
The scoped construct ID. |
connection |
com.hashicorp.cdktf.SSHProvisionerConnection OR com.hashicorp.cdktf.WinrmProvisionerConnection |
No description. |
count |
java.lang.Number OR com.hashicorp.cdktf.TerraformCount |
No description. |
dependsOn |
java.util.List<com.hashicorp.cdktf.ITerraformDependable> |
No description. |
forEach |
com.hashicorp.cdktf.ITerraformIterator |
No description. |
lifecycle |
com.hashicorp.cdktf.TerraformResourceLifecycle |
No description. |
provider |
com.hashicorp.cdktf.TerraformProvider |
No description. |
provisioners |
java.util.List<com.hashicorp.cdktf.FileProvisioner OR com.hashicorp.cdktf.LocalExecProvisioner OR com.hashicorp.cdktf.RemoteExecProvisioner> |
No description. |
name |
java.lang.String |
A unique name for the resource, required by Dataflow. |
tempGcsLocation |
java.lang.String |
A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data. |
templateGcsPath |
java.lang.String |
The Google Cloud Storage path to the Dataflow job template. |
additionalExperiments |
java.util.List<java.lang.String> |
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"]. |
enableStreamingEngine |
java.lang.Boolean OR com.hashicorp.cdktf.IResolvable |
Indicates if the job should use the streaming engine feature. |
id |
java.lang.String |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#id GoogleDataflowJob#id}. |
ipConfiguration |
java.lang.String |
The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE". |
kmsKeyName |
java.lang.String |
The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY. |
labels |
java.util.Map<java.lang.String, java.lang.String> |
User labels to be specified for the job. |
machineType |
java.lang.String |
The machine type to use for the job. |
maxWorkers |
java.lang.Number |
The number of workers permitted to work on the job. More workers may improve processing speed at additional cost. |
network |
java.lang.String |
The network to which VMs will be assigned. If it is not provided, "default" will be used. |
onDelete |
java.lang.String |
One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy. |
parameters |
java.util.Map<java.lang.String, java.lang.String> |
Key/Value pairs to be passed to the Dataflow job (as used in the template). |
project |
java.lang.String |
The project in which the resource belongs. |
region |
java.lang.String |
The region in which the created job should run. |
serviceAccountEmail |
java.lang.String |
The Service Account email used to create the job. |
skipWaitOnJobTermination |
java.lang.Boolean OR com.hashicorp.cdktf.IResolvable |
If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. |
subnetwork |
java.lang.String |
The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". |
timeouts |
GoogleDataflowJobTimeouts |
timeouts block. |
transformNameMapping |
java.util.Map<java.lang.String, java.lang.String> |
Only applicable when updating a pipeline. |
zone |
java.lang.String |
The zone in which the created job should run. If it is not provided, the provider zone is used. |
- Type: software.constructs.Construct
The scope in which to define this construct.
- Type: java.lang.String
The scoped construct ID.
Must be unique amongst siblings in the same scope
- Type: com.hashicorp.cdktf.SSHProvisionerConnection OR com.hashicorp.cdktf.WinrmProvisionerConnection
- Type: java.lang.Number OR com.hashicorp.cdktf.TerraformCount
- Type: java.util.List<com.hashicorp.cdktf.ITerraformDependable>
- Type: com.hashicorp.cdktf.ITerraformIterator
- Type: com.hashicorp.cdktf.TerraformResourceLifecycle
- Type: com.hashicorp.cdktf.TerraformProvider
- Type: java.util.List<com.hashicorp.cdktf.FileProvisioner OR com.hashicorp.cdktf.LocalExecProvisioner OR com.hashicorp.cdktf.RemoteExecProvisioner>
- Type: java.lang.String
A unique name for the resource, required by Dataflow.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#name GoogleDataflowJob#name}
- Type: java.lang.String
A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#temp_gcs_location GoogleDataflowJob#temp_gcs_location}
- Type: java.lang.String
The Google Cloud Storage path to the Dataflow job template.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#template_gcs_path GoogleDataflowJob#template_gcs_path}
- Type: java.util.List<java.lang.String>
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#additional_experiments GoogleDataflowJob#additional_experiments}
- Type: java.lang.Boolean OR com.hashicorp.cdktf.IResolvable
Indicates if the job should use the streaming engine feature.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#enable_streaming_engine GoogleDataflowJob#enable_streaming_engine}
- Type: java.lang.String
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#id GoogleDataflowJob#id}.
Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.
- Type: java.lang.String
The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#ip_configuration GoogleDataflowJob#ip_configuration}
- Type: java.lang.String
The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#kms_key_name GoogleDataflowJob#kms_key_name}
- Type: java.util.Map<java.lang.String, java.lang.String>
User labels to be specified for the job.
Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: This field is non-authoritative, and will only manage the labels present in your configuration. Please refer to the field 'effective_labels' for all of the labels present on the resource.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#labels GoogleDataflowJob#labels}
- Type: java.lang.String
The machine type to use for the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#machine_type GoogleDataflowJob#machine_type}
- Type: java.lang.Number
The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#max_workers GoogleDataflowJob#max_workers}
- Type: java.lang.String
The network to which VMs will be assigned. If it is not provided, "default" will be used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#network GoogleDataflowJob#network}
- Type: java.lang.String
One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#on_delete GoogleDataflowJob#on_delete}
- Type: java.util.Map<java.lang.String, java.lang.String>
Key/Value pairs to be passed to the Dataflow job (as used in the template).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#parameters GoogleDataflowJob#parameters}
- Type: java.lang.String
The project in which the resource belongs.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#project GoogleDataflowJob#project}
- Type: java.lang.String
The region in which the created job should run.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#region GoogleDataflowJob#region}
- Type: java.lang.String
The Service Account email used to create the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#service_account_email GoogleDataflowJob#service_account_email}
- Type: java.lang.Boolean OR com.hashicorp.cdktf.IResolvable
If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.
WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#skip_wait_on_job_termination GoogleDataflowJob#skip_wait_on_job_termination}
- Type: java.lang.String
The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK".
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#subnetwork GoogleDataflowJob#subnetwork}
timeouts block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#timeouts GoogleDataflowJob#timeouts}
- Type: java.util.Map<java.lang.String, java.lang.String>
Only applicable when updating a pipeline.
Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#transform_name_mapping GoogleDataflowJob#transform_name_mapping}
- Type: java.lang.String
The zone in which the created job should run. If it is not provided, the provider zone is used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#zone GoogleDataflowJob#zone}
Name | Description |
---|---|
toString |
Returns a string representation of this construct. |
addOverride |
No description. |
overrideLogicalId |
Overrides the auto-generated logical ID with a specific ID. |
resetOverrideLogicalId |
Resets a previously passed logical Id to use the auto-generated logical id again. |
toHclTerraform |
No description. |
toMetadata |
No description. |
toTerraform |
Adds this resource to the terraform JSON output. |
addMoveTarget |
Adds a user defined moveTarget string to this resource to be later used in .moveTo(moveTarget) to resolve the location of the move. |
getAnyMapAttribute |
No description. |
getBooleanAttribute |
No description. |
getBooleanMapAttribute |
No description. |
getListAttribute |
No description. |
getNumberAttribute |
No description. |
getNumberListAttribute |
No description. |
getNumberMapAttribute |
No description. |
getStringAttribute |
No description. |
getStringMapAttribute |
No description. |
hasResourceMove |
No description. |
importFrom |
No description. |
interpolationForAttribute |
No description. |
moveFromId |
Move the resource corresponding to "id" to this resource. |
moveTo |
Moves this resource to the target resource given by moveTarget. |
moveToId |
Moves this resource to the resource corresponding to "id". |
putTimeouts |
No description. |
resetAdditionalExperiments |
No description. |
resetEnableStreamingEngine |
No description. |
resetId |
No description. |
resetIpConfiguration |
No description. |
resetKmsKeyName |
No description. |
resetLabels |
No description. |
resetMachineType |
No description. |
resetMaxWorkers |
No description. |
resetNetwork |
No description. |
resetOnDelete |
No description. |
resetParameters |
No description. |
resetProject |
No description. |
resetRegion |
No description. |
resetServiceAccountEmail |
No description. |
resetSkipWaitOnJobTermination |
No description. |
resetSubnetwork |
No description. |
resetTimeouts |
No description. |
resetTransformNameMapping |
No description. |
resetZone |
No description. |
public java.lang.String toString()
Returns a string representation of this construct.
public void addOverride(java.lang.String path, java.lang.Object value)
- Type: java.lang.String
- Type: java.lang.Object
public void overrideLogicalId(java.lang.String newLogicalId)
Overrides the auto-generated logical ID with a specific ID.
- Type: java.lang.String
The new logical ID to use for this stack element.
public void resetOverrideLogicalId()
Resets a previously passed logical Id to use the auto-generated logical id again.
public java.lang.Object toHclTerraform()
public java.lang.Object toMetadata()
public java.lang.Object toTerraform()
Adds this resource to the terraform JSON output.
public void addMoveTarget(java.lang.String moveTarget)
Adds a user defined moveTarget string to this resource to be later used in .moveTo(moveTarget) to resolve the location of the move.
- Type: java.lang.String
The string move target that will correspond to this resource.
public java.util.Map<java.lang.String, java.lang.Object> getAnyMapAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public IResolvable getBooleanAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public java.util.Map<java.lang.String, java.lang.Boolean> getBooleanMapAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public java.util.List<java.lang.String> getListAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public java.lang.Number getNumberAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public java.util.List<java.lang.Number> getNumberListAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public java.util.Map<java.lang.String, java.lang.Number> getNumberMapAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public java.lang.String getStringAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public java.util.Map<java.lang.String, java.lang.String> getStringMapAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public TerraformResourceMoveByTarget OR TerraformResourceMoveById hasResourceMove()
public void importFrom(java.lang.String id)
public void importFrom(java.lang.String id, TerraformProvider provider)
- Type: java.lang.String
- Type: com.hashicorp.cdktf.TerraformProvider
public IResolvable interpolationForAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public void moveFromId(java.lang.String id)
Move the resource corresponding to "id" to this resource.
Note that the resource being moved from must be marked as moved using it's instance function.
- Type: java.lang.String
Full id of resource being moved from, e.g. "aws_s3_bucket.example".
public void moveTo(java.lang.String moveTarget)
public void moveTo(java.lang.String moveTarget, java.lang.String OR java.lang.Number index)
Moves this resource to the target resource given by moveTarget.
- Type: java.lang.String
The previously set user defined string set by .addMoveTarget() corresponding to the resource to move to.
- Type: java.lang.String OR java.lang.Number
Optional The index corresponding to the key the resource is to appear in the foreach of a resource to move to.
public void moveToId(java.lang.String id)
Moves this resource to the resource corresponding to "id".
- Type: java.lang.String
Full id of resource to move to, e.g. "aws_s3_bucket.example".
public void putTimeouts(GoogleDataflowJobTimeouts value)
public void resetAdditionalExperiments()
public void resetEnableStreamingEngine()
public void resetId()
public void resetIpConfiguration()
public void resetKmsKeyName()
public void resetLabels()
public void resetMachineType()
public void resetMaxWorkers()
public void resetNetwork()
public void resetOnDelete()
public void resetParameters()
public void resetProject()
public void resetRegion()
public void resetServiceAccountEmail()
public void resetSkipWaitOnJobTermination()
public void resetSubnetwork()
public void resetTimeouts()
public void resetTransformNameMapping()
public void resetZone()
Name | Description |
---|---|
isConstruct |
Checks if x is a construct. |
isTerraformElement |
No description. |
isTerraformResource |
No description. |
generateConfigForImport |
Generates CDKTF code for importing a GoogleDataflowJob resource upon running "cdktf plan ". |
import com.hashicorp.cdktf.providers.google_beta.google_dataflow_job.GoogleDataflowJob;
GoogleDataflowJob.isConstruct(java.lang.Object x)
Checks if x
is a construct.
Use this method instead of instanceof
to properly detect Construct
instances, even when the construct library is symlinked.
Explanation: in JavaScript, multiple copies of the constructs
library on
disk are seen as independent, completely different libraries. As a
consequence, the class Construct
in each copy of the constructs
library
is seen as a different class, and an instance of one class will not test as
instanceof
the other class. npm install
will not create installations
like this, but users may manually symlink construct libraries together or
use a monorepo tool: in those cases, multiple copies of the constructs
library can be accidentally installed, and instanceof
will behave
unpredictably. It is safest to avoid using instanceof
, and using
this type-testing method instead.
- Type: java.lang.Object
Any object.
import com.hashicorp.cdktf.providers.google_beta.google_dataflow_job.GoogleDataflowJob;
GoogleDataflowJob.isTerraformElement(java.lang.Object x)
- Type: java.lang.Object
import com.hashicorp.cdktf.providers.google_beta.google_dataflow_job.GoogleDataflowJob;
GoogleDataflowJob.isTerraformResource(java.lang.Object x)
- Type: java.lang.Object
import com.hashicorp.cdktf.providers.google_beta.google_dataflow_job.GoogleDataflowJob;
GoogleDataflowJob.generateConfigForImport(Construct scope, java.lang.String importToId, java.lang.String importFromId),GoogleDataflowJob.generateConfigForImport(Construct scope, java.lang.String importToId, java.lang.String importFromId, TerraformProvider provider)
Generates CDKTF code for importing a GoogleDataflowJob resource upon running "cdktf plan ".
- Type: software.constructs.Construct
The scope in which to define this construct.
- Type: java.lang.String
The construct id used in the generated config for the GoogleDataflowJob to import.
- Type: java.lang.String
The id of the existing GoogleDataflowJob that should be imported.
Refer to the {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#import import section} in the documentation of this resource for the id to use
- Type: com.hashicorp.cdktf.TerraformProvider
? Optional instance of the provider where the GoogleDataflowJob to import is found.
Name | Type | Description |
---|---|---|
node |
software.constructs.Node |
The tree node. |
cdktfStack |
com.hashicorp.cdktf.TerraformStack |
No description. |
fqn |
java.lang.String |
No description. |
friendlyUniqueId |
java.lang.String |
No description. |
terraformMetaArguments |
java.util.Map<java.lang.String, java.lang.Object> |
No description. |
terraformResourceType |
java.lang.String |
No description. |
terraformGeneratorMetadata |
com.hashicorp.cdktf.TerraformProviderGeneratorMetadata |
No description. |
connection |
com.hashicorp.cdktf.SSHProvisionerConnection OR com.hashicorp.cdktf.WinrmProvisionerConnection |
No description. |
count |
java.lang.Number OR com.hashicorp.cdktf.TerraformCount |
No description. |
dependsOn |
java.util.List<java.lang.String> |
No description. |
forEach |
com.hashicorp.cdktf.ITerraformIterator |
No description. |
lifecycle |
com.hashicorp.cdktf.TerraformResourceLifecycle |
No description. |
provider |
com.hashicorp.cdktf.TerraformProvider |
No description. |
provisioners |
java.util.List<com.hashicorp.cdktf.FileProvisioner OR com.hashicorp.cdktf.LocalExecProvisioner OR com.hashicorp.cdktf.RemoteExecProvisioner> |
No description. |
effectiveLabels |
com.hashicorp.cdktf.StringMap |
No description. |
jobId |
java.lang.String |
No description. |
state |
java.lang.String |
No description. |
terraformLabels |
com.hashicorp.cdktf.StringMap |
No description. |
timeouts |
GoogleDataflowJobTimeoutsOutputReference |
No description. |
type |
java.lang.String |
No description. |
additionalExperimentsInput |
java.util.List<java.lang.String> |
No description. |
enableStreamingEngineInput |
java.lang.Boolean OR com.hashicorp.cdktf.IResolvable |
No description. |
idInput |
java.lang.String |
No description. |
ipConfigurationInput |
java.lang.String |
No description. |
kmsKeyNameInput |
java.lang.String |
No description. |
labelsInput |
java.util.Map<java.lang.String, java.lang.String> |
No description. |
machineTypeInput |
java.lang.String |
No description. |
maxWorkersInput |
java.lang.Number |
No description. |
nameInput |
java.lang.String |
No description. |
networkInput |
java.lang.String |
No description. |
onDeleteInput |
java.lang.String |
No description. |
parametersInput |
java.util.Map<java.lang.String, java.lang.String> |
No description. |
projectInput |
java.lang.String |
No description. |
regionInput |
java.lang.String |
No description. |
serviceAccountEmailInput |
java.lang.String |
No description. |
skipWaitOnJobTerminationInput |
java.lang.Boolean OR com.hashicorp.cdktf.IResolvable |
No description. |
subnetworkInput |
java.lang.String |
No description. |
tempGcsLocationInput |
java.lang.String |
No description. |
templateGcsPathInput |
java.lang.String |
No description. |
timeoutsInput |
com.hashicorp.cdktf.IResolvable OR GoogleDataflowJobTimeouts |
No description. |
transformNameMappingInput |
java.util.Map<java.lang.String, java.lang.String> |
No description. |
zoneInput |
java.lang.String |
No description. |
additionalExperiments |
java.util.List<java.lang.String> |
No description. |
enableStreamingEngine |
java.lang.Boolean OR com.hashicorp.cdktf.IResolvable |
No description. |
id |
java.lang.String |
No description. |
ipConfiguration |
java.lang.String |
No description. |
kmsKeyName |
java.lang.String |
No description. |
labels |
java.util.Map<java.lang.String, java.lang.String> |
No description. |
machineType |
java.lang.String |
No description. |
maxWorkers |
java.lang.Number |
No description. |
name |
java.lang.String |
No description. |
network |
java.lang.String |
No description. |
onDelete |
java.lang.String |
No description. |
parameters |
java.util.Map<java.lang.String, java.lang.String> |
No description. |
project |
java.lang.String |
No description. |
region |
java.lang.String |
No description. |
serviceAccountEmail |
java.lang.String |
No description. |
skipWaitOnJobTermination |
java.lang.Boolean OR com.hashicorp.cdktf.IResolvable |
No description. |
subnetwork |
java.lang.String |
No description. |
tempGcsLocation |
java.lang.String |
No description. |
templateGcsPath |
java.lang.String |
No description. |
transformNameMapping |
java.util.Map<java.lang.String, java.lang.String> |
No description. |
zone |
java.lang.String |
No description. |
public Node getNode();
- Type: software.constructs.Node
The tree node.
public TerraformStack getCdktfStack();
- Type: com.hashicorp.cdktf.TerraformStack
public java.lang.String getFqn();
- Type: java.lang.String
public java.lang.String getFriendlyUniqueId();
- Type: java.lang.String
public java.util.Map<java.lang.String, java.lang.Object> getTerraformMetaArguments();
- Type: java.util.Map<java.lang.String, java.lang.Object>
public java.lang.String getTerraformResourceType();
- Type: java.lang.String
public TerraformProviderGeneratorMetadata getTerraformGeneratorMetadata();
- Type: com.hashicorp.cdktf.TerraformProviderGeneratorMetadata
public java.lang.Object getConnection();
- Type: com.hashicorp.cdktf.SSHProvisionerConnection OR com.hashicorp.cdktf.WinrmProvisionerConnection
public java.lang.Object getCount();
- Type: java.lang.Number OR com.hashicorp.cdktf.TerraformCount
public java.util.List<java.lang.String> getDependsOn();
- Type: java.util.List<java.lang.String>
public ITerraformIterator getForEach();
- Type: com.hashicorp.cdktf.ITerraformIterator
public TerraformResourceLifecycle getLifecycle();
- Type: com.hashicorp.cdktf.TerraformResourceLifecycle
public TerraformProvider getProvider();
- Type: com.hashicorp.cdktf.TerraformProvider
public java.lang.Object getProvisioners();
- Type: java.util.List<com.hashicorp.cdktf.FileProvisioner OR com.hashicorp.cdktf.LocalExecProvisioner OR com.hashicorp.cdktf.RemoteExecProvisioner>
public StringMap getEffectiveLabels();
- Type: com.hashicorp.cdktf.StringMap
public java.lang.String getJobId();
- Type: java.lang.String
public java.lang.String getState();
- Type: java.lang.String
public StringMap getTerraformLabels();
- Type: com.hashicorp.cdktf.StringMap
public GoogleDataflowJobTimeoutsOutputReference getTimeouts();
public java.lang.String getType();
- Type: java.lang.String
public java.util.List<java.lang.String> getAdditionalExperimentsInput();
- Type: java.util.List<java.lang.String>
public java.lang.Object getEnableStreamingEngineInput();
- Type: java.lang.Boolean OR com.hashicorp.cdktf.IResolvable
public java.lang.String getIdInput();
- Type: java.lang.String
public java.lang.String getIpConfigurationInput();
- Type: java.lang.String
public java.lang.String getKmsKeyNameInput();
- Type: java.lang.String
public java.util.Map<java.lang.String, java.lang.String> getLabelsInput();
- Type: java.util.Map<java.lang.String, java.lang.String>
public java.lang.String getMachineTypeInput();
- Type: java.lang.String
public java.lang.Number getMaxWorkersInput();
- Type: java.lang.Number
public java.lang.String getNameInput();
- Type: java.lang.String
public java.lang.String getNetworkInput();
- Type: java.lang.String
public java.lang.String getOnDeleteInput();
- Type: java.lang.String
public java.util.Map<java.lang.String, java.lang.String> getParametersInput();
- Type: java.util.Map<java.lang.String, java.lang.String>
public java.lang.String getProjectInput();
- Type: java.lang.String
public java.lang.String getRegionInput();
- Type: java.lang.String
public java.lang.String getServiceAccountEmailInput();
- Type: java.lang.String
public java.lang.Object getSkipWaitOnJobTerminationInput();
- Type: java.lang.Boolean OR com.hashicorp.cdktf.IResolvable
public java.lang.String getSubnetworkInput();
- Type: java.lang.String
public java.lang.String getTempGcsLocationInput();
- Type: java.lang.String
public java.lang.String getTemplateGcsPathInput();
- Type: java.lang.String
public java.lang.Object getTimeoutsInput();
- Type: com.hashicorp.cdktf.IResolvable OR GoogleDataflowJobTimeouts
public java.util.Map<java.lang.String, java.lang.String> getTransformNameMappingInput();
- Type: java.util.Map<java.lang.String, java.lang.String>
public java.lang.String getZoneInput();
- Type: java.lang.String
public java.util.List<java.lang.String> getAdditionalExperiments();
- Type: java.util.List<java.lang.String>
public java.lang.Object getEnableStreamingEngine();
- Type: java.lang.Boolean OR com.hashicorp.cdktf.IResolvable
public java.lang.String getId();
- Type: java.lang.String
public java.lang.String getIpConfiguration();
- Type: java.lang.String
public java.lang.String getKmsKeyName();
- Type: java.lang.String
public java.util.Map<java.lang.String, java.lang.String> getLabels();
- Type: java.util.Map<java.lang.String, java.lang.String>
public java.lang.String getMachineType();
- Type: java.lang.String
public java.lang.Number getMaxWorkers();
- Type: java.lang.Number
public java.lang.String getName();
- Type: java.lang.String
public java.lang.String getNetwork();
- Type: java.lang.String
public java.lang.String getOnDelete();
- Type: java.lang.String
public java.util.Map<java.lang.String, java.lang.String> getParameters();
- Type: java.util.Map<java.lang.String, java.lang.String>
public java.lang.String getProject();
- Type: java.lang.String
public java.lang.String getRegion();
- Type: java.lang.String
public java.lang.String getServiceAccountEmail();
- Type: java.lang.String
public java.lang.Object getSkipWaitOnJobTermination();
- Type: java.lang.Boolean OR com.hashicorp.cdktf.IResolvable
public java.lang.String getSubnetwork();
- Type: java.lang.String
public java.lang.String getTempGcsLocation();
- Type: java.lang.String
public java.lang.String getTemplateGcsPath();
- Type: java.lang.String
public java.util.Map<java.lang.String, java.lang.String> getTransformNameMapping();
- Type: java.util.Map<java.lang.String, java.lang.String>
public java.lang.String getZone();
- Type: java.lang.String
Name | Type | Description |
---|---|---|
tfResourceType |
java.lang.String |
No description. |
public java.lang.String getTfResourceType();
- Type: java.lang.String
import com.hashicorp.cdktf.providers.google_beta.google_dataflow_job.GoogleDataflowJobConfig;
GoogleDataflowJobConfig.builder()
// .connection(SSHProvisionerConnection)
// .connection(WinrmProvisionerConnection)
// .count(java.lang.Number)
// .count(TerraformCount)
// .dependsOn(java.util.List<ITerraformDependable>)
// .forEach(ITerraformIterator)
// .lifecycle(TerraformResourceLifecycle)
// .provider(TerraformProvider)
// .provisioners(java.util.List<FileProvisioner)
// .provisioners(LocalExecProvisioner)
// .provisioners(RemoteExecProvisioner>)
.name(java.lang.String)
.tempGcsLocation(java.lang.String)
.templateGcsPath(java.lang.String)
// .additionalExperiments(java.util.List<java.lang.String>)
// .enableStreamingEngine(java.lang.Boolean)
// .enableStreamingEngine(IResolvable)
// .id(java.lang.String)
// .ipConfiguration(java.lang.String)
// .kmsKeyName(java.lang.String)
// .labels(java.util.Map<java.lang.String, java.lang.String>)
// .machineType(java.lang.String)
// .maxWorkers(java.lang.Number)
// .network(java.lang.String)
// .onDelete(java.lang.String)
// .parameters(java.util.Map<java.lang.String, java.lang.String>)
// .project(java.lang.String)
// .region(java.lang.String)
// .serviceAccountEmail(java.lang.String)
// .skipWaitOnJobTermination(java.lang.Boolean)
// .skipWaitOnJobTermination(IResolvable)
// .subnetwork(java.lang.String)
// .timeouts(GoogleDataflowJobTimeouts)
// .transformNameMapping(java.util.Map<java.lang.String, java.lang.String>)
// .zone(java.lang.String)
.build();
Name | Type | Description |
---|---|---|
connection |
com.hashicorp.cdktf.SSHProvisionerConnection OR com.hashicorp.cdktf.WinrmProvisionerConnection |
No description. |
count |
java.lang.Number OR com.hashicorp.cdktf.TerraformCount |
No description. |
dependsOn |
java.util.List<com.hashicorp.cdktf.ITerraformDependable> |
No description. |
forEach |
com.hashicorp.cdktf.ITerraformIterator |
No description. |
lifecycle |
com.hashicorp.cdktf.TerraformResourceLifecycle |
No description. |
provider |
com.hashicorp.cdktf.TerraformProvider |
No description. |
provisioners |
java.util.List<com.hashicorp.cdktf.FileProvisioner OR com.hashicorp.cdktf.LocalExecProvisioner OR com.hashicorp.cdktf.RemoteExecProvisioner> |
No description. |
name |
java.lang.String |
A unique name for the resource, required by Dataflow. |
tempGcsLocation |
java.lang.String |
A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data. |
templateGcsPath |
java.lang.String |
The Google Cloud Storage path to the Dataflow job template. |
additionalExperiments |
java.util.List<java.lang.String> |
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"]. |
enableStreamingEngine |
java.lang.Boolean OR com.hashicorp.cdktf.IResolvable |
Indicates if the job should use the streaming engine feature. |
id |
java.lang.String |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#id GoogleDataflowJob#id}. |
ipConfiguration |
java.lang.String |
The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE". |
kmsKeyName |
java.lang.String |
The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY. |
labels |
java.util.Map<java.lang.String, java.lang.String> |
User labels to be specified for the job. |
machineType |
java.lang.String |
The machine type to use for the job. |
maxWorkers |
java.lang.Number |
The number of workers permitted to work on the job. More workers may improve processing speed at additional cost. |
network |
java.lang.String |
The network to which VMs will be assigned. If it is not provided, "default" will be used. |
onDelete |
java.lang.String |
One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy. |
parameters |
java.util.Map<java.lang.String, java.lang.String> |
Key/Value pairs to be passed to the Dataflow job (as used in the template). |
project |
java.lang.String |
The project in which the resource belongs. |
region |
java.lang.String |
The region in which the created job should run. |
serviceAccountEmail |
java.lang.String |
The Service Account email used to create the job. |
skipWaitOnJobTermination |
java.lang.Boolean OR com.hashicorp.cdktf.IResolvable |
If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. |
subnetwork |
java.lang.String |
The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". |
timeouts |
GoogleDataflowJobTimeouts |
timeouts block. |
transformNameMapping |
java.util.Map<java.lang.String, java.lang.String> |
Only applicable when updating a pipeline. |
zone |
java.lang.String |
The zone in which the created job should run. If it is not provided, the provider zone is used. |
public java.lang.Object getConnection();
- Type: com.hashicorp.cdktf.SSHProvisionerConnection OR com.hashicorp.cdktf.WinrmProvisionerConnection
public java.lang.Object getCount();
- Type: java.lang.Number OR com.hashicorp.cdktf.TerraformCount
public java.util.List<ITerraformDependable> getDependsOn();
- Type: java.util.List<com.hashicorp.cdktf.ITerraformDependable>
public ITerraformIterator getForEach();
- Type: com.hashicorp.cdktf.ITerraformIterator
public TerraformResourceLifecycle getLifecycle();
- Type: com.hashicorp.cdktf.TerraformResourceLifecycle
public TerraformProvider getProvider();
- Type: com.hashicorp.cdktf.TerraformProvider
public java.lang.Object getProvisioners();
- Type: java.util.List<com.hashicorp.cdktf.FileProvisioner OR com.hashicorp.cdktf.LocalExecProvisioner OR com.hashicorp.cdktf.RemoteExecProvisioner>
public java.lang.String getName();
- Type: java.lang.String
A unique name for the resource, required by Dataflow.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#name GoogleDataflowJob#name}
public java.lang.String getTempGcsLocation();
- Type: java.lang.String
A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#temp_gcs_location GoogleDataflowJob#temp_gcs_location}
public java.lang.String getTemplateGcsPath();
- Type: java.lang.String
The Google Cloud Storage path to the Dataflow job template.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#template_gcs_path GoogleDataflowJob#template_gcs_path}
public java.util.List<java.lang.String> getAdditionalExperiments();
- Type: java.util.List<java.lang.String>
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#additional_experiments GoogleDataflowJob#additional_experiments}
public java.lang.Object getEnableStreamingEngine();
- Type: java.lang.Boolean OR com.hashicorp.cdktf.IResolvable
Indicates if the job should use the streaming engine feature.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#enable_streaming_engine GoogleDataflowJob#enable_streaming_engine}
public java.lang.String getId();
- Type: java.lang.String
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#id GoogleDataflowJob#id}.
Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.
public java.lang.String getIpConfiguration();
- Type: java.lang.String
The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#ip_configuration GoogleDataflowJob#ip_configuration}
public java.lang.String getKmsKeyName();
- Type: java.lang.String
The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#kms_key_name GoogleDataflowJob#kms_key_name}
public java.util.Map<java.lang.String, java.lang.String> getLabels();
- Type: java.util.Map<java.lang.String, java.lang.String>
User labels to be specified for the job.
Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: This field is non-authoritative, and will only manage the labels present in your configuration. Please refer to the field 'effective_labels' for all of the labels present on the resource.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#labels GoogleDataflowJob#labels}
public java.lang.String getMachineType();
- Type: java.lang.String
The machine type to use for the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#machine_type GoogleDataflowJob#machine_type}
public java.lang.Number getMaxWorkers();
- Type: java.lang.Number
The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#max_workers GoogleDataflowJob#max_workers}
public java.lang.String getNetwork();
- Type: java.lang.String
The network to which VMs will be assigned. If it is not provided, "default" will be used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#network GoogleDataflowJob#network}
public java.lang.String getOnDelete();
- Type: java.lang.String
One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#on_delete GoogleDataflowJob#on_delete}
public java.util.Map<java.lang.String, java.lang.String> getParameters();
- Type: java.util.Map<java.lang.String, java.lang.String>
Key/Value pairs to be passed to the Dataflow job (as used in the template).
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#parameters GoogleDataflowJob#parameters}
public java.lang.String getProject();
- Type: java.lang.String
The project in which the resource belongs.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#project GoogleDataflowJob#project}
public java.lang.String getRegion();
- Type: java.lang.String
The region in which the created job should run.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#region GoogleDataflowJob#region}
public java.lang.String getServiceAccountEmail();
- Type: java.lang.String
The Service Account email used to create the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#service_account_email GoogleDataflowJob#service_account_email}
public java.lang.Object getSkipWaitOnJobTermination();
- Type: java.lang.Boolean OR com.hashicorp.cdktf.IResolvable
If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.
WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#skip_wait_on_job_termination GoogleDataflowJob#skip_wait_on_job_termination}
public java.lang.String getSubnetwork();
- Type: java.lang.String
The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK".
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#subnetwork GoogleDataflowJob#subnetwork}
public GoogleDataflowJobTimeouts getTimeouts();
timeouts block.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#timeouts GoogleDataflowJob#timeouts}
public java.util.Map<java.lang.String, java.lang.String> getTransformNameMapping();
- Type: java.util.Map<java.lang.String, java.lang.String>
Only applicable when updating a pipeline.
Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#transform_name_mapping GoogleDataflowJob#transform_name_mapping}
public java.lang.String getZone();
- Type: java.lang.String
The zone in which the created job should run. If it is not provided, the provider zone is used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#zone GoogleDataflowJob#zone}
import com.hashicorp.cdktf.providers.google_beta.google_dataflow_job.GoogleDataflowJobTimeouts;
GoogleDataflowJobTimeouts.builder()
// .update(java.lang.String)
.build();
Name | Type | Description |
---|---|---|
update |
java.lang.String |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#update GoogleDataflowJob#update}. |
public java.lang.String getUpdate();
- Type: java.lang.String
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.14.1/docs/resources/google_dataflow_job#update GoogleDataflowJob#update}.
import com.hashicorp.cdktf.providers.google_beta.google_dataflow_job.GoogleDataflowJobTimeoutsOutputReference;
new GoogleDataflowJobTimeoutsOutputReference(IInterpolatingParent terraformResource, java.lang.String terraformAttribute);
Name | Type | Description |
---|---|---|
terraformResource |
com.hashicorp.cdktf.IInterpolatingParent |
The parent resource. |
terraformAttribute |
java.lang.String |
The attribute on the parent resource this class is referencing. |
- Type: com.hashicorp.cdktf.IInterpolatingParent
The parent resource.
- Type: java.lang.String
The attribute on the parent resource this class is referencing.
Name | Description |
---|---|
computeFqn |
No description. |
getAnyMapAttribute |
No description. |
getBooleanAttribute |
No description. |
getBooleanMapAttribute |
No description. |
getListAttribute |
No description. |
getNumberAttribute |
No description. |
getNumberListAttribute |
No description. |
getNumberMapAttribute |
No description. |
getStringAttribute |
No description. |
getStringMapAttribute |
No description. |
interpolationForAttribute |
No description. |
resolve |
Produce the Token's value at resolution time. |
toString |
Return a string representation of this resolvable object. |
resetUpdate |
No description. |
public java.lang.String computeFqn()
public java.util.Map<java.lang.String, java.lang.Object> getAnyMapAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public IResolvable getBooleanAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public java.util.Map<java.lang.String, java.lang.Boolean> getBooleanMapAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public java.util.List<java.lang.String> getListAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public java.lang.Number getNumberAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public java.util.List<java.lang.Number> getNumberListAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public java.util.Map<java.lang.String, java.lang.Number> getNumberMapAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public java.lang.String getStringAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public java.util.Map<java.lang.String, java.lang.String> getStringMapAttribute(java.lang.String terraformAttribute)
- Type: java.lang.String
public IResolvable interpolationForAttribute(java.lang.String property)
- Type: java.lang.String
public java.lang.Object resolve(IResolveContext _context)
Produce the Token's value at resolution time.
- Type: com.hashicorp.cdktf.IResolveContext
public java.lang.String toString()
Return a string representation of this resolvable object.
Returns a reversible string representation.
public void resetUpdate()
Name | Type | Description |
---|---|---|
creationStack |
java.util.List<java.lang.String> |
The creation stack of this resolvable which will be appended to errors thrown during resolution. |
fqn |
java.lang.String |
No description. |
updateInput |
java.lang.String |
No description. |
update |
java.lang.String |
No description. |
internalValue |
com.hashicorp.cdktf.IResolvable OR GoogleDataflowJobTimeouts |
No description. |
public java.util.List<java.lang.String> getCreationStack();
- Type: java.util.List<java.lang.String>
The creation stack of this resolvable which will be appended to errors thrown during resolution.
If this returns an empty array the stack will not be attached.
public java.lang.String getFqn();
- Type: java.lang.String
public java.lang.String getUpdateInput();
- Type: java.lang.String
public java.lang.String getUpdate();
- Type: java.lang.String
public java.lang.Object getInternalValue();
- Type: com.hashicorp.cdktf.IResolvable OR GoogleDataflowJobTimeouts