Represents a {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job google_dataflow_flex_template_job}.
from cdktf_cdktf_provider_google_beta import google_dataflow_flex_template_job
googleDataflowFlexTemplateJob.GoogleDataflowFlexTemplateJob(
scope: Construct,
id: str,
connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection] = None,
count: typing.Union[typing.Union[int, float], TerraformCount] = None,
depends_on: typing.List[ITerraformDependable] = None,
for_each: ITerraformIterator = None,
lifecycle: TerraformResourceLifecycle = None,
provider: TerraformProvider = None,
provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]] = None,
container_spec_gcs_path: str,
name: str,
additional_experiments: typing.List[str] = None,
autoscaling_algorithm: str = None,
enable_streaming_engine: typing.Union[bool, IResolvable] = None,
id: str = None,
ip_configuration: str = None,
kms_key_name: str = None,
labels: typing.Mapping[str] = None,
launcher_machine_type: str = None,
machine_type: str = None,
max_workers: typing.Union[int, float] = None,
network: str = None,
num_workers: typing.Union[int, float] = None,
on_delete: str = None,
parameters: typing.Mapping[str] = None,
project: str = None,
region: str = None,
sdk_container_image: str = None,
service_account_email: str = None,
skip_wait_on_job_termination: typing.Union[bool, IResolvable] = None,
staging_location: str = None,
subnetwork: str = None,
temp_location: str = None,
transform_name_mapping: typing.Mapping[str] = None
)
Name | Type | Description |
---|---|---|
scope |
constructs.Construct |
The scope in which to define this construct. |
id |
str |
The scoped construct ID. |
connection |
typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection] |
No description. |
count |
typing.Union[typing.Union[int, float], cdktf.TerraformCount] |
No description. |
depends_on |
typing.List[cdktf.ITerraformDependable] |
No description. |
for_each |
cdktf.ITerraformIterator |
No description. |
lifecycle |
cdktf.TerraformResourceLifecycle |
No description. |
provider |
cdktf.TerraformProvider |
No description. |
provisioners |
typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]] |
No description. |
container_spec_gcs_path |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#container_spec_gcs_path GoogleDataflowFlexTemplateJob#container_spec_gcs_path}. |
name |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#name GoogleDataflowFlexTemplateJob#name}. |
additional_experiments |
typing.List[str] |
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"]. |
autoscaling_algorithm |
str |
The algorithm to use for autoscaling. |
enable_streaming_engine |
typing.Union[bool, cdktf.IResolvable] |
Indicates if the job should use the streaming engine feature. |
id |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#id GoogleDataflowFlexTemplateJob#id}. |
ip_configuration |
str |
The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE". |
kms_key_name |
str |
The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY. |
labels |
typing.Mapping[str] |
User labels to be specified for the job. |
launcher_machine_type |
str |
The machine type to use for launching the job. The default is n1-standard-1. |
machine_type |
str |
The machine type to use for the job. |
max_workers |
typing.Union[int, float] |
The maximum number of Google Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000. |
network |
str |
The network to which VMs will be assigned. If it is not provided, "default" will be used. |
num_workers |
typing.Union[int, float] |
The initial number of Google Compute Engine instances for the job. |
on_delete |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#on_delete GoogleDataflowFlexTemplateJob#on_delete}. |
parameters |
typing.Mapping[str] |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#parameters GoogleDataflowFlexTemplateJob#parameters}. |
project |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#project GoogleDataflowFlexTemplateJob#project}. |
region |
str |
The region in which the created job should run. |
sdk_container_image |
str |
Docker registry location of container image to use for the 'worker harness. |
service_account_email |
str |
The Service Account email used to create the job. |
skip_wait_on_job_termination |
typing.Union[bool, cdktf.IResolvable] |
If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. |
staging_location |
str |
The Cloud Storage path to use for staging files. Must be a valid Cloud Storage URL, beginning with gs://. |
subnetwork |
str |
The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". |
temp_location |
str |
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://. |
transform_name_mapping |
typing.Mapping[str] |
Only applicable when updating a pipeline. |
- Type: constructs.Construct
The scope in which to define this construct.
- Type: str
The scoped construct ID.
Must be unique amongst siblings in the same scope
- Type: typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection]
- Type: typing.Union[typing.Union[int, float], cdktf.TerraformCount]
- Type: typing.List[cdktf.ITerraformDependable]
- Type: cdktf.ITerraformIterator
- Type: cdktf.TerraformResourceLifecycle
- Type: cdktf.TerraformProvider
- Type: typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]]
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#container_spec_gcs_path GoogleDataflowFlexTemplateJob#container_spec_gcs_path}.
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#name GoogleDataflowFlexTemplateJob#name}.
- Type: typing.List[str]
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#additional_experiments GoogleDataflowFlexTemplateJob#additional_experiments}
- Type: str
The algorithm to use for autoscaling.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#autoscaling_algorithm GoogleDataflowFlexTemplateJob#autoscaling_algorithm}
- Type: typing.Union[bool, cdktf.IResolvable]
Indicates if the job should use the streaming engine feature.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#enable_streaming_engine GoogleDataflowFlexTemplateJob#enable_streaming_engine}
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#id GoogleDataflowFlexTemplateJob#id}.
Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.
- Type: str
The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#ip_configuration GoogleDataflowFlexTemplateJob#ip_configuration}
- Type: str
The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#kms_key_name GoogleDataflowFlexTemplateJob#kms_key_name}
- Type: typing.Mapping[str]
User labels to be specified for the job.
Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: This field is non-authoritative, and will only manage the labels present in your configuration. Please refer to the field 'effective_labels' for all of the labels present on the resource.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#labels GoogleDataflowFlexTemplateJob#labels}
- Type: str
The machine type to use for launching the job. The default is n1-standard-1.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#launcher_machine_type GoogleDataflowFlexTemplateJob#launcher_machine_type}
- Type: str
The machine type to use for the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#machine_type GoogleDataflowFlexTemplateJob#machine_type}
- Type: typing.Union[int, float]
The maximum number of Google Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#max_workers GoogleDataflowFlexTemplateJob#max_workers}
- Type: str
The network to which VMs will be assigned. If it is not provided, "default" will be used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#network GoogleDataflowFlexTemplateJob#network}
- Type: typing.Union[int, float]
The initial number of Google Compute Engine instances for the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#num_workers GoogleDataflowFlexTemplateJob#num_workers}
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#on_delete GoogleDataflowFlexTemplateJob#on_delete}.
- Type: typing.Mapping[str]
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#parameters GoogleDataflowFlexTemplateJob#parameters}.
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#project GoogleDataflowFlexTemplateJob#project}.
- Type: str
The region in which the created job should run.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#region GoogleDataflowFlexTemplateJob#region}
- Type: str
Docker registry location of container image to use for the 'worker harness.
Default is the container for the version of the SDK. Note this field is only valid for portable pipelines.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#sdk_container_image GoogleDataflowFlexTemplateJob#sdk_container_image}
- Type: str
The Service Account email used to create the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#service_account_email GoogleDataflowFlexTemplateJob#service_account_email}
- Type: typing.Union[bool, cdktf.IResolvable]
If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.
WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#skip_wait_on_job_termination GoogleDataflowFlexTemplateJob#skip_wait_on_job_termination}
- Type: str
The Cloud Storage path to use for staging files. Must be a valid Cloud Storage URL, beginning with gs://.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#staging_location GoogleDataflowFlexTemplateJob#staging_location}
- Type: str
The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK".
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#subnetwork GoogleDataflowFlexTemplateJob#subnetwork}
- Type: str
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#temp_location GoogleDataflowFlexTemplateJob#temp_location}
- Type: typing.Mapping[str]
Only applicable when updating a pipeline.
Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#transform_name_mapping GoogleDataflowFlexTemplateJob#transform_name_mapping}
Name | Description |
---|---|
to_string |
Returns a string representation of this construct. |
add_override |
No description. |
override_logical_id |
Overrides the auto-generated logical ID with a specific ID. |
reset_override_logical_id |
Resets a previously passed logical Id to use the auto-generated logical id again. |
to_hcl_terraform |
No description. |
to_metadata |
No description. |
to_terraform |
Adds this resource to the terraform JSON output. |
add_move_target |
Adds a user defined moveTarget string to this resource to be later used in .moveTo(moveTarget) to resolve the location of the move. |
get_any_map_attribute |
No description. |
get_boolean_attribute |
No description. |
get_boolean_map_attribute |
No description. |
get_list_attribute |
No description. |
get_number_attribute |
No description. |
get_number_list_attribute |
No description. |
get_number_map_attribute |
No description. |
get_string_attribute |
No description. |
get_string_map_attribute |
No description. |
has_resource_move |
No description. |
import_from |
No description. |
interpolation_for_attribute |
No description. |
move_from_id |
Move the resource corresponding to "id" to this resource. |
move_to |
Moves this resource to the target resource given by moveTarget. |
move_to_id |
Moves this resource to the resource corresponding to "id". |
reset_additional_experiments |
No description. |
reset_autoscaling_algorithm |
No description. |
reset_enable_streaming_engine |
No description. |
reset_id |
No description. |
reset_ip_configuration |
No description. |
reset_kms_key_name |
No description. |
reset_labels |
No description. |
reset_launcher_machine_type |
No description. |
reset_machine_type |
No description. |
reset_max_workers |
No description. |
reset_network |
No description. |
reset_num_workers |
No description. |
reset_on_delete |
No description. |
reset_parameters |
No description. |
reset_project |
No description. |
reset_region |
No description. |
reset_sdk_container_image |
No description. |
reset_service_account_email |
No description. |
reset_skip_wait_on_job_termination |
No description. |
reset_staging_location |
No description. |
reset_subnetwork |
No description. |
reset_temp_location |
No description. |
reset_transform_name_mapping |
No description. |
def to_string() -> str
Returns a string representation of this construct.
def add_override(
path: str,
value: typing.Any
) -> None
- Type: str
- Type: typing.Any
def override_logical_id(
new_logical_id: str
) -> None
Overrides the auto-generated logical ID with a specific ID.
- Type: str
The new logical ID to use for this stack element.
def reset_override_logical_id() -> None
Resets a previously passed logical Id to use the auto-generated logical id again.
def to_hcl_terraform() -> typing.Any
def to_metadata() -> typing.Any
def to_terraform() -> typing.Any
Adds this resource to the terraform JSON output.
def add_move_target(
move_target: str
) -> None
Adds a user defined moveTarget string to this resource to be later used in .moveTo(moveTarget) to resolve the location of the move.
- Type: str
The string move target that will correspond to this resource.
def get_any_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Any]
- Type: str
def get_boolean_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def get_boolean_map_attribute(
terraform_attribute: str
) -> typing.Mapping[bool]
- Type: str
def get_list_attribute(
terraform_attribute: str
) -> typing.List[str]
- Type: str
def get_number_attribute(
terraform_attribute: str
) -> typing.Union[int, float]
- Type: str
def get_number_list_attribute(
terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
- Type: str
def get_number_map_attribute(
terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
- Type: str
def get_string_attribute(
terraform_attribute: str
) -> str
- Type: str
def get_string_map_attribute(
terraform_attribute: str
) -> typing.Mapping[str]
- Type: str
def has_resource_move() -> typing.Union[TerraformResourceMoveByTarget, TerraformResourceMoveById]
def import_from(
id: str,
provider: TerraformProvider = None
) -> None
- Type: str
- Type: cdktf.TerraformProvider
def interpolation_for_attribute(
terraform_attribute: str
) -> IResolvable
- Type: str
def move_from_id(
id: str
) -> None
Move the resource corresponding to "id" to this resource.
Note that the resource being moved from must be marked as moved using it's instance function.
- Type: str
Full id of resource being moved from, e.g. "aws_s3_bucket.example".
def move_to(
move_target: str,
index: typing.Union[str, typing.Union[int, float]] = None
) -> None
Moves this resource to the target resource given by moveTarget.
- Type: str
The previously set user defined string set by .addMoveTarget() corresponding to the resource to move to.
- Type: typing.Union[str, typing.Union[int, float]]
Optional The index corresponding to the key the resource is to appear in the foreach of a resource to move to.
def move_to_id(
id: str
) -> None
Moves this resource to the resource corresponding to "id".
- Type: str
Full id of resource to move to, e.g. "aws_s3_bucket.example".
def reset_additional_experiments() -> None
def reset_autoscaling_algorithm() -> None
def reset_enable_streaming_engine() -> None
def reset_id() -> None
def reset_ip_configuration() -> None
def reset_kms_key_name() -> None
def reset_labels() -> None
def reset_launcher_machine_type() -> None
def reset_machine_type() -> None
def reset_max_workers() -> None
def reset_network() -> None
def reset_num_workers() -> None
def reset_on_delete() -> None
def reset_parameters() -> None
def reset_project() -> None
def reset_region() -> None
def reset_sdk_container_image() -> None
def reset_service_account_email() -> None
def reset_skip_wait_on_job_termination() -> None
def reset_staging_location() -> None
def reset_subnetwork() -> None
def reset_temp_location() -> None
def reset_transform_name_mapping() -> None
Name | Description |
---|---|
is_construct |
Checks if x is a construct. |
is_terraform_element |
No description. |
is_terraform_resource |
No description. |
generate_config_for_import |
Generates CDKTF code for importing a GoogleDataflowFlexTemplateJob resource upon running "cdktf plan ". |
from cdktf_cdktf_provider_google_beta import google_dataflow_flex_template_job
googleDataflowFlexTemplateJob.GoogleDataflowFlexTemplateJob.is_construct(
x: typing.Any
)
Checks if x
is a construct.
Use this method instead of instanceof
to properly detect Construct
instances, even when the construct library is symlinked.
Explanation: in JavaScript, multiple copies of the constructs
library on
disk are seen as independent, completely different libraries. As a
consequence, the class Construct
in each copy of the constructs
library
is seen as a different class, and an instance of one class will not test as
instanceof
the other class. npm install
will not create installations
like this, but users may manually symlink construct libraries together or
use a monorepo tool: in those cases, multiple copies of the constructs
library can be accidentally installed, and instanceof
will behave
unpredictably. It is safest to avoid using instanceof
, and using
this type-testing method instead.
- Type: typing.Any
Any object.
from cdktf_cdktf_provider_google_beta import google_dataflow_flex_template_job
googleDataflowFlexTemplateJob.GoogleDataflowFlexTemplateJob.is_terraform_element(
x: typing.Any
)
- Type: typing.Any
from cdktf_cdktf_provider_google_beta import google_dataflow_flex_template_job
googleDataflowFlexTemplateJob.GoogleDataflowFlexTemplateJob.is_terraform_resource(
x: typing.Any
)
- Type: typing.Any
from cdktf_cdktf_provider_google_beta import google_dataflow_flex_template_job
googleDataflowFlexTemplateJob.GoogleDataflowFlexTemplateJob.generate_config_for_import(
scope: Construct,
import_to_id: str,
import_from_id: str,
provider: TerraformProvider = None
)
Generates CDKTF code for importing a GoogleDataflowFlexTemplateJob resource upon running "cdktf plan ".
- Type: constructs.Construct
The scope in which to define this construct.
- Type: str
The construct id used in the generated config for the GoogleDataflowFlexTemplateJob to import.
- Type: str
The id of the existing GoogleDataflowFlexTemplateJob that should be imported.
Refer to the {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#import import section} in the documentation of this resource for the id to use
- Type: cdktf.TerraformProvider
? Optional instance of the provider where the GoogleDataflowFlexTemplateJob to import is found.
Name | Type | Description |
---|---|---|
node |
constructs.Node |
The tree node. |
cdktf_stack |
cdktf.TerraformStack |
No description. |
fqn |
str |
No description. |
friendly_unique_id |
str |
No description. |
terraform_meta_arguments |
typing.Mapping[typing.Any] |
No description. |
terraform_resource_type |
str |
No description. |
terraform_generator_metadata |
cdktf.TerraformProviderGeneratorMetadata |
No description. |
connection |
typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection] |
No description. |
count |
typing.Union[typing.Union[int, float], cdktf.TerraformCount] |
No description. |
depends_on |
typing.List[str] |
No description. |
for_each |
cdktf.ITerraformIterator |
No description. |
lifecycle |
cdktf.TerraformResourceLifecycle |
No description. |
provider |
cdktf.TerraformProvider |
No description. |
provisioners |
typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]] |
No description. |
effective_labels |
cdktf.StringMap |
No description. |
job_id |
str |
No description. |
state |
str |
No description. |
terraform_labels |
cdktf.StringMap |
No description. |
type |
str |
No description. |
additional_experiments_input |
typing.List[str] |
No description. |
autoscaling_algorithm_input |
str |
No description. |
container_spec_gcs_path_input |
str |
No description. |
enable_streaming_engine_input |
typing.Union[bool, cdktf.IResolvable] |
No description. |
id_input |
str |
No description. |
ip_configuration_input |
str |
No description. |
kms_key_name_input |
str |
No description. |
labels_input |
typing.Mapping[str] |
No description. |
launcher_machine_type_input |
str |
No description. |
machine_type_input |
str |
No description. |
max_workers_input |
typing.Union[int, float] |
No description. |
name_input |
str |
No description. |
network_input |
str |
No description. |
num_workers_input |
typing.Union[int, float] |
No description. |
on_delete_input |
str |
No description. |
parameters_input |
typing.Mapping[str] |
No description. |
project_input |
str |
No description. |
region_input |
str |
No description. |
sdk_container_image_input |
str |
No description. |
service_account_email_input |
str |
No description. |
skip_wait_on_job_termination_input |
typing.Union[bool, cdktf.IResolvable] |
No description. |
staging_location_input |
str |
No description. |
subnetwork_input |
str |
No description. |
temp_location_input |
str |
No description. |
transform_name_mapping_input |
typing.Mapping[str] |
No description. |
additional_experiments |
typing.List[str] |
No description. |
autoscaling_algorithm |
str |
No description. |
container_spec_gcs_path |
str |
No description. |
enable_streaming_engine |
typing.Union[bool, cdktf.IResolvable] |
No description. |
id |
str |
No description. |
ip_configuration |
str |
No description. |
kms_key_name |
str |
No description. |
labels |
typing.Mapping[str] |
No description. |
launcher_machine_type |
str |
No description. |
machine_type |
str |
No description. |
max_workers |
typing.Union[int, float] |
No description. |
name |
str |
No description. |
network |
str |
No description. |
num_workers |
typing.Union[int, float] |
No description. |
on_delete |
str |
No description. |
parameters |
typing.Mapping[str] |
No description. |
project |
str |
No description. |
region |
str |
No description. |
sdk_container_image |
str |
No description. |
service_account_email |
str |
No description. |
skip_wait_on_job_termination |
typing.Union[bool, cdktf.IResolvable] |
No description. |
staging_location |
str |
No description. |
subnetwork |
str |
No description. |
temp_location |
str |
No description. |
transform_name_mapping |
typing.Mapping[str] |
No description. |
node: Node
- Type: constructs.Node
The tree node.
cdktf_stack: TerraformStack
- Type: cdktf.TerraformStack
fqn: str
- Type: str
friendly_unique_id: str
- Type: str
terraform_meta_arguments: typing.Mapping[typing.Any]
- Type: typing.Mapping[typing.Any]
terraform_resource_type: str
- Type: str
terraform_generator_metadata: TerraformProviderGeneratorMetadata
- Type: cdktf.TerraformProviderGeneratorMetadata
connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection]
- Type: typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection]
count: typing.Union[typing.Union[int, float], TerraformCount]
- Type: typing.Union[typing.Union[int, float], cdktf.TerraformCount]
depends_on: typing.List[str]
- Type: typing.List[str]
for_each: ITerraformIterator
- Type: cdktf.ITerraformIterator
lifecycle: TerraformResourceLifecycle
- Type: cdktf.TerraformResourceLifecycle
provider: TerraformProvider
- Type: cdktf.TerraformProvider
provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]]
- Type: typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]]
effective_labels: StringMap
- Type: cdktf.StringMap
job_id: str
- Type: str
state: str
- Type: str
terraform_labels: StringMap
- Type: cdktf.StringMap
type: str
- Type: str
additional_experiments_input: typing.List[str]
- Type: typing.List[str]
autoscaling_algorithm_input: str
- Type: str
container_spec_gcs_path_input: str
- Type: str
enable_streaming_engine_input: typing.Union[bool, IResolvable]
- Type: typing.Union[bool, cdktf.IResolvable]
id_input: str
- Type: str
ip_configuration_input: str
- Type: str
kms_key_name_input: str
- Type: str
labels_input: typing.Mapping[str]
- Type: typing.Mapping[str]
launcher_machine_type_input: str
- Type: str
machine_type_input: str
- Type: str
max_workers_input: typing.Union[int, float]
- Type: typing.Union[int, float]
name_input: str
- Type: str
network_input: str
- Type: str
num_workers_input: typing.Union[int, float]
- Type: typing.Union[int, float]
on_delete_input: str
- Type: str
parameters_input: typing.Mapping[str]
- Type: typing.Mapping[str]
project_input: str
- Type: str
region_input: str
- Type: str
sdk_container_image_input: str
- Type: str
service_account_email_input: str
- Type: str
skip_wait_on_job_termination_input: typing.Union[bool, IResolvable]
- Type: typing.Union[bool, cdktf.IResolvable]
staging_location_input: str
- Type: str
subnetwork_input: str
- Type: str
temp_location_input: str
- Type: str
transform_name_mapping_input: typing.Mapping[str]
- Type: typing.Mapping[str]
additional_experiments: typing.List[str]
- Type: typing.List[str]
autoscaling_algorithm: str
- Type: str
container_spec_gcs_path: str
- Type: str
enable_streaming_engine: typing.Union[bool, IResolvable]
- Type: typing.Union[bool, cdktf.IResolvable]
id: str
- Type: str
ip_configuration: str
- Type: str
kms_key_name: str
- Type: str
labels: typing.Mapping[str]
- Type: typing.Mapping[str]
launcher_machine_type: str
- Type: str
machine_type: str
- Type: str
max_workers: typing.Union[int, float]
- Type: typing.Union[int, float]
name: str
- Type: str
network: str
- Type: str
num_workers: typing.Union[int, float]
- Type: typing.Union[int, float]
on_delete: str
- Type: str
parameters: typing.Mapping[str]
- Type: typing.Mapping[str]
project: str
- Type: str
region: str
- Type: str
sdk_container_image: str
- Type: str
service_account_email: str
- Type: str
skip_wait_on_job_termination: typing.Union[bool, IResolvable]
- Type: typing.Union[bool, cdktf.IResolvable]
staging_location: str
- Type: str
subnetwork: str
- Type: str
temp_location: str
- Type: str
transform_name_mapping: typing.Mapping[str]
- Type: typing.Mapping[str]
Name | Type | Description |
---|---|---|
tfResourceType |
str |
No description. |
tfResourceType: str
- Type: str
from cdktf_cdktf_provider_google_beta import google_dataflow_flex_template_job
googleDataflowFlexTemplateJob.GoogleDataflowFlexTemplateJobConfig(
connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection] = None,
count: typing.Union[typing.Union[int, float], TerraformCount] = None,
depends_on: typing.List[ITerraformDependable] = None,
for_each: ITerraformIterator = None,
lifecycle: TerraformResourceLifecycle = None,
provider: TerraformProvider = None,
provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]] = None,
container_spec_gcs_path: str,
name: str,
additional_experiments: typing.List[str] = None,
autoscaling_algorithm: str = None,
enable_streaming_engine: typing.Union[bool, IResolvable] = None,
id: str = None,
ip_configuration: str = None,
kms_key_name: str = None,
labels: typing.Mapping[str] = None,
launcher_machine_type: str = None,
machine_type: str = None,
max_workers: typing.Union[int, float] = None,
network: str = None,
num_workers: typing.Union[int, float] = None,
on_delete: str = None,
parameters: typing.Mapping[str] = None,
project: str = None,
region: str = None,
sdk_container_image: str = None,
service_account_email: str = None,
skip_wait_on_job_termination: typing.Union[bool, IResolvable] = None,
staging_location: str = None,
subnetwork: str = None,
temp_location: str = None,
transform_name_mapping: typing.Mapping[str] = None
)
Name | Type | Description |
---|---|---|
connection |
typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection] |
No description. |
count |
typing.Union[typing.Union[int, float], cdktf.TerraformCount] |
No description. |
depends_on |
typing.List[cdktf.ITerraformDependable] |
No description. |
for_each |
cdktf.ITerraformIterator |
No description. |
lifecycle |
cdktf.TerraformResourceLifecycle |
No description. |
provider |
cdktf.TerraformProvider |
No description. |
provisioners |
typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]] |
No description. |
container_spec_gcs_path |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#container_spec_gcs_path GoogleDataflowFlexTemplateJob#container_spec_gcs_path}. |
name |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#name GoogleDataflowFlexTemplateJob#name}. |
additional_experiments |
typing.List[str] |
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"]. |
autoscaling_algorithm |
str |
The algorithm to use for autoscaling. |
enable_streaming_engine |
typing.Union[bool, cdktf.IResolvable] |
Indicates if the job should use the streaming engine feature. |
id |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#id GoogleDataflowFlexTemplateJob#id}. |
ip_configuration |
str |
The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE". |
kms_key_name |
str |
The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY. |
labels |
typing.Mapping[str] |
User labels to be specified for the job. |
launcher_machine_type |
str |
The machine type to use for launching the job. The default is n1-standard-1. |
machine_type |
str |
The machine type to use for the job. |
max_workers |
typing.Union[int, float] |
The maximum number of Google Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000. |
network |
str |
The network to which VMs will be assigned. If it is not provided, "default" will be used. |
num_workers |
typing.Union[int, float] |
The initial number of Google Compute Engine instances for the job. |
on_delete |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#on_delete GoogleDataflowFlexTemplateJob#on_delete}. |
parameters |
typing.Mapping[str] |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#parameters GoogleDataflowFlexTemplateJob#parameters}. |
project |
str |
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#project GoogleDataflowFlexTemplateJob#project}. |
region |
str |
The region in which the created job should run. |
sdk_container_image |
str |
Docker registry location of container image to use for the 'worker harness. |
service_account_email |
str |
The Service Account email used to create the job. |
skip_wait_on_job_termination |
typing.Union[bool, cdktf.IResolvable] |
If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. |
staging_location |
str |
The Cloud Storage path to use for staging files. Must be a valid Cloud Storage URL, beginning with gs://. |
subnetwork |
str |
The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". |
temp_location |
str |
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://. |
transform_name_mapping |
typing.Mapping[str] |
Only applicable when updating a pipeline. |
connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection]
- Type: typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection]
count: typing.Union[typing.Union[int, float], TerraformCount]
- Type: typing.Union[typing.Union[int, float], cdktf.TerraformCount]
depends_on: typing.List[ITerraformDependable]
- Type: typing.List[cdktf.ITerraformDependable]
for_each: ITerraformIterator
- Type: cdktf.ITerraformIterator
lifecycle: TerraformResourceLifecycle
- Type: cdktf.TerraformResourceLifecycle
provider: TerraformProvider
- Type: cdktf.TerraformProvider
provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]]
- Type: typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]]
container_spec_gcs_path: str
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#container_spec_gcs_path GoogleDataflowFlexTemplateJob#container_spec_gcs_path}.
name: str
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#name GoogleDataflowFlexTemplateJob#name}.
additional_experiments: typing.List[str]
- Type: typing.List[str]
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#additional_experiments GoogleDataflowFlexTemplateJob#additional_experiments}
autoscaling_algorithm: str
- Type: str
The algorithm to use for autoscaling.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#autoscaling_algorithm GoogleDataflowFlexTemplateJob#autoscaling_algorithm}
enable_streaming_engine: typing.Union[bool, IResolvable]
- Type: typing.Union[bool, cdktf.IResolvable]
Indicates if the job should use the streaming engine feature.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#enable_streaming_engine GoogleDataflowFlexTemplateJob#enable_streaming_engine}
id: str
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#id GoogleDataflowFlexTemplateJob#id}.
Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.
ip_configuration: str
- Type: str
The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#ip_configuration GoogleDataflowFlexTemplateJob#ip_configuration}
kms_key_name: str
- Type: str
The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#kms_key_name GoogleDataflowFlexTemplateJob#kms_key_name}
labels: typing.Mapping[str]
- Type: typing.Mapping[str]
User labels to be specified for the job.
Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: This field is non-authoritative, and will only manage the labels present in your configuration. Please refer to the field 'effective_labels' for all of the labels present on the resource.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#labels GoogleDataflowFlexTemplateJob#labels}
launcher_machine_type: str
- Type: str
The machine type to use for launching the job. The default is n1-standard-1.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#launcher_machine_type GoogleDataflowFlexTemplateJob#launcher_machine_type}
machine_type: str
- Type: str
The machine type to use for the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#machine_type GoogleDataflowFlexTemplateJob#machine_type}
max_workers: typing.Union[int, float]
- Type: typing.Union[int, float]
The maximum number of Google Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#max_workers GoogleDataflowFlexTemplateJob#max_workers}
network: str
- Type: str
The network to which VMs will be assigned. If it is not provided, "default" will be used.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#network GoogleDataflowFlexTemplateJob#network}
num_workers: typing.Union[int, float]
- Type: typing.Union[int, float]
The initial number of Google Compute Engine instances for the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#num_workers GoogleDataflowFlexTemplateJob#num_workers}
on_delete: str
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#on_delete GoogleDataflowFlexTemplateJob#on_delete}.
parameters: typing.Mapping[str]
- Type: typing.Mapping[str]
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#parameters GoogleDataflowFlexTemplateJob#parameters}.
project: str
- Type: str
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#project GoogleDataflowFlexTemplateJob#project}.
region: str
- Type: str
The region in which the created job should run.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#region GoogleDataflowFlexTemplateJob#region}
sdk_container_image: str
- Type: str
Docker registry location of container image to use for the 'worker harness.
Default is the container for the version of the SDK. Note this field is only valid for portable pipelines.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#sdk_container_image GoogleDataflowFlexTemplateJob#sdk_container_image}
service_account_email: str
- Type: str
The Service Account email used to create the job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#service_account_email GoogleDataflowFlexTemplateJob#service_account_email}
skip_wait_on_job_termination: typing.Union[bool, IResolvable]
- Type: typing.Union[bool, cdktf.IResolvable]
If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.
WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#skip_wait_on_job_termination GoogleDataflowFlexTemplateJob#skip_wait_on_job_termination}
staging_location: str
- Type: str
The Cloud Storage path to use for staging files. Must be a valid Cloud Storage URL, beginning with gs://.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#staging_location GoogleDataflowFlexTemplateJob#staging_location}
subnetwork: str
- Type: str
The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK".
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#subnetwork GoogleDataflowFlexTemplateJob#subnetwork}
temp_location: str
- Type: str
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#temp_location GoogleDataflowFlexTemplateJob#temp_location}
transform_name_mapping: typing.Mapping[str]
- Type: typing.Mapping[str]
Only applicable when updating a pipeline.
Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job.
Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_flex_template_job#transform_name_mapping GoogleDataflowFlexTemplateJob#transform_name_mapping}