Skip to content

Latest commit

 

History

History
2525 lines (1603 loc) · 113 KB

googleDataflowJob.python.md

File metadata and controls

2525 lines (1603 loc) · 113 KB

googleDataflowJob Submodule

Constructs

GoogleDataflowJob

Represents a {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job google_dataflow_job}.

Initializers

from cdktf_cdktf_provider_google_beta import google_dataflow_job

googleDataflowJob.GoogleDataflowJob(
  scope: Construct,
  id: str,
  connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection] = None,
  count: typing.Union[typing.Union[int, float], TerraformCount] = None,
  depends_on: typing.List[ITerraformDependable] = None,
  for_each: ITerraformIterator = None,
  lifecycle: TerraformResourceLifecycle = None,
  provider: TerraformProvider = None,
  provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]] = None,
  name: str,
  temp_gcs_location: str,
  template_gcs_path: str,
  additional_experiments: typing.List[str] = None,
  enable_streaming_engine: typing.Union[bool, IResolvable] = None,
  id: str = None,
  ip_configuration: str = None,
  kms_key_name: str = None,
  labels: typing.Mapping[str] = None,
  machine_type: str = None,
  max_workers: typing.Union[int, float] = None,
  network: str = None,
  on_delete: str = None,
  parameters: typing.Mapping[str] = None,
  project: str = None,
  region: str = None,
  service_account_email: str = None,
  skip_wait_on_job_termination: typing.Union[bool, IResolvable] = None,
  subnetwork: str = None,
  timeouts: GoogleDataflowJobTimeouts = None,
  transform_name_mapping: typing.Mapping[str] = None,
  zone: str = None
)
Name Type Description
scope constructs.Construct The scope in which to define this construct.
id str The scoped construct ID.
connection typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection] No description.
count typing.Union[typing.Union[int, float], cdktf.TerraformCount] No description.
depends_on typing.List[cdktf.ITerraformDependable] No description.
for_each cdktf.ITerraformIterator No description.
lifecycle cdktf.TerraformResourceLifecycle No description.
provider cdktf.TerraformProvider No description.
provisioners typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]] No description.
name str A unique name for the resource, required by Dataflow.
temp_gcs_location str A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data.
template_gcs_path str The Google Cloud Storage path to the Dataflow job template.
additional_experiments typing.List[str] List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].
enable_streaming_engine typing.Union[bool, cdktf.IResolvable] Indicates if the job should use the streaming engine feature.
id str Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#id GoogleDataflowJob#id}.
ip_configuration str The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".
kms_key_name str The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY.
labels typing.Mapping[str] User labels to be specified for the job.
machine_type str The machine type to use for the job.
max_workers typing.Union[int, float] The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
network str The network to which VMs will be assigned. If it is not provided, "default" will be used.
on_delete str One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy.
parameters typing.Mapping[str] Key/Value pairs to be passed to the Dataflow job (as used in the template).
project str The project in which the resource belongs.
region str The region in which the created job should run.
service_account_email str The Service Account email used to create the job.
skip_wait_on_job_termination typing.Union[bool, cdktf.IResolvable] If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.
subnetwork str The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK".
timeouts GoogleDataflowJobTimeouts timeouts block.
transform_name_mapping typing.Mapping[str] Only applicable when updating a pipeline.
zone str The zone in which the created job should run. If it is not provided, the provider zone is used.

scopeRequired
  • Type: constructs.Construct

The scope in which to define this construct.


idRequired
  • Type: str

The scoped construct ID.

Must be unique amongst siblings in the same scope


connectionOptional
  • Type: typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection]

countOptional
  • Type: typing.Union[typing.Union[int, float], cdktf.TerraformCount]

depends_onOptional
  • Type: typing.List[cdktf.ITerraformDependable]

for_eachOptional
  • Type: cdktf.ITerraformIterator

lifecycleOptional
  • Type: cdktf.TerraformResourceLifecycle

providerOptional
  • Type: cdktf.TerraformProvider

provisionersOptional
  • Type: typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]]

nameRequired
  • Type: str

A unique name for the resource, required by Dataflow.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#name GoogleDataflowJob#name}


temp_gcs_locationRequired
  • Type: str

A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#temp_gcs_location GoogleDataflowJob#temp_gcs_location}


template_gcs_pathRequired
  • Type: str

The Google Cloud Storage path to the Dataflow job template.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#template_gcs_path GoogleDataflowJob#template_gcs_path}


additional_experimentsOptional
  • Type: typing.List[str]

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#additional_experiments GoogleDataflowJob#additional_experiments}


enable_streaming_engineOptional
  • Type: typing.Union[bool, cdktf.IResolvable]

Indicates if the job should use the streaming engine feature.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#enable_streaming_engine GoogleDataflowJob#enable_streaming_engine}


idOptional
  • Type: str

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#id GoogleDataflowJob#id}.

Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.


ip_configurationOptional
  • Type: str

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#ip_configuration GoogleDataflowJob#ip_configuration}


kms_key_nameOptional
  • Type: str

The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#kms_key_name GoogleDataflowJob#kms_key_name}


labelsOptional
  • Type: typing.Mapping[str]

User labels to be specified for the job.

Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: This field is non-authoritative, and will only manage the labels present in your configuration. Please refer to the field 'effective_labels' for all of the labels present on the resource.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#labels GoogleDataflowJob#labels}


machine_typeOptional
  • Type: str

The machine type to use for the job.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#machine_type GoogleDataflowJob#machine_type}


max_workersOptional
  • Type: typing.Union[int, float]

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#max_workers GoogleDataflowJob#max_workers}


networkOptional
  • Type: str

The network to which VMs will be assigned. If it is not provided, "default" will be used.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#network GoogleDataflowJob#network}


on_deleteOptional
  • Type: str

One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#on_delete GoogleDataflowJob#on_delete}


parametersOptional
  • Type: typing.Mapping[str]

Key/Value pairs to be passed to the Dataflow job (as used in the template).

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#parameters GoogleDataflowJob#parameters}


projectOptional
  • Type: str

The project in which the resource belongs.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#project GoogleDataflowJob#project}


regionOptional
  • Type: str

The region in which the created job should run.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#region GoogleDataflowJob#region}


service_account_emailOptional
  • Type: str

The Service Account email used to create the job.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#service_account_email GoogleDataflowJob#service_account_email}


skip_wait_on_job_terminationOptional
  • Type: typing.Union[bool, cdktf.IResolvable]

If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.

WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#skip_wait_on_job_termination GoogleDataflowJob#skip_wait_on_job_termination}


subnetworkOptional
  • Type: str

The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK".

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#subnetwork GoogleDataflowJob#subnetwork}


timeoutsOptional

timeouts block.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#timeouts GoogleDataflowJob#timeouts}


transform_name_mappingOptional
  • Type: typing.Mapping[str]

Only applicable when updating a pipeline.

Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#transform_name_mapping GoogleDataflowJob#transform_name_mapping}


zoneOptional
  • Type: str

The zone in which the created job should run. If it is not provided, the provider zone is used.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#zone GoogleDataflowJob#zone}


Methods

Name Description
to_string Returns a string representation of this construct.
add_override No description.
override_logical_id Overrides the auto-generated logical ID with a specific ID.
reset_override_logical_id Resets a previously passed logical Id to use the auto-generated logical id again.
to_hcl_terraform No description.
to_metadata No description.
to_terraform Adds this resource to the terraform JSON output.
add_move_target Adds a user defined moveTarget string to this resource to be later used in .moveTo(moveTarget) to resolve the location of the move.
get_any_map_attribute No description.
get_boolean_attribute No description.
get_boolean_map_attribute No description.
get_list_attribute No description.
get_number_attribute No description.
get_number_list_attribute No description.
get_number_map_attribute No description.
get_string_attribute No description.
get_string_map_attribute No description.
has_resource_move No description.
import_from No description.
interpolation_for_attribute No description.
move_from_id Move the resource corresponding to "id" to this resource.
move_to Moves this resource to the target resource given by moveTarget.
move_to_id Moves this resource to the resource corresponding to "id".
put_timeouts No description.
reset_additional_experiments No description.
reset_enable_streaming_engine No description.
reset_id No description.
reset_ip_configuration No description.
reset_kms_key_name No description.
reset_labels No description.
reset_machine_type No description.
reset_max_workers No description.
reset_network No description.
reset_on_delete No description.
reset_parameters No description.
reset_project No description.
reset_region No description.
reset_service_account_email No description.
reset_skip_wait_on_job_termination No description.
reset_subnetwork No description.
reset_timeouts No description.
reset_transform_name_mapping No description.
reset_zone No description.

to_string
def to_string() -> str

Returns a string representation of this construct.

add_override
def add_override(
  path: str,
  value: typing.Any
) -> None
pathRequired
  • Type: str

valueRequired
  • Type: typing.Any

override_logical_id
def override_logical_id(
  new_logical_id: str
) -> None

Overrides the auto-generated logical ID with a specific ID.

new_logical_idRequired
  • Type: str

The new logical ID to use for this stack element.


reset_override_logical_id
def reset_override_logical_id() -> None

Resets a previously passed logical Id to use the auto-generated logical id again.

to_hcl_terraform
def to_hcl_terraform() -> typing.Any
to_metadata
def to_metadata() -> typing.Any
to_terraform
def to_terraform() -> typing.Any

Adds this resource to the terraform JSON output.

add_move_target
def add_move_target(
  move_target: str
) -> None

Adds a user defined moveTarget string to this resource to be later used in .moveTo(moveTarget) to resolve the location of the move.

move_targetRequired
  • Type: str

The string move target that will correspond to this resource.


get_any_map_attribute
def get_any_map_attribute(
  terraform_attribute: str
) -> typing.Mapping[typing.Any]
terraform_attributeRequired
  • Type: str

get_boolean_attribute
def get_boolean_attribute(
  terraform_attribute: str
) -> IResolvable
terraform_attributeRequired
  • Type: str

get_boolean_map_attribute
def get_boolean_map_attribute(
  terraform_attribute: str
) -> typing.Mapping[bool]
terraform_attributeRequired
  • Type: str

get_list_attribute
def get_list_attribute(
  terraform_attribute: str
) -> typing.List[str]
terraform_attributeRequired
  • Type: str

get_number_attribute
def get_number_attribute(
  terraform_attribute: str
) -> typing.Union[int, float]
terraform_attributeRequired
  • Type: str

get_number_list_attribute
def get_number_list_attribute(
  terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
terraform_attributeRequired
  • Type: str

get_number_map_attribute
def get_number_map_attribute(
  terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
terraform_attributeRequired
  • Type: str

get_string_attribute
def get_string_attribute(
  terraform_attribute: str
) -> str
terraform_attributeRequired
  • Type: str

get_string_map_attribute
def get_string_map_attribute(
  terraform_attribute: str
) -> typing.Mapping[str]
terraform_attributeRequired
  • Type: str

has_resource_move
def has_resource_move() -> typing.Union[TerraformResourceMoveByTarget, TerraformResourceMoveById]
import_from
def import_from(
  id: str,
  provider: TerraformProvider = None
) -> None
idRequired
  • Type: str

providerOptional
  • Type: cdktf.TerraformProvider

interpolation_for_attribute
def interpolation_for_attribute(
  terraform_attribute: str
) -> IResolvable
terraform_attributeRequired
  • Type: str

move_from_id
def move_from_id(
  id: str
) -> None

Move the resource corresponding to "id" to this resource.

Note that the resource being moved from must be marked as moved using it's instance function.

idRequired
  • Type: str

Full id of resource being moved from, e.g. "aws_s3_bucket.example".


move_to
def move_to(
  move_target: str,
  index: typing.Union[str, typing.Union[int, float]] = None
) -> None

Moves this resource to the target resource given by moveTarget.

move_targetRequired
  • Type: str

The previously set user defined string set by .addMoveTarget() corresponding to the resource to move to.


indexOptional
  • Type: typing.Union[str, typing.Union[int, float]]

Optional The index corresponding to the key the resource is to appear in the foreach of a resource to move to.


move_to_id
def move_to_id(
  id: str
) -> None

Moves this resource to the resource corresponding to "id".

idRequired
  • Type: str

Full id of resource to move to, e.g. "aws_s3_bucket.example".


put_timeouts
def put_timeouts(
  update: str = None
) -> None
updateOptional
  • Type: str

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#update GoogleDataflowJob#update}.


reset_additional_experiments
def reset_additional_experiments() -> None
reset_enable_streaming_engine
def reset_enable_streaming_engine() -> None
reset_id
def reset_id() -> None
reset_ip_configuration
def reset_ip_configuration() -> None
reset_kms_key_name
def reset_kms_key_name() -> None
reset_labels
def reset_labels() -> None
reset_machine_type
def reset_machine_type() -> None
reset_max_workers
def reset_max_workers() -> None
reset_network
def reset_network() -> None
reset_on_delete
def reset_on_delete() -> None
reset_parameters
def reset_parameters() -> None
reset_project
def reset_project() -> None
reset_region
def reset_region() -> None
reset_service_account_email
def reset_service_account_email() -> None
reset_skip_wait_on_job_termination
def reset_skip_wait_on_job_termination() -> None
reset_subnetwork
def reset_subnetwork() -> None
reset_timeouts
def reset_timeouts() -> None
reset_transform_name_mapping
def reset_transform_name_mapping() -> None
reset_zone
def reset_zone() -> None

Static Functions

Name Description
is_construct Checks if x is a construct.
is_terraform_element No description.
is_terraform_resource No description.
generate_config_for_import Generates CDKTF code for importing a GoogleDataflowJob resource upon running "cdktf plan ".

is_construct
from cdktf_cdktf_provider_google_beta import google_dataflow_job

googleDataflowJob.GoogleDataflowJob.is_construct(
  x: typing.Any
)

Checks if x is a construct.

Use this method instead of instanceof to properly detect Construct instances, even when the construct library is symlinked.

Explanation: in JavaScript, multiple copies of the constructs library on disk are seen as independent, completely different libraries. As a consequence, the class Construct in each copy of the constructs library is seen as a different class, and an instance of one class will not test as instanceof the other class. npm install will not create installations like this, but users may manually symlink construct libraries together or use a monorepo tool: in those cases, multiple copies of the constructs library can be accidentally installed, and instanceof will behave unpredictably. It is safest to avoid using instanceof, and using this type-testing method instead.

xRequired
  • Type: typing.Any

Any object.


is_terraform_element
from cdktf_cdktf_provider_google_beta import google_dataflow_job

googleDataflowJob.GoogleDataflowJob.is_terraform_element(
  x: typing.Any
)
xRequired
  • Type: typing.Any

is_terraform_resource
from cdktf_cdktf_provider_google_beta import google_dataflow_job

googleDataflowJob.GoogleDataflowJob.is_terraform_resource(
  x: typing.Any
)
xRequired
  • Type: typing.Any

generate_config_for_import
from cdktf_cdktf_provider_google_beta import google_dataflow_job

googleDataflowJob.GoogleDataflowJob.generate_config_for_import(
  scope: Construct,
  import_to_id: str,
  import_from_id: str,
  provider: TerraformProvider = None
)

Generates CDKTF code for importing a GoogleDataflowJob resource upon running "cdktf plan ".

scopeRequired
  • Type: constructs.Construct

The scope in which to define this construct.


import_to_idRequired
  • Type: str

The construct id used in the generated config for the GoogleDataflowJob to import.


import_from_idRequired
  • Type: str

The id of the existing GoogleDataflowJob that should be imported.

Refer to the {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#import import section} in the documentation of this resource for the id to use


providerOptional
  • Type: cdktf.TerraformProvider

? Optional instance of the provider where the GoogleDataflowJob to import is found.


Properties

Name Type Description
node constructs.Node The tree node.
cdktf_stack cdktf.TerraformStack No description.
fqn str No description.
friendly_unique_id str No description.
terraform_meta_arguments typing.Mapping[typing.Any] No description.
terraform_resource_type str No description.
terraform_generator_metadata cdktf.TerraformProviderGeneratorMetadata No description.
connection typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection] No description.
count typing.Union[typing.Union[int, float], cdktf.TerraformCount] No description.
depends_on typing.List[str] No description.
for_each cdktf.ITerraformIterator No description.
lifecycle cdktf.TerraformResourceLifecycle No description.
provider cdktf.TerraformProvider No description.
provisioners typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]] No description.
effective_labels cdktf.StringMap No description.
job_id str No description.
state str No description.
terraform_labels cdktf.StringMap No description.
timeouts GoogleDataflowJobTimeoutsOutputReference No description.
type str No description.
additional_experiments_input typing.List[str] No description.
enable_streaming_engine_input typing.Union[bool, cdktf.IResolvable] No description.
id_input str No description.
ip_configuration_input str No description.
kms_key_name_input str No description.
labels_input typing.Mapping[str] No description.
machine_type_input str No description.
max_workers_input typing.Union[int, float] No description.
name_input str No description.
network_input str No description.
on_delete_input str No description.
parameters_input typing.Mapping[str] No description.
project_input str No description.
region_input str No description.
service_account_email_input str No description.
skip_wait_on_job_termination_input typing.Union[bool, cdktf.IResolvable] No description.
subnetwork_input str No description.
temp_gcs_location_input str No description.
template_gcs_path_input str No description.
timeouts_input typing.Union[cdktf.IResolvable, GoogleDataflowJobTimeouts] No description.
transform_name_mapping_input typing.Mapping[str] No description.
zone_input str No description.
additional_experiments typing.List[str] No description.
enable_streaming_engine typing.Union[bool, cdktf.IResolvable] No description.
id str No description.
ip_configuration str No description.
kms_key_name str No description.
labels typing.Mapping[str] No description.
machine_type str No description.
max_workers typing.Union[int, float] No description.
name str No description.
network str No description.
on_delete str No description.
parameters typing.Mapping[str] No description.
project str No description.
region str No description.
service_account_email str No description.
skip_wait_on_job_termination typing.Union[bool, cdktf.IResolvable] No description.
subnetwork str No description.
temp_gcs_location str No description.
template_gcs_path str No description.
transform_name_mapping typing.Mapping[str] No description.
zone str No description.

nodeRequired
node: Node
  • Type: constructs.Node

The tree node.


cdktf_stackRequired
cdktf_stack: TerraformStack
  • Type: cdktf.TerraformStack

fqnRequired
fqn: str
  • Type: str

friendly_unique_idRequired
friendly_unique_id: str
  • Type: str

terraform_meta_argumentsRequired
terraform_meta_arguments: typing.Mapping[typing.Any]
  • Type: typing.Mapping[typing.Any]

terraform_resource_typeRequired
terraform_resource_type: str
  • Type: str

terraform_generator_metadataOptional
terraform_generator_metadata: TerraformProviderGeneratorMetadata
  • Type: cdktf.TerraformProviderGeneratorMetadata

connectionOptional
connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection]
  • Type: typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection]

countOptional
count: typing.Union[typing.Union[int, float], TerraformCount]
  • Type: typing.Union[typing.Union[int, float], cdktf.TerraformCount]

depends_onOptional
depends_on: typing.List[str]
  • Type: typing.List[str]

for_eachOptional
for_each: ITerraformIterator
  • Type: cdktf.ITerraformIterator

lifecycleOptional
lifecycle: TerraformResourceLifecycle
  • Type: cdktf.TerraformResourceLifecycle

providerOptional
provider: TerraformProvider
  • Type: cdktf.TerraformProvider

provisionersOptional
provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]]
  • Type: typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]]

effective_labelsRequired
effective_labels: StringMap
  • Type: cdktf.StringMap

job_idRequired
job_id: str
  • Type: str

stateRequired
state: str
  • Type: str

terraform_labelsRequired
terraform_labels: StringMap
  • Type: cdktf.StringMap

timeoutsRequired
timeouts: GoogleDataflowJobTimeoutsOutputReference

typeRequired
type: str
  • Type: str

additional_experiments_inputOptional
additional_experiments_input: typing.List[str]
  • Type: typing.List[str]

enable_streaming_engine_inputOptional
enable_streaming_engine_input: typing.Union[bool, IResolvable]
  • Type: typing.Union[bool, cdktf.IResolvable]

id_inputOptional
id_input: str
  • Type: str

ip_configuration_inputOptional
ip_configuration_input: str
  • Type: str

kms_key_name_inputOptional
kms_key_name_input: str
  • Type: str

labels_inputOptional
labels_input: typing.Mapping[str]
  • Type: typing.Mapping[str]

machine_type_inputOptional
machine_type_input: str
  • Type: str

max_workers_inputOptional
max_workers_input: typing.Union[int, float]
  • Type: typing.Union[int, float]

name_inputOptional
name_input: str
  • Type: str

network_inputOptional
network_input: str
  • Type: str

on_delete_inputOptional
on_delete_input: str
  • Type: str

parameters_inputOptional
parameters_input: typing.Mapping[str]
  • Type: typing.Mapping[str]

project_inputOptional
project_input: str
  • Type: str

region_inputOptional
region_input: str
  • Type: str

service_account_email_inputOptional
service_account_email_input: str
  • Type: str

skip_wait_on_job_termination_inputOptional
skip_wait_on_job_termination_input: typing.Union[bool, IResolvable]
  • Type: typing.Union[bool, cdktf.IResolvable]

subnetwork_inputOptional
subnetwork_input: str
  • Type: str

temp_gcs_location_inputOptional
temp_gcs_location_input: str
  • Type: str

template_gcs_path_inputOptional
template_gcs_path_input: str
  • Type: str

timeouts_inputOptional
timeouts_input: typing.Union[IResolvable, GoogleDataflowJobTimeouts]

transform_name_mapping_inputOptional
transform_name_mapping_input: typing.Mapping[str]
  • Type: typing.Mapping[str]

zone_inputOptional
zone_input: str
  • Type: str

additional_experimentsRequired
additional_experiments: typing.List[str]
  • Type: typing.List[str]

enable_streaming_engineRequired
enable_streaming_engine: typing.Union[bool, IResolvable]
  • Type: typing.Union[bool, cdktf.IResolvable]

idRequired
id: str
  • Type: str

ip_configurationRequired
ip_configuration: str
  • Type: str

kms_key_nameRequired
kms_key_name: str
  • Type: str

labelsRequired
labels: typing.Mapping[str]
  • Type: typing.Mapping[str]

machine_typeRequired
machine_type: str
  • Type: str

max_workersRequired
max_workers: typing.Union[int, float]
  • Type: typing.Union[int, float]

nameRequired
name: str
  • Type: str

networkRequired
network: str
  • Type: str

on_deleteRequired
on_delete: str
  • Type: str

parametersRequired
parameters: typing.Mapping[str]
  • Type: typing.Mapping[str]

projectRequired
project: str
  • Type: str

regionRequired
region: str
  • Type: str

service_account_emailRequired
service_account_email: str
  • Type: str

skip_wait_on_job_terminationRequired
skip_wait_on_job_termination: typing.Union[bool, IResolvable]
  • Type: typing.Union[bool, cdktf.IResolvable]

subnetworkRequired
subnetwork: str
  • Type: str

temp_gcs_locationRequired
temp_gcs_location: str
  • Type: str

template_gcs_pathRequired
template_gcs_path: str
  • Type: str

transform_name_mappingRequired
transform_name_mapping: typing.Mapping[str]
  • Type: typing.Mapping[str]

zoneRequired
zone: str
  • Type: str

Constants

Name Type Description
tfResourceType str No description.

tfResourceTypeRequired
tfResourceType: str
  • Type: str

Structs

GoogleDataflowJobConfig

Initializer

from cdktf_cdktf_provider_google_beta import google_dataflow_job

googleDataflowJob.GoogleDataflowJobConfig(
  connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection] = None,
  count: typing.Union[typing.Union[int, float], TerraformCount] = None,
  depends_on: typing.List[ITerraformDependable] = None,
  for_each: ITerraformIterator = None,
  lifecycle: TerraformResourceLifecycle = None,
  provider: TerraformProvider = None,
  provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]] = None,
  name: str,
  temp_gcs_location: str,
  template_gcs_path: str,
  additional_experiments: typing.List[str] = None,
  enable_streaming_engine: typing.Union[bool, IResolvable] = None,
  id: str = None,
  ip_configuration: str = None,
  kms_key_name: str = None,
  labels: typing.Mapping[str] = None,
  machine_type: str = None,
  max_workers: typing.Union[int, float] = None,
  network: str = None,
  on_delete: str = None,
  parameters: typing.Mapping[str] = None,
  project: str = None,
  region: str = None,
  service_account_email: str = None,
  skip_wait_on_job_termination: typing.Union[bool, IResolvable] = None,
  subnetwork: str = None,
  timeouts: GoogleDataflowJobTimeouts = None,
  transform_name_mapping: typing.Mapping[str] = None,
  zone: str = None
)

Properties

Name Type Description
connection typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection] No description.
count typing.Union[typing.Union[int, float], cdktf.TerraformCount] No description.
depends_on typing.List[cdktf.ITerraformDependable] No description.
for_each cdktf.ITerraformIterator No description.
lifecycle cdktf.TerraformResourceLifecycle No description.
provider cdktf.TerraformProvider No description.
provisioners typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]] No description.
name str A unique name for the resource, required by Dataflow.
temp_gcs_location str A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data.
template_gcs_path str The Google Cloud Storage path to the Dataflow job template.
additional_experiments typing.List[str] List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].
enable_streaming_engine typing.Union[bool, cdktf.IResolvable] Indicates if the job should use the streaming engine feature.
id str Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#id GoogleDataflowJob#id}.
ip_configuration str The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".
kms_key_name str The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY.
labels typing.Mapping[str] User labels to be specified for the job.
machine_type str The machine type to use for the job.
max_workers typing.Union[int, float] The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
network str The network to which VMs will be assigned. If it is not provided, "default" will be used.
on_delete str One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy.
parameters typing.Mapping[str] Key/Value pairs to be passed to the Dataflow job (as used in the template).
project str The project in which the resource belongs.
region str The region in which the created job should run.
service_account_email str The Service Account email used to create the job.
skip_wait_on_job_termination typing.Union[bool, cdktf.IResolvable] If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.
subnetwork str The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK".
timeouts GoogleDataflowJobTimeouts timeouts block.
transform_name_mapping typing.Mapping[str] Only applicable when updating a pipeline.
zone str The zone in which the created job should run. If it is not provided, the provider zone is used.

connectionOptional
connection: typing.Union[SSHProvisionerConnection, WinrmProvisionerConnection]
  • Type: typing.Union[cdktf.SSHProvisionerConnection, cdktf.WinrmProvisionerConnection]

countOptional
count: typing.Union[typing.Union[int, float], TerraformCount]
  • Type: typing.Union[typing.Union[int, float], cdktf.TerraformCount]

depends_onOptional
depends_on: typing.List[ITerraformDependable]
  • Type: typing.List[cdktf.ITerraformDependable]

for_eachOptional
for_each: ITerraformIterator
  • Type: cdktf.ITerraformIterator

lifecycleOptional
lifecycle: TerraformResourceLifecycle
  • Type: cdktf.TerraformResourceLifecycle

providerOptional
provider: TerraformProvider
  • Type: cdktf.TerraformProvider

provisionersOptional
provisioners: typing.List[typing.Union[FileProvisioner, LocalExecProvisioner, RemoteExecProvisioner]]
  • Type: typing.List[typing.Union[cdktf.FileProvisioner, cdktf.LocalExecProvisioner, cdktf.RemoteExecProvisioner]]

nameRequired
name: str
  • Type: str

A unique name for the resource, required by Dataflow.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#name GoogleDataflowJob#name}


temp_gcs_locationRequired
temp_gcs_location: str
  • Type: str

A writeable location on Google Cloud Storage for the Dataflow job to dump its temporary data.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#temp_gcs_location GoogleDataflowJob#temp_gcs_location}


template_gcs_pathRequired
template_gcs_path: str
  • Type: str

The Google Cloud Storage path to the Dataflow job template.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#template_gcs_path GoogleDataflowJob#template_gcs_path}


additional_experimentsOptional
additional_experiments: typing.List[str]
  • Type: typing.List[str]

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#additional_experiments GoogleDataflowJob#additional_experiments}


enable_streaming_engineOptional
enable_streaming_engine: typing.Union[bool, IResolvable]
  • Type: typing.Union[bool, cdktf.IResolvable]

Indicates if the job should use the streaming engine feature.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#enable_streaming_engine GoogleDataflowJob#enable_streaming_engine}


idOptional
id: str
  • Type: str

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#id GoogleDataflowJob#id}.

Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable.


ip_configurationOptional
ip_configuration: str
  • Type: str

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#ip_configuration GoogleDataflowJob#ip_configuration}


kms_key_nameOptional
kms_key_name: str
  • Type: str

The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#kms_key_name GoogleDataflowJob#kms_key_name}


labelsOptional
labels: typing.Mapping[str]
  • Type: typing.Mapping[str]

User labels to be specified for the job.

Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: This field is non-authoritative, and will only manage the labels present in your configuration. Please refer to the field 'effective_labels' for all of the labels present on the resource.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#labels GoogleDataflowJob#labels}


machine_typeOptional
machine_type: str
  • Type: str

The machine type to use for the job.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#machine_type GoogleDataflowJob#machine_type}


max_workersOptional
max_workers: typing.Union[int, float]
  • Type: typing.Union[int, float]

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#max_workers GoogleDataflowJob#max_workers}


networkOptional
network: str
  • Type: str

The network to which VMs will be assigned. If it is not provided, "default" will be used.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#network GoogleDataflowJob#network}


on_deleteOptional
on_delete: str
  • Type: str

One of "drain" or "cancel". Specifies behavior of deletion during terraform destroy.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#on_delete GoogleDataflowJob#on_delete}


parametersOptional
parameters: typing.Mapping[str]
  • Type: typing.Mapping[str]

Key/Value pairs to be passed to the Dataflow job (as used in the template).

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#parameters GoogleDataflowJob#parameters}


projectOptional
project: str
  • Type: str

The project in which the resource belongs.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#project GoogleDataflowJob#project}


regionOptional
region: str
  • Type: str

The region in which the created job should run.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#region GoogleDataflowJob#region}


service_account_emailOptional
service_account_email: str
  • Type: str

The Service Account email used to create the job.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#service_account_email GoogleDataflowJob#service_account_email}


skip_wait_on_job_terminationOptional
skip_wait_on_job_termination: typing.Union[bool, IResolvable]
  • Type: typing.Union[bool, cdktf.IResolvable]

If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on.

WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#skip_wait_on_job_termination GoogleDataflowJob#skip_wait_on_job_termination}


subnetworkOptional
subnetwork: str
  • Type: str

The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK".

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#subnetwork GoogleDataflowJob#subnetwork}


timeoutsOptional
timeouts: GoogleDataflowJobTimeouts

timeouts block.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#timeouts GoogleDataflowJob#timeouts}


transform_name_mappingOptional
transform_name_mapping: typing.Mapping[str]
  • Type: typing.Mapping[str]

Only applicable when updating a pipeline.

Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#transform_name_mapping GoogleDataflowJob#transform_name_mapping}


zoneOptional
zone: str
  • Type: str

The zone in which the created job should run. If it is not provided, the provider zone is used.

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#zone GoogleDataflowJob#zone}


GoogleDataflowJobTimeouts

Initializer

from cdktf_cdktf_provider_google_beta import google_dataflow_job

googleDataflowJob.GoogleDataflowJobTimeouts(
  update: str = None
)

Properties

Name Type Description
update str Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#update GoogleDataflowJob#update}.

updateOptional
update: str
  • Type: str

Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.12.0/docs/resources/google_dataflow_job#update GoogleDataflowJob#update}.


Classes

GoogleDataflowJobTimeoutsOutputReference

Initializers

from cdktf_cdktf_provider_google_beta import google_dataflow_job

googleDataflowJob.GoogleDataflowJobTimeoutsOutputReference(
  terraform_resource: IInterpolatingParent,
  terraform_attribute: str
)
Name Type Description
terraform_resource cdktf.IInterpolatingParent The parent resource.
terraform_attribute str The attribute on the parent resource this class is referencing.

terraform_resourceRequired
  • Type: cdktf.IInterpolatingParent

The parent resource.


terraform_attributeRequired
  • Type: str

The attribute on the parent resource this class is referencing.


Methods

Name Description
compute_fqn No description.
get_any_map_attribute No description.
get_boolean_attribute No description.
get_boolean_map_attribute No description.
get_list_attribute No description.
get_number_attribute No description.
get_number_list_attribute No description.
get_number_map_attribute No description.
get_string_attribute No description.
get_string_map_attribute No description.
interpolation_for_attribute No description.
resolve Produce the Token's value at resolution time.
to_string Return a string representation of this resolvable object.
reset_update No description.

compute_fqn
def compute_fqn() -> str
get_any_map_attribute
def get_any_map_attribute(
  terraform_attribute: str
) -> typing.Mapping[typing.Any]
terraform_attributeRequired
  • Type: str

get_boolean_attribute
def get_boolean_attribute(
  terraform_attribute: str
) -> IResolvable
terraform_attributeRequired
  • Type: str

get_boolean_map_attribute
def get_boolean_map_attribute(
  terraform_attribute: str
) -> typing.Mapping[bool]
terraform_attributeRequired
  • Type: str

get_list_attribute
def get_list_attribute(
  terraform_attribute: str
) -> typing.List[str]
terraform_attributeRequired
  • Type: str

get_number_attribute
def get_number_attribute(
  terraform_attribute: str
) -> typing.Union[int, float]
terraform_attributeRequired
  • Type: str

get_number_list_attribute
def get_number_list_attribute(
  terraform_attribute: str
) -> typing.List[typing.Union[int, float]]
terraform_attributeRequired
  • Type: str

get_number_map_attribute
def get_number_map_attribute(
  terraform_attribute: str
) -> typing.Mapping[typing.Union[int, float]]
terraform_attributeRequired
  • Type: str

get_string_attribute
def get_string_attribute(
  terraform_attribute: str
) -> str
terraform_attributeRequired
  • Type: str

get_string_map_attribute
def get_string_map_attribute(
  terraform_attribute: str
) -> typing.Mapping[str]
terraform_attributeRequired
  • Type: str

interpolation_for_attribute
def interpolation_for_attribute(
  property: str
) -> IResolvable
propertyRequired
  • Type: str

resolve
def resolve(
  _context: IResolveContext
) -> typing.Any

Produce the Token's value at resolution time.

_contextRequired
  • Type: cdktf.IResolveContext

to_string
def to_string() -> str

Return a string representation of this resolvable object.

Returns a reversible string representation.

reset_update
def reset_update() -> None

Properties

Name Type Description
creation_stack typing.List[str] The creation stack of this resolvable which will be appended to errors thrown during resolution.
fqn str No description.
update_input str No description.
update str No description.
internal_value typing.Union[cdktf.IResolvable, GoogleDataflowJobTimeouts] No description.

creation_stackRequired
creation_stack: typing.List[str]
  • Type: typing.List[str]

The creation stack of this resolvable which will be appended to errors thrown during resolution.

If this returns an empty array the stack will not be attached.


fqnRequired
fqn: str
  • Type: str

update_inputOptional
update_input: str
  • Type: str

updateRequired
update: str
  • Type: str

internal_valueOptional
internal_value: typing.Union[IResolvable, GoogleDataflowJobTimeouts]