Documentation ¶
Index ¶
- func GoogleDataprocBatch_GenerateConfigForImport(scope constructs.Construct, importToId *string, importFromId *string, ...) cdktf.ImportableResource
- func GoogleDataprocBatch_IsConstruct(x interface{}) *bool
- func GoogleDataprocBatch_IsTerraformElement(x interface{}) *bool
- func GoogleDataprocBatch_IsTerraformResource(x interface{}) *bool
- func GoogleDataprocBatch_TfResourceType() *string
- func NewGoogleDataprocBatchEnvironmentConfigExecutionConfigOutputReference_Override(g GoogleDataprocBatchEnvironmentConfigExecutionConfigOutputReference, ...)
- func NewGoogleDataprocBatchEnvironmentConfigOutputReference_Override(g GoogleDataprocBatchEnvironmentConfigOutputReference, ...)
- func NewGoogleDataprocBatchEnvironmentConfigPeripheralsConfigOutputReference_Override(g GoogleDataprocBatchEnvironmentConfigPeripheralsConfigOutputReference, ...)
- func NewGoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfigOutputReference_Override(...)
- func NewGoogleDataprocBatchPysparkBatchOutputReference_Override(g GoogleDataprocBatchPysparkBatchOutputReference, ...)
- func NewGoogleDataprocBatchRuntimeConfigOutputReference_Override(g GoogleDataprocBatchRuntimeConfigOutputReference, ...)
- func NewGoogleDataprocBatchRuntimeInfoApproximateUsageList_Override(g GoogleDataprocBatchRuntimeInfoApproximateUsageList, ...)
- func NewGoogleDataprocBatchRuntimeInfoApproximateUsageOutputReference_Override(g GoogleDataprocBatchRuntimeInfoApproximateUsageOutputReference, ...)
- func NewGoogleDataprocBatchRuntimeInfoCurrentUsageList_Override(g GoogleDataprocBatchRuntimeInfoCurrentUsageList, ...)
- func NewGoogleDataprocBatchRuntimeInfoCurrentUsageOutputReference_Override(g GoogleDataprocBatchRuntimeInfoCurrentUsageOutputReference, ...)
- func NewGoogleDataprocBatchRuntimeInfoList_Override(g GoogleDataprocBatchRuntimeInfoList, ...)
- func NewGoogleDataprocBatchRuntimeInfoOutputReference_Override(g GoogleDataprocBatchRuntimeInfoOutputReference, ...)
- func NewGoogleDataprocBatchSparkBatchOutputReference_Override(g GoogleDataprocBatchSparkBatchOutputReference, ...)
- func NewGoogleDataprocBatchSparkRBatchOutputReference_Override(g GoogleDataprocBatchSparkRBatchOutputReference, ...)
- func NewGoogleDataprocBatchSparkSqlBatchOutputReference_Override(g GoogleDataprocBatchSparkSqlBatchOutputReference, ...)
- func NewGoogleDataprocBatchStateHistoryList_Override(g GoogleDataprocBatchStateHistoryList, ...)
- func NewGoogleDataprocBatchStateHistoryOutputReference_Override(g GoogleDataprocBatchStateHistoryOutputReference, ...)
- func NewGoogleDataprocBatchTimeoutsOutputReference_Override(g GoogleDataprocBatchTimeoutsOutputReference, ...)
- func NewGoogleDataprocBatch_Override(g GoogleDataprocBatch, scope constructs.Construct, id *string, ...)
- type GoogleDataprocBatch
- type GoogleDataprocBatchConfig
- type GoogleDataprocBatchEnvironmentConfig
- type GoogleDataprocBatchEnvironmentConfigExecutionConfig
- type GoogleDataprocBatchEnvironmentConfigExecutionConfigOutputReference
- type GoogleDataprocBatchEnvironmentConfigOutputReference
- type GoogleDataprocBatchEnvironmentConfigPeripheralsConfig
- type GoogleDataprocBatchEnvironmentConfigPeripheralsConfigOutputReference
- type GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfig
- type GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfigOutputReference
- type GoogleDataprocBatchPysparkBatch
- type GoogleDataprocBatchPysparkBatchOutputReference
- type GoogleDataprocBatchRuntimeConfig
- type GoogleDataprocBatchRuntimeConfigOutputReference
- type GoogleDataprocBatchRuntimeInfo
- type GoogleDataprocBatchRuntimeInfoApproximateUsage
- type GoogleDataprocBatchRuntimeInfoApproximateUsageList
- type GoogleDataprocBatchRuntimeInfoApproximateUsageOutputReference
- type GoogleDataprocBatchRuntimeInfoCurrentUsage
- type GoogleDataprocBatchRuntimeInfoCurrentUsageList
- type GoogleDataprocBatchRuntimeInfoCurrentUsageOutputReference
- type GoogleDataprocBatchRuntimeInfoList
- type GoogleDataprocBatchRuntimeInfoOutputReference
- type GoogleDataprocBatchSparkBatch
- type GoogleDataprocBatchSparkBatchOutputReference
- type GoogleDataprocBatchSparkRBatch
- type GoogleDataprocBatchSparkRBatchOutputReference
- type GoogleDataprocBatchSparkSqlBatch
- type GoogleDataprocBatchSparkSqlBatchOutputReference
- type GoogleDataprocBatchStateHistory
- type GoogleDataprocBatchStateHistoryList
- type GoogleDataprocBatchStateHistoryOutputReference
- type GoogleDataprocBatchTimeouts
- type GoogleDataprocBatchTimeoutsOutputReference
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func GoogleDataprocBatch_GenerateConfigForImport ¶
func GoogleDataprocBatch_GenerateConfigForImport(scope constructs.Construct, importToId *string, importFromId *string, provider cdktf.TerraformProvider) cdktf.ImportableResource
Generates CDKTF code for importing a GoogleDataprocBatch resource upon running "cdktf plan <stack-name>".
func GoogleDataprocBatch_IsConstruct ¶
func GoogleDataprocBatch_IsConstruct(x interface{}) *bool
Checks if `x` is a construct.
Use this method instead of `instanceof` to properly detect `Construct` instances, even when the construct library is symlinked.
Explanation: in JavaScript, multiple copies of the `constructs` library on disk are seen as independent, completely different libraries. As a consequence, the class `Construct` in each copy of the `constructs` library is seen as a different class, and an instance of one class will not test as `instanceof` the other class. `npm install` will not create installations like this, but users may manually symlink construct libraries together or use a monorepo tool: in those cases, multiple copies of the `constructs` library can be accidentally installed, and `instanceof` will behave unpredictably. It is safest to avoid using `instanceof`, and using this type-testing method instead.
Returns: true if `x` is an object created from a class which extends `Construct`.
func GoogleDataprocBatch_IsTerraformElement ¶
func GoogleDataprocBatch_IsTerraformElement(x interface{}) *bool
Experimental.
func GoogleDataprocBatch_IsTerraformResource ¶
func GoogleDataprocBatch_IsTerraformResource(x interface{}) *bool
Experimental.
func GoogleDataprocBatch_TfResourceType ¶
func GoogleDataprocBatch_TfResourceType() *string
func NewGoogleDataprocBatchEnvironmentConfigExecutionConfigOutputReference_Override ¶
func NewGoogleDataprocBatchEnvironmentConfigExecutionConfigOutputReference_Override(g GoogleDataprocBatchEnvironmentConfigExecutionConfigOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocBatchEnvironmentConfigOutputReference_Override ¶
func NewGoogleDataprocBatchEnvironmentConfigOutputReference_Override(g GoogleDataprocBatchEnvironmentConfigOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocBatchEnvironmentConfigPeripheralsConfigOutputReference_Override ¶
func NewGoogleDataprocBatchEnvironmentConfigPeripheralsConfigOutputReference_Override(g GoogleDataprocBatchEnvironmentConfigPeripheralsConfigOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfigOutputReference_Override ¶
func NewGoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfigOutputReference_Override(g GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfigOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocBatchPysparkBatchOutputReference_Override ¶
func NewGoogleDataprocBatchPysparkBatchOutputReference_Override(g GoogleDataprocBatchPysparkBatchOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocBatchRuntimeConfigOutputReference_Override ¶
func NewGoogleDataprocBatchRuntimeConfigOutputReference_Override(g GoogleDataprocBatchRuntimeConfigOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocBatchRuntimeInfoApproximateUsageList_Override ¶
func NewGoogleDataprocBatchRuntimeInfoApproximateUsageList_Override(g GoogleDataprocBatchRuntimeInfoApproximateUsageList, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, wrapsSet *bool)
func NewGoogleDataprocBatchRuntimeInfoApproximateUsageOutputReference_Override ¶
func NewGoogleDataprocBatchRuntimeInfoApproximateUsageOutputReference_Override(g GoogleDataprocBatchRuntimeInfoApproximateUsageOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, complexObjectIndex *float64, complexObjectIsFromSet *bool)
func NewGoogleDataprocBatchRuntimeInfoCurrentUsageList_Override ¶
func NewGoogleDataprocBatchRuntimeInfoCurrentUsageList_Override(g GoogleDataprocBatchRuntimeInfoCurrentUsageList, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, wrapsSet *bool)
func NewGoogleDataprocBatchRuntimeInfoCurrentUsageOutputReference_Override ¶
func NewGoogleDataprocBatchRuntimeInfoCurrentUsageOutputReference_Override(g GoogleDataprocBatchRuntimeInfoCurrentUsageOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, complexObjectIndex *float64, complexObjectIsFromSet *bool)
func NewGoogleDataprocBatchRuntimeInfoList_Override ¶
func NewGoogleDataprocBatchRuntimeInfoList_Override(g GoogleDataprocBatchRuntimeInfoList, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, wrapsSet *bool)
func NewGoogleDataprocBatchRuntimeInfoOutputReference_Override ¶
func NewGoogleDataprocBatchRuntimeInfoOutputReference_Override(g GoogleDataprocBatchRuntimeInfoOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, complexObjectIndex *float64, complexObjectIsFromSet *bool)
func NewGoogleDataprocBatchSparkBatchOutputReference_Override ¶
func NewGoogleDataprocBatchSparkBatchOutputReference_Override(g GoogleDataprocBatchSparkBatchOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocBatchSparkRBatchOutputReference_Override ¶
func NewGoogleDataprocBatchSparkRBatchOutputReference_Override(g GoogleDataprocBatchSparkRBatchOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocBatchSparkSqlBatchOutputReference_Override ¶
func NewGoogleDataprocBatchSparkSqlBatchOutputReference_Override(g GoogleDataprocBatchSparkSqlBatchOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocBatchStateHistoryList_Override ¶
func NewGoogleDataprocBatchStateHistoryList_Override(g GoogleDataprocBatchStateHistoryList, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, wrapsSet *bool)
func NewGoogleDataprocBatchStateHistoryOutputReference_Override ¶
func NewGoogleDataprocBatchStateHistoryOutputReference_Override(g GoogleDataprocBatchStateHistoryOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, complexObjectIndex *float64, complexObjectIsFromSet *bool)
func NewGoogleDataprocBatchTimeoutsOutputReference_Override ¶
func NewGoogleDataprocBatchTimeoutsOutputReference_Override(g GoogleDataprocBatchTimeoutsOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocBatch_Override ¶
func NewGoogleDataprocBatch_Override(g GoogleDataprocBatch, scope constructs.Construct, id *string, config *GoogleDataprocBatchConfig)
Create a new {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch google_dataproc_batch} Resource.
Types ¶
type GoogleDataprocBatch ¶
type GoogleDataprocBatch interface { cdktf.TerraformResource BatchId() *string SetBatchId(val *string) BatchIdInput() *string // Experimental. CdktfStack() cdktf.TerraformStack // Experimental. Connection() interface{} // Experimental. SetConnection(val interface{}) // Experimental. ConstructNodeMetadata() *map[string]interface{} // Experimental. Count() interface{} // Experimental. SetCount(val interface{}) CreateTime() *string Creator() *string // Experimental. DependsOn() *[]*string // Experimental. SetDependsOn(val *[]*string) EffectiveLabels() cdktf.StringMap EnvironmentConfig() GoogleDataprocBatchEnvironmentConfigOutputReference EnvironmentConfigInput() *GoogleDataprocBatchEnvironmentConfig // Experimental. ForEach() cdktf.ITerraformIterator // Experimental. SetForEach(val cdktf.ITerraformIterator) // Experimental. Fqn() *string // Experimental. FriendlyUniqueId() *string Id() *string SetId(val *string) IdInput() *string Labels() *map[string]*string SetLabels(val *map[string]*string) LabelsInput() *map[string]*string // Experimental. Lifecycle() *cdktf.TerraformResourceLifecycle // Experimental. SetLifecycle(val *cdktf.TerraformResourceLifecycle) Location() *string SetLocation(val *string) LocationInput() *string Name() *string // The tree node. Node() constructs.Node Operation() *string Project() *string SetProject(val *string) ProjectInput() *string // Experimental. Provider() cdktf.TerraformProvider // Experimental. SetProvider(val cdktf.TerraformProvider) // Experimental. Provisioners() *[]interface{} // Experimental. SetProvisioners(val *[]interface{}) PysparkBatch() GoogleDataprocBatchPysparkBatchOutputReference PysparkBatchInput() *GoogleDataprocBatchPysparkBatch // Experimental. RawOverrides() interface{} RuntimeConfig() GoogleDataprocBatchRuntimeConfigOutputReference RuntimeConfigInput() *GoogleDataprocBatchRuntimeConfig RuntimeInfo() GoogleDataprocBatchRuntimeInfoList SparkBatch() GoogleDataprocBatchSparkBatchOutputReference SparkBatchInput() *GoogleDataprocBatchSparkBatch SparkRBatch() GoogleDataprocBatchSparkRBatchOutputReference SparkRBatchInput() *GoogleDataprocBatchSparkRBatch SparkSqlBatch() GoogleDataprocBatchSparkSqlBatchOutputReference SparkSqlBatchInput() *GoogleDataprocBatchSparkSqlBatch State() *string StateHistory() GoogleDataprocBatchStateHistoryList StateMessage() *string StateTime() *string // Experimental. TerraformGeneratorMetadata() *cdktf.TerraformProviderGeneratorMetadata TerraformLabels() cdktf.StringMap // Experimental. TerraformMetaArguments() *map[string]interface{} // Experimental. TerraformResourceType() *string Timeouts() GoogleDataprocBatchTimeoutsOutputReference TimeoutsInput() interface{} Uuid() *string // Adds a user defined moveTarget string to this resource to be later used in .moveTo(moveTarget) to resolve the location of the move. // Experimental. AddMoveTarget(moveTarget *string) // Experimental. AddOverride(path *string, value interface{}) // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. HasResourceMove() interface{} // Experimental. ImportFrom(id *string, provider cdktf.TerraformProvider) // Experimental. InterpolationForAttribute(terraformAttribute *string) cdktf.IResolvable // Move the resource corresponding to "id" to this resource. // // Note that the resource being moved from must be marked as moved using it's instance function. // Experimental. MoveFromId(id *string) // Moves this resource to the target resource given by moveTarget. // Experimental. MoveTo(moveTarget *string, index interface{}) // Moves this resource to the resource corresponding to "id". // Experimental. MoveToId(id *string) // Overrides the auto-generated logical ID with a specific ID. // Experimental. OverrideLogicalId(newLogicalId *string) PutEnvironmentConfig(value *GoogleDataprocBatchEnvironmentConfig) PutPysparkBatch(value *GoogleDataprocBatchPysparkBatch) PutRuntimeConfig(value *GoogleDataprocBatchRuntimeConfig) PutSparkBatch(value *GoogleDataprocBatchSparkBatch) PutSparkRBatch(value *GoogleDataprocBatchSparkRBatch) PutSparkSqlBatch(value *GoogleDataprocBatchSparkSqlBatch) PutTimeouts(value *GoogleDataprocBatchTimeouts) ResetBatchId() ResetEnvironmentConfig() ResetId() ResetLabels() ResetLocation() // Resets a previously passed logical Id to use the auto-generated logical id again. // Experimental. ResetOverrideLogicalId() ResetProject() ResetPysparkBatch() ResetRuntimeConfig() ResetSparkBatch() ResetSparkRBatch() ResetSparkSqlBatch() ResetTimeouts() SynthesizeAttributes() *map[string]interface{} SynthesizeHclAttributes() *map[string]interface{} // Experimental. ToHclTerraform() interface{} // Experimental. ToMetadata() interface{} // Returns a string representation of this construct. ToString() *string // Adds this resource to the terraform JSON output. // Experimental. ToTerraform() interface{} }
Represents a {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch google_dataproc_batch}.
func NewGoogleDataprocBatch ¶
func NewGoogleDataprocBatch(scope constructs.Construct, id *string, config *GoogleDataprocBatchConfig) GoogleDataprocBatch
Create a new {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch google_dataproc_batch} Resource.
type GoogleDataprocBatchConfig ¶
type GoogleDataprocBatchConfig struct { // Experimental. Connection interface{} `field:"optional" json:"connection" yaml:"connection"` // Experimental. Count interface{} `field:"optional" json:"count" yaml:"count"` // Experimental. DependsOn *[]cdktf.ITerraformDependable `field:"optional" json:"dependsOn" yaml:"dependsOn"` // Experimental. ForEach cdktf.ITerraformIterator `field:"optional" json:"forEach" yaml:"forEach"` // Experimental. Lifecycle *cdktf.TerraformResourceLifecycle `field:"optional" json:"lifecycle" yaml:"lifecycle"` // Experimental. Provider cdktf.TerraformProvider `field:"optional" json:"provider" yaml:"provider"` // Experimental. Provisioners *[]interface{} `field:"optional" json:"provisioners" yaml:"provisioners"` // The ID to use for the batch, which will become the final component of the batch's resource name. // // This value must be 4-63 characters. Valid characters are /[a-z][0-9]-/. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#batch_id GoogleDataprocBatch#batch_id} BatchId *string `field:"optional" json:"batchId" yaml:"batchId"` // environment_config block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#environment_config GoogleDataprocBatch#environment_config} EnvironmentConfig *GoogleDataprocBatchEnvironmentConfig `field:"optional" json:"environmentConfig" yaml:"environmentConfig"` // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#id GoogleDataprocBatch#id}. // // Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. // If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable. Id *string `field:"optional" json:"id" yaml:"id"` // The labels to associate with this batch. // // **Note**: This field is non-authoritative, and will only manage the labels present in your configuration. // Please refer to the field 'effective_labels' for all of the labels present on the resource. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#labels GoogleDataprocBatch#labels} Labels *map[string]*string `field:"optional" json:"labels" yaml:"labels"` // The location in which the batch will be created in. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#location GoogleDataprocBatch#location} Location *string `field:"optional" json:"location" yaml:"location"` // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#project GoogleDataprocBatch#project}. Project *string `field:"optional" json:"project" yaml:"project"` // pyspark_batch block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#pyspark_batch GoogleDataprocBatch#pyspark_batch} PysparkBatch *GoogleDataprocBatchPysparkBatch `field:"optional" json:"pysparkBatch" yaml:"pysparkBatch"` // runtime_config block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#runtime_config GoogleDataprocBatch#runtime_config} RuntimeConfig *GoogleDataprocBatchRuntimeConfig `field:"optional" json:"runtimeConfig" yaml:"runtimeConfig"` // spark_batch block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#spark_batch GoogleDataprocBatch#spark_batch} SparkBatch *GoogleDataprocBatchSparkBatch `field:"optional" json:"sparkBatch" yaml:"sparkBatch"` // spark_r_batch block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#spark_r_batch GoogleDataprocBatch#spark_r_batch} SparkRBatch *GoogleDataprocBatchSparkRBatch `field:"optional" json:"sparkRBatch" yaml:"sparkRBatch"` // spark_sql_batch block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#spark_sql_batch GoogleDataprocBatch#spark_sql_batch} SparkSqlBatch *GoogleDataprocBatchSparkSqlBatch `field:"optional" json:"sparkSqlBatch" yaml:"sparkSqlBatch"` // timeouts block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#timeouts GoogleDataprocBatch#timeouts} Timeouts *GoogleDataprocBatchTimeouts `field:"optional" json:"timeouts" yaml:"timeouts"` }
type GoogleDataprocBatchEnvironmentConfig ¶
type GoogleDataprocBatchEnvironmentConfig struct { // execution_config block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#execution_config GoogleDataprocBatch#execution_config} ExecutionConfig *GoogleDataprocBatchEnvironmentConfigExecutionConfig `field:"optional" json:"executionConfig" yaml:"executionConfig"` // peripherals_config block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#peripherals_config GoogleDataprocBatch#peripherals_config} PeripheralsConfig *GoogleDataprocBatchEnvironmentConfigPeripheralsConfig `field:"optional" json:"peripheralsConfig" yaml:"peripheralsConfig"` }
type GoogleDataprocBatchEnvironmentConfigExecutionConfig ¶
type GoogleDataprocBatchEnvironmentConfigExecutionConfig struct { // The Cloud KMS key to use for encryption. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#kms_key GoogleDataprocBatch#kms_key} KmsKey *string `field:"optional" json:"kmsKey" yaml:"kmsKey"` // Tags used for network traffic control. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#network_tags GoogleDataprocBatch#network_tags} NetworkTags *[]*string `field:"optional" json:"networkTags" yaml:"networkTags"` // Network configuration for workload execution. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#network_uri GoogleDataprocBatch#network_uri} NetworkUri *string `field:"optional" json:"networkUri" yaml:"networkUri"` // Service account that used to execute workload. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#service_account GoogleDataprocBatch#service_account} ServiceAccount *string `field:"optional" json:"serviceAccount" yaml:"serviceAccount"` // A Cloud Storage bucket used to stage workload dependencies, config files, and store workload output and other ephemeral data, such as Spark history files. // // If you do not specify a staging bucket, // Cloud Dataproc will determine a Cloud Storage location according to the region where your workload is running, // and then create and manage project-level, per-location staging and temporary buckets. // This field requires a Cloud Storage bucket name, not a gs://... URI to a Cloud Storage bucket. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#staging_bucket GoogleDataprocBatch#staging_bucket} StagingBucket *string `field:"optional" json:"stagingBucket" yaml:"stagingBucket"` // Subnetwork configuration for workload execution. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#subnetwork_uri GoogleDataprocBatch#subnetwork_uri} SubnetworkUri *string `field:"optional" json:"subnetworkUri" yaml:"subnetworkUri"` // The duration after which the workload will be terminated. // // When the workload exceeds this duration, it will be unconditionally terminated without waiting for ongoing // work to finish. If ttl is not specified for a batch workload, the workload will be allowed to run until it // exits naturally (or run forever without exiting). If ttl is not specified for an interactive session, // it defaults to 24 hours. If ttl is not specified for a batch that uses 2.1+ runtime version, it defaults to 4 hours. // Minimum value is 10 minutes; maximum value is 14 days. If both ttl and idleTtl are specified (for an interactive session), // the conditions are treated as OR conditions: the workload will be terminated when it has been idle for idleTtl or // when ttl has been exceeded, whichever occurs first. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#ttl GoogleDataprocBatch#ttl} Ttl *string `field:"optional" json:"ttl" yaml:"ttl"` }
type GoogleDataprocBatchEnvironmentConfigExecutionConfigOutputReference ¶
type GoogleDataprocBatchEnvironmentConfigExecutionConfigOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataprocBatchEnvironmentConfigExecutionConfig SetInternalValue(val *GoogleDataprocBatchEnvironmentConfigExecutionConfig) KmsKey() *string SetKmsKey(val *string) KmsKeyInput() *string NetworkTags() *[]*string SetNetworkTags(val *[]*string) NetworkTagsInput() *[]*string NetworkUri() *string SetNetworkUri(val *string) NetworkUriInput() *string ServiceAccount() *string SetServiceAccount(val *string) ServiceAccountInput() *string StagingBucket() *string SetStagingBucket(val *string) StagingBucketInput() *string SubnetworkUri() *string SetSubnetworkUri(val *string) SubnetworkUriInput() *string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) Ttl() *string SetTtl(val *string) TtlInput() *string // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetKmsKey() ResetNetworkTags() ResetNetworkUri() ResetServiceAccount() ResetStagingBucket() ResetSubnetworkUri() ResetTtl() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchEnvironmentConfigExecutionConfigOutputReference ¶
func NewGoogleDataprocBatchEnvironmentConfigExecutionConfigOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocBatchEnvironmentConfigExecutionConfigOutputReference
type GoogleDataprocBatchEnvironmentConfigOutputReference ¶
type GoogleDataprocBatchEnvironmentConfigOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string ExecutionConfig() GoogleDataprocBatchEnvironmentConfigExecutionConfigOutputReference ExecutionConfigInput() *GoogleDataprocBatchEnvironmentConfigExecutionConfig // Experimental. Fqn() *string InternalValue() *GoogleDataprocBatchEnvironmentConfig SetInternalValue(val *GoogleDataprocBatchEnvironmentConfig) PeripheralsConfig() GoogleDataprocBatchEnvironmentConfigPeripheralsConfigOutputReference PeripheralsConfigInput() *GoogleDataprocBatchEnvironmentConfigPeripheralsConfig // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable PutExecutionConfig(value *GoogleDataprocBatchEnvironmentConfigExecutionConfig) PutPeripheralsConfig(value *GoogleDataprocBatchEnvironmentConfigPeripheralsConfig) ResetExecutionConfig() ResetPeripheralsConfig() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchEnvironmentConfigOutputReference ¶
func NewGoogleDataprocBatchEnvironmentConfigOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocBatchEnvironmentConfigOutputReference
type GoogleDataprocBatchEnvironmentConfigPeripheralsConfig ¶
type GoogleDataprocBatchEnvironmentConfigPeripheralsConfig struct { // Resource name of an existing Dataproc Metastore service. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#metastore_service GoogleDataprocBatch#metastore_service} MetastoreService *string `field:"optional" json:"metastoreService" yaml:"metastoreService"` // spark_history_server_config block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#spark_history_server_config GoogleDataprocBatch#spark_history_server_config} SparkHistoryServerConfig *GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfig `field:"optional" json:"sparkHistoryServerConfig" yaml:"sparkHistoryServerConfig"` }
type GoogleDataprocBatchEnvironmentConfigPeripheralsConfigOutputReference ¶
type GoogleDataprocBatchEnvironmentConfigPeripheralsConfigOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataprocBatchEnvironmentConfigPeripheralsConfig SetInternalValue(val *GoogleDataprocBatchEnvironmentConfigPeripheralsConfig) MetastoreService() *string SetMetastoreService(val *string) MetastoreServiceInput() *string SparkHistoryServerConfig() GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfigOutputReference SparkHistoryServerConfigInput() *GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfig // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable PutSparkHistoryServerConfig(value *GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfig) ResetMetastoreService() ResetSparkHistoryServerConfig() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchEnvironmentConfigPeripheralsConfigOutputReference ¶
func NewGoogleDataprocBatchEnvironmentConfigPeripheralsConfigOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocBatchEnvironmentConfigPeripheralsConfigOutputReference
type GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfig ¶
type GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfig struct { // Resource name of an existing Dataproc Cluster to act as a Spark History Server for the workload. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#dataproc_cluster GoogleDataprocBatch#dataproc_cluster} DataprocCluster *string `field:"optional" json:"dataprocCluster" yaml:"dataprocCluster"` }
type GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfigOutputReference ¶
type GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfigOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string DataprocCluster() *string SetDataprocCluster(val *string) DataprocClusterInput() *string // Experimental. Fqn() *string InternalValue() *GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfig SetInternalValue(val *GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfig) // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetDataprocCluster() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfigOutputReference ¶
func NewGoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfigOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfigOutputReference
type GoogleDataprocBatchPysparkBatch ¶
type GoogleDataprocBatchPysparkBatch struct { // HCFS URIs of archives to be extracted into the working directory of each executor. // // Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#archive_uris GoogleDataprocBatch#archive_uris} ArchiveUris *[]*string `field:"optional" json:"archiveUris" yaml:"archiveUris"` // The arguments to pass to the driver. // // Do not include arguments that can be set as batch // properties, such as --conf, since a collision can occur that causes an incorrect batch submission. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#args GoogleDataprocBatch#args} Args *[]*string `field:"optional" json:"args" yaml:"args"` // HCFS URIs of files to be placed in the working directory of each executor. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#file_uris GoogleDataprocBatch#file_uris} FileUris *[]*string `field:"optional" json:"fileUris" yaml:"fileUris"` // HCFS URIs of jar files to add to the classpath of the Spark driver and tasks. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#jar_file_uris GoogleDataprocBatch#jar_file_uris} JarFileUris *[]*string `field:"optional" json:"jarFileUris" yaml:"jarFileUris"` // The HCFS URI of the main Python file to use as the Spark driver. Must be a .py file. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#main_python_file_uri GoogleDataprocBatch#main_python_file_uri} MainPythonFileUri *string `field:"optional" json:"mainPythonFileUri" yaml:"mainPythonFileUri"` // HCFS file URIs of Python files to pass to the PySpark framework. Supported file types: .py, .egg, and .zip. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#python_file_uris GoogleDataprocBatch#python_file_uris} PythonFileUris *[]*string `field:"optional" json:"pythonFileUris" yaml:"pythonFileUris"` }
type GoogleDataprocBatchPysparkBatchOutputReference ¶
type GoogleDataprocBatchPysparkBatchOutputReference interface { cdktf.ComplexObject ArchiveUris() *[]*string SetArchiveUris(val *[]*string) ArchiveUrisInput() *[]*string Args() *[]*string SetArgs(val *[]*string) ArgsInput() *[]*string // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string FileUris() *[]*string SetFileUris(val *[]*string) FileUrisInput() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataprocBatchPysparkBatch SetInternalValue(val *GoogleDataprocBatchPysparkBatch) JarFileUris() *[]*string SetJarFileUris(val *[]*string) JarFileUrisInput() *[]*string MainPythonFileUri() *string SetMainPythonFileUri(val *string) MainPythonFileUriInput() *string PythonFileUris() *[]*string SetPythonFileUris(val *[]*string) PythonFileUrisInput() *[]*string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetArchiveUris() ResetArgs() ResetFileUris() ResetJarFileUris() ResetMainPythonFileUri() ResetPythonFileUris() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchPysparkBatchOutputReference ¶
func NewGoogleDataprocBatchPysparkBatchOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocBatchPysparkBatchOutputReference
type GoogleDataprocBatchRuntimeConfig ¶
type GoogleDataprocBatchRuntimeConfig struct { // Optional custom container image for the job runtime environment. If not specified, a default container image will be used. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#container_image GoogleDataprocBatch#container_image} ContainerImage *string `field:"optional" json:"containerImage" yaml:"containerImage"` // A mapping of property names to values, which are used to configure workload execution. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#properties GoogleDataprocBatch#properties} Properties *map[string]*string `field:"optional" json:"properties" yaml:"properties"` // Version of the batch runtime. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#version GoogleDataprocBatch#version} Version *string `field:"optional" json:"version" yaml:"version"` }
type GoogleDataprocBatchRuntimeConfigOutputReference ¶
type GoogleDataprocBatchRuntimeConfigOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) ContainerImage() *string SetContainerImage(val *string) ContainerImageInput() *string // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string EffectiveProperties() cdktf.StringMap // Experimental. Fqn() *string InternalValue() *GoogleDataprocBatchRuntimeConfig SetInternalValue(val *GoogleDataprocBatchRuntimeConfig) Properties() *map[string]*string SetProperties(val *map[string]*string) PropertiesInput() *map[string]*string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) Version() *string SetVersion(val *string) VersionInput() *string // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetContainerImage() ResetProperties() ResetVersion() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchRuntimeConfigOutputReference ¶
func NewGoogleDataprocBatchRuntimeConfigOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocBatchRuntimeConfigOutputReference
type GoogleDataprocBatchRuntimeInfo ¶
type GoogleDataprocBatchRuntimeInfo struct { }
type GoogleDataprocBatchRuntimeInfoApproximateUsage ¶
type GoogleDataprocBatchRuntimeInfoApproximateUsage struct { }
type GoogleDataprocBatchRuntimeInfoApproximateUsageList ¶
type GoogleDataprocBatchRuntimeInfoApproximateUsageList interface { cdktf.ComplexList // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string // Experimental. Fqn() *string // The attribute on the parent resource this class is referencing. TerraformAttribute() *string SetTerraformAttribute(val *string) // The parent resource. TerraformResource() cdktf.IInterpolatingParent SetTerraformResource(val cdktf.IInterpolatingParent) // whether the list is wrapping a set (will add tolist() to be able to access an item via an index). WrapsSet() *bool SetWrapsSet(val *bool) // Creating an iterator for this complex list. // // The list will be converted into a map with the mapKeyAttributeName as the key. // Experimental. AllWithMapKey(mapKeyAttributeName *string) cdktf.DynamicListTerraformIterator // Experimental. ComputeFqn() *string Get(index *float64) GoogleDataprocBatchRuntimeInfoApproximateUsageOutputReference // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchRuntimeInfoApproximateUsageList ¶
func NewGoogleDataprocBatchRuntimeInfoApproximateUsageList(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, wrapsSet *bool) GoogleDataprocBatchRuntimeInfoApproximateUsageList
type GoogleDataprocBatchRuntimeInfoApproximateUsageOutputReference ¶
type GoogleDataprocBatchRuntimeInfoApproximateUsageOutputReference interface { cdktf.ComplexObject AcceleratorType() *string // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataprocBatchRuntimeInfoApproximateUsage SetInternalValue(val *GoogleDataprocBatchRuntimeInfoApproximateUsage) MilliAcceleratorSeconds() *string MilliDcuSeconds() *string ShuffleStorageGbSeconds() *string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchRuntimeInfoApproximateUsageOutputReference ¶
func NewGoogleDataprocBatchRuntimeInfoApproximateUsageOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, complexObjectIndex *float64, complexObjectIsFromSet *bool) GoogleDataprocBatchRuntimeInfoApproximateUsageOutputReference
type GoogleDataprocBatchRuntimeInfoCurrentUsage ¶
type GoogleDataprocBatchRuntimeInfoCurrentUsage struct { }
type GoogleDataprocBatchRuntimeInfoCurrentUsageList ¶
type GoogleDataprocBatchRuntimeInfoCurrentUsageList interface { cdktf.ComplexList // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string // Experimental. Fqn() *string // The attribute on the parent resource this class is referencing. TerraformAttribute() *string SetTerraformAttribute(val *string) // The parent resource. TerraformResource() cdktf.IInterpolatingParent SetTerraformResource(val cdktf.IInterpolatingParent) // whether the list is wrapping a set (will add tolist() to be able to access an item via an index). WrapsSet() *bool SetWrapsSet(val *bool) // Creating an iterator for this complex list. // // The list will be converted into a map with the mapKeyAttributeName as the key. // Experimental. AllWithMapKey(mapKeyAttributeName *string) cdktf.DynamicListTerraformIterator // Experimental. ComputeFqn() *string Get(index *float64) GoogleDataprocBatchRuntimeInfoCurrentUsageOutputReference // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchRuntimeInfoCurrentUsageList ¶
func NewGoogleDataprocBatchRuntimeInfoCurrentUsageList(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, wrapsSet *bool) GoogleDataprocBatchRuntimeInfoCurrentUsageList
type GoogleDataprocBatchRuntimeInfoCurrentUsageOutputReference ¶
type GoogleDataprocBatchRuntimeInfoCurrentUsageOutputReference interface { cdktf.ComplexObject AcceleratorType() *string // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataprocBatchRuntimeInfoCurrentUsage SetInternalValue(val *GoogleDataprocBatchRuntimeInfoCurrentUsage) MilliAccelerator() *string MilliDcu() *string MilliDcuPremium() *string ShuffleStorageGb() *string ShuffleStorageGbPremium() *string SnapshotTime() *string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchRuntimeInfoCurrentUsageOutputReference ¶
func NewGoogleDataprocBatchRuntimeInfoCurrentUsageOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, complexObjectIndex *float64, complexObjectIsFromSet *bool) GoogleDataprocBatchRuntimeInfoCurrentUsageOutputReference
type GoogleDataprocBatchRuntimeInfoList ¶
type GoogleDataprocBatchRuntimeInfoList interface { cdktf.ComplexList // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string // Experimental. Fqn() *string // The attribute on the parent resource this class is referencing. TerraformAttribute() *string SetTerraformAttribute(val *string) // The parent resource. TerraformResource() cdktf.IInterpolatingParent SetTerraformResource(val cdktf.IInterpolatingParent) // whether the list is wrapping a set (will add tolist() to be able to access an item via an index). WrapsSet() *bool SetWrapsSet(val *bool) // Creating an iterator for this complex list. // // The list will be converted into a map with the mapKeyAttributeName as the key. // Experimental. AllWithMapKey(mapKeyAttributeName *string) cdktf.DynamicListTerraformIterator // Experimental. ComputeFqn() *string Get(index *float64) GoogleDataprocBatchRuntimeInfoOutputReference // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchRuntimeInfoList ¶
func NewGoogleDataprocBatchRuntimeInfoList(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, wrapsSet *bool) GoogleDataprocBatchRuntimeInfoList
type GoogleDataprocBatchRuntimeInfoOutputReference ¶
type GoogleDataprocBatchRuntimeInfoOutputReference interface { cdktf.ComplexObject ApproximateUsage() GoogleDataprocBatchRuntimeInfoApproximateUsageList // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string CurrentUsage() GoogleDataprocBatchRuntimeInfoCurrentUsageList DiagnosticOutputUri() *string Endpoints() cdktf.StringMap // Experimental. Fqn() *string InternalValue() *GoogleDataprocBatchRuntimeInfo SetInternalValue(val *GoogleDataprocBatchRuntimeInfo) OutputUri() *string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchRuntimeInfoOutputReference ¶
func NewGoogleDataprocBatchRuntimeInfoOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, complexObjectIndex *float64, complexObjectIsFromSet *bool) GoogleDataprocBatchRuntimeInfoOutputReference
type GoogleDataprocBatchSparkBatch ¶
type GoogleDataprocBatchSparkBatch struct { // HCFS URIs of archives to be extracted into the working directory of each executor. // // Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#archive_uris GoogleDataprocBatch#archive_uris} ArchiveUris *[]*string `field:"optional" json:"archiveUris" yaml:"archiveUris"` // The arguments to pass to the driver. // // Do not include arguments that can be set as batch // properties, such as --conf, since a collision can occur that causes an incorrect batch submission. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#args GoogleDataprocBatch#args} Args *[]*string `field:"optional" json:"args" yaml:"args"` // HCFS URIs of files to be placed in the working directory of each executor. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#file_uris GoogleDataprocBatch#file_uris} FileUris *[]*string `field:"optional" json:"fileUris" yaml:"fileUris"` // HCFS URIs of jar files to add to the classpath of the Spark driver and tasks. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#jar_file_uris GoogleDataprocBatch#jar_file_uris} JarFileUris *[]*string `field:"optional" json:"jarFileUris" yaml:"jarFileUris"` // The name of the driver main class. // // The jar file that contains the class must be in the // classpath or specified in jarFileUris. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#main_class GoogleDataprocBatch#main_class} MainClass *string `field:"optional" json:"mainClass" yaml:"mainClass"` // The HCFS URI of the jar file that contains the main class. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#main_jar_file_uri GoogleDataprocBatch#main_jar_file_uri} MainJarFileUri *string `field:"optional" json:"mainJarFileUri" yaml:"mainJarFileUri"` }
type GoogleDataprocBatchSparkBatchOutputReference ¶
type GoogleDataprocBatchSparkBatchOutputReference interface { cdktf.ComplexObject ArchiveUris() *[]*string SetArchiveUris(val *[]*string) ArchiveUrisInput() *[]*string Args() *[]*string SetArgs(val *[]*string) ArgsInput() *[]*string // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string FileUris() *[]*string SetFileUris(val *[]*string) FileUrisInput() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataprocBatchSparkBatch SetInternalValue(val *GoogleDataprocBatchSparkBatch) JarFileUris() *[]*string SetJarFileUris(val *[]*string) JarFileUrisInput() *[]*string MainClass() *string SetMainClass(val *string) MainClassInput() *string MainJarFileUri() *string SetMainJarFileUri(val *string) MainJarFileUriInput() *string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetArchiveUris() ResetArgs() ResetFileUris() ResetJarFileUris() ResetMainClass() ResetMainJarFileUri() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchSparkBatchOutputReference ¶
func NewGoogleDataprocBatchSparkBatchOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocBatchSparkBatchOutputReference
type GoogleDataprocBatchSparkRBatch ¶
type GoogleDataprocBatchSparkRBatch struct { // HCFS URIs of archives to be extracted into the working directory of each executor. // // Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#archive_uris GoogleDataprocBatch#archive_uris} ArchiveUris *[]*string `field:"optional" json:"archiveUris" yaml:"archiveUris"` // The arguments to pass to the driver. // // Do not include arguments that can be set as batch // properties, such as --conf, since a collision can occur that causes an incorrect batch submission. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#args GoogleDataprocBatch#args} Args *[]*string `field:"optional" json:"args" yaml:"args"` // HCFS URIs of files to be placed in the working directory of each executor. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#file_uris GoogleDataprocBatch#file_uris} FileUris *[]*string `field:"optional" json:"fileUris" yaml:"fileUris"` // The HCFS URI of the main R file to use as the driver. // // Must be a .R or .r file. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#main_r_file_uri GoogleDataprocBatch#main_r_file_uri} MainRFileUri *string `field:"optional" json:"mainRFileUri" yaml:"mainRFileUri"` }
type GoogleDataprocBatchSparkRBatchOutputReference ¶
type GoogleDataprocBatchSparkRBatchOutputReference interface { cdktf.ComplexObject ArchiveUris() *[]*string SetArchiveUris(val *[]*string) ArchiveUrisInput() *[]*string Args() *[]*string SetArgs(val *[]*string) ArgsInput() *[]*string // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string FileUris() *[]*string SetFileUris(val *[]*string) FileUrisInput() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataprocBatchSparkRBatch SetInternalValue(val *GoogleDataprocBatchSparkRBatch) MainRFileUri() *string SetMainRFileUri(val *string) MainRFileUriInput() *string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetArchiveUris() ResetArgs() ResetFileUris() ResetMainRFileUri() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchSparkRBatchOutputReference ¶
func NewGoogleDataprocBatchSparkRBatchOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocBatchSparkRBatchOutputReference
type GoogleDataprocBatchSparkSqlBatch ¶
type GoogleDataprocBatchSparkSqlBatch struct { // HCFS URIs of jar files to be added to the Spark CLASSPATH. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#jar_file_uris GoogleDataprocBatch#jar_file_uris} JarFileUris *[]*string `field:"optional" json:"jarFileUris" yaml:"jarFileUris"` // The HCFS URI of the script that contains Spark SQL queries to execute. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#query_file_uri GoogleDataprocBatch#query_file_uri} QueryFileUri *string `field:"optional" json:"queryFileUri" yaml:"queryFileUri"` // Mapping of query variable names to values (equivalent to the Spark SQL command: SET name="value";). // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#query_variables GoogleDataprocBatch#query_variables} QueryVariables *map[string]*string `field:"optional" json:"queryVariables" yaml:"queryVariables"` }
type GoogleDataprocBatchSparkSqlBatchOutputReference ¶
type GoogleDataprocBatchSparkSqlBatchOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataprocBatchSparkSqlBatch SetInternalValue(val *GoogleDataprocBatchSparkSqlBatch) JarFileUris() *[]*string SetJarFileUris(val *[]*string) JarFileUrisInput() *[]*string QueryFileUri() *string SetQueryFileUri(val *string) QueryFileUriInput() *string QueryVariables() *map[string]*string SetQueryVariables(val *map[string]*string) QueryVariablesInput() *map[string]*string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetJarFileUris() ResetQueryFileUri() ResetQueryVariables() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchSparkSqlBatchOutputReference ¶
func NewGoogleDataprocBatchSparkSqlBatchOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocBatchSparkSqlBatchOutputReference
type GoogleDataprocBatchStateHistory ¶
type GoogleDataprocBatchStateHistory struct { }
type GoogleDataprocBatchStateHistoryList ¶
type GoogleDataprocBatchStateHistoryList interface { cdktf.ComplexList // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string // Experimental. Fqn() *string // The attribute on the parent resource this class is referencing. TerraformAttribute() *string SetTerraformAttribute(val *string) // The parent resource. TerraformResource() cdktf.IInterpolatingParent SetTerraformResource(val cdktf.IInterpolatingParent) // whether the list is wrapping a set (will add tolist() to be able to access an item via an index). WrapsSet() *bool SetWrapsSet(val *bool) // Creating an iterator for this complex list. // // The list will be converted into a map with the mapKeyAttributeName as the key. // Experimental. AllWithMapKey(mapKeyAttributeName *string) cdktf.DynamicListTerraformIterator // Experimental. ComputeFqn() *string Get(index *float64) GoogleDataprocBatchStateHistoryOutputReference // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchStateHistoryList ¶
func NewGoogleDataprocBatchStateHistoryList(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, wrapsSet *bool) GoogleDataprocBatchStateHistoryList
type GoogleDataprocBatchStateHistoryOutputReference ¶
type GoogleDataprocBatchStateHistoryOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataprocBatchStateHistory SetInternalValue(val *GoogleDataprocBatchStateHistory) State() *string StateMessage() *string StateStartTime() *string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchStateHistoryOutputReference ¶
func NewGoogleDataprocBatchStateHistoryOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string, complexObjectIndex *float64, complexObjectIsFromSet *bool) GoogleDataprocBatchStateHistoryOutputReference
type GoogleDataprocBatchTimeouts ¶
type GoogleDataprocBatchTimeouts struct { // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#create GoogleDataprocBatch#create}. Create *string `field:"optional" json:"create" yaml:"create"` // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#delete GoogleDataprocBatch#delete}. Delete *string `field:"optional" json:"delete" yaml:"delete"` // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.7.0/docs/resources/google_dataproc_batch#update GoogleDataprocBatch#update}. Update *string `field:"optional" json:"update" yaml:"update"` }
type GoogleDataprocBatchTimeoutsOutputReference ¶
type GoogleDataprocBatchTimeoutsOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) Create() *string SetCreate(val *string) CreateInput() *string // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string Delete() *string SetDelete(val *string) DeleteInput() *string // Experimental. Fqn() *string InternalValue() interface{} SetInternalValue(val interface{}) // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) Update() *string SetUpdate(val *string) UpdateInput() *string // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetCreate() ResetDelete() ResetUpdate() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocBatchTimeoutsOutputReference ¶
func NewGoogleDataprocBatchTimeoutsOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocBatchTimeoutsOutputReference
Source Files ¶
- GoogleDataprocBatch.go
- GoogleDataprocBatchConfig.go
- GoogleDataprocBatchEnvironmentConfig.go
- GoogleDataprocBatchEnvironmentConfigExecutionConfig.go
- GoogleDataprocBatchEnvironmentConfigExecutionConfigOutputReference.go
- GoogleDataprocBatchEnvironmentConfigExecutionConfigOutputReference__checks.go
- GoogleDataprocBatchEnvironmentConfigOutputReference.go
- GoogleDataprocBatchEnvironmentConfigOutputReference__checks.go
- GoogleDataprocBatchEnvironmentConfigPeripheralsConfig.go
- GoogleDataprocBatchEnvironmentConfigPeripheralsConfigOutputReference.go
- GoogleDataprocBatchEnvironmentConfigPeripheralsConfigOutputReference__checks.go
- GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfig.go
- GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfigOutputReference.go
- GoogleDataprocBatchEnvironmentConfigPeripheralsConfigSparkHistoryServerConfigOutputReference__checks.go
- GoogleDataprocBatchPysparkBatch.go
- GoogleDataprocBatchPysparkBatchOutputReference.go
- GoogleDataprocBatchPysparkBatchOutputReference__checks.go
- GoogleDataprocBatchRuntimeConfig.go
- GoogleDataprocBatchRuntimeConfigOutputReference.go
- GoogleDataprocBatchRuntimeConfigOutputReference__checks.go
- GoogleDataprocBatchRuntimeInfo.go
- GoogleDataprocBatchRuntimeInfoApproximateUsage.go
- GoogleDataprocBatchRuntimeInfoApproximateUsageList.go
- GoogleDataprocBatchRuntimeInfoApproximateUsageList__checks.go
- GoogleDataprocBatchRuntimeInfoApproximateUsageOutputReference.go
- GoogleDataprocBatchRuntimeInfoApproximateUsageOutputReference__checks.go
- GoogleDataprocBatchRuntimeInfoCurrentUsage.go
- GoogleDataprocBatchRuntimeInfoCurrentUsageList.go
- GoogleDataprocBatchRuntimeInfoCurrentUsageList__checks.go
- GoogleDataprocBatchRuntimeInfoCurrentUsageOutputReference.go
- GoogleDataprocBatchRuntimeInfoCurrentUsageOutputReference__checks.go
- GoogleDataprocBatchRuntimeInfoList.go
- GoogleDataprocBatchRuntimeInfoList__checks.go
- GoogleDataprocBatchRuntimeInfoOutputReference.go
- GoogleDataprocBatchRuntimeInfoOutputReference__checks.go
- GoogleDataprocBatchSparkBatch.go
- GoogleDataprocBatchSparkBatchOutputReference.go
- GoogleDataprocBatchSparkBatchOutputReference__checks.go
- GoogleDataprocBatchSparkRBatch.go
- GoogleDataprocBatchSparkRBatchOutputReference.go
- GoogleDataprocBatchSparkRBatchOutputReference__checks.go
- GoogleDataprocBatchSparkSqlBatch.go
- GoogleDataprocBatchSparkSqlBatchOutputReference.go
- GoogleDataprocBatchSparkSqlBatchOutputReference__checks.go
- GoogleDataprocBatchStateHistory.go
- GoogleDataprocBatchStateHistoryList.go
- GoogleDataprocBatchStateHistoryList__checks.go
- GoogleDataprocBatchStateHistoryOutputReference.go
- GoogleDataprocBatchStateHistoryOutputReference__checks.go
- GoogleDataprocBatchTimeouts.go
- GoogleDataprocBatchTimeoutsOutputReference.go
- GoogleDataprocBatchTimeoutsOutputReference__checks.go
- GoogleDataprocBatch__checks.go
- main.go