Documentation
¶
Index ¶
- func GoogleDataprocGdcSparkApplication_GenerateConfigForImport(scope constructs.Construct, importToId *string, importFromId *string, ...) cdktf.ImportableResource
- func GoogleDataprocGdcSparkApplication_IsConstruct(x interface{}) *bool
- func GoogleDataprocGdcSparkApplication_IsTerraformElement(x interface{}) *bool
- func GoogleDataprocGdcSparkApplication_IsTerraformResource(x interface{}) *bool
- func GoogleDataprocGdcSparkApplication_TfResourceType() *string
- func NewGoogleDataprocGdcSparkApplicationPysparkApplicationConfigOutputReference_Override(g GoogleDataprocGdcSparkApplicationPysparkApplicationConfigOutputReference, ...)
- func NewGoogleDataprocGdcSparkApplicationSparkApplicationConfigOutputReference_Override(g GoogleDataprocGdcSparkApplicationSparkApplicationConfigOutputReference, ...)
- func NewGoogleDataprocGdcSparkApplicationSparkRApplicationConfigOutputReference_Override(g GoogleDataprocGdcSparkApplicationSparkRApplicationConfigOutputReference, ...)
- func NewGoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigOutputReference_Override(g GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigOutputReference, ...)
- func NewGoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStructOutputReference_Override(...)
- func NewGoogleDataprocGdcSparkApplicationTimeoutsOutputReference_Override(g GoogleDataprocGdcSparkApplicationTimeoutsOutputReference, ...)
- func NewGoogleDataprocGdcSparkApplication_Override(g GoogleDataprocGdcSparkApplication, scope constructs.Construct, id *string, ...)
- type GoogleDataprocGdcSparkApplication
- type GoogleDataprocGdcSparkApplicationConfig
- type GoogleDataprocGdcSparkApplicationPysparkApplicationConfig
- type GoogleDataprocGdcSparkApplicationPysparkApplicationConfigOutputReference
- type GoogleDataprocGdcSparkApplicationSparkApplicationConfig
- type GoogleDataprocGdcSparkApplicationSparkApplicationConfigOutputReference
- type GoogleDataprocGdcSparkApplicationSparkRApplicationConfig
- type GoogleDataprocGdcSparkApplicationSparkRApplicationConfigOutputReference
- type GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfig
- type GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigOutputReference
- type GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStruct
- type GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStructOutputReference
- type GoogleDataprocGdcSparkApplicationTimeouts
- type GoogleDataprocGdcSparkApplicationTimeoutsOutputReference
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func GoogleDataprocGdcSparkApplication_GenerateConfigForImport ¶
func GoogleDataprocGdcSparkApplication_GenerateConfigForImport(scope constructs.Construct, importToId *string, importFromId *string, provider cdktf.TerraformProvider) cdktf.ImportableResource
Generates CDKTF code for importing a GoogleDataprocGdcSparkApplication resource upon running "cdktf plan <stack-name>".
func GoogleDataprocGdcSparkApplication_IsConstruct ¶
func GoogleDataprocGdcSparkApplication_IsConstruct(x interface{}) *bool
Checks if `x` is a construct.
Use this method instead of `instanceof` to properly detect `Construct` instances, even when the construct library is symlinked.
Explanation: in JavaScript, multiple copies of the `constructs` library on disk are seen as independent, completely different libraries. As a consequence, the class `Construct` in each copy of the `constructs` library is seen as a different class, and an instance of one class will not test as `instanceof` the other class. `npm install` will not create installations like this, but users may manually symlink construct libraries together or use a monorepo tool: in those cases, multiple copies of the `constructs` library can be accidentally installed, and `instanceof` will behave unpredictably. It is safest to avoid using `instanceof`, and using this type-testing method instead.
Returns: true if `x` is an object created from a class which extends `Construct`.
func GoogleDataprocGdcSparkApplication_IsTerraformElement ¶
func GoogleDataprocGdcSparkApplication_IsTerraformElement(x interface{}) *bool
Experimental.
func GoogleDataprocGdcSparkApplication_IsTerraformResource ¶
func GoogleDataprocGdcSparkApplication_IsTerraformResource(x interface{}) *bool
Experimental.
func GoogleDataprocGdcSparkApplication_TfResourceType ¶
func GoogleDataprocGdcSparkApplication_TfResourceType() *string
func NewGoogleDataprocGdcSparkApplicationPysparkApplicationConfigOutputReference_Override ¶
func NewGoogleDataprocGdcSparkApplicationPysparkApplicationConfigOutputReference_Override(g GoogleDataprocGdcSparkApplicationPysparkApplicationConfigOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocGdcSparkApplicationSparkApplicationConfigOutputReference_Override ¶
func NewGoogleDataprocGdcSparkApplicationSparkApplicationConfigOutputReference_Override(g GoogleDataprocGdcSparkApplicationSparkApplicationConfigOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocGdcSparkApplicationSparkRApplicationConfigOutputReference_Override ¶
func NewGoogleDataprocGdcSparkApplicationSparkRApplicationConfigOutputReference_Override(g GoogleDataprocGdcSparkApplicationSparkRApplicationConfigOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigOutputReference_Override ¶
func NewGoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigOutputReference_Override(g GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStructOutputReference_Override ¶
func NewGoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStructOutputReference_Override(g GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStructOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocGdcSparkApplicationTimeoutsOutputReference_Override ¶
func NewGoogleDataprocGdcSparkApplicationTimeoutsOutputReference_Override(g GoogleDataprocGdcSparkApplicationTimeoutsOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataprocGdcSparkApplication_Override ¶
func NewGoogleDataprocGdcSparkApplication_Override(g GoogleDataprocGdcSparkApplication, scope constructs.Construct, id *string, config *GoogleDataprocGdcSparkApplicationConfig)
Create a new {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application google_dataproc_gdc_spark_application} Resource.
Types ¶
type GoogleDataprocGdcSparkApplication ¶
type GoogleDataprocGdcSparkApplication interface { cdktf.TerraformResource Annotations() *map[string]*string SetAnnotations(val *map[string]*string) AnnotationsInput() *map[string]*string ApplicationEnvironment() *string SetApplicationEnvironment(val *string) ApplicationEnvironmentInput() *string // Experimental. CdktfStack() cdktf.TerraformStack // Experimental. Connection() interface{} // Experimental. SetConnection(val interface{}) // Experimental. ConstructNodeMetadata() *map[string]interface{} // Experimental. Count() interface{} // Experimental. SetCount(val interface{}) CreateTime() *string DependencyImages() *[]*string SetDependencyImages(val *[]*string) DependencyImagesInput() *[]*string // Experimental. DependsOn() *[]*string // Experimental. SetDependsOn(val *[]*string) DisplayName() *string SetDisplayName(val *string) DisplayNameInput() *string EffectiveAnnotations() cdktf.StringMap EffectiveLabels() cdktf.StringMap // Experimental. ForEach() cdktf.ITerraformIterator // Experimental. SetForEach(val cdktf.ITerraformIterator) // Experimental. Fqn() *string // Experimental. FriendlyUniqueId() *string Id() *string SetId(val *string) IdInput() *string Labels() *map[string]*string SetLabels(val *map[string]*string) LabelsInput() *map[string]*string // Experimental. Lifecycle() *cdktf.TerraformResourceLifecycle // Experimental. SetLifecycle(val *cdktf.TerraformResourceLifecycle) Location() *string SetLocation(val *string) LocationInput() *string MonitoringEndpoint() *string Name() *string Namespace() *string SetNamespace(val *string) NamespaceInput() *string // The tree node. Node() constructs.Node OutputUri() *string Project() *string SetProject(val *string) ProjectInput() *string Properties() *map[string]*string SetProperties(val *map[string]*string) PropertiesInput() *map[string]*string // Experimental. Provider() cdktf.TerraformProvider // Experimental. SetProvider(val cdktf.TerraformProvider) // Experimental. Provisioners() *[]interface{} // Experimental. SetProvisioners(val *[]interface{}) PysparkApplicationConfig() GoogleDataprocGdcSparkApplicationPysparkApplicationConfigOutputReference PysparkApplicationConfigInput() *GoogleDataprocGdcSparkApplicationPysparkApplicationConfig // Experimental. RawOverrides() interface{} Reconciling() cdktf.IResolvable Serviceinstance() *string SetServiceinstance(val *string) ServiceinstanceInput() *string SparkApplicationConfig() GoogleDataprocGdcSparkApplicationSparkApplicationConfigOutputReference SparkApplicationConfigInput() *GoogleDataprocGdcSparkApplicationSparkApplicationConfig SparkApplicationId() *string SetSparkApplicationId(val *string) SparkApplicationIdInput() *string SparkRApplicationConfig() GoogleDataprocGdcSparkApplicationSparkRApplicationConfigOutputReference SparkRApplicationConfigInput() *GoogleDataprocGdcSparkApplicationSparkRApplicationConfig SparkSqlApplicationConfig() GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigOutputReference SparkSqlApplicationConfigInput() *GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfig State() *string StateMessage() *string // Experimental. TerraformGeneratorMetadata() *cdktf.TerraformProviderGeneratorMetadata TerraformLabels() cdktf.StringMap // Experimental. TerraformMetaArguments() *map[string]interface{} // Experimental. TerraformResourceType() *string Timeouts() GoogleDataprocGdcSparkApplicationTimeoutsOutputReference TimeoutsInput() interface{} Uid() *string UpdateTime() *string Version() *string SetVersion(val *string) VersionInput() *string // Adds a user defined moveTarget string to this resource to be later used in .moveTo(moveTarget) to resolve the location of the move. // Experimental. AddMoveTarget(moveTarget *string) // Experimental. AddOverride(path *string, value interface{}) // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. HasResourceMove() interface{} // Experimental. ImportFrom(id *string, provider cdktf.TerraformProvider) // Experimental. InterpolationForAttribute(terraformAttribute *string) cdktf.IResolvable // Move the resource corresponding to "id" to this resource. // // Note that the resource being moved from must be marked as moved using it's instance function. // Experimental. MoveFromId(id *string) // Moves this resource to the target resource given by moveTarget. // Experimental. MoveTo(moveTarget *string, index interface{}) // Moves this resource to the resource corresponding to "id". // Experimental. MoveToId(id *string) // Overrides the auto-generated logical ID with a specific ID. // Experimental. OverrideLogicalId(newLogicalId *string) PutPysparkApplicationConfig(value *GoogleDataprocGdcSparkApplicationPysparkApplicationConfig) PutSparkApplicationConfig(value *GoogleDataprocGdcSparkApplicationSparkApplicationConfig) PutSparkRApplicationConfig(value *GoogleDataprocGdcSparkApplicationSparkRApplicationConfig) PutSparkSqlApplicationConfig(value *GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfig) PutTimeouts(value *GoogleDataprocGdcSparkApplicationTimeouts) ResetAnnotations() ResetApplicationEnvironment() ResetDependencyImages() ResetDisplayName() ResetId() ResetLabels() ResetNamespace() // Resets a previously passed logical Id to use the auto-generated logical id again. // Experimental. ResetOverrideLogicalId() ResetProject() ResetProperties() ResetPysparkApplicationConfig() ResetSparkApplicationConfig() ResetSparkRApplicationConfig() ResetSparkSqlApplicationConfig() ResetTimeouts() ResetVersion() SynthesizeAttributes() *map[string]interface{} SynthesizeHclAttributes() *map[string]interface{} // Experimental. ToHclTerraform() interface{} // Experimental. ToMetadata() interface{} // Returns a string representation of this construct. ToString() *string // Adds this resource to the terraform JSON output. // Experimental. ToTerraform() interface{} }
Represents a {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application google_dataproc_gdc_spark_application}.
func NewGoogleDataprocGdcSparkApplication ¶
func NewGoogleDataprocGdcSparkApplication(scope constructs.Construct, id *string, config *GoogleDataprocGdcSparkApplicationConfig) GoogleDataprocGdcSparkApplication
Create a new {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application google_dataproc_gdc_spark_application} Resource.
type GoogleDataprocGdcSparkApplicationConfig ¶
type GoogleDataprocGdcSparkApplicationConfig struct { // Experimental. Connection interface{} `field:"optional" json:"connection" yaml:"connection"` // Experimental. Count interface{} `field:"optional" json:"count" yaml:"count"` // Experimental. DependsOn *[]cdktf.ITerraformDependable `field:"optional" json:"dependsOn" yaml:"dependsOn"` // Experimental. ForEach cdktf.ITerraformIterator `field:"optional" json:"forEach" yaml:"forEach"` // Experimental. Lifecycle *cdktf.TerraformResourceLifecycle `field:"optional" json:"lifecycle" yaml:"lifecycle"` // Experimental. Provider cdktf.TerraformProvider `field:"optional" json:"provider" yaml:"provider"` // Experimental. Provisioners *[]interface{} `field:"optional" json:"provisioners" yaml:"provisioners"` // The location of the spark application. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#location GoogleDataprocGdcSparkApplication#location} Location *string `field:"required" json:"location" yaml:"location"` // The id of the service instance to which this spark application belongs. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#serviceinstance GoogleDataprocGdcSparkApplication#serviceinstance} Serviceinstance *string `field:"required" json:"serviceinstance" yaml:"serviceinstance"` // The id of the application. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#spark_application_id GoogleDataprocGdcSparkApplication#spark_application_id} SparkApplicationId *string `field:"required" json:"sparkApplicationId" yaml:"sparkApplicationId"` // The annotations to associate with this application. // // Annotations may be used to store client information, but are not used by the server. // // **Note**: This field is non-authoritative, and will only manage the annotations present in your configuration. // Please refer to the field 'effective_annotations' for all of the annotations present on the resource. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#annotations GoogleDataprocGdcSparkApplication#annotations} Annotations *map[string]*string `field:"optional" json:"annotations" yaml:"annotations"` // An ApplicationEnvironment from which to inherit configuration properties. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#application_environment GoogleDataprocGdcSparkApplication#application_environment} ApplicationEnvironment *string `field:"optional" json:"applicationEnvironment" yaml:"applicationEnvironment"` // List of container image uris for additional file dependencies. // // Dependent files are sequentially copied from each image. If a file with the same name exists in 2 images then the file from later image is used. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#dependency_images GoogleDataprocGdcSparkApplication#dependency_images} DependencyImages *[]*string `field:"optional" json:"dependencyImages" yaml:"dependencyImages"` // User-provided human-readable name to be used in user interfaces. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#display_name GoogleDataprocGdcSparkApplication#display_name} DisplayName *string `field:"optional" json:"displayName" yaml:"displayName"` // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#id GoogleDataprocGdcSparkApplication#id}. // // Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. // If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable. Id *string `field:"optional" json:"id" yaml:"id"` // The labels to associate with this application. Labels may be used for filtering and billing tracking. // // **Note**: This field is non-authoritative, and will only manage the labels present in your configuration. // Please refer to the field 'effective_labels' for all of the labels present on the resource. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#labels GoogleDataprocGdcSparkApplication#labels} Labels *map[string]*string `field:"optional" json:"labels" yaml:"labels"` // The Kubernetes namespace in which to create the application. This namespace must already exist on the cluster. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#namespace GoogleDataprocGdcSparkApplication#namespace} Namespace *string `field:"optional" json:"namespace" yaml:"namespace"` // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#project GoogleDataprocGdcSparkApplication#project}. Project *string `field:"optional" json:"project" yaml:"project"` // application-specific properties. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#properties GoogleDataprocGdcSparkApplication#properties} Properties *map[string]*string `field:"optional" json:"properties" yaml:"properties"` // pyspark_application_config block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#pyspark_application_config GoogleDataprocGdcSparkApplication#pyspark_application_config} PysparkApplicationConfig *GoogleDataprocGdcSparkApplicationPysparkApplicationConfig `field:"optional" json:"pysparkApplicationConfig" yaml:"pysparkApplicationConfig"` // spark_application_config block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#spark_application_config GoogleDataprocGdcSparkApplication#spark_application_config} SparkApplicationConfig *GoogleDataprocGdcSparkApplicationSparkApplicationConfig `field:"optional" json:"sparkApplicationConfig" yaml:"sparkApplicationConfig"` // spark_r_application_config block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#spark_r_application_config GoogleDataprocGdcSparkApplication#spark_r_application_config} SparkRApplicationConfig *GoogleDataprocGdcSparkApplicationSparkRApplicationConfig `field:"optional" json:"sparkRApplicationConfig" yaml:"sparkRApplicationConfig"` // spark_sql_application_config block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#spark_sql_application_config GoogleDataprocGdcSparkApplication#spark_sql_application_config} SparkSqlApplicationConfig *GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfig `field:"optional" json:"sparkSqlApplicationConfig" yaml:"sparkSqlApplicationConfig"` // timeouts block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#timeouts GoogleDataprocGdcSparkApplication#timeouts} Timeouts *GoogleDataprocGdcSparkApplicationTimeouts `field:"optional" json:"timeouts" yaml:"timeouts"` // The Dataproc version of this application. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#version GoogleDataprocGdcSparkApplication#version} Version *string `field:"optional" json:"version" yaml:"version"` }
type GoogleDataprocGdcSparkApplicationPysparkApplicationConfig ¶
type GoogleDataprocGdcSparkApplicationPysparkApplicationConfig struct { // The HCFS URI of the main Python file to use as the driver. Must be a .py file. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#main_python_file_uri GoogleDataprocGdcSparkApplication#main_python_file_uri} MainPythonFileUri *string `field:"required" json:"mainPythonFileUri" yaml:"mainPythonFileUri"` // HCFS URIs of archives to be extracted into the working directory of each executor. // // Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#archive_uris GoogleDataprocGdcSparkApplication#archive_uris} ArchiveUris *[]*string `field:"optional" json:"archiveUris" yaml:"archiveUris"` // The arguments to pass to the driver. // // Do not include arguments, such as '--conf', that can be set as job properties, since a collision may occur that causes an incorrect job submission. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#args GoogleDataprocGdcSparkApplication#args} Args *[]*string `field:"optional" json:"args" yaml:"args"` // HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#file_uris GoogleDataprocGdcSparkApplication#file_uris} FileUris *[]*string `field:"optional" json:"fileUris" yaml:"fileUris"` // HCFS URIs of jar files to add to the CLASSPATHs of the Python driver and tasks. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#jar_file_uris GoogleDataprocGdcSparkApplication#jar_file_uris} JarFileUris *[]*string `field:"optional" json:"jarFileUris" yaml:"jarFileUris"` // HCFS file URIs of Python files to pass to the PySpark framework. Supported file types: .py, .egg, and .zip. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#python_file_uris GoogleDataprocGdcSparkApplication#python_file_uris} PythonFileUris *[]*string `field:"optional" json:"pythonFileUris" yaml:"pythonFileUris"` }
type GoogleDataprocGdcSparkApplicationPysparkApplicationConfigOutputReference ¶
type GoogleDataprocGdcSparkApplicationPysparkApplicationConfigOutputReference interface { cdktf.ComplexObject ArchiveUris() *[]*string SetArchiveUris(val *[]*string) ArchiveUrisInput() *[]*string Args() *[]*string SetArgs(val *[]*string) ArgsInput() *[]*string // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string FileUris() *[]*string SetFileUris(val *[]*string) FileUrisInput() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataprocGdcSparkApplicationPysparkApplicationConfig SetInternalValue(val *GoogleDataprocGdcSparkApplicationPysparkApplicationConfig) JarFileUris() *[]*string SetJarFileUris(val *[]*string) JarFileUrisInput() *[]*string MainPythonFileUri() *string SetMainPythonFileUri(val *string) MainPythonFileUriInput() *string PythonFileUris() *[]*string SetPythonFileUris(val *[]*string) PythonFileUrisInput() *[]*string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetArchiveUris() ResetArgs() ResetFileUris() ResetJarFileUris() ResetPythonFileUris() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocGdcSparkApplicationPysparkApplicationConfigOutputReference ¶
func NewGoogleDataprocGdcSparkApplicationPysparkApplicationConfigOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocGdcSparkApplicationPysparkApplicationConfigOutputReference
type GoogleDataprocGdcSparkApplicationSparkApplicationConfig ¶
type GoogleDataprocGdcSparkApplicationSparkApplicationConfig struct { // HCFS URIs of archives to be extracted into the working directory of each executor. // // Supported file types: '.jar', '.tar', '.tar.gz', '.tgz', and '.zip'. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#archive_uris GoogleDataprocGdcSparkApplication#archive_uris} ArchiveUris *[]*string `field:"optional" json:"archiveUris" yaml:"archiveUris"` // The arguments to pass to the driver. // // Do not include arguments that can be set as application properties, such as '--conf', since a collision can occur that causes an incorrect application submission. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#args GoogleDataprocGdcSparkApplication#args} Args *[]*string `field:"optional" json:"args" yaml:"args"` // HCFS URIs of files to be placed in the working directory of each executor. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#file_uris GoogleDataprocGdcSparkApplication#file_uris} FileUris *[]*string `field:"optional" json:"fileUris" yaml:"fileUris"` // HCFS URIs of jar files to add to the classpath of the Spark driver and tasks. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#jar_file_uris GoogleDataprocGdcSparkApplication#jar_file_uris} JarFileUris *[]*string `field:"optional" json:"jarFileUris" yaml:"jarFileUris"` // The name of the driver main class. // // The jar file that contains the class must be in the classpath or specified in 'jar_file_uris'. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#main_class GoogleDataprocGdcSparkApplication#main_class} MainClass *string `field:"optional" json:"mainClass" yaml:"mainClass"` // The HCFS URI of the jar file that contains the main class. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#main_jar_file_uri GoogleDataprocGdcSparkApplication#main_jar_file_uri} MainJarFileUri *string `field:"optional" json:"mainJarFileUri" yaml:"mainJarFileUri"` }
type GoogleDataprocGdcSparkApplicationSparkApplicationConfigOutputReference ¶
type GoogleDataprocGdcSparkApplicationSparkApplicationConfigOutputReference interface { cdktf.ComplexObject ArchiveUris() *[]*string SetArchiveUris(val *[]*string) ArchiveUrisInput() *[]*string Args() *[]*string SetArgs(val *[]*string) ArgsInput() *[]*string // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string FileUris() *[]*string SetFileUris(val *[]*string) FileUrisInput() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataprocGdcSparkApplicationSparkApplicationConfig SetInternalValue(val *GoogleDataprocGdcSparkApplicationSparkApplicationConfig) JarFileUris() *[]*string SetJarFileUris(val *[]*string) JarFileUrisInput() *[]*string MainClass() *string SetMainClass(val *string) MainClassInput() *string MainJarFileUri() *string SetMainJarFileUri(val *string) MainJarFileUriInput() *string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetArchiveUris() ResetArgs() ResetFileUris() ResetJarFileUris() ResetMainClass() ResetMainJarFileUri() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocGdcSparkApplicationSparkApplicationConfigOutputReference ¶
func NewGoogleDataprocGdcSparkApplicationSparkApplicationConfigOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocGdcSparkApplicationSparkApplicationConfigOutputReference
type GoogleDataprocGdcSparkApplicationSparkRApplicationConfig ¶
type GoogleDataprocGdcSparkApplicationSparkRApplicationConfig struct { // The HCFS URI of the main R file to use as the driver. Must be a .R file. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#main_r_file_uri GoogleDataprocGdcSparkApplication#main_r_file_uri} MainRFileUri *string `field:"required" json:"mainRFileUri" yaml:"mainRFileUri"` // HCFS URIs of archives to be extracted into the working directory of each executor. // // Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#archive_uris GoogleDataprocGdcSparkApplication#archive_uris} ArchiveUris *[]*string `field:"optional" json:"archiveUris" yaml:"archiveUris"` // The arguments to pass to the driver. // // Do not include arguments, such as '--conf', that can be set as job properties, since a collision may occur that causes an incorrect job submission. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#args GoogleDataprocGdcSparkApplication#args} Args *[]*string `field:"optional" json:"args" yaml:"args"` // HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#file_uris GoogleDataprocGdcSparkApplication#file_uris} FileUris *[]*string `field:"optional" json:"fileUris" yaml:"fileUris"` }
type GoogleDataprocGdcSparkApplicationSparkRApplicationConfigOutputReference ¶
type GoogleDataprocGdcSparkApplicationSparkRApplicationConfigOutputReference interface { cdktf.ComplexObject ArchiveUris() *[]*string SetArchiveUris(val *[]*string) ArchiveUrisInput() *[]*string Args() *[]*string SetArgs(val *[]*string) ArgsInput() *[]*string // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string FileUris() *[]*string SetFileUris(val *[]*string) FileUrisInput() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataprocGdcSparkApplicationSparkRApplicationConfig SetInternalValue(val *GoogleDataprocGdcSparkApplicationSparkRApplicationConfig) MainRFileUri() *string SetMainRFileUri(val *string) MainRFileUriInput() *string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetArchiveUris() ResetArgs() ResetFileUris() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocGdcSparkApplicationSparkRApplicationConfigOutputReference ¶
func NewGoogleDataprocGdcSparkApplicationSparkRApplicationConfigOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocGdcSparkApplicationSparkRApplicationConfigOutputReference
type GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfig ¶
type GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfig struct { // HCFS URIs of jar files to be added to the Spark CLASSPATH. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#jar_file_uris GoogleDataprocGdcSparkApplication#jar_file_uris} JarFileUris *[]*string `field:"optional" json:"jarFileUris" yaml:"jarFileUris"` // The HCFS URI of the script that contains SQL queries. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#query_file_uri GoogleDataprocGdcSparkApplication#query_file_uri} QueryFileUri *string `field:"optional" json:"queryFileUri" yaml:"queryFileUri"` // query_list block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#query_list GoogleDataprocGdcSparkApplication#query_list} QueryList *GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStruct `field:"optional" json:"queryList" yaml:"queryList"` // Mapping of query variable names to values (equivalent to the Spark SQL command: SET 'name="value";'). // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#script_variables GoogleDataprocGdcSparkApplication#script_variables} ScriptVariables *map[string]*string `field:"optional" json:"scriptVariables" yaml:"scriptVariables"` }
type GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigOutputReference ¶
type GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfig SetInternalValue(val *GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfig) JarFileUris() *[]*string SetJarFileUris(val *[]*string) JarFileUrisInput() *[]*string QueryFileUri() *string SetQueryFileUri(val *string) QueryFileUriInput() *string QueryList() GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStructOutputReference QueryListInput() *GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStruct ScriptVariables() *map[string]*string SetScriptVariables(val *map[string]*string) ScriptVariablesInput() *map[string]*string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable PutQueryList(value *GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStruct) ResetJarFileUris() ResetQueryFileUri() ResetQueryList() ResetScriptVariables() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigOutputReference ¶
func NewGoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigOutputReference
type GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStruct ¶
type GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStruct struct { // The queries to run. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#queries GoogleDataprocGdcSparkApplication#queries} Queries *[]*string `field:"required" json:"queries" yaml:"queries"` }
type GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStructOutputReference ¶
type GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStructOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStruct SetInternalValue(val *GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStruct) Queries() *[]*string SetQueries(val *[]*string) QueriesInput() *[]*string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStructOutputReference ¶
func NewGoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStructOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStructOutputReference
type GoogleDataprocGdcSparkApplicationTimeouts ¶
type GoogleDataprocGdcSparkApplicationTimeouts struct { // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#create GoogleDataprocGdcSparkApplication#create}. Create *string `field:"optional" json:"create" yaml:"create"` // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#delete GoogleDataprocGdcSparkApplication#delete}. Delete *string `field:"optional" json:"delete" yaml:"delete"` // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/6.18.0/docs/resources/google_dataproc_gdc_spark_application#update GoogleDataprocGdcSparkApplication#update}. Update *string `field:"optional" json:"update" yaml:"update"` }
type GoogleDataprocGdcSparkApplicationTimeoutsOutputReference ¶
type GoogleDataprocGdcSparkApplicationTimeoutsOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) Create() *string SetCreate(val *string) CreateInput() *string // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string Delete() *string SetDelete(val *string) DeleteInput() *string // Experimental. Fqn() *string InternalValue() interface{} SetInternalValue(val interface{}) // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) Update() *string SetUpdate(val *string) UpdateInput() *string // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetCreate() ResetDelete() ResetUpdate() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataprocGdcSparkApplicationTimeoutsOutputReference ¶
func NewGoogleDataprocGdcSparkApplicationTimeoutsOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataprocGdcSparkApplicationTimeoutsOutputReference
Source Files
¶
- GoogleDataprocGdcSparkApplication.go
- GoogleDataprocGdcSparkApplicationConfig.go
- GoogleDataprocGdcSparkApplicationPysparkApplicationConfig.go
- GoogleDataprocGdcSparkApplicationPysparkApplicationConfigOutputReference.go
- GoogleDataprocGdcSparkApplicationPysparkApplicationConfigOutputReference__checks.go
- GoogleDataprocGdcSparkApplicationSparkApplicationConfig.go
- GoogleDataprocGdcSparkApplicationSparkApplicationConfigOutputReference.go
- GoogleDataprocGdcSparkApplicationSparkApplicationConfigOutputReference__checks.go
- GoogleDataprocGdcSparkApplicationSparkRApplicationConfig.go
- GoogleDataprocGdcSparkApplicationSparkRApplicationConfigOutputReference.go
- GoogleDataprocGdcSparkApplicationSparkRApplicationConfigOutputReference__checks.go
- GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfig.go
- GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigOutputReference.go
- GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigOutputReference__checks.go
- GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStruct.go
- GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStructOutputReference.go
- GoogleDataprocGdcSparkApplicationSparkSqlApplicationConfigQueryListStructOutputReference__checks.go
- GoogleDataprocGdcSparkApplicationTimeouts.go
- GoogleDataprocGdcSparkApplicationTimeoutsOutputReference.go
- GoogleDataprocGdcSparkApplicationTimeoutsOutputReference__checks.go
- GoogleDataprocGdcSparkApplication__checks.go
- main.go