Documentation ¶
Index ¶
- func GoogleDataPipelinePipeline_GenerateConfigForImport(scope constructs.Construct, importToId *string, importFromId *string, ...) cdktf.ImportableResource
- func GoogleDataPipelinePipeline_IsConstruct(x interface{}) *bool
- func GoogleDataPipelinePipeline_IsTerraformElement(x interface{}) *bool
- func GoogleDataPipelinePipeline_IsTerraformResource(x interface{}) *bool
- func GoogleDataPipelinePipeline_TfResourceType() *string
- func NewGoogleDataPipelinePipelineScheduleInfoOutputReference_Override(g GoogleDataPipelinePipelineScheduleInfoOutputReference, ...)
- func NewGoogleDataPipelinePipelineTimeoutsOutputReference_Override(g GoogleDataPipelinePipelineTimeoutsOutputReference, ...)
- func NewGoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironmentOutputReference_Override(...)
- func NewGoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterOutputReference_Override(...)
- func NewGoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestOutputReference_Override(g GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestOutputReference, ...)
- func NewGoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentOutputReference_Override(...)
- func NewGoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersOutputReference_Override(...)
- func NewGoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestOutputReference_Override(...)
- func NewGoogleDataPipelinePipelineWorkloadOutputReference_Override(g GoogleDataPipelinePipelineWorkloadOutputReference, ...)
- func NewGoogleDataPipelinePipeline_Override(g GoogleDataPipelinePipeline, scope constructs.Construct, id *string, ...)
- type GoogleDataPipelinePipeline
- type GoogleDataPipelinePipelineConfig
- type GoogleDataPipelinePipelineScheduleInfo
- type GoogleDataPipelinePipelineScheduleInfoOutputReference
- type GoogleDataPipelinePipelineTimeouts
- type GoogleDataPipelinePipelineTimeoutsOutputReference
- type GoogleDataPipelinePipelineWorkload
- type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequest
- type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameter
- type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment
- type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironmentOutputReference
- type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterOutputReference
- type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestOutputReference
- type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequest
- type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParameters
- type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment
- type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentOutputReference
- type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersOutputReference
- type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestOutputReference
- type GoogleDataPipelinePipelineWorkloadOutputReference
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func GoogleDataPipelinePipeline_GenerateConfigForImport ¶
func GoogleDataPipelinePipeline_GenerateConfigForImport(scope constructs.Construct, importToId *string, importFromId *string, provider cdktf.TerraformProvider) cdktf.ImportableResource
Generates CDKTF code for importing a GoogleDataPipelinePipeline resource upon running "cdktf plan <stack-name>".
func GoogleDataPipelinePipeline_IsConstruct ¶
func GoogleDataPipelinePipeline_IsConstruct(x interface{}) *bool
Checks if `x` is a construct.
Use this method instead of `instanceof` to properly detect `Construct` instances, even when the construct library is symlinked.
Explanation: in JavaScript, multiple copies of the `constructs` library on disk are seen as independent, completely different libraries. As a consequence, the class `Construct` in each copy of the `constructs` library is seen as a different class, and an instance of one class will not test as `instanceof` the other class. `npm install` will not create installations like this, but users may manually symlink construct libraries together or use a monorepo tool: in those cases, multiple copies of the `constructs` library can be accidentally installed, and `instanceof` will behave unpredictably. It is safest to avoid using `instanceof`, and using this type-testing method instead.
Returns: true if `x` is an object created from a class which extends `Construct`.
func GoogleDataPipelinePipeline_IsTerraformElement ¶
func GoogleDataPipelinePipeline_IsTerraformElement(x interface{}) *bool
Experimental.
func GoogleDataPipelinePipeline_IsTerraformResource ¶
func GoogleDataPipelinePipeline_IsTerraformResource(x interface{}) *bool
Experimental.
func GoogleDataPipelinePipeline_TfResourceType ¶
func GoogleDataPipelinePipeline_TfResourceType() *string
func NewGoogleDataPipelinePipelineScheduleInfoOutputReference_Override ¶
func NewGoogleDataPipelinePipelineScheduleInfoOutputReference_Override(g GoogleDataPipelinePipelineScheduleInfoOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataPipelinePipelineTimeoutsOutputReference_Override ¶
func NewGoogleDataPipelinePipelineTimeoutsOutputReference_Override(g GoogleDataPipelinePipelineTimeoutsOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironmentOutputReference_Override ¶
func NewGoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironmentOutputReference_Override(g GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironmentOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterOutputReference_Override ¶
func NewGoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterOutputReference_Override(g GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestOutputReference_Override ¶
func NewGoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestOutputReference_Override(g GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentOutputReference_Override ¶
func NewGoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentOutputReference_Override(g GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersOutputReference_Override ¶
func NewGoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersOutputReference_Override(g GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestOutputReference_Override ¶
func NewGoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestOutputReference_Override(g GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataPipelinePipelineWorkloadOutputReference_Override ¶
func NewGoogleDataPipelinePipelineWorkloadOutputReference_Override(g GoogleDataPipelinePipelineWorkloadOutputReference, terraformResource cdktf.IInterpolatingParent, terraformAttribute *string)
func NewGoogleDataPipelinePipeline_Override ¶
func NewGoogleDataPipelinePipeline_Override(g GoogleDataPipelinePipeline, scope constructs.Construct, id *string, config *GoogleDataPipelinePipelineConfig)
Create a new {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline google_data_pipeline_pipeline} Resource.
Types ¶
type GoogleDataPipelinePipeline ¶
type GoogleDataPipelinePipeline interface { cdktf.TerraformResource // Experimental. CdktfStack() cdktf.TerraformStack // Experimental. Connection() interface{} // Experimental. SetConnection(val interface{}) // Experimental. ConstructNodeMetadata() *map[string]interface{} // Experimental. Count() interface{} // Experimental. SetCount(val interface{}) CreateTime() *string // Experimental. DependsOn() *[]*string // Experimental. SetDependsOn(val *[]*string) DisplayName() *string SetDisplayName(val *string) DisplayNameInput() *string // Experimental. ForEach() cdktf.ITerraformIterator // Experimental. SetForEach(val cdktf.ITerraformIterator) // Experimental. Fqn() *string // Experimental. FriendlyUniqueId() *string Id() *string SetId(val *string) IdInput() *string JobCount() *float64 LastUpdateTime() *string // Experimental. Lifecycle() *cdktf.TerraformResourceLifecycle // Experimental. SetLifecycle(val *cdktf.TerraformResourceLifecycle) Name() *string SetName(val *string) NameInput() *string // The tree node. Node() constructs.Node PipelineSources() *map[string]*string SetPipelineSources(val *map[string]*string) PipelineSourcesInput() *map[string]*string Project() *string SetProject(val *string) ProjectInput() *string // Experimental. Provider() cdktf.TerraformProvider // Experimental. SetProvider(val cdktf.TerraformProvider) // Experimental. Provisioners() *[]interface{} // Experimental. SetProvisioners(val *[]interface{}) // Experimental. RawOverrides() interface{} Region() *string SetRegion(val *string) RegionInput() *string ScheduleInfo() GoogleDataPipelinePipelineScheduleInfoOutputReference ScheduleInfoInput() *GoogleDataPipelinePipelineScheduleInfo SchedulerServiceAccountEmail() *string SetSchedulerServiceAccountEmail(val *string) SchedulerServiceAccountEmailInput() *string State() *string SetState(val *string) StateInput() *string // Experimental. TerraformGeneratorMetadata() *cdktf.TerraformProviderGeneratorMetadata // Experimental. TerraformMetaArguments() *map[string]interface{} // Experimental. TerraformResourceType() *string Timeouts() GoogleDataPipelinePipelineTimeoutsOutputReference TimeoutsInput() interface{} Type() *string SetType(val *string) TypeInput() *string Workload() GoogleDataPipelinePipelineWorkloadOutputReference WorkloadInput() *GoogleDataPipelinePipelineWorkload // Adds a user defined moveTarget string to this resource to be later used in .moveTo(moveTarget) to resolve the location of the move. // Experimental. AddMoveTarget(moveTarget *string) // Experimental. AddOverride(path *string, value interface{}) // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. ImportFrom(id *string, provider cdktf.TerraformProvider) // Experimental. InterpolationForAttribute(terraformAttribute *string) cdktf.IResolvable // Moves this resource to the target resource given by moveTarget. // Experimental. MoveTo(moveTarget *string, index interface{}) // Overrides the auto-generated logical ID with a specific ID. // Experimental. OverrideLogicalId(newLogicalId *string) PutScheduleInfo(value *GoogleDataPipelinePipelineScheduleInfo) PutTimeouts(value *GoogleDataPipelinePipelineTimeouts) PutWorkload(value *GoogleDataPipelinePipelineWorkload) ResetDisplayName() ResetId() // Resets a previously passed logical Id to use the auto-generated logical id again. // Experimental. ResetOverrideLogicalId() ResetPipelineSources() ResetProject() ResetRegion() ResetScheduleInfo() ResetSchedulerServiceAccountEmail() ResetTimeouts() ResetWorkload() SynthesizeAttributes() *map[string]interface{} // Experimental. ToMetadata() interface{} // Returns a string representation of this construct. ToString() *string // Adds this resource to the terraform JSON output. // Experimental. ToTerraform() interface{} }
Represents a {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline google_data_pipeline_pipeline}.
func NewGoogleDataPipelinePipeline ¶
func NewGoogleDataPipelinePipeline(scope constructs.Construct, id *string, config *GoogleDataPipelinePipelineConfig) GoogleDataPipelinePipeline
Create a new {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline google_data_pipeline_pipeline} Resource.
type GoogleDataPipelinePipelineConfig ¶
type GoogleDataPipelinePipelineConfig struct { // Experimental. Connection interface{} `field:"optional" json:"connection" yaml:"connection"` // Experimental. Count interface{} `field:"optional" json:"count" yaml:"count"` // Experimental. DependsOn *[]cdktf.ITerraformDependable `field:"optional" json:"dependsOn" yaml:"dependsOn"` // Experimental. ForEach cdktf.ITerraformIterator `field:"optional" json:"forEach" yaml:"forEach"` // Experimental. Lifecycle *cdktf.TerraformResourceLifecycle `field:"optional" json:"lifecycle" yaml:"lifecycle"` // Experimental. Provider cdktf.TerraformProvider `field:"optional" json:"provider" yaml:"provider"` // Experimental. Provisioners *[]interface{} `field:"optional" json:"provisioners" yaml:"provisioners"` // "The pipeline name. // // For example': 'projects/PROJECT_ID/locations/LOCATION_ID/pipelines/PIPELINE_ID." // "- PROJECT_ID can contain letters ([A-Za-z]), numbers ([0-9]), hyphens (-), colons (:), and periods (.). For more information, see Identifying projects." // "LOCATION_ID is the canonical ID for the pipeline's location. The list of available locations can be obtained by calling google.cloud.location.Locations.ListLocations. Note that the Data Pipelines service is not available in all regions. It depends on Cloud Scheduler, an App Engine application, so it's only available in App Engine regions." // "PIPELINE_ID is the ID of the pipeline. Must be unique for the selected project and location." // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#name GoogleDataPipelinePipeline#name} Name *string `field:"required" json:"name" yaml:"name"` // The state of the pipeline. // // When the pipeline is created, the state is set to 'PIPELINE_STATE_ACTIVE' by default. State changes can be requested by setting the state to stopping, paused, or resuming. State cannot be changed through pipelines.patch requests. // https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines#state Possible values: ["STATE_UNSPECIFIED", "STATE_RESUMING", "STATE_ACTIVE", "STATE_STOPPING", "STATE_ARCHIVED", "STATE_PAUSED"] // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#state GoogleDataPipelinePipeline#state} State *string `field:"required" json:"state" yaml:"state"` // The type of the pipeline. // // This field affects the scheduling of the pipeline and the type of metrics to show for the pipeline. // https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines#pipelinetype Possible values: ["PIPELINE_TYPE_UNSPECIFIED", "PIPELINE_TYPE_BATCH", "PIPELINE_TYPE_STREAMING"] // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#type GoogleDataPipelinePipeline#type} Type *string `field:"required" json:"type" yaml:"type"` // The display name of the pipeline. It can contain only letters ([A-Za-z]), numbers ([0-9]), hyphens (-), and underscores (_). // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#display_name GoogleDataPipelinePipeline#display_name} DisplayName *string `field:"optional" json:"displayName" yaml:"displayName"` // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#id GoogleDataPipelinePipeline#id}. // // Please be aware that the id field is automatically added to all resources in Terraform providers using a Terraform provider SDK version below 2. // If you experience problems setting this value it might not be settable. Please take a look at the provider documentation to ensure it should be settable. Id *string `field:"optional" json:"id" yaml:"id"` // The sources of the pipeline (for example, Dataplex). // // The keys and values are set by the corresponding sources during pipeline creation. // An object containing a list of "key": value pairs. Example: { "name": "wrench", "mass": "1.3kg", "count": "3" }. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#pipeline_sources GoogleDataPipelinePipeline#pipeline_sources} PipelineSources *map[string]*string `field:"optional" json:"pipelineSources" yaml:"pipelineSources"` // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#project GoogleDataPipelinePipeline#project}. Project *string `field:"optional" json:"project" yaml:"project"` // A reference to the region. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#region GoogleDataPipelinePipeline#region} Region *string `field:"optional" json:"region" yaml:"region"` // schedule_info block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#schedule_info GoogleDataPipelinePipeline#schedule_info} ScheduleInfo *GoogleDataPipelinePipelineScheduleInfo `field:"optional" json:"scheduleInfo" yaml:"scheduleInfo"` // Optional. // // A service account email to be used with the Cloud Scheduler job. If not specified, the default compute engine service account will be used. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#scheduler_service_account_email GoogleDataPipelinePipeline#scheduler_service_account_email} SchedulerServiceAccountEmail *string `field:"optional" json:"schedulerServiceAccountEmail" yaml:"schedulerServiceAccountEmail"` // timeouts block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#timeouts GoogleDataPipelinePipeline#timeouts} Timeouts *GoogleDataPipelinePipelineTimeouts `field:"optional" json:"timeouts" yaml:"timeouts"` // workload block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#workload GoogleDataPipelinePipeline#workload} Workload *GoogleDataPipelinePipelineWorkload `field:"optional" json:"workload" yaml:"workload"` }
type GoogleDataPipelinePipelineScheduleInfo ¶
type GoogleDataPipelinePipelineScheduleInfo struct { // Unix-cron format of the schedule. This information is retrieved from the linked Cloud Scheduler. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#schedule GoogleDataPipelinePipeline#schedule} Schedule *string `field:"optional" json:"schedule" yaml:"schedule"` // Timezone ID. This matches the timezone IDs used by the Cloud Scheduler API. If empty, UTC time is assumed. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#time_zone GoogleDataPipelinePipeline#time_zone} TimeZone *string `field:"optional" json:"timeZone" yaml:"timeZone"` }
type GoogleDataPipelinePipelineScheduleInfoOutputReference ¶
type GoogleDataPipelinePipelineScheduleInfoOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataPipelinePipelineScheduleInfo SetInternalValue(val *GoogleDataPipelinePipelineScheduleInfo) NextJobTime() *string Schedule() *string SetSchedule(val *string) ScheduleInput() *string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) TimeZone() *string SetTimeZone(val *string) TimeZoneInput() *string // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetSchedule() ResetTimeZone() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataPipelinePipelineScheduleInfoOutputReference ¶
func NewGoogleDataPipelinePipelineScheduleInfoOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataPipelinePipelineScheduleInfoOutputReference
type GoogleDataPipelinePipelineTimeouts ¶
type GoogleDataPipelinePipelineTimeouts struct { // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#create GoogleDataPipelinePipeline#create}. Create *string `field:"optional" json:"create" yaml:"create"` // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#delete GoogleDataPipelinePipeline#delete}. Delete *string `field:"optional" json:"delete" yaml:"delete"` // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#update GoogleDataPipelinePipeline#update}. Update *string `field:"optional" json:"update" yaml:"update"` }
type GoogleDataPipelinePipelineTimeoutsOutputReference ¶
type GoogleDataPipelinePipelineTimeoutsOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) Create() *string SetCreate(val *string) CreateInput() *string // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string Delete() *string SetDelete(val *string) DeleteInput() *string // Experimental. Fqn() *string InternalValue() interface{} SetInternalValue(val interface{}) // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) Update() *string SetUpdate(val *string) UpdateInput() *string // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetCreate() ResetDelete() ResetUpdate() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataPipelinePipelineTimeoutsOutputReference ¶
func NewGoogleDataPipelinePipelineTimeoutsOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataPipelinePipelineTimeoutsOutputReference
type GoogleDataPipelinePipelineWorkload ¶
type GoogleDataPipelinePipelineWorkload struct { // dataflow_flex_template_request block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#dataflow_flex_template_request GoogleDataPipelinePipeline#dataflow_flex_template_request} DataflowFlexTemplateRequest *GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequest `field:"optional" json:"dataflowFlexTemplateRequest" yaml:"dataflowFlexTemplateRequest"` // dataflow_launch_template_request block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#dataflow_launch_template_request GoogleDataPipelinePipeline#dataflow_launch_template_request} DataflowLaunchTemplateRequest *GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequest `field:"optional" json:"dataflowLaunchTemplateRequest" yaml:"dataflowLaunchTemplateRequest"` }
type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequest ¶
type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequest struct { // launch_parameter block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#launch_parameter GoogleDataPipelinePipeline#launch_parameter} LaunchParameter *GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameter `field:"required" json:"launchParameter" yaml:"launchParameter"` // The regional endpoint to which to direct the request. For example, us-central1, us-west1. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#location GoogleDataPipelinePipeline#location} Location *string `field:"required" json:"location" yaml:"location"` // The ID of the Cloud Platform project that the job belongs to. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#project_id GoogleDataPipelinePipeline#project_id} ProjectId *string `field:"required" json:"projectId" yaml:"projectId"` // If true, the request is validated but not actually executed. Defaults to false. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#validate_only GoogleDataPipelinePipeline#validate_only} ValidateOnly interface{} `field:"optional" json:"validateOnly" yaml:"validateOnly"` }
type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameter ¶
type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameter struct { // The job name to use for the created job. // // For an update job request, the job name should be the same as the existing running job. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#job_name GoogleDataPipelinePipeline#job_name} JobName *string `field:"required" json:"jobName" yaml:"jobName"` // Cloud Storage path to a file with a JSON-serialized ContainerSpec as content. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#container_spec_gcs_path GoogleDataPipelinePipeline#container_spec_gcs_path} ContainerSpecGcsPath *string `field:"optional" json:"containerSpecGcsPath" yaml:"containerSpecGcsPath"` // environment block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#environment GoogleDataPipelinePipeline#environment} Environment *GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment `field:"optional" json:"environment" yaml:"environment"` // Launch options for this Flex Template job. // // This is a common set of options across languages and templates. This should not be used to pass job parameters. // 'An object containing a list of "key": value pairs. Example: { "name": "wrench", "mass": "1.3kg", "count": "3" }.' // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#launch_options GoogleDataPipelinePipeline#launch_options} LaunchOptions *map[string]*string `field:"optional" json:"launchOptions" yaml:"launchOptions"` // 'The parameters for the Flex Template. // // Example: {"numWorkers":"5"}' // 'An object containing a list of "key": value pairs. Example: { "name": "wrench", "mass": "1.3kg", "count": "3" }.' // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#parameters GoogleDataPipelinePipeline#parameters} Parameters *map[string]*string `field:"optional" json:"parameters" yaml:"parameters"` // 'Use this to pass transform name mappings for streaming update jobs. // // Example: {"oldTransformName":"newTransformName",...}' // 'An object containing a list of "key": value pairs. Example: { "name": "wrench", "mass": "1.3kg", "count": "3" }.' // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#transform_name_mappings GoogleDataPipelinePipeline#transform_name_mappings} TransformNameMappings *map[string]*string `field:"optional" json:"transformNameMappings" yaml:"transformNameMappings"` // Set this to true if you are sending a request to update a running streaming job. // // When set, the job name should be the same as the running job. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#update GoogleDataPipelinePipeline#update} Update interface{} `field:"optional" json:"update" yaml:"update"` }
type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment ¶
type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment struct { // Additional experiment flags for the job. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#additional_experiments GoogleDataPipelinePipeline#additional_experiments} AdditionalExperiments *[]*string `field:"optional" json:"additionalExperiments" yaml:"additionalExperiments"` // Additional user labels to be specified for the job. // // Keys and values should follow the restrictions specified in the labeling restrictions page. An object containing a list of key/value pairs. // 'Example: { "name": "wrench", "mass": "1kg", "count": "3" }.' // 'An object containing a list of "key": value pairs. Example: { "name": "wrench", "mass": "1.3kg", "count": "3" }.' // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#additional_user_labels GoogleDataPipelinePipeline#additional_user_labels} AdditionalUserLabels *map[string]*string `field:"optional" json:"additionalUserLabels" yaml:"additionalUserLabels"` // Whether to enable Streaming Engine for the job. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#enable_streaming_engine GoogleDataPipelinePipeline#enable_streaming_engine} EnableStreamingEngine interface{} `field:"optional" json:"enableStreamingEngine" yaml:"enableStreamingEngine"` // Set FlexRS goal for the job. https://cloud.google.com/dataflow/docs/guides/flexrs https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines#FlexResourceSchedulingGoal Possible values: ["FLEXRS_UNSPECIFIED", "FLEXRS_SPEED_OPTIMIZED", "FLEXRS_COST_OPTIMIZED"]. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#flexrs_goal GoogleDataPipelinePipeline#flexrs_goal} FlexrsGoal *string `field:"optional" json:"flexrsGoal" yaml:"flexrsGoal"` // Configuration for VM IPs. https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines#WorkerIPAddressConfiguration Possible values: ["WORKER_IP_UNSPECIFIED", "WORKER_IP_PUBLIC", "WORKER_IP_PRIVATE"]. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#ip_configuration GoogleDataPipelinePipeline#ip_configuration} IpConfiguration *string `field:"optional" json:"ipConfiguration" yaml:"ipConfiguration"` // 'Name for the Cloud KMS key for the job. The key format is: projects//locations//keyRings//cryptoKeys/'. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#kms_key_name GoogleDataPipelinePipeline#kms_key_name} KmsKeyName *string `field:"optional" json:"kmsKeyName" yaml:"kmsKeyName"` // The machine type to use for the job. Defaults to the value from the template if not specified. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#machine_type GoogleDataPipelinePipeline#machine_type} MachineType *string `field:"optional" json:"machineType" yaml:"machineType"` // The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#max_workers GoogleDataPipelinePipeline#max_workers} MaxWorkers *float64 `field:"optional" json:"maxWorkers" yaml:"maxWorkers"` // Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default". // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#network GoogleDataPipelinePipeline#network} Network *string `field:"optional" json:"network" yaml:"network"` // The initial number of Compute Engine instances for the job. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#num_workers GoogleDataPipelinePipeline#num_workers} NumWorkers *float64 `field:"optional" json:"numWorkers" yaml:"numWorkers"` // The email address of the service account to run the job as. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#service_account_email GoogleDataPipelinePipeline#service_account_email} ServiceAccountEmail *string `field:"optional" json:"serviceAccountEmail" yaml:"serviceAccountEmail"` // Subnetwork to which VMs will be assigned, if desired. // // You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#subnetwork GoogleDataPipelinePipeline#subnetwork} Subnetwork *string `field:"optional" json:"subnetwork" yaml:"subnetwork"` // The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#temp_location GoogleDataPipelinePipeline#temp_location} TempLocation *string `field:"optional" json:"tempLocation" yaml:"tempLocation"` // The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with workerZone. If neither workerRegion nor workerZone is specified, default to the control plane's region. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#worker_region GoogleDataPipelinePipeline#worker_region} WorkerRegion *string `field:"optional" json:"workerRegion" yaml:"workerRegion"` // The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with workerRegion. If neither workerRegion nor workerZone is specified, a zone in the control plane's region is chosen based on available capacity. If both workerZone and zone are set, workerZone takes precedence. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#worker_zone GoogleDataPipelinePipeline#worker_zone} WorkerZone *string `field:"optional" json:"workerZone" yaml:"workerZone"` // The Compute Engine availability zone for launching worker instances to run your pipeline. // // In the future, workerZone will take precedence. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#zone GoogleDataPipelinePipeline#zone} Zone *string `field:"optional" json:"zone" yaml:"zone"` }
type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironmentOutputReference ¶
type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironmentOutputReference interface { cdktf.ComplexObject AdditionalExperiments() *[]*string SetAdditionalExperiments(val *[]*string) AdditionalExperimentsInput() *[]*string AdditionalUserLabels() *map[string]*string SetAdditionalUserLabels(val *map[string]*string) AdditionalUserLabelsInput() *map[string]*string // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string EnableStreamingEngine() interface{} SetEnableStreamingEngine(val interface{}) EnableStreamingEngineInput() interface{} FlexrsGoal() *string SetFlexrsGoal(val *string) FlexrsGoalInput() *string // Experimental. Fqn() *string InternalValue() *GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment SetInternalValue(val *GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment) IpConfiguration() *string SetIpConfiguration(val *string) IpConfigurationInput() *string KmsKeyName() *string SetKmsKeyName(val *string) KmsKeyNameInput() *string MachineType() *string SetMachineType(val *string) MachineTypeInput() *string MaxWorkers() *float64 SetMaxWorkers(val *float64) MaxWorkersInput() *float64 Network() *string SetNetwork(val *string) NetworkInput() *string NumWorkers() *float64 SetNumWorkers(val *float64) NumWorkersInput() *float64 ServiceAccountEmail() *string SetServiceAccountEmail(val *string) ServiceAccountEmailInput() *string Subnetwork() *string SetSubnetwork(val *string) SubnetworkInput() *string TempLocation() *string SetTempLocation(val *string) TempLocationInput() *string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) WorkerRegion() *string SetWorkerRegion(val *string) WorkerRegionInput() *string WorkerZone() *string SetWorkerZone(val *string) WorkerZoneInput() *string Zone() *string SetZone(val *string) ZoneInput() *string // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetAdditionalExperiments() ResetAdditionalUserLabels() ResetEnableStreamingEngine() ResetFlexrsGoal() ResetIpConfiguration() ResetKmsKeyName() ResetMachineType() ResetMaxWorkers() ResetNetwork() ResetNumWorkers() ResetServiceAccountEmail() ResetSubnetwork() ResetTempLocation() ResetWorkerRegion() ResetWorkerZone() ResetZone() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironmentOutputReference ¶
func NewGoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironmentOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironmentOutputReference
type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterOutputReference ¶
type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) ContainerSpecGcsPath() *string SetContainerSpecGcsPath(val *string) ContainerSpecGcsPathInput() *string // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string Environment() GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironmentOutputReference EnvironmentInput() *GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment // Experimental. Fqn() *string InternalValue() *GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameter SetInternalValue(val *GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameter) JobName() *string SetJobName(val *string) JobNameInput() *string LaunchOptions() *map[string]*string SetLaunchOptions(val *map[string]*string) LaunchOptionsInput() *map[string]*string Parameters() *map[string]*string SetParameters(val *map[string]*string) ParametersInput() *map[string]*string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) TransformNameMappings() *map[string]*string SetTransformNameMappings(val *map[string]*string) TransformNameMappingsInput() *map[string]*string Update() interface{} SetUpdate(val interface{}) UpdateInput() interface{} // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable PutEnvironment(value *GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment) ResetContainerSpecGcsPath() ResetEnvironment() ResetLaunchOptions() ResetParameters() ResetTransformNameMappings() ResetUpdate() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterOutputReference ¶
func NewGoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterOutputReference
type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestOutputReference ¶
type GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string // Experimental. Fqn() *string InternalValue() *GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequest SetInternalValue(val *GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequest) LaunchParameter() GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterOutputReference LaunchParameterInput() *GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameter Location() *string SetLocation(val *string) LocationInput() *string ProjectId() *string SetProjectId(val *string) ProjectIdInput() *string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) ValidateOnly() interface{} SetValidateOnly(val interface{}) ValidateOnlyInput() interface{} // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable PutLaunchParameter(value *GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameter) ResetValidateOnly() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestOutputReference ¶
func NewGoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestOutputReference
type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequest ¶
type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequest struct { // The ID of the Cloud Platform project that the job belongs to. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#project_id GoogleDataPipelinePipeline#project_id} ProjectId *string `field:"required" json:"projectId" yaml:"projectId"` // A Cloud Storage path to the template from which to create the job. // // Must be a valid Cloud Storage URL, beginning with 'gs://'. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#gcs_path GoogleDataPipelinePipeline#gcs_path} GcsPath *string `field:"optional" json:"gcsPath" yaml:"gcsPath"` // launch_parameters block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#launch_parameters GoogleDataPipelinePipeline#launch_parameters} LaunchParameters *GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParameters `field:"optional" json:"launchParameters" yaml:"launchParameters"` // The regional endpoint to which to direct the request. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#location GoogleDataPipelinePipeline#location} Location *string `field:"optional" json:"location" yaml:"location"` // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#validate_only GoogleDataPipelinePipeline#validate_only}. ValidateOnly interface{} `field:"optional" json:"validateOnly" yaml:"validateOnly"` }
type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParameters ¶
type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParameters struct { // The job name to use for the created job. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#job_name GoogleDataPipelinePipeline#job_name} JobName *string `field:"required" json:"jobName" yaml:"jobName"` // environment block. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#environment GoogleDataPipelinePipeline#environment} Environment *GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment `field:"optional" json:"environment" yaml:"environment"` // The runtime parameters to pass to the job. // // 'An object containing a list of "key": value pairs. Example: { "name": "wrench", "mass": "1.3kg", "count": "3" }.' // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#parameters GoogleDataPipelinePipeline#parameters} Parameters *map[string]*string `field:"optional" json:"parameters" yaml:"parameters"` // Map of transform name prefixes of the job to be replaced to the corresponding name prefixes of the new job. // // Only applicable when updating a pipeline. // 'An object containing a list of "key": value pairs. Example: { "name": "wrench", "mass": "1.3kg", "count": "3" }.' // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#transform_name_mapping GoogleDataPipelinePipeline#transform_name_mapping} TransformNameMapping *map[string]*string `field:"optional" json:"transformNameMapping" yaml:"transformNameMapping"` // If set, replace the existing pipeline with the name specified by jobName with this pipeline, preserving state. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#update GoogleDataPipelinePipeline#update} Update interface{} `field:"optional" json:"update" yaml:"update"` }
type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment ¶
type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment struct { // Additional experiment flags for the job. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#additional_experiments GoogleDataPipelinePipeline#additional_experiments} AdditionalExperiments *[]*string `field:"optional" json:"additionalExperiments" yaml:"additionalExperiments"` // Additional user labels to be specified for the job. // // Keys and values should follow the restrictions specified in the labeling restrictions page. An object containing a list of key/value pairs. // 'Example: { "name": "wrench", "mass": "1kg", "count": "3" }.' // 'An object containing a list of "key": value pairs. Example: { "name": "wrench", "mass": "1.3kg", "count": "3" }.' // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#additional_user_labels GoogleDataPipelinePipeline#additional_user_labels} AdditionalUserLabels *map[string]*string `field:"optional" json:"additionalUserLabels" yaml:"additionalUserLabels"` // Whether to bypass the safety checks for the job's temporary directory. Use with caution. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#bypass_temp_dir_validation GoogleDataPipelinePipeline#bypass_temp_dir_validation} BypassTempDirValidation interface{} `field:"optional" json:"bypassTempDirValidation" yaml:"bypassTempDirValidation"` // Whether to enable Streaming Engine for the job. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#enable_streaming_engine GoogleDataPipelinePipeline#enable_streaming_engine} EnableStreamingEngine interface{} `field:"optional" json:"enableStreamingEngine" yaml:"enableStreamingEngine"` // Configuration for VM IPs. https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines#WorkerIPAddressConfiguration Possible values: ["WORKER_IP_UNSPECIFIED", "WORKER_IP_PUBLIC", "WORKER_IP_PRIVATE"]. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#ip_configuration GoogleDataPipelinePipeline#ip_configuration} IpConfiguration *string `field:"optional" json:"ipConfiguration" yaml:"ipConfiguration"` // 'Name for the Cloud KMS key for the job. The key format is: projects//locations//keyRings//cryptoKeys/'. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#kms_key_name GoogleDataPipelinePipeline#kms_key_name} KmsKeyName *string `field:"optional" json:"kmsKeyName" yaml:"kmsKeyName"` // The machine type to use for the job. Defaults to the value from the template if not specified. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#machine_type GoogleDataPipelinePipeline#machine_type} MachineType *string `field:"optional" json:"machineType" yaml:"machineType"` // The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#max_workers GoogleDataPipelinePipeline#max_workers} MaxWorkers *float64 `field:"optional" json:"maxWorkers" yaml:"maxWorkers"` // Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default". // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#network GoogleDataPipelinePipeline#network} Network *string `field:"optional" json:"network" yaml:"network"` // The initial number of Compute Engine instances for the job. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#num_workers GoogleDataPipelinePipeline#num_workers} NumWorkers *float64 `field:"optional" json:"numWorkers" yaml:"numWorkers"` // The email address of the service account to run the job as. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#service_account_email GoogleDataPipelinePipeline#service_account_email} ServiceAccountEmail *string `field:"optional" json:"serviceAccountEmail" yaml:"serviceAccountEmail"` // Subnetwork to which VMs will be assigned, if desired. // // You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#subnetwork GoogleDataPipelinePipeline#subnetwork} Subnetwork *string `field:"optional" json:"subnetwork" yaml:"subnetwork"` // The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#temp_location GoogleDataPipelinePipeline#temp_location} TempLocation *string `field:"optional" json:"tempLocation" yaml:"tempLocation"` // The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with workerZone. If neither workerRegion nor workerZone is specified, default to the control plane's region. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#worker_region GoogleDataPipelinePipeline#worker_region} WorkerRegion *string `field:"optional" json:"workerRegion" yaml:"workerRegion"` // The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with workerRegion. If neither workerRegion nor workerZone is specified, a zone in the control plane's region is chosen based on available capacity. If both workerZone and zone are set, workerZone takes precedence. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#worker_zone GoogleDataPipelinePipeline#worker_zone} WorkerZone *string `field:"optional" json:"workerZone" yaml:"workerZone"` // The Compute Engine availability zone for launching worker instances to run your pipeline. // // In the future, workerZone will take precedence. // // Docs at Terraform Registry: {@link https://registry.terraform.io/providers/hashicorp/google-beta/5.4.0/docs/resources/google_data_pipeline_pipeline#zone GoogleDataPipelinePipeline#zone} Zone *string `field:"optional" json:"zone" yaml:"zone"` }
type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentOutputReference ¶
type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentOutputReference interface { cdktf.ComplexObject AdditionalExperiments() *[]*string SetAdditionalExperiments(val *[]*string) AdditionalExperimentsInput() *[]*string AdditionalUserLabels() *map[string]*string SetAdditionalUserLabels(val *map[string]*string) AdditionalUserLabelsInput() *map[string]*string BypassTempDirValidation() interface{} SetBypassTempDirValidation(val interface{}) BypassTempDirValidationInput() interface{} // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string EnableStreamingEngine() interface{} SetEnableStreamingEngine(val interface{}) EnableStreamingEngineInput() interface{} // Experimental. Fqn() *string InternalValue() *GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment SetInternalValue(val *GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment) IpConfiguration() *string SetIpConfiguration(val *string) IpConfigurationInput() *string KmsKeyName() *string SetKmsKeyName(val *string) KmsKeyNameInput() *string MachineType() *string SetMachineType(val *string) MachineTypeInput() *string MaxWorkers() *float64 SetMaxWorkers(val *float64) MaxWorkersInput() *float64 Network() *string SetNetwork(val *string) NetworkInput() *string NumWorkers() *float64 SetNumWorkers(val *float64) NumWorkersInput() *float64 ServiceAccountEmail() *string SetServiceAccountEmail(val *string) ServiceAccountEmailInput() *string Subnetwork() *string SetSubnetwork(val *string) SubnetworkInput() *string TempLocation() *string SetTempLocation(val *string) TempLocationInput() *string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) WorkerRegion() *string SetWorkerRegion(val *string) WorkerRegionInput() *string WorkerZone() *string SetWorkerZone(val *string) WorkerZoneInput() *string Zone() *string SetZone(val *string) ZoneInput() *string // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable ResetAdditionalExperiments() ResetAdditionalUserLabels() ResetBypassTempDirValidation() ResetEnableStreamingEngine() ResetIpConfiguration() ResetKmsKeyName() ResetMachineType() ResetMaxWorkers() ResetNetwork() ResetNumWorkers() ResetServiceAccountEmail() ResetSubnetwork() ResetTempLocation() ResetWorkerRegion() ResetWorkerZone() ResetZone() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentOutputReference ¶
func NewGoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentOutputReference
type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersOutputReference ¶
type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string Environment() GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentOutputReference EnvironmentInput() *GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment // Experimental. Fqn() *string InternalValue() *GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParameters SetInternalValue(val *GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParameters) JobName() *string SetJobName(val *string) JobNameInput() *string Parameters() *map[string]*string SetParameters(val *map[string]*string) ParametersInput() *map[string]*string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) TransformNameMapping() *map[string]*string SetTransformNameMapping(val *map[string]*string) TransformNameMappingInput() *map[string]*string Update() interface{} SetUpdate(val interface{}) UpdateInput() interface{} // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable PutEnvironment(value *GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment) ResetEnvironment() ResetParameters() ResetTransformNameMapping() ResetUpdate() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersOutputReference ¶
func NewGoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersOutputReference
type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestOutputReference ¶
type GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string // Experimental. Fqn() *string GcsPath() *string SetGcsPath(val *string) GcsPathInput() *string InternalValue() *GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequest SetInternalValue(val *GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequest) LaunchParameters() GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersOutputReference LaunchParametersInput() *GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParameters Location() *string SetLocation(val *string) LocationInput() *string ProjectId() *string SetProjectId(val *string) ProjectIdInput() *string // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) ValidateOnly() interface{} SetValidateOnly(val interface{}) ValidateOnlyInput() interface{} // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable PutLaunchParameters(value *GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParameters) ResetGcsPath() ResetLaunchParameters() ResetLocation() ResetValidateOnly() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestOutputReference ¶
func NewGoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestOutputReference
type GoogleDataPipelinePipelineWorkloadOutputReference ¶
type GoogleDataPipelinePipelineWorkloadOutputReference interface { cdktf.ComplexObject // the index of the complex object in a list. // Experimental. ComplexObjectIndex() interface{} // Experimental. SetComplexObjectIndex(val interface{}) // set to true if this item is from inside a set and needs tolist() for accessing it set to "0" for single list items. // Experimental. ComplexObjectIsFromSet() *bool // Experimental. SetComplexObjectIsFromSet(val *bool) // The creation stack of this resolvable which will be appended to errors thrown during resolution. // // If this returns an empty array the stack will not be attached. // Experimental. CreationStack() *[]*string DataflowFlexTemplateRequest() GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestOutputReference DataflowFlexTemplateRequestInput() *GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequest DataflowLaunchTemplateRequest() GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestOutputReference DataflowLaunchTemplateRequestInput() *GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequest // Experimental. Fqn() *string InternalValue() *GoogleDataPipelinePipelineWorkload SetInternalValue(val *GoogleDataPipelinePipelineWorkload) // Experimental. TerraformAttribute() *string // Experimental. SetTerraformAttribute(val *string) // Experimental. TerraformResource() cdktf.IInterpolatingParent // Experimental. SetTerraformResource(val cdktf.IInterpolatingParent) // Experimental. ComputeFqn() *string // Experimental. GetAnyMapAttribute(terraformAttribute *string) *map[string]interface{} // Experimental. GetBooleanAttribute(terraformAttribute *string) cdktf.IResolvable // Experimental. GetBooleanMapAttribute(terraformAttribute *string) *map[string]*bool // Experimental. GetListAttribute(terraformAttribute *string) *[]*string // Experimental. GetNumberAttribute(terraformAttribute *string) *float64 // Experimental. GetNumberListAttribute(terraformAttribute *string) *[]*float64 // Experimental. GetNumberMapAttribute(terraformAttribute *string) *map[string]*float64 // Experimental. GetStringAttribute(terraformAttribute *string) *string // Experimental. GetStringMapAttribute(terraformAttribute *string) *map[string]*string // Experimental. InterpolationAsList() cdktf.IResolvable // Experimental. InterpolationForAttribute(property *string) cdktf.IResolvable PutDataflowFlexTemplateRequest(value *GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequest) PutDataflowLaunchTemplateRequest(value *GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequest) ResetDataflowFlexTemplateRequest() ResetDataflowLaunchTemplateRequest() // Produce the Token's value at resolution time. // Experimental. Resolve(_context cdktf.IResolveContext) interface{} // Return a string representation of this resolvable object. // // Returns a reversible string representation. // Experimental. ToString() *string }
func NewGoogleDataPipelinePipelineWorkloadOutputReference ¶
func NewGoogleDataPipelinePipelineWorkloadOutputReference(terraformResource cdktf.IInterpolatingParent, terraformAttribute *string) GoogleDataPipelinePipelineWorkloadOutputReference
Source Files ¶
- GoogleDataPipelinePipeline.go
- GoogleDataPipelinePipelineConfig.go
- GoogleDataPipelinePipelineScheduleInfo.go
- GoogleDataPipelinePipelineScheduleInfoOutputReference.go
- GoogleDataPipelinePipelineScheduleInfoOutputReference__checks.go
- GoogleDataPipelinePipelineTimeouts.go
- GoogleDataPipelinePipelineTimeoutsOutputReference.go
- GoogleDataPipelinePipelineTimeoutsOutputReference__checks.go
- GoogleDataPipelinePipelineWorkload.go
- GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequest.go
- GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameter.go
- GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironment.go
- GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironmentOutputReference.go
- GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterEnvironmentOutputReference__checks.go
- GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterOutputReference.go
- GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestLaunchParameterOutputReference__checks.go
- GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestOutputReference.go
- GoogleDataPipelinePipelineWorkloadDataflowFlexTemplateRequestOutputReference__checks.go
- GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequest.go
- GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParameters.go
- GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironment.go
- GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentOutputReference.go
- GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersEnvironmentOutputReference__checks.go
- GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersOutputReference.go
- GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestLaunchParametersOutputReference__checks.go
- GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestOutputReference.go
- GoogleDataPipelinePipelineWorkloadDataflowLaunchTemplateRequestOutputReference__checks.go
- GoogleDataPipelinePipelineWorkloadOutputReference.go
- GoogleDataPipelinePipelineWorkloadOutputReference__checks.go
- GoogleDataPipelinePipeline__checks.go
- main.go