dataflow

package
v0.18.3 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 5, 2019 License: Apache-2.0 Imports: 2 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Job

type Job struct {
	// contains filtered or unexported fields
}

Creates a job on Dataflow, which is an implementation of Apache Beam running on Google Compute Engine. For more information see the official documentation for [Beam](https://beam.apache.org) and [Dataflow](https://cloud.google.com/dataflow/).

## Note on "destroy" / "apply"

There are many types of Dataflow jobs. Some Dataflow jobs run constantly, getting new data from (e.g.) a GCS bucket, and outputting data continuously. Some jobs process a set amount of data then terminate. All jobs can fail while running due to programming errors or other issues. In this way, Dataflow jobs are different from most other Terraform / Google resources.

The Dataflow resource is considered 'existing' while it is in a nonterminal state. If it reaches a terminal state (e.g. 'FAILED', 'COMPLETE', 'CANCELLED'), it will be recreated on the next 'apply'. This is as expected for jobs which run continously, but may surprise users who use this resource for other kinds of Dataflow jobs.

A Dataflow job which is 'destroyed' may be "cancelled" or "drained". If "cancelled", the job terminates - any data written remains where it is, but no new data will be processed. If "drained", no new data will enter the pipeline, but any data currently in the pipeline will finish being processed. The default is "cancelled", but if a user sets `on_delete` to `"drain"` in the configuration, you may experience a long wait for your `terraform destroy` to complete.

func GetJob

func GetJob(ctx *pulumi.Context,
	name string, id pulumi.ID, state *JobState, opts ...pulumi.ResourceOpt) (*Job, error)

GetJob gets an existing Job resource's state with the given name, ID, and optional state properties that are used to uniquely qualify the lookup (nil if not required).

func NewJob

func NewJob(ctx *pulumi.Context,
	name string, args *JobArgs, opts ...pulumi.ResourceOpt) (*Job, error)

NewJob registers a new resource with the given unique name, arguments, and options.

func (*Job) ID

func (r *Job) ID() *pulumi.IDOutput

ID is this resource's unique identifier assigned by its provider.

func (*Job) MaxWorkers

func (r *Job) MaxWorkers() *pulumi.IntOutput

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

func (*Job) Name

func (r *Job) Name() *pulumi.StringOutput

A unique name for the resource, required by Dataflow.

func (*Job) OnDelete

func (r *Job) OnDelete() *pulumi.StringOutput

One of "drain" or "cancel". Specifies behavior of deletion during `terraform destroy`. See above note.

func (*Job) Parameters

func (r *Job) Parameters() *pulumi.MapOutput

Key/Value pairs to be passed to the Dataflow job (as used in the template).

func (*Job) Project

func (r *Job) Project() *pulumi.StringOutput

The project in which the resource belongs. If it is not provided, the provider project is used.

func (*Job) Region added in v0.16.0

func (r *Job) Region() *pulumi.StringOutput

func (*Job) ServiceAccountEmail added in v0.18.1

func (r *Job) ServiceAccountEmail() *pulumi.StringOutput

The Service Account email used to create the job.

func (*Job) State

func (r *Job) State() *pulumi.StringOutput

The current state of the resource, selected from the [JobState enum](https://cloud.google.com/dataflow/docs/reference/rest/v1b3/projects.jobs#Job.JobState)

func (*Job) TempGcsLocation

func (r *Job) TempGcsLocation() *pulumi.StringOutput

A writeable location on GCS for the Dataflow job to dump its temporary data.

func (*Job) TemplateGcsPath

func (r *Job) TemplateGcsPath() *pulumi.StringOutput

The GCS path to the Dataflow job template.

func (*Job) URN

func (r *Job) URN() *pulumi.URNOutput

URN is this resource's unique name assigned by Pulumi.

func (*Job) Zone

func (r *Job) Zone() *pulumi.StringOutput

The zone in which the created job should run. If it is not provided, the provider zone is used.

type JobArgs

type JobArgs struct {
	// The number of workers permitted to work on the job.  More workers may improve processing speed at additional cost.
	MaxWorkers interface{}
	// A unique name for the resource, required by Dataflow.
	Name interface{}
	// One of "drain" or "cancel".  Specifies behavior of deletion during `terraform destroy`.  See above note.
	OnDelete interface{}
	// Key/Value pairs to be passed to the Dataflow job (as used in the template).
	Parameters interface{}
	// The project in which the resource belongs. If it is not provided, the provider project is used.
	Project interface{}
	Region  interface{}
	// The Service Account email used to create the job.
	ServiceAccountEmail interface{}
	// A writeable location on GCS for the Dataflow job to dump its temporary data.
	TempGcsLocation interface{}
	// The GCS path to the Dataflow job template.
	TemplateGcsPath interface{}
	// The zone in which the created job should run. If it is not provided, the provider zone is used.
	Zone interface{}
}

The set of arguments for constructing a Job resource.

type JobState

type JobState struct {
	// The number of workers permitted to work on the job.  More workers may improve processing speed at additional cost.
	MaxWorkers interface{}
	// A unique name for the resource, required by Dataflow.
	Name interface{}
	// One of "drain" or "cancel".  Specifies behavior of deletion during `terraform destroy`.  See above note.
	OnDelete interface{}
	// Key/Value pairs to be passed to the Dataflow job (as used in the template).
	Parameters interface{}
	// The project in which the resource belongs. If it is not provided, the provider project is used.
	Project interface{}
	Region  interface{}
	// The Service Account email used to create the job.
	ServiceAccountEmail interface{}
	// The current state of the resource, selected from the [JobState enum](https://cloud.google.com/dataflow/docs/reference/rest/v1b3/projects.jobs#Job.JobState)
	State interface{}
	// A writeable location on GCS for the Dataflow job to dump its temporary data.
	TempGcsLocation interface{}
	// The GCS path to the Dataflow job template.
	TemplateGcsPath interface{}
	// The zone in which the created job should run. If it is not provided, the provider zone is used.
	Zone interface{}
}

Input properties used for looking up and filtering Job resources.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL