batches

package
v0.0.0-...-63319d1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 29, 2024 License: MPL-2.0, Apache-2.0 Imports: 1 Imported by: 0

Documentation

Index

Constants

View Source
const (
	// StateStarting is a state means the batch processing job is being started.
	StateStarting = "starting"
	// StateRunning is a state means the batch processing job is executing a task.
	StateRunning = "running"
	// StateDead is a state means the batch processing job has failed to execute.
	StateDead = "dead"
	// StateSuccess is a state means the batch processing job is successfully executed.
	StateSuccess = "success"
	// StateRecovering is a state means the batch processing job is being restored.
	StateRecovering = "recovering"
)

Variables

This section is empty.

Functions

func Delete

func Delete(c *golangsdk.ServiceClient, jobId string) *golangsdk.ErrResult

Delete is a method to cancel the unfinished spark job.

Types

type CreateOpts

type CreateOpts struct {
	// Name of the package that is of the JAR or pyFile type and has been uploaded to the DLI resource management
	// system. You can also specify an OBS path, for example, obs://Bucket name/Package name.
	File string `json:"file" required:"true"`
	// Queue name. Set this parameter to the name of the created DLI queue.
	// NOTE: This parameter is compatible with the cluster_name parameter. That is, if cluster_name is used to specify a
	//       queue, the queue is still valid.
	// You are advised to use the queue parameter. The queue and cluster_name parameters cannot coexist.
	Queue string `json:"queue" required:"true"`
	// Java/Spark main class of the batch processing job.
	ClassName *string `json:"class_name,omitempty"`
	// Queue name. Set this parameter to the created DLI queue name.
	// NOTE: You are advised to use the queue parameter. The queue and cluster_name parameters cannot coexist.
	ClusterName string `json:"cluster_name,omitempty"`
	// Input parameters of the main class, that is, application parameters.
	Arguments []string `json:"args,omitempty"`
	// Compute resource type. Currently, resource types A, B, and C are available.
	// If this parameter is not specified, the minimum configuration (type A) is used.
	Specification string `json:"sc_type,omitempty"`
	// Name of the package that is of the JAR type and has been uploaded to the DLI resource management system.
	// You can also specify an OBS path, for example, obs://Bucket name/Package name.
	Jars []string `json:"jars,omitempty"`
	// Name of the package that is of the PyFile type and has been uploaded to the DLI resource management system.
	// You can also specify an OBS path, for example, obs://Bucket name/Package name.
	PythonFiles []string `json:"python_files,omitempty"`
	// Name of the package that is of the file type and has been uploaded to the DLI resource management system.
	// You can also specify an OBS path, for example, obs://Bucket name/Package name.
	Files []string `json:"files,omitempty"`
	// Name of the dependent system resource module. You can view the module name using the API related to Querying
	// Resource Packages in a Group. DLI provides dependencies for executing datasource jobs.
	// The following table lists the dependency modules corresponding to different services.
	//   CloudTable/MRS HBase: sys.datasource.hbase
	//   CloudTable/MRS OpenTSDB: sys.datasource.opentsdb
	//   RDS MySQL: sys.datasource.rds
	//   RDS Postgre: preset
	//   DWS: preset
	//   CSS: sys.datasource.css
	Modules []string `json:"modules,omitempty"`
	// JSON object list, including the name and type of the JSON package that has been uploaded to the queue.
	Resources []Resource `json:"resources,omitempty"`
	// JSON object list, including the package group resource. For details about the format, see the request example.
	// If the type of the name in resources is not verified, the package with the name exists in the group.
	Groups []Group `json:"groups,omitempty"`
	// Batch configuration item. For details, see Spark Configuration.
	Configurations map[string]interface{} `json:"conf,omitempty"`
	// Batch processing task name. The value contains a maximum of 128 characters.
	Name string `json:"name,omitempty"`
	// Driver memory of the Spark application, for example, 2 GB and 2048 MB. This configuration item replaces the
	// default parameter in sc_type. The unit must be provided. Otherwise, the startup fails.
	DriverMemory string `json:"driver_memory,omitempty"`
	// Number of CPU cores of the Spark application driver.
	// This configuration item replaces the default parameter in sc_type.
	DriverCores int `json:"driver_cores,omitempty"`
	// Executor memory of the Spark application, for example, 2 GB and 2048 MB. This configuration item replaces the
	// default parameter in sc_type. The unit must be provided. Otherwise, the startup fails.
	ExecutorMemory string `json:"executor_memory,omitempty"`
	// Number of CPU cores of each Executor in the Spark application.
	// This configuration item replaces the default parameter in sc_type.
	ExecutorCores int `json:"executor_cores,omitempty"`
	// Number of Executors in a Spark application. This configuration item replaces the default parameter in sc_type.
	NumExecutors int `json:"num_executors,omitempty"`
	// OBS bucket for storing the Spark jobs. Set this parameter when you need to save jobs.
	ObsBucket string `json:"obs_bucket,omitempty"`
	// Whether to enable the retry function.
	// If enabled, Spark jobs will be automatically retried after an exception occurs. The default value is false.
	AutoRecovery bool `json:"auto_recovery,omitempty"`
	// Maximum retry times. The maximum value is 100, and the default value is 20.
	MaxRetryTimes int `json:"max_retry_times,omitempty"`
	// Job feature. Type of the Spark image used by a job.
	// basic: indicates that the basic Spark image provided by DLI is used.
	// custom: indicates that the user-defined Spark image is used.
	// ai: indicates that the AI image provided by DLI is used.
	Feature string `json:"feature,omitempty"`
	// Version of the Spark component used by a job. Set this parameter when feature is set to basic or ai.
	// If this parameter is not set, the default Spark component version 2.3.2 is used.
	SparkVersion string `json:"spark_version,omitempty"`
	// Custom image. The format is Organization name/Image name:Image version.
	// This parameter is valid only when feature is set to custom.
	// You can use this parameter with the feature parameter to specify a user-defined Spark image for job running.
	Image string `json:"image,omitempty"`
	// To access metadata, set this parameter to DLI.
	CatalogName string `json:"catalog_name,omitempty"`
}

CreateOpts is a struct which will be used to submit a spark job.

type CreateResp

type CreateResp struct {
	// ID of a batch processing job.
	ID string `json:"id"`
	// Back-end application ID of a batch processing job.
	AppId []string `json:"appId"`
	// Name of a batch processing job.
	Name string `json:"name"`
	// Owner of a batch processing job.
	Owner string `json:"owner"`
	// Proxy user (resource tenant) to which a batch processing job belongs.
	ProxyUser string `json:"proxyUser"`
	// Status of a batch processing job. For details, see Table 7 in Creating a Batch Processing Job.
	State string `json:"state"`
	// Type of a batch processing job. Only Spark parameters are supported.
	Kind string `json:"kind"`
	// Last 10 records of the current batch processing job.
	Log []string `json:"log"`
	// Type of a computing resource. If the computing resource type is customized, value CUSTOMIZED is returned.
	ScType string `json:"sc_type"`
	// Queue where a batch processing job is located.
	ClusterName string `json:"cluster_name"`
	// Queue where a batch processing job is located.
	Queue string `json:"queue"`
	// Time when a batch processing job is created. The timestamp is expressed in milliseconds.
	CreateTime int `json:"create_time"`
	// Time when a batch processing job is updated. The timestamp is expressed in milliseconds.
	UpdateTime int `json:"update_time"`
	// Job feature. Type of the Spark image used by a job.
	// basic: indicates that the basic Spark image provided by DLI is used.
	// custom: indicates that the user-defined Spark image is used.
	// ai: indicates that the AI image provided by DLI is used.
	Feature string `json:"feature"`
	// Version of the Spark component used by a job. Set this parameter when feature is set to basic or ai.
	// If this parameter is not set, the default Spark component version 2.3.2 is used.
	SparkVersion string `json:"spark_version"`
	// Custom image. The format is Organization name/Image name:Image version.
	// This parameter is valid only when feature is set to custom. You can use this parameter with the feature parameter
	// to specify a user-defined Spark image for job running. For details about how to use custom images, see the Data
	// Lake Insight User Guide.
	Image string `json:"image"`
}

CreateResp represents a result of the Create method.

func Create

func Create(c *golangsdk.ServiceClient, opts CreateOpts) (*CreateResp, error)

Create is a method to submit a Spark job with given parameters.

func Get

func Get(c *golangsdk.ServiceClient, jobId string) (*CreateResp, error)

Get is a method to obtain the specified Spark job with job ID.

type Group

type Group struct {
	// User group name.
	Name string `json:"name,omitempty"`
	// User group resource.
	Resources []Resource `json:"resources,omitempty"`
}

Group is an object which will be build up a package group.

type Resource

type Resource struct {
	// Resource name. You can also specify an OBS path, for example, obs://Bucket name/Package name.
	Name string `json:"name,omitempty"`
	// Resource type.
	Type string `json:"type,omitempty"`
}

Resource is an object which specified the user group resource.

type StateResp

type StateResp struct {
	// ID of a batch processing job, which is in the universal unique identifier (UUID) format.
	ID string `json:"id"`
	// Status of a batch processing job.
	// The valid values are starting, running, dead, success and recovering. For detail, see Constant definition.
	State string `json:"state"`
}

StateResp represents a result of the GetState method.

func GetState

func GetState(c *golangsdk.ServiceClient, jobId string) (*StateResp, error)

GetState is a method to obtain the state of specified Spark job with job ID.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL