v1alpha1

package
v0.0.0-...-965667f Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 3, 2018 License: Apache-2.0 Imports: 5 Imported by: 0

Documentation

Index

Constants

View Source
const GroupName = "spark-operator.k8s.io"

GroupName is the group name used in this package.

Variables

View Source
var (
	SchemeBuilder = runtime.NewSchemeBuilder(addKnownTypes)
	AddToScheme   = SchemeBuilder.AddToScheme
)
View Source
var SchemeGroupVersion = schema.GroupVersion{Group: GroupName, Version: "v1alpha1"}

SchemeGroupVersion is the group version used to register these objects.

Functions

func RegisterDeepCopies deprecated

func RegisterDeepCopies(scheme *runtime.Scheme) error

RegisterDeepCopies adds deep-copy functions to the given scheme. Public to allow building arbitrary schemes.

Deprecated: deepcopy registration will go away when static deepcopy is fully implemented.

func RegisterDefaults

func RegisterDefaults(scheme *runtime.Scheme) error

RegisterDefaults adds defaulters functions to the given scheme. Public to allow building arbitrary schemes. All generated defaulters are covering - they call all nested defaulters.

func Resource

func Resource(resource string) schema.GroupResource

Resource takes an unqualified resource and returns a Group-qualified GroupResource.

func SetDefaults_SparkApplication

func SetDefaults_SparkApplication(app *SparkApplication)

func SetObjectDefaults_SparkApplication

func SetObjectDefaults_SparkApplication(in *SparkApplication)

Types

type ApplicationState

type ApplicationState struct {
	State        ApplicationStateType `json:"state"`
	ErrorMessage string               `json:"errorMessage"`
}

ApplicationState tells the current state of the application and an error message in case of failures.

func (*ApplicationState) DeepCopy

func (in *ApplicationState) DeepCopy() *ApplicationState

DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new ApplicationState.

func (*ApplicationState) DeepCopyInto

func (in *ApplicationState) DeepCopyInto(out *ApplicationState)

DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.

type ApplicationStateType

type ApplicationStateType string

ApplicationStateType represents the type of the current state of an application.

const (
	NewState       ApplicationStateType = "NEW"
	SubmittedState ApplicationStateType = "SUBMITTED"
	RunningState   ApplicationStateType = "RUNNING"
	CompletedState ApplicationStateType = "COMPLETED"
	FailedState    ApplicationStateType = "FAILED"
)

Different states an application may have.

type Dependencies

type Dependencies struct {
	// JarFiles is a list of JAR files the Spark application depends on.
	// Optional.
	JarFiles []string `json:"jarFiles,omitempty"`
	// Files is a list of files the Spark application depends on.
	// Optional.
	Files []string `json:"files,omitempty"`
	// PyFiles is a list of Python files the Spark application depends on.
	// Optional.
	PyFiles []string `json:"pyFiles,omitempty"`
}

Dependencies specifies all possible types of dependencies of a Spark application.

func (*Dependencies) DeepCopy

func (in *Dependencies) DeepCopy() *Dependencies

DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new Dependencies.

func (*Dependencies) DeepCopyInto

func (in *Dependencies) DeepCopyInto(out *Dependencies)

DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.

type DeployMode

type DeployMode string

DeployMode describes the type of deployment of a Spark application.

const (
	ClusterMode         DeployMode = "cluster"
	ClientMode          DeployMode = "client"
	InClusterClientMode DeployMode = "in-cluster-client"
)

Different types of deployments.

type DriverInfo

type DriverInfo struct {
	WebUIServiceName string `json:"webUIServiceName"`
	WebUIPort        int32  `json:"webUIPort"`
	WebUIAddress     string `json:"webUIAddress"`
	PodName          string `json:"podName"`
}

DriverInfo captures information about the driver.

func (*DriverInfo) DeepCopy

func (in *DriverInfo) DeepCopy() *DriverInfo

DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new DriverInfo.

func (*DriverInfo) DeepCopyInto

func (in *DriverInfo) DeepCopyInto(out *DriverInfo)

DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.

type DriverSpec

type DriverSpec struct {
	// PodName is the name of the driver pod that the user creates. This is used for the
	// in-cluster client mode in which the user creates a client pod where the driver of
	// the user application runs. It's an error to set this field if Mode is not
	// in-cluster-client.
	// Optional.
	PodName *string `json:"podName,omitempty"`
	// Cores is used to set spark.driver.cores.
	// Optional.
	Cores *string `json:"cores,omitempty"`
	// Memory is used to set spark.driver.memory.
	// Optional.
	Memory *string `json:"memory,omitempty"`
	// Image is the driver Docker image to use.
	Image string `json:"image"`
	// DriverConfigMaps carries information of other ConfigMaps to add to the driver Pod.
	// Optional.
	DriverConfigMaps []NamePath `json:"driverConigMaps,omitempty"`
	// DriverSecrets carries information of secrets to add to the driver Pod.
	// Optional.
	DriverSecrets []SecretInfo `json:"driverSecrets,omitempty"`
	// DriverEnvVars carries the environment variables to add to the driver Pod.
	// Optional.
	DriverEnvVars map[string]string `json:"driverEnvVars,omitempty"`
}

DriverSpec is specification of the driver.

func (*DriverSpec) DeepCopy

func (in *DriverSpec) DeepCopy() *DriverSpec

DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new DriverSpec.

func (*DriverSpec) DeepCopyInto

func (in *DriverSpec) DeepCopyInto(out *DriverSpec)

DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.

type ExecutorSpec

type ExecutorSpec struct {
	// Cores is used to set spark.executor.cores.
	// Optional.
	Cores *string `json:"cores,omitempty"`
	// Memory is used to set spark.executor.memory.
	// Optional.
	Memory *string `json:"memory,omitempty"`
	// Image is the executor Docker image to use.
	Image string `json:"image"`
	// Instances is the number of executor instances.
	Instances *int32 `json:"instances,omitempty"`
	// ExecutorConfigMaps carries information of other ConfigMaps to add to the executor Pods.
	// Optional.
	ExecutorConfigMaps []NamePath `json:"executorConigMaps,omitempty"`
	// ExecutorSecrets carries information of secrets to add to the executor Pods.
	// Optional.
	ExecutorSecrets []SecretInfo `json:"executorSecrets,omitempty"`
	// ExecutorEnvVars carries the environment variables to add to the executor Pods.
	// Optional.
	ExecutorEnvVars map[string]string `json:"executorEnvVars,omitempty"`
}

ExecutorSpec is specification of the executor.

func (*ExecutorSpec) DeepCopy

func (in *ExecutorSpec) DeepCopy() *ExecutorSpec

DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new ExecutorSpec.

func (*ExecutorSpec) DeepCopyInto

func (in *ExecutorSpec) DeepCopyInto(out *ExecutorSpec)

DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.

type ExecutorState

type ExecutorState string

ExecutorState tells the current state of an executor.

const (
	ExecutorPendingState   ExecutorState = "PENDING"
	ExecutorRunningState   ExecutorState = "RUNNING"
	ExecutorCompletedState ExecutorState = "COMPLETED"
	ExecutorFailedState    ExecutorState = "FAILED"
)

Different states an executor may have.

type NamePath

type NamePath struct {
	Name string `json:"name"`
	Path string `json:"path"`
}

NamePath is a pair of a name and a path to which the named objects should be mounted to.

func (*NamePath) DeepCopy

func (in *NamePath) DeepCopy() *NamePath

DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new NamePath.

func (*NamePath) DeepCopyInto

func (in *NamePath) DeepCopyInto(out *NamePath)

DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.

type SecretInfo

type SecretInfo struct {
	Name string     `json:"name"`
	Path string     `json:"path"`
	Type SecretType `json:"secretType"`
}

SecretInfo captures information of a secret.

func (*SecretInfo) DeepCopy

func (in *SecretInfo) DeepCopy() *SecretInfo

DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new SecretInfo.

func (*SecretInfo) DeepCopyInto

func (in *SecretInfo) DeepCopyInto(out *SecretInfo)

DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.

type SecretType

type SecretType string

SecretType tells the type of a secret.

const (
	// GCPServiceAccountSecret is for secrets from a GCP service account Json key file that needs
	// the environment variable GOOGLE_APPLICATION_CREDENTIALS.
	GCPServiceAccountSecret SecretType = "GCPServiceAccount"
	// HDFSDelegationTokenSecret is for secrets from an HDFS delegation token that needs the
	// environment variable HADOOP_TOKEN_FILE_LOCATION.
	HDFSDelegationTokenSecret SecretType = "HDFSDelegationToken"
	// GenericType is for secrets that needs no special handling.
	GenericType SecretType = "Generic"
)

An enumeration of secret types supported.

type SparkApplication

type SparkApplication struct {
	metav1.TypeMeta   `json:",inline"`
	metav1.ObjectMeta `json:"metadata"`
	Spec              SparkApplicationSpec   `json:"spec"`
	Status            SparkApplicationStatus `json:"status,omitempty"`
}

SparkApplication represents a Spark application running on and using Kubernetes as a cluster manager.

func (*SparkApplication) DeepCopy

func (in *SparkApplication) DeepCopy() *SparkApplication

DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new SparkApplication.

func (*SparkApplication) DeepCopyInto

func (in *SparkApplication) DeepCopyInto(out *SparkApplication)

DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.

func (*SparkApplication) DeepCopyObject

func (in *SparkApplication) DeepCopyObject() runtime.Object

DeepCopyObject is an autogenerated deepcopy function, copying the receiver, creating a new runtime.Object.

type SparkApplicationList

type SparkApplicationList struct {
	metav1.TypeMeta `json:",inline"`
	metav1.ListMeta `json:"metadata,omitempty"`
	Items           []SparkApplication `json:"items,omitempty"`
}

SparkApplicationList carries a list of SparkApplication objects.

func (*SparkApplicationList) DeepCopy

DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new SparkApplicationList.

func (*SparkApplicationList) DeepCopyInto

func (in *SparkApplicationList) DeepCopyInto(out *SparkApplicationList)

DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.

func (*SparkApplicationList) DeepCopyObject

func (in *SparkApplicationList) DeepCopyObject() runtime.Object

DeepCopyObject is an autogenerated deepcopy function, copying the receiver, creating a new runtime.Object.

type SparkApplicationSpec

type SparkApplicationSpec struct {
	// Type tells the type of the Spark application.
	Type SparkApplicationType `json:"type"`
	// Mode is the deployment mode of the Spark application.
	Mode DeployMode `json:"mode"`
	// MainClass is the fully-qualified main class of the Spark application.
	// This only applies to Java/Scala Spark applications.
	// Optional.
	MainClass *string `json:"mainClass,omitempty"`
	// MainFile is the path to a bundled JAR, Python, or R file of the application.
	MainApplicationFile string `json:"mainApplicationFile"`
	// Arguments is a list of arguments to be passed to the application.
	// Optional.
	Arguments []string `json:"arguments,omitempty"`
	// SparkConf carries user-specified Spark configuration properties as they would use the  "--conf" option in
	// spark-submit.
	// Optional.
	SparkConf map[string]string `json:"sparkConf,omitempty"`
	// HadoopConf carries user-specified Hadoop configuration properties as they would use the  the "--conf" option
	// in spark-submit.  The SparkApplication controller automatically adds prefix "spark.hadoop." to Hadoop
	// configuration properties.
	// Optional.
	HadoopConf map[string]string `json:"hadoopConf,omitempty"`
	// SparkConfigMap carries the name of the ConfigMap containing Spark configuration files such as log4j.properties.
	// The controller will add environment variable SPARK_CONF_DIR to the path where the ConfigMap is mounted to.
	// Optional.
	SparkConfigMap *string `json:"sparkConfigMap,omitempty"`
	// HadoopConfigMap carries the name of the ConfigMap containing Hadoop configuration files such as core-site.xml.
	// The controller will add environment variable HADOOP_CONF_DIR to the path where the ConfigMap is mounted to.
	// Optional.
	HadoopConfigMap *string `json:"hadoopConfigMap,omitempty"`
	// Driver is the driver specification.
	Driver DriverSpec `json:"driver"`
	// Executor is the executor specification.
	Executor ExecutorSpec `json:"executor"`
	// Deps captures all possible types of dependencies of a Spark application.
	Deps Dependencies `json:"deps"`
	// SubmissionByUser indicates if the application is to be submitted by the user.
	// The custom controller should not submit the application on behalf of the user if this is true.
	// It defaults to false.
	SubmissionByUser bool `json:"submissionByUser"`
}

SparkApplicationSpec describes the specification of a Spark application using Kubernetes as a cluster manager. It carries every pieces of information a spark-submit command takes and recognizes.

func (*SparkApplicationSpec) DeepCopy

DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new SparkApplicationSpec.

func (*SparkApplicationSpec) DeepCopyInto

func (in *SparkApplicationSpec) DeepCopyInto(out *SparkApplicationSpec)

DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.

type SparkApplicationStatus

type SparkApplicationStatus struct {
	// AppId is the application ID that's also added as a label to the SparkApplication object
	// and driver and executor Pods, and is used to group the objects for the same application.
	AppID string `json:"appId"`
	// SubmissionTime is the time when the application is submitted.
	SubmissionTime metav1.Time `json:"submissionTime"`
	// CompletionTime is the time when the application runs to completion if it does.
	CompletionTime metav1.Time `json:"completionTime"`
	// DriverInfo has information about the driver.
	DriverInfo DriverInfo `json:"driverInfo"`
	// AppState tells the overall application state.
	AppState ApplicationState `json:"applicationState"`
	// ExecutorState records the state of executors by executor Pod names.
	ExecutorState map[string]ExecutorState `json:"executorState,omitempty"`
}

SparkApplicationStatus describes the current status of a Spark application.

func (*SparkApplicationStatus) DeepCopy

DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new SparkApplicationStatus.

func (*SparkApplicationStatus) DeepCopyInto

func (in *SparkApplicationStatus) DeepCopyInto(out *SparkApplicationStatus)

DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.

type SparkApplicationType

type SparkApplicationType string

SparkApplicationType describes the type of a Spark application.

const (
	JavaApplicationType   SparkApplicationType = "Java"
	ScalaApplicationType  SparkApplicationType = "Scala"
	PythonApplicationType SparkApplicationType = "Python"
	RApplocationType      SparkApplicationType = "R"
)

Different types of Spark applications.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL