common

package
v0.1.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 19, 2021 License: Apache-2.0 Imports: 0 Imported by: 0

Documentation

Index

Constants

View Source
const (
	// SparkSchedulerName is the name of the kube-scheduler instance that talks with the extender
	SparkSchedulerName = "spark-scheduler"
	// SparkRoleLabel represents the label key for the spark-role of a pod
	SparkRoleLabel = "spark-role"
	// SparkAppIDLabel represents the label key for the spark application ID on a pod
	SparkAppIDLabel = "spark-app-id" // TODO(onursatici): change this to a spark specific label when spark has one
	// Driver represents the label key for a pod that identifies the pod as a spark driver
	Driver = "driver"
	// Executor represents the label key for a pod that identifies the pod as a spark executor
	Executor = "executor"
)
View Source
const (
	// DriverCPU represents the key of an annotation that describes how much CPU a spark driver requires
	DriverCPU = "spark-driver-cpu"
	// DriverMemory represents the key of an annotation that describes how much memory a spark driver requires
	DriverMemory = "spark-driver-mem"
	// ExecutorCPU represents the key of an annotation that describes how much cpu a spark executor requires
	ExecutorCPU = "spark-executor-cpu"
	// ExecutorMemory represents the key of an annotation that describes how much memory a spark executor requires
	ExecutorMemory = "spark-executor-mem"
	// DynamicAllocationEnabled sets whether dynamic allocation is enabled for this spark application (false by default)
	DynamicAllocationEnabled = "spark-dynamic-allocation-enabled"
	// ExecutorCount represents the key of an annotation that describes how many executors a spark application requires (required if DynamicAllocationEnabled is false)
	ExecutorCount = "spark-executor-count"
	// DAMinExecutorCount represents the lower bound on the number of executors a spark application requires if dynamic allocation is enabled (required if DynamicAllocationEnabled is true)
	DAMinExecutorCount = "spark-dynamic-allocation-min-executor-count"
	// DAMaxExecutorCount represents the upper bound on the number of executors a spark application can have if dynamic allocation is enabled (required if DynamicAllocationEnabled is true)
	DAMaxExecutorCount = "spark-dynamic-allocation-max-executor-count"
)

Variables

This section is empty.

Functions

This section is empty.

Types

This section is empty.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL