Documentation ¶
Overview ¶
+groupName=datasciencecluster.opendatahub.io
Index ¶
- type Component
- func (c *Component) Cleanup(_ context.Context, _ client.Client, _ metav1.Object, ...) error
- func (c *Component) ConfigComponentLogger(logger logr.Logger, component string, dscispec *dsciv1.DSCInitializationSpec) logr.Logger
- func (in *Component) DeepCopy() *Component
- func (in *Component) DeepCopyInto(out *Component)
- func (c *Component) GetManagementState() operatorv1.ManagementState
- func (c *Component) UpdatePrometheusConfig(_ client.Client, logger logr.Logger, enable bool, component string) error
- type ComponentInterface
- type DevFlags
- type ManifestsConfig
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Component ¶
type Component struct { // Set to one of the following values: // // - "Managed" : the operator is actively managing the component and trying to keep it active. // It will only upgrade the component if it is safe to do so // // - "Removed" : the operator is actively managing the component and will not install it, // or if it is installed, the operator will try to remove it // // +kubebuilder:validation:Enum=Managed;Removed ManagementState operatorv1.ManagementState `json:"managementState,omitempty"` // Add developer fields // +optional // +operator-sdk:csv:customresourcedefinitions:type=spec,order=2 DevFlags *DevFlags `json:"devFlags,omitempty"` }
Component struct defines the basis for each OpenDataHub component configuration. +kubebuilder:object:generate=true
func (*Component) ConfigComponentLogger ¶ added in v2.10.0
func (c *Component) ConfigComponentLogger(logger logr.Logger, component string, dscispec *dsciv1.DSCInitializationSpec) logr.Logger
extend origal ConfigLoggers to include component name.
func (*Component) DeepCopy ¶ added in v2.7.0
DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new Component.
func (*Component) DeepCopyInto ¶ added in v2.7.0
DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.
func (*Component) GetManagementState ¶ added in v2.2.0
func (c *Component) GetManagementState() operatorv1.ManagementState
func (*Component) UpdatePrometheusConfig ¶ added in v2.7.0
func (c *Component) UpdatePrometheusConfig(_ client.Client, logger logr.Logger, enable bool, component string) error
UpdatePrometheusConfig update prometheus-configs.yaml to include/exclude <component>.rules parameter enable when set to true to add new rules, when set to false to remove existing rules.
type ComponentInterface ¶
type ComponentInterface interface { ReconcileComponent(ctx context.Context, cli client.Client, logger logr.Logger, owner metav1.Object, DSCISpec *dsciv1.DSCInitializationSpec, platform cluster.Platform, currentComponentStatus bool) error Cleanup(ctx context.Context, cli client.Client, owner metav1.Object, DSCISpec *dsciv1.DSCInitializationSpec) error GetComponentName() string GetManagementState() operatorv1.ManagementState OverrideManifests(ctx context.Context, platform cluster.Platform) error UpdatePrometheusConfig(cli client.Client, logger logr.Logger, enable bool, component string) error ConfigComponentLogger(logger logr.Logger, component string, dscispec *dsciv1.DSCInitializationSpec) logr.Logger }
type DevFlags ¶ added in v2.2.0
type DevFlags struct { // List of custom manifests for the given component // +optional Manifests []ManifestsConfig `json:"manifests,omitempty"` }
DevFlags defines list of fields that can be used by developers to test customizations. This is not recommended to be used in production environment. +kubebuilder:object:generate=true
func (*DevFlags) DeepCopy ¶ added in v2.7.0
DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new DevFlags.
func (*DevFlags) DeepCopyInto ¶ added in v2.7.0
DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.
type ManifestsConfig ¶ added in v2.2.0
type ManifestsConfig struct { // uri is the URI point to a git repo with tag/branch. e.g. https://github.com/org/repo/tarball/<tag/branch> // +optional // +kubebuilder:default:="" // +operator-sdk:csv:customresourcedefinitions:type=spec,order=1 URI string `json:"uri,omitempty"` // contextDir is the relative path to the folder containing manifests in a repository, default value "manifests" // +optional // +kubebuilder:default:="manifests" // +operator-sdk:csv:customresourcedefinitions:type=spec,order=2 ContextDir string `json:"contextDir,omitempty"` // sourcePath is the subpath within contextDir where kustomize builds start. Examples include any sub-folder or path: `base`, `overlays/dev`, `default`, `odh` etc. // +optional // +kubebuilder:default:="" // +operator-sdk:csv:customresourcedefinitions:type=spec,order=3 SourcePath string `json:"sourcePath,omitempty"` }
Directories ¶
Path | Synopsis |
---|---|
Package codeflare provides utility functions to config CodeFlare as part of the stack which makes managing distributed compute infrastructure in the cloud easy and intuitive for Data Scientists +groupName=datasciencecluster.opendatahub.io
|
Package codeflare provides utility functions to config CodeFlare as part of the stack which makes managing distributed compute infrastructure in the cloud easy and intuitive for Data Scientists +groupName=datasciencecluster.opendatahub.io |
Package dashboard provides utility functions to config Open Data Hub Dashboard: A web dashboard that displays installed Open Data Hub components with easy access to component UIs and documentation +groupName=datasciencecluster.opendatahub.io
|
Package dashboard provides utility functions to config Open Data Hub Dashboard: A web dashboard that displays installed Open Data Hub components with easy access to component UIs and documentation +groupName=datasciencecluster.opendatahub.io |
Package datasciencepipelines provides utility functions to config Data Science Pipelines: Pipeline solution for end to end MLOps workflows that support the Kubeflow Pipelines SDK, Tekton and Argo Workflows.
|
Package datasciencepipelines provides utility functions to config Data Science Pipelines: Pipeline solution for end to end MLOps workflows that support the Kubeflow Pipelines SDK, Tekton and Argo Workflows. |
Package kserve provides utility functions to config Kserve as the Controller for serving ML models on arbitrary frameworks +groupName=datasciencecluster.opendatahub.io
|
Package kserve provides utility functions to config Kserve as the Controller for serving ML models on arbitrary frameworks +groupName=datasciencecluster.opendatahub.io |
+groupName=datasciencecluster.opendatahub.io
|
+groupName=datasciencecluster.opendatahub.io |
Package modelmeshserving provides utility functions to config MoModelMesh, a general-purpose model serving management/routing layer +groupName=datasciencecluster.opendatahub.io
|
Package modelmeshserving provides utility functions to config MoModelMesh, a general-purpose model serving management/routing layer +groupName=datasciencecluster.opendatahub.io |
Package modelregistry provides utility functions to config ModelRegistry, an ML Model metadata repository service +groupName=datasciencecluster.opendatahub.io
|
Package modelregistry provides utility functions to config ModelRegistry, an ML Model metadata repository service +groupName=datasciencecluster.opendatahub.io |
Package ray provides utility functions to config Ray as part of the stack which makes managing distributed compute infrastructure in the cloud easy and intuitive for Data Scientists +groupName=datasciencecluster.opendatahub.io
|
Package ray provides utility functions to config Ray as part of the stack which makes managing distributed compute infrastructure in the cloud easy and intuitive for Data Scientists +groupName=datasciencecluster.opendatahub.io |
Package trainingoperator provides utility functions to config trainingoperator as part of the stack which makes managing distributed compute infrastructure in the cloud easy and intuitive for Data Scientists +groupName=datasciencecluster.opendatahub.io
|
Package trainingoperator provides utility functions to config trainingoperator as part of the stack which makes managing distributed compute infrastructure in the cloud easy and intuitive for Data Scientists +groupName=datasciencecluster.opendatahub.io |
Package trustyai provides utility functions to config TrustyAI, a bias/fairness and explainability toolkit +groupName=datasciencecluster.opendatahub.io
|
Package trustyai provides utility functions to config TrustyAI, a bias/fairness and explainability toolkit +groupName=datasciencecluster.opendatahub.io |
Package workbenches provides utility functions to config Workbenches to secure Jupyter Notebook in Kubernetes environments with support for OAuth +groupName=datasciencecluster.opendatahub.io
|
Package workbenches provides utility functions to config Workbenches to secure Jupyter Notebook in Kubernetes environments with support for OAuth +groupName=datasciencecluster.opendatahub.io |