labelselector

package
v3.10.0-0.25.0+incompa... Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 20, 2018 License: Apache-2.0 Imports: 3 Imported by: 0

Documentation

Overview

labelselector is trim down version of k8s/pkg/labels/selector.go It only accepts exact label matches Example: "k1=v1, k2 = v2"

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func Conflicts

func Conflicts(labels1, labels2 map[string]string) bool

Conflicts takes 2 maps returns true if there a key match between the maps but the value doesn't match returns false in other cases

func Equals

func Equals(labels1, labels2 map[string]string) bool

Equals returns true if the given maps are equal

func Merge

func Merge(labels1, labels2 map[string]string) map[string]string

Merge combines given maps Note: It doesn't not check for any conflicts between the maps

func Parse

func Parse(selector string) (map[string]string, error)

Parse takes a string representing a selector and returns map[key]value, or an error. The input will cause an error if it does not follow this form:

<selector-syntax> ::= [ <requirement> | <requirement> "," <selector-syntax> ] <requirement> ::= KEY "=" VALUE KEY is a sequence of one or more characters following [ DNS_SUBDOMAIN "/" ] DNS_LABEL VALUE is a sequence of zero or more characters "([A-Za-z0-9_-\.])". Max length is 64 character. Delimiter is white space: (' ', '\t')

Types

type Lexer

type Lexer struct {
	// contains filtered or unexported fields
}

Lexer represents the Lexer struct for label selector. It contains necessary informationt to tokenize the input string

func (*Lexer) Lex

func (l *Lexer) Lex() (tok Token, lit string)

Lex returns a pair of Token and the literal literal is meaningfull only for IdentifierToken token

type Parser

type Parser struct {
	// contains filtered or unexported fields
}

Parser data structure contains the label selector parser data structure

type ScannedItem

type ScannedItem struct {
	// contains filtered or unexported fields
}

ScannedItem are the item produced by the lexer. It contains the Token and the literal.

type Token

type Token int

constants definition for lexer token

const (
	ErrorToken Token = iota
	EndOfStringToken
	CommaToken
	EqualsToken
	IdentifierToken // to represent keys and values
)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL