tokenizer

package
v3.6.0-0.dev+incompatible Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 9, 2019 License: Apache-2.0 Imports: 4 Imported by: 0

Documentation

Index

Constants

View Source
const (
	TokLabel tokenKind = iota + 1
	TokStringLiteral
	TokLBrace
	TokRBrace
	TokComma
	TokEq
	TokNe
	TokIn
	TokNot
	TokNotIn
	TokAll
	TokHas
	TokLParen
	TokRParen
	TokAnd
	TokOr
	TokEOF
)
View Source
const (
	// LabelKeyMatcher is the base regex for a valid label key.
	LabelKeyMatcher = `[a-zA-Z0-9_./-]{1,512}`
)

Variables

This section is empty.

Functions

This section is empty.

Types

type Token

type Token struct {
	Kind  tokenKind
	Value interface{}
}

Token has a kind and a value

func Tokenize

func Tokenize(input string) (tokens []Token, err error)

Tokenize transforms string to token slice

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL