tokenizer

package
v1.1.0-rc1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 7, 2017 License: Apache-2.0 Imports: 4 Imported by: 90

Documentation

Index

Constants

View Source
const (
	TokLabel tokenKind = iota + 1
	TokStringLiteral
	TokLBrace
	TokRBrace
	TokComma
	TokEq
	TokNe
	TokIn
	TokNot
	TokNotIn
	TokAll
	TokHas
	TokLParen
	TokRParen
	TokAnd
	TokOr
	TokEof
)
View Source
const (
	// LabelKeyMatcher is the base regex for a valid label key.
	LabelKeyMatcher = `([a-zA-Z0-9_.-/]{0,253}/)?[a-zA-Z0-9]([a-zA-Z0-9_.-]{0,61}[a-zA-Z0-9])?`
)

Variables

This section is empty.

Functions

This section is empty.

Types

type Token

type Token struct {
	Kind  tokenKind
	Value interface{}
}

func Tokenize

func Tokenize(input string) (tokens []Token, err error)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL