Documentation
¶
Index ¶
Constants ¶
View Source
const EOFRUNE = -1
View Source
const FlagExpiration = "expiration"
FlagExpiration indicates that `expiration` is supported as a first-class feature in the schema.
Variables ¶
View Source
var Flags = map[string]transformer{ FlagExpiration: func(lexeme Lexeme) (Lexeme, bool) { if lexeme.Kind == TokenTypeIdentifier && lexeme.Value == "expiration" { lexeme.Kind = TokenTypeKeyword return lexeme, true } if lexeme.Kind == TokenTypeIdentifier && lexeme.Value == "and" { lexeme.Kind = TokenTypeKeyword return lexeme, true } return lexeme, false }, }
Flags is a map of flag names to their corresponding transformers.
Functions ¶
Types ¶
type FlaggableLexler ¶ added in v1.40.1
type FlaggableLexler struct {
// contains filtered or unexported fields
}
FlaggableLexler wraps a lexer, automatically translating tokens based on flags, if any.
func NewFlaggableLexler ¶ added in v1.40.1
func NewFlaggableLexler(lex *Lexer) *FlaggableLexler
NewFlaggableLexler returns a new FlaggableLexler for the given lexer.
func (*FlaggableLexler) Close ¶ added in v1.40.1
func (l *FlaggableLexler) Close()
Close stops the lexer from running.
func (*FlaggableLexler) NextToken ¶ added in v1.40.1
func (l *FlaggableLexler) NextToken() Lexeme
NextToken returns the next token found in the lexer.
type Lexeme ¶
type Lexeme struct { Kind TokenType // The type of this lexeme. Position input.BytePosition // The starting position of this token in the input string. Value string // The textual value of this token. Error string // The error associated with the lexeme, if any. }
Lexeme represents a token returned from scanning the contents of a file.
type Lexer ¶
Lexer holds the state of the scanner.
type TokenType ¶
type TokenType int
TokenType identifies the type of lexer lexemes.
const ( TokenTypeError TokenType = iota // error occurred; value is text of error // Synthetic semicolon TokenTypeSyntheticSemicolon TokenTypeEOF TokenTypeWhitespace TokenTypeSinglelineComment TokenTypeMultilineComment TokenTypeNewline TokenTypeKeyword // interface TokenTypeIdentifier // helloworld TokenTypeNumber // 123 TokenTypeLeftBrace // { TokenTypeRightBrace // } TokenTypeLeftParen // ( TokenTypeRightParen // ) TokenTypePipe // | TokenTypePlus // + TokenTypeMinus // - TokenTypeAnd // & TokenTypeDiv // / TokenTypeEquals // = TokenTypeColon // : TokenTypeSemicolon // ; TokenTypeRightArrow // -> TokenTypeHash // # TokenTypeEllipsis // ... TokenTypeStar // * // Additional tokens for CEL: https://github.com/google/cel-spec/blob/master/doc/langdef.md#syntax TokenTypeQuestionMark // ? TokenTypeConditionalOr // || TokenTypeConditionalAnd // && TokenTypeExclamationPoint // ! TokenTypeLeftBracket // [ TokenTypeRightBracket // ] TokenTypePeriod // . TokenTypeComma // , TokenTypePercent // % TokenTypeLessThan // < TokenTypeGreaterThan // > TokenTypeLessThanOrEqual // <= TokenTypeGreaterThanOrEqual // >= TokenTypeEqualEqual // == TokenTypeNotEqual // != TokenTypeString // "...", '...', """...""", ”'...”' )
Click to show internal directories.
Click to hide internal directories.