Documentation ¶
Overview ¶
Package lexer contains LogQL lexer.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
Types ¶
type TokenType ¶
type TokenType int
TokenType defines LogQL token type.
const ( Invalid TokenType = iota EOF Ident // Literals String Number Duration Bytes Comma Dot OpenBrace CloseBrace Eq NotEq Re NotRe PipeExact PipeMatch Pipe Unwrap OpenParen CloseParen By Without Bool OpenBracket CloseBracket Offset On Ignoring GroupLeft GroupRight // Binary operations Or And Unless Add Sub Mul Div Mod Pow // Comparison operations CmpEq Gt Gte Lt Lte JSON Regexp Logfmt Unpack Pattern LabelFormat LineFormat IP Decolorize Distinct Drop Keep Range Rate RateCounter CountOverTime BytesRate BytesOverTime AvgOverTime SumOverTime MinOverTime MaxOverTime StdvarOverTime StddevOverTime QuantileOverTime FirstOverTime LastOverTime AbsentOverTime Vector Sum Avg Max Min Count Stddev Stdvar Bottomk Topk Sort SortDesc LabelReplace BytesConv DurationConv DurationSecondsConv )
func (TokenType) IsFunction ¶
IsFunction returns true if token is function name.
type TokenizeOptions ¶
type TokenizeOptions struct { // Filename sets filename for the scanner. Filename string // AllowDots allows dots in identifiers. AllowDots bool }
TokenizeOptions is a Tokenize options structure.
Click to show internal directories.
Click to hide internal directories.