Documentation ¶
Overview ¶
Package lexer provides a handlebars tokenizer.
Example ¶
package example
source := "You know {{nothing}} John Snow" output := "" lex := Scan(source) for { // consume next token token := lex.NextToken() output += fmt.Sprintf(" %s", token) // stops when all tokens have been consumed, or on error if token.Kind == TokenEOF || token.Kind == TokenError { break } } fmt.Print(output)
Output: Content{"You know "} Open{"{{"} ID{"nothing"} Close{"}}"} Content{" John Snow"} EOF
Index ¶
Examples ¶
Constants ¶
View Source
const ( // Mustaches detection ESCAPED_ESCAPED_OPEN_MUSTACHE = "\\\\{{" ESCAPED_OPEN_MUSTACHE = "\\{{" OPEN_MUSTACHE = "{{" CLOSE_MUSTACHE = "}}" CLOSE_STRIP_MUSTACHE = "~}}" CLOSE_UNESCAPED_STRIP_MUSTACHE = "}~}}" )
View Source
const ( // Option to generate token position in its string representation DUMP_TOKEN_POS = false // Option to generate values for all token kinds for their string representations DUMP_ALL_TOKENS_VAL = true )
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Lexer ¶
type Lexer struct {
// contains filtered or unexported fields
}
Lexer is a lexical analyzer.
func Scan ¶
Scan scans given input.
Tokens can then be fetched sequentially thanks to NextToken() function on returned lexer.
type Token ¶
type Token struct { Kind TokenKind // Token kind Val string // Token value Pos int // Byte position in input string Line int // Line number in input string }
Token represents a scanned token.
type TokenKind ¶
type TokenKind int
TokenKind represents a Token type.
const ( TokenError TokenKind = iota TokenEOF // mustache delimiters TokenOpen // OPEN TokenClose // CLOSE TokenOpenRawBlock // OPEN_RAW_BLOCK TokenCloseRawBlock // CLOSE_RAW_BLOCK TokenOpenEndRawBlock // END_RAW_BLOCK TokenOpenUnescaped // OPEN_UNESCAPED TokenCloseUnescaped // CLOSE_UNESCAPED TokenOpenBlock // OPEN_BLOCK TokenOpenEndBlock // OPEN_ENDBLOCK TokenInverse // INVERSE TokenOpenInverse // OPEN_INVERSE TokenOpenInverseChain // OPEN_INVERSE_CHAIN TokenOpenPartial // OPEN_PARTIAL TokenComment // COMMENT // inside mustaches TokenOpenSexpr // OPEN_SEXPR TokenCloseSexpr // CLOSE_SEXPR TokenEquals // EQUALS TokenData // DATA TokenSep // SEP TokenOpenBlockParams // OPEN_BLOCK_PARAMS TokenCloseBlockParams // CLOSE_BLOCK_PARAMS // tokens with content TokenContent // CONTENT TokenID // ID TokenString // STRING TokenNumber // NUMBER TokenBoolean // BOOLEAN )
Click to show internal directories.
Click to hide internal directories.