Documentation
¶
Overview ¶
Package lexer is based on Rop Pike's talk: https://www.youtube.com/watch?v=HxaD_trXwRE
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Lexer ¶
type Lexer interface { // Next returns the next available Token from the lexer. // If no more Tokens are available or the lexer was closed, // the Token has type EOF. Next() Token }
The Lexer interface represents a lexer that lexes its input into Tokens.
type Token ¶
Token is an item returned from the scanner.
func (Token) IsDelimiter ¶
IsDelimiter returns true, if the token has type delimiter.
type TokenType ¶
type TokenType int
TokenType identifies the type of lex tokens.
const ( ILLEGAL TokenType = iota EOF IDENT // sendData INT // 123 RAWSTRING // `rawstring` BYTESIZE // 1MB DURATION // 80ns VERSION ERRORS ENUM TYPE SERVICE CALL STREAM ASYNC ARG RET MAXARGSIZE MAXRETSIZE TIMEOUT MAP LPAREN // ( RPAREN // ) LBRACE // { RBRACE // } LBRACK // [ RBRACK // ] COLON // : EQUAL // = COMMA // , )
Click to show internal directories.
Click to hide internal directories.