Documentation ¶
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
Types ¶
type Lexer ¶
type Lexer Tokenizer
Lexer turns an input stream into a sequence of tokens. Whitespace and comments are skipped.
type TokenType ¶
type TokenType int
TokenType is a top-level token classification: A word, space, comment, unknown.
type Tokenizer ¶
type Tokenizer struct {
// contains filtered or unexported fields
}
Tokenizer turns an input stream into a sequence of typed tokens
func NewTokenizer ¶
NewTokenizer creates a new tokenizer from an input stream.
Click to show internal directories.
Click to hide internal directories.