Documentation ¶
Overview ¶
Package tokenizer implements a rudimentary tokens parser of buffered io.Reader while respecting quotes and parenthesis boundaries.
Example
tk := tokenizer.NewFromString("a, b, (c, d)") result, _ := tk.ScanAll() // ["a", "b", "(c, d)"]
Index ¶
Constants ¶
This section is empty.
Variables ¶
var DefaultSeparators = []rune{','}
DefaultSeparators is a list with the default token separator characters.
Functions ¶
This section is empty.
Types ¶
type Tokenizer ¶
type Tokenizer struct {
// contains filtered or unexported fields
}
Tokenizer defines a struct that parses a reader into tokens while respecting quotes and parenthesis boundaries.
func NewFromBytes ¶
NewFromBytes creates new Tokenizer from the provided bytes slice.
func NewFromString ¶
NewFromString creates new Tokenizer from the provided string.
func (*Tokenizer) IgnoreParenthesis ¶
IgnoreParenthesis defines whether to ignore the parenthesis boundaries and to treat the '(' and ')' as regular characters.
func (*Tokenizer) KeepEmptyTokens ¶
KeepEmptyTokens defines whether to keep empty tokens on Scan() (default to false).
func (*Tokenizer) KeepSeparator ¶
KeepSeparator defines whether to keep the separator rune as part of the token (default to false).
func (*Tokenizer) Scan ¶
Scan reads and returns the next available token from the Tokenizer's buffer (trimmed!).
Empty tokens are skipped if t.keepEmptyTokens is not set (which is the default).
Returns io.EOF error when there are no more tokens to scan.
func (*Tokenizer) ScanAll ¶
ScanAll reads the entire Tokenizer's buffer and return all found tokens.
func (*Tokenizer) Separators ¶
Separators defines the provided separatos of the current Tokenizer.