Documentation
¶
Overview ¶
The token package provides functions which tokenise, or provide for the manipulation of tokenised strings. Functions allow for multiple token separators, and quote aware tokenisation which provide greater flexibility over the simple strings Split and SplitN functions.
Index ¶
- func Tokenise_count(buf string, sepchrs string) (cmap map[string]int)
- func Tokenise_drop(buf string, sepchrs string) (ntokens int, tokens []string)
- func Tokenise_keep(buf string, sepchrs string) (int, []string)
- func Tokenise_populated(buf string, sepchrs string) (int, []string)
- func Tokenise_qpopulated(buf string, sepchrs string) (int, []string)
- func Tokenise_qsep(buf string, sepchrs string) (int, []string)
- func Tokenise_qsepu(buf string, sepchrs string) (int, []string)
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func Tokenise_count ¶
Tokenises the string using tokenise_qsep and then builds a map which counts each token (map[string]int). Can be used by the caller as an easy de-dup process.
func Tokenise_drop ¶
Takes a string and slices it into tokens using the characters in sepchrs as the breaking points. The separation characters are discarded.
func Tokenise_keep ¶
Tokensise_keep takes a string and slices it into tokens using the characters in sepchrs as the breaking points. The separation characters are returned as individual tokens in the list. Separater characters escaped with a backslant are NOT treated like separators.
The return values are ntokens (int) and the list of tokens and separators.
func Tokenise_populated ¶
Takes a string and slices it into tokens using the characters in sepchrs as the breaking points keeping only populated fields. The separation characters are discarded. Null (empty) tokens are dropped allowing a space separated record to treat multiple spaces as a single separator.
The return values are the number of tokens and the list of token strings.
func Tokenise_qpopulated ¶
Takes a string and slices it into tokens using the characters in sepchrs as the breaking points, but allowing double quotes to provide protection against separation. For example, if sepchrs is ",|", then the string
foo,bar,"hello,world",,"you|me"
would break into 4 tokens:
foo bar hello,world you|me
Similar to tokenise_qsep, but this method removes empty tokens from the final result.
The return value is the number of tokens and the list of tokens.
func Tokenise_qsep ¶
Takes a string and slices it into tokens using the characters in sepchrs as the breaking points, but allowing double quotes to provide protection against separation. For example, if sepchrs is ",|", then the string
foo,bar,"hello,world","you|me"
would break into 4 tokens:
foo bar hello,world you|me
If there are empty fields, they are returned as empty tokens.
The return values are the number of tokens and the list of tokens.
Types ¶
This section is empty.