token

package
v1.0.2 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 30, 2023 License: Apache-2.0 Imports: 1 Imported by: 1

Documentation

Overview

The token package provides functions which tokenise, or provide for the manipulation of tokenised strings. Functions allow for multiple token separators, and quote aware tokenisation which provide greater flexibility over the simple strings Split and SplitN functions.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func Tokenise_count

func Tokenise_count(buf string, sepchrs string) (cmap map[string]int)

Tokenises the string using tokenise_qsep and then builds a map which counts each token (map[string]int). Can be used by the caller as an easy de-dup process.

func Tokenise_drop

func Tokenise_drop(buf string, sepchrs string) (ntokens int, tokens []string)

Takes a string and slices it into tokens using the characters in sepchrs as the breaking points. The separation characters are discarded.

func Tokenise_keep

func Tokenise_keep(buf string, sepchrs string) (int, []string)

Tokensise_keep takes a string and slices it into tokens using the characters in sepchrs as the breaking points. The separation characters are returned as individual tokens in the list. Separater characters escaped with a backslant are NOT treated like separators.

The return values are ntokens (int) and the list of tokens and separators.

func Tokenise_populated

func Tokenise_populated(buf string, sepchrs string) (int, []string)

Takes a string and slices it into tokens using the characters in sepchrs as the breaking points keeping only populated fields. The separation characters are discarded. Null (empty) tokens are dropped allowing a space separated record to treat multiple spaces as a single separator.

The return values are the number of tokens and the list of token strings.

func Tokenise_qpopulated

func Tokenise_qpopulated(buf string, sepchrs string) (int, []string)

Takes a string and slices it into tokens using the characters in sepchrs as the breaking points, but allowing double quotes to provide protection against separation. For example, if sepchrs is ",|", then the string

foo,bar,"hello,world",,"you|me"

would break into 4 tokens:

foo
bar
hello,world
you|me

Similar to tokenise_qsep, but this method removes empty tokens from the final result.

The return value is the number of tokens and the list of tokens.

func Tokenise_qsep

func Tokenise_qsep(buf string, sepchrs string) (int, []string)

Takes a string and slices it into tokens using the characters in sepchrs as the breaking points, but allowing double quotes to provide protection against separation. For example, if sepchrs is ",|", then the string

foo,bar,"hello,world","you|me"

would break into 4 tokens:

foo
bar
hello,world
you|me

If there are empty fields, they are returned as empty tokens.

The return values are the number of tokens and the list of tokens.

func Tokenise_qsepu

func Tokenise_qsepu(buf string, sepchrs string) (int, []string)

Tokenises a string, but returns only an array of unique tokens. Empty tokens are discarded.

Types

This section is empty.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL