tokenizer

package
v0.0.0-...-cbcb865 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 16, 2021 License: MIT Imports: 3 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type SimpleTokenizer

type SimpleTokenizer struct {
	// contains filtered or unexported fields
}

SimpleTokenizer is one of implementation for Tokenizer

func NewSimpleTokenizer

func NewSimpleTokenizer() *SimpleTokenizer

NewSimpleTokenizer is a constructor of SimpleTokenizer

func (*SimpleTokenizer) DisableRegex

func (x *SimpleTokenizer) DisableRegex()

DisableRegex is disabler of heuristics patterns

func (*SimpleTokenizer) EnableRegex

func (x *SimpleTokenizer) EnableRegex()

EnableRegex is disabler of heuristics patterns

func (*SimpleTokenizer) SetDelim

func (x *SimpleTokenizer) SetDelim(d string)

SetDelim is a function set characters as delimiter

func (*SimpleTokenizer) Split

func (x *SimpleTokenizer) Split(msg string) []*Token

Split is a function to split log message.

type Token

type Token struct {
	Data    string
	IsDelim bool
	// contains filtered or unexported fields
}

Token is a part of log message

func (*Token) IsSpace

func (x *Token) IsSpace() bool

IsSpace checks all runes in string

type Tokenizer

type Tokenizer interface {
	Split(msg string) []*Token
}

Tokenizer splits log message string

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL