tokenizer

package
v0.0.0-...-a7eb69b Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Aug 22, 2024 License: MIT Imports: 2 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Tokenizer

type Tokenizer struct {
	// contains filtered or unexported fields
}

Tokenizer is kagome.Tokenizer wrapper

func NewTokenizer

func NewTokenizer(opt ...string) *Tokenizer

NewTokenizer initialize kagome.Tokenizer

func (*Tokenizer) Tokenize

func (m *Tokenizer) Tokenize(text string) Tokens

Tokenize create []kagome.Token

type Tokens

type Tokens []tokenizer.Token

Tokens is tokenizer.Token wrapper

func (Tokens) DistinctByNoun

func (m Tokens) DistinctByNoun() (tokens Tokens)

DistinctByNoun return Tokens distinct by nouns

func (Tokens) Len

func (m Tokens) Len() int

Len is implementation of sort.Sort

func (Tokens) Less

func (m Tokens) Less(i, j int) bool

Less is implementation of sort.Sort

func (Tokens) Sort

func (m Tokens) Sort() Tokens

Sort as kagome.Token.Surface ASC

func (Tokens) Swap

func (m Tokens) Swap(i, j int)

Swap is implementation of sort.Sort

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL