README ¶ tokenize Overview Description Separates text into tokens / words / punctuation Implementation details Uses the gopkg.in/jdkato/prose.v2 library. Compliance to Spec Rough level of compliance 100% (for now I am going to do more testing) Expand ▾ Collapse ▴ Documentation ¶ Index ¶ func New(ctx operation.InitContext) (operation.Operation, error) type Input func (i *Input) FromMap(values map[string]interface{}) error type Operation func (a *Operation) Eval(inputs map[string]interface{}) (interface{}, error) Constants ¶ This section is empty. Variables ¶ This section is empty. Functions ¶ func New ¶ func New(ctx operation.InitContext) (operation.Operation, error) Types ¶ type Input ¶ type Input struct { Str string `md:"str"` } func (*Input) FromMap ¶ func (i *Input) FromMap(values map[string]interface{}) error type Operation ¶ type Operation struct { // contains filtered or unexported fields } func (*Operation) Eval ¶ func (a *Operation) Eval(inputs map[string]interface{}) (interface{}, error) Source Files ¶ View all Source files metadata.go operation.go Click to show internal directories. Click to hide internal directories.